Congratulations, Bam!

Congratulations to President Barack Obama for the award of the 2009 Nobel Prize for Peace!
Former speechwriter to President Jimmie Carter, James Fallows, analyzes Bam’s speech yesterday here.   The Peace Prize is yet another commonality between US Presidents 44 and 26.
I am stunned that much of the commentariat seems to think Obama has not done anything to deserve this, as if ending the Bush-Cheney doctrine of global bullying was nothing at all.  Let us not forget that an unelected US Administration made, in August 2002, a decision to invade Iraq, which decision said administration and their allies refused for several months to provide the public with reasons for (a refusal which led the Australian Senate, for example, to pass its first-ever and so-far-only motion of censure against a sitting Prime Minister, and which led to the largest public demonstrations in Europe for four decades), and which decision was then justified to the public on grounds the justifiers appear to have known at the time to be misleading and possibly also false.  For 8 long years, the US Government was led by a secretive, macho, power-hungry, war-mongering, torture-mongering, jingoistic, neoconservative cabal, and as a consequence the peace and safety of all us around the world was lessened.   The prospects for global peace improved dramatically at 12 noon on 20 January 2009, immediately upon the removal of that cabal from office, a removal that was itself also a major achievement, and the Nobel Committee has recognized that real achievement for peace with this award.
Among the churlish commentary, I was most surprised by former Polish President Lech Walesa’s reaction, who apparently said, “So soon? Too early. He has no contribution so far.  He is still at an early stage.”   But the Nobel Prize for Peace is sometimes awarded to people or groups as a statement of solidarity by the Nobel Committee, and thus the world, for the person or cause receiving it.   Recent examples include a courageous national political leader under house arrest (Aung San Suu Kyi,  1991), a courageous dissident scientist also held under house arrest by his Government (Andrei Sakharov 1975), and the leader of an outlawed trade union, whose cause appeared at the time not only to have failed completely but to have been entirely counter-productive, leading as it did to martial law and more political repression in response than would otherwise have been the case (Lech Walesa, 1983).
Meanwhile, Andrew Sullivan quotes an anonymous correspondent:

Remember how Obama should have stepped aside and let Hillary win the primaries? Remember how America wasn’t ready for a black President, of course, so why didn’t he just realize it and wait his turn? Remember last summer when the candidate went to Germany and gave speech before hundreds of thousands of adoring fans?  How arrogant.  Who does he think he is?  Only a president should do that.  He should have at least waited until he won. And then he did win.  And he took a world tour and gave a game changing speech in the Cairo.  Who did he think he was?  A rock star?  The arrogance and audacity – it’s breathtaking. If the man would just wait his turn, dammit.
 

Myopic utilitarianism

What are the odds, eh?  On the same day that the Guardian publishes an obituary of theoretical computer scientist, Peter Landin (1930-2009), pioneer of the use of Alonzo Church’s lambda calculus as a formal semantics for computer programs, they also report that the Government is planning only to fund research which has relevance  to the real-world.  This is GREAT NEWS for philosophers and pure mathematicians! 
What might have seemed, for example,  mere pointless musings on the correct way to undertake reasoning – by Aristotle, by Islamic and Roman Catholic medieval theologians, by numerous English, Irish and American abstract mathematicians in the 19th century, by an entire generation of Polish logicians before World War II, and by those real-world men-of-action Gottlob Frege, Bertrand Russell, Ludwig Wittgenstein and Alonzo Church – turned out to be EXTREMELY USEFUL for the design and engineering of electronic computers.   Despite Russell’s Zen-influenced personal motto – “Just do!  Don’t think!” (later adopted by IBM) – his work turned out to be useful after all.   I can see the British research funding agencies right now, using their sophisticated and proven prognostication procedures to calculate the society-wide economic and social benefits we should expect to see from our current research efforts over the next 2300 years  – ie, the length of time that Aristotle’s research on logic took to be implemented in technology.   Thank goodness our politicians have shown no myopic utilitarianism this last couple of centuries, eh what?!
All while this man apparently received no direct state or commercial research funding for his efforts as a computer pioneer, playing with “pointless” abstractions like the lambda calculus.
And Normblog also comments.
POSTSCRIPT (2014-02-16):   And along comes The Cloud and ruins everything!   Because the lower layers of the Cloud – the physical infrastructure, operating system, even low-level application software – are fungible and dynamically so, then the Cloud is effectively “dark” to its users, beneath some level.   Specifying and designing applications that will run over it, or systems that will access it, thus requires specification and design to be undertaken at high levels of abstraction.   If all you can say about your new system is that in 10 years time it will grab some data from the NYSE, and nothing (yet) about the format of that data, then you need to speak in abstract generalities, not in specifics.   It turns out the lambda calculus is just right for this task and so London’s big banks have been recruiting logicians and formal methods people to spec & design their next-gen systems.  You can blame those action men, Church and Russell.

Nicolas Fatio de Duillier

Fatio de DuillierNicolas Fatio de Duillier (1664-1753) was a Genevan mathematician and polymath, who for a time in the 1680s and 1690s, was a close friend of Isaac Newton. After coming to London in 1687, he became a Fellow of the Royal Society (on 1688-05-15), as later did his brother Jean-Christophe (on 1706-04-03).  He played a major part in Newton’s feud with Leibniz over who had invented the differential calculus, and was a protagonist all his life for Newton’s thought and ideas.
Continue reading ‘Nicolas Fatio de Duillier’

Guerrilla logic: a salute to Mervyn Pragnell

When a detailed history of computer science in Britain comes to be written, one name that should not be forgotten is Mervyn O. Pragnell.  As far as I am aware, Mervyn Pragnell never held any academic post and he published no research papers.   However, he introduced several of the key players in British computer science to one another, and as importantly, to the lambda calculus of Alonzo Church (Hodges 2001).  At a time (the 1950s and 1960s) when logic was not held in much favour in either philosophy or pure mathematics, and before it became to be regarded highly in computer science, he studied the discipline not as a salaried academic in a university, but in a private reading-circle of his own creation, almost as a guerrilla activity.

Pragnell recruited people for his logic reading-circle by haunting London bookshops, approaching people he saw buying logic texts (Bornat 2009).  Among those he recruited to the circle were later-famous computer pioneers such as Rod Burstall, Peter Landin (1930-2009) and Christopher Strachey (1916-1975).  The meetings were held after hours, usually in Birkbeck College, University of London, without the knowledge or permission of the college authorities (Burstall 2000).  Some were held or continued in the neighbouring pub, The Duke of Marlborough.  It seems that Pragnell was employed for a time in the 1960s as a private research assistant for Strachey, working from Strachey’s house (Burstall 2000).   By the 1980s, he was apparently a regular attendee at the seminars on logic programming held at the Department of Computing in Imperial College, London, then (and still) one of the great research centres for the application of formal logic in computer science.

Pragnell’s key role in early theoretical computer science is sadly under-recognized.   Donald MacKenzie’s fascinating history and sociology of automated theorem proving, for example, mentions Pragnell in the text (MacKenzie 2001, p. 273), but manages to omit his name from the index.  Other than this, the only references I can find to his contributions are in the obituaries and personal recollections of other people.  I welcome any other information anyone can provide.

UPDATE (2009-09-23): Today’s issue of The Guardian newspaper has an obituary for theoretical computer scientist Peter Landin (1930-2009), which mentions Mervyn Pragnell.

UPDATE (2012-01-30):  MOP appears also to have been part of a production of the play The Way Out at The Little Theatre, Bristol in 1945-46, according to this web-chive of theatrical info.

UPDATE (2013-02-11):  In this 2001 lecture by Peter Landin at the Science Museum, Landin mentions first meeting Mervyn Pragnell in a cafe in Sheffield, and then talks about his participation in Pragnell’s London reading group (from about minute 21:50).

UPDATE (2019-07-05): I have learnt some further information from a cousin of Mervyn Pragnell, Ms Susan Miles.  From her, I understand that MOP’s mother died in the Influenza Pandemic around 1918, when he was very young, and he was subsequently raised in Cardiff in the large family of a cousin of his mother’s, the Miles family.  MOP’s father’s family had a specialist paint manufacturing business in Bristol, Oliver Pragnell & Company Limited, which operated from 25-27 Broadmead.  This establishment suffered serious bomb damage during WW II.   MOP was married to Margaret and although they themselves had no children, they kept in close contact with their relatives.  Both are remembered fondly by their family.   (I am most grateful to Susan Miles, daughter of Mervyn Miles whose parents raised MOP, for sharing this information.)

References:

Richard Bornat [2009]:  Peter Landin:  a computer scientist who inspired a generation, 5th June 1930 – 3rd June 2009.  Formal Aspects of Computing, 21 (5):  393-395.

Rod Burstall [2000]:  Christopher Strachey – understanding programming languages.  Higher-Order and Symbolic Computation, 13:  51-55.

Wilfrid Hodges [2001]:  A history of British logic.  Unpublished slide presentation.  Available from his website.

Peter Landin [2002]:  Rod Burstall:  a personal note. Formal Aspects of Computing, 13:  195.

Donald MacKenzie [2001]:  Mechanizing Proof:  Computing, Risk, and Trust.  Cambridge, MA, USA:  MIT Press.

Alan Turing

Yesterday, I reported on the restoration of the world’s oldest, still-working modern computer.  Last night, British Prime Minister Gordon Brown apologized for the country’s treatment of Alan Turing, computer pioneer.  In the words of Brown’s statement:

Turing was a quite brilliant mathematician, most famous for his work on breaking the German Enigma codes. It is no exaggeration to say that, without his outstanding contribution, the history of World War Two could well have been very different. He truly was one of those individuals we can point to whose unique contribution helped to turn the tide of war. The debt of gratitude he is owed makes it all the more horrifying, therefore, that he was treated so inhumanely. In 1952, he was convicted of ‘gross indecency’ – in effect, tried for being gay. His sentence – and he was faced with the miserable choice of this or prison – was chemical castration by a series of injections of female hormones. He took his own life just two years later.”

It might be considered that this apology required no courage of Brown.

This is not the case.  Until very recently, and perhaps still today, there were people who disparaged and belittled Turing’s contribution to computer science and computer engineering.  The conventional academic wisdom is that he was only good at the abstract theory and at the formal mathematizing (as in his “schoolboy essay” proposing a test to distinguish human from machine interlocuters), and not good for anything practical.   This belief is false.  As the philosopher and historian  B. Jack Copeland has shown, Turing was actively and intimately involved in the design and construction work (mechanical & electrical) of creating the machines developed at Bletchley Park during WWII, the computing machines which enabled Britain to crack the communications codes used by the Germans.

Turing-2004-Poster

Perhaps, like myself, you imagine this revision to conventional wisdom would be uncontroversial.  Sadly, not.  On 5 June 2004, I attended a symposium in Cottonopolis to commemorate the 50th anniversary of Turing’s death.  At this symposium, Copeland played a recording of an oral-history interview with engineer Tom Kilburn (1921-2001), first head of the first Department of Computer Science in Britain (at the University of Manchester), and also one of the pioneers of modern computing.   Kilburn and Turing had worked together in Manchester after WW II.  The audience heard Kilburn stress to his interviewer that what he learnt from Turing about the design and creation of computers was all high-level (ie, abstract) and not very much, indeed only about 30 minutes worth of conversation.  Copeland then produced evidence (from signing-in books) that Kilburn had attended a restricted, invitation-only, multi-week, full-time course on the design and engineering of computers which Turing had presented at the National Physical Laboratories shortly after the end of WW II, a course organized by the British Ministry of Defence to share some of the learnings of the Bletchley Park people in designing, building and operating computers.   If Turing had so little of practical relevance to contribute to Kilburn’s work, why then, one wonders, would Kilburn have turned up each day to his course.

That these issues were still fresh in the minds of some people was shown by the Q&A session at the end of Copeland’s presentation.  Several elderly members of the audience, clearly supporters of Kilburn, took strident and emotive issue with Copeland’s argument, with one of them even claiming that Turing had contributed nothing to the development of computing.   I repeat: this took place in Manchester 50 years after Turing’s death!    Clearly there were people who did not like Turing, or in some way had been offended by him, and who were still extremely upset about it half a century later.  They were still trying to belittle his contribution and his practical skills, despite the factual evidence to the contrary.

I applaud Gordon Brown’s courage in officially apologizing to Alan Turing, an apology which at least ensures the historical record is set straight for what our modern society owes this man.

POSTSCRIPT #1 (2009-10-01): The year 2012 will be a centenary year of celebration of Alan Turing.

POSTSCRIPT #2 (2011-11-18):  It should also be noted, concerning Mr Brown’s statement, that Turing died from eating an apple laced with cyanide.  He was apparently in the habit of eating an apple each day.   These two facts are not, by themselves, sufficient evidence to support a claim that he took his own life.

POSTSCRIPT #3 (2013-02-15):  I am not the only person to have questioned the coroner’s official verdict that Turing committed suicide.    The BBC reports that Jack Copeland notes that the police never actually tested the apple found beside Turing’s body for traces of cyanide, so it is quite possible it had no traces.     The possibility remains that he died from an accidental inhalation of cyanide or that he was deliberately poisoned.   Given the evidence, the only rational verdict is an open one.

Switch WITCH

The Guardian today carries a story about an effort at the UK National Musem of Computing at Bletchley Park to install and restore the world’s oldest working modern electric computer, the Harwell Dekatron Computer (aka the WITCH, pictured here), built originally for the UK Atomic Energy Research Establishment at Harwell in 1951.  The restoration is being done by UK Computer Conservation Society.
Note:  The Guardian claims this to be the world’s oldest working computer.  I am sure there are older “computers” still working elsewhere, if we assume a computer is a programmable device.  At late as 1985, in Harare, I saw at work in factories programmable textile and brush-making machinery which had been built in Britain more than a century earlier.
WITCH Computer

“One of the things that attracted us to the project was that it was built from standard off-the-shelf Post Office components, of which we have a stock built up for Colossus,” says Frazer. “And we have some former Post Office engineers who can do that sort of wiring.”
Frazer says he can imagine the machine’s three designers – Ted Cooke-Yarborough, Dick Barnes and Gurney Thomas – going to the stores with a list and saying: “We’d like these to build a computer, please.”
Dick Barnes, now a sprightly 88, says: “We had to build [the machine] from our existing resources or we might not have been allowed to build it at all. The relay controls came about because that was my background: during the war I had produced single-purpose calculating devices using relays. We knew it wasn’t going to be a fast computer, but it was designed to fulfil a real need at a time when the sole computing resources were hand-turned desk calculators.”

Sharks and murder: a Sydney story

Any aspiring crime novelists among you may relish the details of this report from tomorrow’s Sydney Morning Herald:

No wonder the Shark Arm Murder of 1935 remains Sydney’s best-known homicide. Apart from combining two of the city’s enduring interests, crime and sharks, the dismemberment of Jim Smith and its aftermath involved an unusually large number of suburbs, providing plenty of local colour.
. . .
For those uninitiated in the detail of the Shark Arm Murder, Jim Smith, a knockabout and police informer, was killed in Cronulla in a cottage called Cored Joy. According to Alex Castles’s book, most of his body was probably given a ”Sydney send-off” and dumped at sea.
But the arm itself , once detached, seems to have been taken by killer Patrick Brady on a journey to the McMahons Point residence of one Reginald Holmes, a pillar of the local Presbyterian church and cocaine smuggler. After the meeting with Holmes, Brady took the arm to Maroubra, and threw it into the ocean. It was eaten by a small shark that was then consumed by a four-metre tiger shark, which was caught and exhibited alive in a pool at the Coogee Aquarium Baths (now the Beach Palace Hotel). When the shark vomited the arm before a fascinated crowd on Anzac Day, police from Randwick and the city were called. During the investigation, Holmes, who ran his business from Lavender Bay, got into a motorboat, consumed a lot of brandy, and tried unsuccessfully to shoot himself. He then led the police on a four-hour chase around the harbour.
On the morning of the inquest into Smith’s death, Holmes was found dead in the driver’s seat of his Nash sedan in The Rocks, with three gunshot wounds in his chest. No one was convicted of the deaths of Smith or Holmes.”

Against the macho-securocrats

Andrew Sullivan on torture expresses my views exactly.  Richard B. Cheney and that egregious horseman of the apocalypse, John Bolton, keep making the macho-security argument – that only brute force and brutal methods will guarantee the West’s security.  Not only are such means ineffective and counter-productive, their very immorality vitiates our ends.   Just what western values, precisely, could be defended with torture and arbitrary arrest and detention?  That we inflict cruel and unusual punishments, in secret, on our perceived enemies?  That we treat even innocent people as less than human?  That we think laws and due process are dispensible?   Just whose western values are these?
The macho-security argument needs to be forcefully countered every time it is made, as Andrew Sullivan does here:

Actually, I can [believe that America is now safer because of the new restrictions on torture]. I think the intelligence we now get will be much more reliable; I believe that torture recruited thousands of Jihadists; I believe holding torturers accountable will help restore our alliances and give moral integrity back to the war on terror; I believe that without torture, we may actually be able to bring terrorists to justice; and that restoring America’s moral standing will make the war of ideas against Jihadism more winnable and therefore the West less vulnerable than it is now.”

Guest Post: Michael Holzman on Writing Intelligence History

In response to my review of his book on the life of Jim Angleton, Michael Holzman has written a thoughtful post on the particular challenges of writing histories of secret intelligence organizations:
Histories of the activities of secret intelligence organizations form a specialized branch of historical research, similar, in many ways, to military and political history, dissimilar in other ways.  They are similar in that the object of study is almost always a governmental institution and like the Army, for example, a secret intelligence organization may produce its own public and private histories and cooperate or not cooperate with outside historians.  They are dissimilar due to the unusual nature of secret intelligence organizations.
The diplomatic historian has at his or her disposal the vast, rich and often astonishingly frank archives of diplomacy, such as the Foreign Relations of the United States (FRUS).   Needless to say, there is no publication series entitled the Secret Foreign Operations of the United States (or any other country).  What we have instead is something like an archeological site, a site not well-preserved or well-protected, littered with fake artifacts, much missing, much mixed together and all difficult to put in context.
The overwhelming majority of publications about secret intelligence are produced by secret intelligence services as part of their operations, whether purportedly written  by “retired” members of those services, by those “close to” such services, by writers commissioned, directly or through third or fourth parties, by such services.  There are very few independent researchers working in the field.  The most distinguished practitioners, British academics, for example, have dual appointments—university chairs and status as “the historian” of secret intelligence agencies.  There are, of course, muckrakers, some of whom have achieved high status among the cognoscenti, but they are muckrakers nonetheless and as such exhibit the professional deformations of their trade, chiefly, a certain obscurity of sourcing and lack of balance in judgment.
Thus, an academically trained researcher, taking an interest in this field, finds challenges unknown elsewhere.  The archives are non-existent, “weeded,” or faked; the “literature” is tendentious to a degree not found otherwise outside of obscure religious sects; common knowledge, including fundamental matters of relative importance of persons and events, is at the very least unreliable, and research methods are themselves most peculiar.  Concerning the latter, the privileged mode is the interview with secret intelligence officials, retired secret intelligence officials, spies and so forth.  Authors and researchers will carefully enumerate how many interviews they held, sometimes for attribution, more often not, the latter instances apparently more valued than the former.  This is an unusual practice, not that researchers do not routinely interview those thought to be knowledgeable about the subject at hand, but because these particular interviewees are known to be, by definition, unreliable witnesses.  Many are themselves trained interrogators; most are accustomed to viewing their own speech as an instrument for specific operational purposes; nearly all have signed security pledges.  The methodological difficulties confronting the researcher seem to allow only a single use for the products of these interviews:  the statement that the interviewee on this occasion said this or that, quite without any meaningful application of the statements made.
An additional, unusual, barrier to research is the reaction of the ensemble of voices from the secret intelligence world to published research not emanating from that world or emanating from particular zones not favored by certain voices.  Work that can be traced to other intelligence services is discredited for that reason; work from non-intelligence sources is discredited for that reason (“professor so-and-so is unknown to experienced intelligence professionals”); certain topics are off-limits and, curiously, certain topic are de rigeur (“The writer has not mentioned the notorious case y”).   And, finally, there is the scattershot of minutiae always on hand for the purpose—dates (down to the day of the week), spelling (often transliterated by changing convention), names of secret intelligence agencies and their abbreviations (“Surely the writer realizes that before 19__ the agency in question was known as XXX”).  All this intended to drown out dissident ideas or, more importantly, inconvenient facts, non-received opinions. 
What is to be done?  One suggestion would be that of scholarly modesty.  The scholar would be well-advised to accept at the beginning that much will never be available.  Consider the ULTRA secret—the fact that the British were able to read a variety of high-grade German ciphers during the Second World War.  This was known, in one way or another, to hundreds, if not thousands, of people, and yet remained secret for most of a generation.  Are we sure that there is no other matter, as significant, not only to the history of secret intelligence, but to general history, that is not yet known?  Secondly, that which does become available must be treated with extraordinary caution in two ways:  is it what it purports to be, and how does it fit into a more general context?  To point at two highly controversial matters, there is VENONA, the decryptions and interpretations of certain Soviet diplomatic message traffic, and, on a different register, the matter of conspiracy theories.  Just to approach the prickly pear of the latter, the term itself was invented by James Angleton, chief of the CIA counterintelligence staff, as a way for discouraging questions of the conclusions of the Warren Commission.  It lives on, an undead barrier to the understanding of many incidents of the Cold War.  The VENONA material is available only in a form edited and annotated by American secret intelligence.  There are, for example, footnotes assigning certain cover names to certain well-known persons, but no reasons are given for these attributions.  The original documents have not been made available to researchers, nor the stages of decryption and interpretation. And yet great castles of interpretation have been constructed on these foundations.
Intelligence materials can be used, indeed, if available, must be used, if we are to understand certain historical situations:  the coup d’etats in Iran, Guatemala and Chile, for example.  The FRUS itself incorporates secret intelligence materials in its account of the Guatemala matter.  But such materials can only be illustrative; the case itself must be made from open sources.  There are exceptions:  Nazi-era German intelligence records were captured and are now available nearly in their entirety; occasional congressional investigations have obtained substantial amounts of the files of American secret intelligence agencies; other materials become misplaced into the public realm.  But this is a diminuendo of research excellence.  The historian concerned with secret intelligence matters must face the unpleasant reality that little can be known about such matters and, from the point of view of the reader, the more certainty with which interpretations are asserted, the more likely it is that such interpretations are yet another secret intelligence operation.
— Michael Holzman