Alan Turing

Yesterday, I reported on the restoration of the world’s oldest, still-working modern computer.  Last night, British Prime Minister Gordon Brown apologized for the country’s treatment of Alan Turing, computer pioneer.  In the words of Brown’s statement:

Turing was a quite brilliant mathematician, most famous for his work on breaking the German Enigma codes. It is no exaggeration to say that, without his outstanding contribution, the history of World War Two could well have been very different. He truly was one of those individuals we can point to whose unique contribution helped to turn the tide of war. The debt of gratitude he is owed makes it all the more horrifying, therefore, that he was treated so inhumanely. In 1952, he was convicted of ‘gross indecency’ – in effect, tried for being gay. His sentence – and he was faced with the miserable choice of this or prison – was chemical castration by a series of injections of female hormones. He took his own life just two years later.”

It might be considered that this apology required no courage of Brown.

This is not the case.  Until very recently, and perhaps still today, there were people who disparaged and belittled Turing’s contribution to computer science and computer engineering.  The conventional academic wisdom is that he was only good at the abstract theory and at the formal mathematizing (as in his “schoolboy essay” proposing a test to distinguish human from machine interlocuters), and not good for anything practical.   This belief is false.  As the philosopher and historian  B. Jack Copeland has shown, Turing was actively and intimately involved in the design and construction work (mechanical & electrical) of creating the machines developed at Bletchley Park during WWII, the computing machines which enabled Britain to crack the communications codes used by the Germans.

Turing-2004-Poster

Perhaps, like myself, you imagine this revision to conventional wisdom would be uncontroversial.  Sadly, not.  On 5 June 2004, I attended a symposium in Cottonopolis to commemorate the 50th anniversary of Turing’s death.  At this symposium, Copeland played a recording of an oral-history interview with engineer Tom Kilburn (1921-2001), first head of the first Department of Computer Science in Britain (at the University of Manchester), and also one of the pioneers of modern computing.   Kilburn and Turing had worked together in Manchester after WW II.  The audience heard Kilburn stress to his interviewer that what he learnt from Turing about the design and creation of computers was all high-level (ie, abstract) and not very much, indeed only about 30 minutes worth of conversation.  Copeland then produced evidence (from signing-in books) that Kilburn had attended a restricted, invitation-only, multi-week, full-time course on the design and engineering of computers which Turing had presented at the National Physical Laboratories shortly after the end of WW II, a course organized by the British Ministry of Defence to share some of the learnings of the Bletchley Park people in designing, building and operating computers.   If Turing had so little of practical relevance to contribute to Kilburn’s work, why then, one wonders, would Kilburn have turned up each day to his course.

That these issues were still fresh in the minds of some people was shown by the Q&A session at the end of Copeland’s presentation.  Several elderly members of the audience, clearly supporters of Kilburn, took strident and emotive issue with Copeland’s argument, with one of them even claiming that Turing had contributed nothing to the development of computing.   I repeat: this took place in Manchester 50 years after Turing’s death!    Clearly there were people who did not like Turing, or in some way had been offended by him, and who were still extremely upset about it half a century later.  They were still trying to belittle his contribution and his practical skills, despite the factual evidence to the contrary.

I applaud Gordon Brown’s courage in officially apologizing to Alan Turing, an apology which at least ensures the historical record is set straight for what our modern society owes this man.

POSTSCRIPT #1 (2009-10-01): The year 2012 will be a centenary year of celebration of Alan Turing.

POSTSCRIPT #2 (2011-11-18):  It should also be noted, concerning Mr Brown’s statement, that Turing died from eating an apple laced with cyanide.  He was apparently in the habit of eating an apple each day.   These two facts are not, by themselves, sufficient evidence to support a claim that he took his own life.

POSTSCRIPT #3 (2013-02-15):  I am not the only person to have questioned the coroner’s official verdict that Turing committed suicide.    The BBC reports that Jack Copeland notes that the police never actually tested the apple found beside Turing’s body for traces of cyanide, so it is quite possible it had no traces.     The possibility remains that he died from an accidental inhalation of cyanide or that he was deliberately poisoned.   Given the evidence, the only rational verdict is an open one.

Obama the policy-wonk

Andrew Sprung, over at XPOSTFACTOID, has a powerful deconstruction of the myth that Barack Obama does not do detail.   Of course he does, as has been evident – from the start of his Presidential campaign 33 months ago – to anyone who actually listens to what he says.  Why has the myth persisted?  Partly, I think it is laziness:  it is easier to repeat a cliche than to listen and think for oneself. Partly, I think it is right-wing spin:  his enemies think they can paint him as an airhead, as some tried to paint Tony Blair (remember “Bambi”?).

Switch WITCH

The Guardian today carries a story about an effort at the UK National Musem of Computing at Bletchley Park to install and restore the world’s oldest working modern electric computer, the Harwell Dekatron Computer (aka the WITCH, pictured here), built originally for the UK Atomic Energy Research Establishment at Harwell in 1951.  The restoration is being done by UK Computer Conservation Society.
Note:  The Guardian claims this to be the world’s oldest working computer.  I am sure there are older “computers” still working elsewhere, if we assume a computer is a programmable device.  At late as 1985, in Harare, I saw at work in factories programmable textile and brush-making machinery which had been built in Britain more than a century earlier.
WITCH Computer

“One of the things that attracted us to the project was that it was built from standard off-the-shelf Post Office components, of which we have a stock built up for Colossus,” says Frazer. “And we have some former Post Office engineers who can do that sort of wiring.”
Frazer says he can imagine the machine’s three designers – Ted Cooke-Yarborough, Dick Barnes and Gurney Thomas – going to the stores with a list and saying: “We’d like these to build a computer, please.”
Dick Barnes, now a sprightly 88, says: “We had to build [the machine] from our existing resources or we might not have been allowed to build it at all. The relay controls came about because that was my background: during the war I had produced single-purpose calculating devices using relays. We knew it wasn’t going to be a fast computer, but it was designed to fulfil a real need at a time when the sole computing resources were hand-turned desk calculators.”

Bonuses yet again

Alex Goodall, over at A Swift Blow to the Head, has written another angry post about the bonuses paid to financial sector staff. I’ve been in several minds about responding, since my views seem to be decidedly minority ones in our present environment, and because there seems to be so much anger abroad on this topic.  But so much that is written and said, including by intelligent, reasonable people such as Alex, mis-understands the topic, that I feel a response is again needed.  It behooves none of us to make policy on the basis of anger and ignorance.
Continue reading ‘Bonuses yet again’

Computing-as-interaction

In its brief history, computer science has enjoyed several different metaphors for the notion of computation.  From the time of Charles Babbage in the nineteenth century until the mid-1960s, most people thought of computation as calculation, or the manipulation of numbers.  Indeed, the English word “computer” was originally used to describe a person undertaking arithmetical calculations.  With widespread digital storage and processing of non-numerical information from the 1960s onwards, computation was re-conceptualized more generally as information processing, or the manipulation of numerical-, text-, audio- or video-data.  This metaphor is probably still the prevailing view among people who are not computer scientists.  From the late 1970s, with the development of various forms of machine intelligence, such as expert systems, a yet more general metaphor of computation as cognition, or the manipulation of ideas, became widespread, at least among computer scientists.  The fruits of this metaphor have been realized, for example, in the advanced artificial intelligence technologies which have now been a standard part of desktop computer operating systems since the mid-1990s.  Windows95, for example, included a Bayesnet for automated diagnosis of printer faults.
With the growth of the Internet and the Web over the last two decades, we have reached a position where a new metaphor for computation is required:  computation as interaction, or the joint manipulation of ideas and actions. In this metaphor, computation is something which happens by and through the communications which computational entities have with one another.  Cognition and intelligent behaviour is not something which a computer does on its own, or not merely that, but is something which arises through its interactions with other intelligent computers to which is connected.  The network is the computer, in SUN’s famous phrase.  This viewpoint is a radical reconceptualization of the notion of computation.
coveral3roadmap
In this new metaphor, computation is an activity which is inherently social, rather than solitary, and this view leads to a new ways of conceiving, designing, developing and managing computational systems.  One example of the influence of this viewpoint, is the model of software as a service, for example in Service Oriented Architectures.  In this model, applications are no longer “compiled together” in order to function on one machine (single user applications), or distributed applications managed by a single organization (such as most of today’s Intranet applications), but instead are societies of components:

  • These components are viewed as providing services to one another rather than being compiled together.  They may not all have been designed together or even by the same software development team; they may be created, operate and de-commissioned according to different timescales; they may enter and leave different societies at different times and for different reasons; and they may form coalitions or virtual organizations with one another to achieve particular temporary objectives.  Examples are automated procurement systems comprising all the companies connected along a supply chain, or service creation and service delivery platforms for dynamic provision of value-added telecommunications services.
  • The components and their services may be owned and managed by different organizations, and thus have access to different information sources, have different objectives, have conflicting preferences, and be subject to different policies or regulations regarding information collection, storage and dissemination.  Health care management systems spanning multiple hospitals or automated resource allocation systems, such as Grid systems, are examples here.
  • The components are not necessarily activated by human users but may also carry out actions in an automated and co-ordinated manner when certain conditions hold true.  These pre-conditions may themselves be distributed across components, so that action by one component requires prior co-ordination and agreement with other components.  Simple multi-party database commit protocols are examples of this, but significantly more complex co-ordination and negotiation protocols have been studied and deployed, for example in utility computing systems and in ad hoc wireless networks.
  • Intelligent, automated components may even undertake self-assembly of software and systems, to enable adaptation or response to changing external or internal circumstances.  An example is the creation of on-the-fly coalitions in automated supply-chain systems in order to exploit dynamic commercial opportunities.  Such systems resemble those of the natural world and human societies much more than they do the example arithmetical calculations  programs typically taught in Fortran classes, and so ideas from biology, ecology, statistical physics, sociology, and economics play an increasingly important role in computer science.

How should we exploit this new metaphor of computation as a social activity, as interaction between intelligent and independent entities, adapting and co-evolving with one another?  The answer, many people believe, lies with agent technologies.  An agent is a computer programme capable of flexible and autonomous action in a dynamic environment, usually an environment containing other agents.  In this abstraction, we have software entities called agents, encapsulated, autonomous and intelligent, and we have demarcated the society in which they operate, a multi-agent system.  Agent-based computing concerns the theoretical and practical working through of the details of this simple two-level abstraction.
Reference:
Text edited slightly from the Executive Summary of:
M. Luck, P. McBurney, S. Willmott and O. Shehory [2005]: The AgentLink III Agent Technology Roadmap. AgentLink III, the European Co-ordination Action for Agent-Based Computing, Southampton, UK.

Social forecasting: Doppio Software

Five years ago, back in the antediluvian era of Web 2.0 (the web as enabler and facilitator of social networks), we had the idea of  social-network forecasting.  We developed a product to enable a group of people to share and aggregate their forecasts of something, via the web.  Because reducing greenhouse gases were also becoming flavour-du-jour, we applied these ideas to social forecasts of the price for the European Union’s carbon emission permits, in a nifty product we called Prophets-360.  Sadly, due mainly to poor regulatory design of the European carbon emission market, supply greatly outstripped demand for emissions permits, and the price of permits fell quickly and has mostly stayed fallen.  A flat curve is not difficult to predict, and certainly there was little value in comparing one person’s forecast with that of another.  Our venture was also felled.
But now the second generation of social networking forecasting tools has arrived.  I see that a French start-up, Doppio Software, has recently launched publicly.   They appear to have a product which has several advantages over ours:

  • Doppio Software is focused on forecasting demand along a supply chain.  This means the forecasting objective is very tactical, not the long-term strategic forecasting that CO2 emission permit prices became.   In the present economic climate, short-term tactical success is certainly more compelling to business customers than even looking five years hence.
  • The relevant social network for a supply chain is a much stronger community of interest than the amorphous groups we had in mind for Prophets-360.  Firstly, this community already exists (for each chain), and does not need to be created.  Secondly, the members of the community by definition have differential access to information, on the basis of their different positions up and down the chain.  Thirdly, although the interests of the partners in a supply chain are not identical, these interests are mutually-reinforcing:  everyone in the chain benefits if the chain itself is more successful at forecasting throughput.
  • In addition, Team Doppio (the Doppiogangers?) appear to have included a very compelling value-add:  their own automated modeling of causal relationships between the target demand variables of each client and general macro-economic variables, using  semantic-web data and qualitative modeling technologies from AI.  Only the largest manufacturing companies can afford their own econometricians, and such people will normally only be able to hand-craft models for the most important variables.  There are few companies IMO who would not benefit from Doppio’s offer here.

Of course, I’ve not seen the Doppio interface and a lot will hinge on its ease-of-use (as with all software aimed at business users).  But this offer appears to be very sophisticated, well-crafted and compelling, combining social network forecasting, intelligent causal modeling and semantic web technologies.
Well done, Team Doppio!  I wish you every success with this product!
PS:  I have just learnt that “doppio” means “double”, which makes it a very apposite name for this application – forecasts considered by many people, across their human network.  Neat!  (2009-09-16)
Article in The Observer (UK) about Doppio 2009-09-06 here. And here is an AFP TV news story (2009-09-15) about Doppio co-founder, Edouard d’Archimbaud.  Another co-founder is Benjamin Haycraft.

Action-at-a-distance

For at least 22 years, I have heard business presentations (ie, not just technical presentations) given by IT companies which mention client-server architectures.   For the last 17 of those years, this is not suprising, since both the Hyper-Text Transfer Protocol (HTTP) and the World-Wide Web (WWW) use this architecture.    In a client-server architecture, one machine (the client) requests that some action be taken by another machine (the server), which responds to the request.  For HTTP, the standard request by the client is for the server to send to the client some electronic file, such as a web-page.  The response by the server is not necessarily to undertake the action requested.    Indeed, the specifications of HTTP define 41 responses (so-called status codes), including outright refusal by the server (Client Error 403 “Forbidden”), and allow for hundreds more to be defined.  Typically, one server will be configured to respond to many simultaneous or near-simultaneous client requests.   The functions of client and server are conceptually quite distinct, although of course, one machine may undertake both functions, and a server may even have to make a request as a client to another server in order to respond to an earlier request from its clients.   As an analogy, consider a library which acts like a server of books to its readers, who are its clients;  a library may have to request a book via inter-library loan from another library in order to satisfy a reader’s request.
Since the rise of file sharing, particularly illegal file sharing, over a decade ago, it has also been common to hear talk about Peer-to-Peer (P2P) architectures.   Conceptually, in these architectures all machines are viewed equally, and none are especially distinguished as servers.   Here, there is no central library of books; rather, each reader him or herself owns some books and is willing to lend them to any other reader as and when needed.   Originally, peer-to-peer architectures were invented to circumvent laws on copyright, but they turn out (as do most technical innovations) to have other, more legal, uses – such as the distributed storage and sharing of electronic documents in large organizations (eg, xray images in networks of medical clinics).
Both client-server and P2P architectures involve attempts at remote control.  A client or a peer-machine makes a request of another machine (a server or another peer, respectively), to undertake some action(s) at the location of the second machine.   The second machine receiving the request from the first may or may not execute the request.   This has led me to think about models of such action-at-a-distance.
Imagine we have two agents (human or software), named A and B, at different locations, and a resource, named X, at the same location as B.   For example, X could be an electron microscope, B the local technician at site of the microscope, and  A a remote user of the microscope. Suppose further that agent B can take actions directly to control resource X.   Agent A may or may not have permissions or powers to act on X.
Then,  we have the following five possible situations:

1.  Agent A controls X directly, without agent B’s involvement (ie, A has remote access to and remote control over resource X).
2.  Agent A commands agent B to control X (ie, A and B have a master-slave relationship; some client-server relationships would fall into this category).
3.  Agent A requests agent B to control X (ie, both A and B are autonomous agents; P2P would be in this category, as well as many client-server interactions).
4.  Both agent A and agent B need to take actions jointly to control X (eg, the double-key system for launch of nuclear missiles in most nuclear-armed forces; coalitions of agents would be in this category)
5.  Agent A has no powers, not direct nor indirect, to control resource X.

As far as I can tell, these five situations exhaust the possible relationships betwen agents A and B acting on resource X, at least for those cases where potential actions on X are initated by agent A.  From this outline, we can see the relevance of much that is now being studied in computer science:

  • Action co-ordination (Cases 1-5)
  • Command dialogs (Case 2)
  • Persuasion dialogs (Case 3)
  • Negotiation dialogs (dialogs to divide a scarce resource) (Case 4)
  • Deliberation dialogs (dialogs over what actions to take) (Cases 1-4)
  • Coalitions (Case  4).

To the best of my knowledge, there is as yet no formal theory which encompasses these five cases.   (I welcome any suggestions or comments to the contrary.)  Such a formal theory is needed as we move beyond Web 2.0 (the web as means to create and sustain social networks) to reification of the idea of computing-as-interaction (the web as a means to co-ordinate joint actions).
Reference:
Network Working Group [1999]: Hypertext Transfer Protocol – HTTP/1.1. Technical Report RFC 2616.  Internet Engineering Task Force.

Thinkers of renown

The recent death of mathematician Jim Wiegold (1934-2009), whom I once knew, has led me to ponder the nature of intellectual influence.  Written matter – initially, hand-copied books, then printed books, and now the Web – has been the main conduit of influence.   For those of us with a formal education, lectures and tutorials are another means of influence, more direct than written materials.   Yet despite these broadcast methods, we still seek out individual contact with others.  Speaking for myself, it is almost never the knowledge or facts of others, per se, that I have sought or seek in making personal contact, but rather their various different ways of looking at the world.   In mathematical terminology, the ideas that have influenced me have not been the solutions that certain people have for particular problems, but rather the methods and perspectives they use for approaching and tackling problems, even when these methods are not always successful.

To express my gratitude, I thought I would list some of the people whose ideas have influenced me, either directly through their lectures, or indirectly through their books and other writings.   In the second category, I have not included those whose ideas have come to me mediated through the books or lectures of others, which therefore excludes many mathematicians whose work has influenced me (in particular:  Newton, Leibniz, Cauchy, Weierstrauss, Cantor, Frege, Poincare, Pieri, Hilbert, Lebesque, Kolmogorov, and Godel).  I have also not included the many writers of poetry, fiction, history and biography whose work has had great impact on me.  These two categories also exclude people whose intellectual influence has been manifest in non-verbal forms, such as through visual arts or music, or via working together, since those categories need posts of their own.

Teachers & lecturers I have had who have influenced my thinking includeLeo Birsen (1902-1992), Sr. Claver Butler RSM (ca. 1930-2009), Burgess Cameron (1922-2020), Sr. Clare Castle RSM (ca. 1920- ca. 2000), John Coates (1945-2022), Dot Crowe, James Cutt, Bro. Clive Davis FMS, Tom Donaldson (1945-2006), Aleksandr Doronin, Gary Dunbier, Sol Encel (1925-2010), Felix Fabryczny de Leiris, Claudio Forcada, Richard Gill (1941-2018), Myrtle Hanley (1909-1984), Sr. Jennifer Hartley RSM, Chip Heathcote (1931-2016),  Hope Hewitt (1915-2011), Alec Hope (1907-2000),  John Hutchinson, Marg Keetles, Joe Lynch, Robert Marks, John McBurney (1932-1998), David Midgley, Lindsay Morley, Leopoldo Mugnai, Terry O’Neill, Jim Penberthy* (1917-1999), Malcolm Rennie (1940-1980), John Roberts, Gisela Soares, Brian Stacey (1946-1996), James Taylor, Frank Torpie (1934-1989),  Neil Trudinger, David Urquhart-Jones, Frederick Wedd (1890-1972), Gary Whale (1943-2019), Ted Wheelwright (1921-2007), John Woods and Alkiviadis Zalavras.

People whose writings have influenced my thinking includeJohn Baez, Ole Barndorff-Nielsen (1935-2022), Charlotte Joko Beck (1917-2011), Johan van Bentham, Mark Evan Bonds, John Cage (1912-1992), Albert Camus (1913-1960), Nikolai Chentsov (1930-1992), John Miller Chernoff, Stewart Copeland, Sam Eilenberg (1913-1998), Paul Feyerabend (1924-1994), George Fowler (1929-2000), Kyle Gann, Alfred Gell (1945-1997), Herb Gintis, Jurgen Habermas, Charles Hamblin (1922-1985), Vaclav Havel (1936-2011), Lafcadio Hearn (1850-1904), Jaakko Hintikka (1929-2015), Eric von Hippel, Wilfrid Hodges, Christmas Humphreys (1901-1983), Jon Kabat-Zinn, Herman Kahn (1922-1983), John Maynard Keynes (1883-1946), Andrey Kolmogorov (1903-1987), Paul Krugman, Imre Lakatos (1922-1974), Trevor Leggett (1914-2000), George Leonard (1923-2010), Brad de Long, Donald MacKenzie,  Saunders Mac Lane (1909-2005), Karl Marx (1818-1883), Grant McCracken, Henry Mintzberg, Philip Mirowski, Michel de Montaigne (1533-1592), Michael Porter, Charles Reich (1928-2019), Jean-Francois Revel (1924-2006), Daniel Rose, Bertrand Russell (1872-1970), Pierre Ryckmans (aka Simon Leys) (1935-2014), Oliver Sacks (1933-2015), Gunther Schuller (1925-2015), George Shackle (1903-1992), Cosma Shalizi, Rupert Sheldrake, Raymond Smullyan (1919-2017), Rory Stewart, Anne Sweeney (d. 2007), Nassim Taleb, Henry David Thoreau (1817-1862), Stephen Toulmin (1922-2009), Scott Turner, Roy Weintraub, Geoffrey Vickers VC (1894-1982), and Richard Wilson.

FOOTNOTES:
* Which makes me a grand-pupil of Nadia Boulanger (1887-1979).
** Of course, this being the World-Wide-Web, I need to explicitly say that nothing in what I have written here should be taken to mean that I agree with anything in particular which any of the people mentioned here have said or written.
A more complete list of teachers is here.

Sharks and murder: a Sydney story

Any aspiring crime novelists among you may relish the details of this report from tomorrow’s Sydney Morning Herald:

No wonder the Shark Arm Murder of 1935 remains Sydney’s best-known homicide. Apart from combining two of the city’s enduring interests, crime and sharks, the dismemberment of Jim Smith and its aftermath involved an unusually large number of suburbs, providing plenty of local colour.
. . .
For those uninitiated in the detail of the Shark Arm Murder, Jim Smith, a knockabout and police informer, was killed in Cronulla in a cottage called Cored Joy. According to Alex Castles’s book, most of his body was probably given a ”Sydney send-off” and dumped at sea.
But the arm itself , once detached, seems to have been taken by killer Patrick Brady on a journey to the McMahons Point residence of one Reginald Holmes, a pillar of the local Presbyterian church and cocaine smuggler. After the meeting with Holmes, Brady took the arm to Maroubra, and threw it into the ocean. It was eaten by a small shark that was then consumed by a four-metre tiger shark, which was caught and exhibited alive in a pool at the Coogee Aquarium Baths (now the Beach Palace Hotel). When the shark vomited the arm before a fascinated crowd on Anzac Day, police from Randwick and the city were called. During the investigation, Holmes, who ran his business from Lavender Bay, got into a motorboat, consumed a lot of brandy, and tried unsuccessfully to shoot himself. He then led the police on a four-hour chase around the harbour.
On the morning of the inquest into Smith’s death, Holmes was found dead in the driver’s seat of his Nash sedan in The Rocks, with three gunshot wounds in his chest. No one was convicted of the deaths of Smith or Holmes.”