Catwoman, my old flame

Those of you paying attention to these lectures will realize how obsessed I am with Economics.  That flaxen-haired lady promised so much, but she has so many flaws and failings.   When we first meet her, it seems she is everything you could wish for:  she is concerned with how society should be organized, how people should be given material goods, how the benefits of new technology and material well-being should be shared with all, and how the poor should be enriched, so that they can spend their time on self-improving and fulfilling activities, like art and sport.  So much is promised!
But then, once the flirtation and seduction are over, her flaws become evident. I have been thinking about these flaws again, having just read Deirdre McCloskey’s superb 2002 pamphlet, The Secret Sins of Economics.  Many of McCloskey’s criticisms are ones I (and many others) have made before, but some are new.   I decided, for comparison, to list here my chief complaints with this blemished beauty, this feline seductress, Our Lady of the Catallacts.  Date her if you wish, but you should read these accounts by her ex-lovers before you do.
First, she is blinkered, often unable to see what is obvious to anyone else – that we are all shaped by social and cultural forces, and peer pressures.   Instead, Catwoman and her acolytes invariably assume an individualist explanation for any economic or social phenomenon, and then seek to demonstrate it.  McCloskey calls this a focus on the P-variables (price, individual prudence, profit, the profane) as distinct from the S-variables (solidarity, speech, stories, shame) which Anthropology, that Indiana Jones of academic disciplines – creative, unruly, a thorn in everyone else’s side – has focused on.   A classic example is Levitt and Dubner’s Freakonomics.
Because of her blindness to the social, Cat Lady mostly ignored (until recently) major aspects of society, such as Institutions, legal frameworks, norms, and power relationships, aspects which can make or fail the marketplaces she says she studies.   She can’t claim that no one mentioned these to her, since 19th-century economists such as Karl Marx made the study of these aspects the work of a lifetime, and their study has continued to the present by sociologists and anthropologists and political scientists.
She has also been blind to anything historical or temporal, as if all her work stood outside the mundane and messy world in which we live.  This blindness manifests itself most strongly in the complete disregard (until recently) for endowments:  how did we get to where we are?  So, for example, free trade theory says that if England produces textiles more cheaply than Portugal, and Portugal produces wine more cheaply then England, the two should trade textiles for wine, and wine for textiles.   And the choice of these products is a subtly clever one, obfuscating much, since wine needs sunshine and not too much rain, while textiles (in the 18th and early 19th centuries) needed lots of rain, in order that the damp air would ensure cotton threads did not break when woven by machines.   So, Portugal’s sunshine and Northern England’s rain, being part of the God-given climate, were natural advantages, beyond the control or manipulation of any temporal human powers.  Free trade seems to have been ordained by the Almighty. But why consider only England’s textiles and not Ireland’s?    The answer is that Ireland had no textile industry to speak of.  And just why is that?  After all, much of Ireland is as damp as the valleys of Lancashire.   The reason is that the owners of northern English textile factories lobbied the British authorities to exclude Irish-made textiles from entering England.  When Ireland lost its own Parliament in a hostile takeover by Westminster, this protectionism for English textiles was entrenched, and the growing British Empire provided the critical masses of customers to ensure bonuses in Bury and Bolton and Burnley.     (Is it any wonder that people in Ireland and India and elsewhere sought Independence, when colonialism so powerfully stifled economic aspirations.)  Northern England has no natural comparative advantage in textile production, at least, not when compared to Ireland, but an artificial, man-made advantage.  The same type of advantage, in fact, that South Korea today has in ship-building, or the USA in most computer and aerospace technologies.   Where, in the mainstream theory of free trade, are these aspects studied, or even mentioned?
And when, angered by these failings, you face her with them, the wench promises you that that was all in the past, and she will be different from now on.  Path dependence and network goods and institutional economics are all the rage, she says.   But then you find, she’s still up to her old tricks:  She says she’s building models of economic phenomena in order to understand, predict and control, just like physicists do.  But, although it looks like that’s what she’s doing, in fact her models are not models of real phenomena, but models of stylized abstractions of phenomena.  Her acolytes even use that very word – stylized – to describe the “facts” which they use to calibrate or test their models.
Of course, she will say, physicists do this too.  Newton famously assumed the planets were perfect spheres in order to predict their relative movements using his theory of gravitation.   But physicists later relax their assumptions, in order to build revised models, in a process that has continued since Newton to the present day.  Physicists also allow their models to be falsified by the data they collect, even when that data too is stylized, and overturned.     Instead, Catwoman is still assuming that people are maximizers of individual utility, with perfect foresight and unlimited processing capabilities, obeying the axiom of the irrelevance of independent alternatives, when all these assumptions have been shown to be false about us.   When was the last time a mainstream economic model was overturned?
Indeed, here is another of her flaws:  her loose grasp of reality.  She says we are always, all of us, acting in our own self-interest.  When you quiz this, pointing out (say) a friend who donated money to a charity, she replies that he is making himself feel better by doing something he thinks virtuous, and thus is maximizing his own self-interest.  Her assumption, it turns out, is unfalsifiable.   It is also naive and morally repugnant – and false!  Anyone with any experience of the world sees through this assumption straight away, which is why I think our feline friend is borderline autistic.   She just does not know much about real people and how they interact and live in the word. Who would want to step out with someone having such views, and unable to reconstruct them in the light of experience?
And, despite her claims to be grounded in the material world (Paul Samuelson:  “Economics is the study of how people and society end up choosing, with or without the use of money, to employ scarce productive resources that would have alternative uses,  . . .”), she sure is fond of metaphysical entities for which no hard evidence exists:  invisible hands, equilibria, perfect competition, free trade, commodities, in fact, the whole shebang.   As marketers say, the existence of a true commodity is evidence that a marketing manager is not doing his or her job.  In comparison, Richard Dawkins with his memes is a mere amateur in this creation of imaginary objects for religious veneration.
One could perhaps accept the scented candles and the imaginary friends if she was a little more humble and tolerant of the opinions of others.  But no, the feline femme fatale and her acolytes are among the most arrogant and condescending of any academic disciplines.  Read the recovering Chicago economist McCloskey for an account of this, if you don’t believe me.   McCloskey’s anecdotes and experiences were very familiar to me, especially that sneer from an economist who thinks you’ve not acted in your own self-interest – for example, by helping your colleagues or employer with something you are not legally required to do.  Indeed, the theft by economists from philosophers of the word “rational” to describe a very particular, narrow, autistic behavior is the best example of this.   Anyone whose behavior does not fit the models of mainstream economics can be thus be labeled irrational, and dismissed from further consideration as if insane.
Date her at your peril!  You have been warned!

Doing a PhD

These are some notes on deciding to do a PhD, notes I wrote some years ago after completing my own PhD.
Choosing a PhD program is one of the hardest decisions we can make. For a start, most of us only make this decision once in our lives, and so we have no prior personal experience to go on.
Second, the success or otherwise of a PhD depends a great deal on factors about which we have little advanced knowledge or control, including, for example:
Continue reading ‘Doing a PhD’

That deadline

Nate Fick, whom I saluted here, had an op-ed in the NYT last week on the decision by the Obama administration to announce a deadline for withdrawal, here.  His conclusion:

Announcing the timeline was risky, and it could turn out to be our undoing. The president delivered two intertwined messages in his speech at West Point outlining his Afghan policy: one to his American audience (“I see the way out of this war”), and one to the people of Afghanistan and Pakistan, including the Taliban (“I’m in to win”). The danger of dual messages, of course, is that each may find the other audience, with Americans hearing over-commitment and Afghans hearing abandonment.
The only way to reassure both is to show demonstrable progress on the ground.  A credible declaration of American limits may, paradoxically, be the needed catalyst.”

Poem: O world, thou choosest not the better part!

Today’s poem is a sonnet by George Santayana (1863-1952), whom I have blogged about here and here.

Sonnet III
O world, thou choosest not the better part!
It is not wisdom to be only wise,
And on the inward vision close the eyes,
But it is wisdom to believe the heart.
Columbus found a world, and had no chart,
Save one that faith deciphered in the skies;
To trust the soul’s invincible surmise
Was all his science and his only art.
Our knowledge is a torch of smoky pine
That lights the pathway but one step ahead
Across a void of mystery and dread.
Bid, then, the tender light of faith to shine
By which alone the mortal heart is led
Unto the thinking of the thought divine.

Vale William Safire

William Safire, a speech-writer for Richard Nixon and later an op-ed columnist with The New York Times, has just died.  To his memory, I retrieve a statement from his novel Full Disclosure, which nicely expresses a different model of decision-making to that taught in Decision Theory classes:

The truth about big decisions, Ericson mused, was that they never marched through logical processes, staff systems, option papers, and yellow pads to a conclusion.  No dramatic bottom lines, no Thurberian captains with their voices like thin ice breaking, announcing, “We’re going through!”    The big ones were a matter of mental sets, predispositions, tendencies – taking a lifetime to determine – followed by the battering of circumstance, the search for a feeling of what was right – never concluded at some finite moment of conclusion, but in the recollection of having “known” what the decision would be some indeterminate time before.  For weeks now, Ericson knew he had known he was ready to do what he had to do, if only Andy or somebody could be induced to come up with a solution that the President could then put through his Decision-Making Process. That made his decision a willingness not to obstruct, rather than a decision to go ahead, much like Truman’s unwillingness to stop the train of events that led to the dropping of the A-bomb – not on the same level of magnitude, but the same type of reluctant going-along.”  (pp. 491-492)

Reference:
William Safire [1977]:  Full Disclosure. (Garden City, NY, USA:  Doubleday and Company).

GTD Intelligence at Kimberly-Clark

I started talking recently about getting-things-done (GTD) intelligence.  Grant McCracken, over at This Blog Sits At, has an interview with Paula Rosch, formerly of fmcg company Kimberly-Clark, which illustrates this nicely.

I spent the rest of my K-C career in advanced product development or new business identification, usually as a team leader, and sometimes as what Gifford Pinchot called an “Intrapreneur” – a corporate entrepreneur, driving new products from discovery to basis-for-interest to commercialization.  It’s the nature of many companies to prematurely dismiss ideas that represent what the world might want/need 5, 10 years out and beyond in favor of near-term opportunities – the intrapreneur stays under the radar, using passion, brains, intuition, stealth, any and every other human and material resource available to keep things moving.  It helps to have had some managers that often looked the other way.
Continue reading ‘GTD Intelligence at Kimberly-Clark’

Bonuses yet again

Alex Goodall, over at A Swift Blow to the Head, has written another angry post about the bonuses paid to financial sector staff. I’ve been in several minds about responding, since my views seem to be decidedly minority ones in our present environment, and because there seems to be so much anger abroad on this topic.  But so much that is written and said, including by intelligent, reasonable people such as Alex, mis-understands the topic, that I feel a response is again needed.  It behooves none of us to make policy on the basis of anger and ignorance.
Continue reading ‘Bonuses yet again’

Computing-as-interaction

In its brief history, computer science has enjoyed several different metaphors for the notion of computation.  From the time of Charles Babbage in the nineteenth century until the mid-1960s, most people thought of computation as calculation, or the manipulation of numbers.  Indeed, the English word “computer” was originally used to describe a person undertaking arithmetical calculations.  With widespread digital storage and processing of non-numerical information from the 1960s onwards, computation was re-conceptualized more generally as information processing, or the manipulation of numerical-, text-, audio- or video-data.  This metaphor is probably still the prevailing view among people who are not computer scientists.  From the late 1970s, with the development of various forms of machine intelligence, such as expert systems, a yet more general metaphor of computation as cognition, or the manipulation of ideas, became widespread, at least among computer scientists.  The fruits of this metaphor have been realized, for example, in the advanced artificial intelligence technologies which have now been a standard part of desktop computer operating systems since the mid-1990s.  Windows95, for example, included a Bayesnet for automated diagnosis of printer faults.
With the growth of the Internet and the Web over the last two decades, we have reached a position where a new metaphor for computation is required:  computation as interaction, or the joint manipulation of ideas and actions. In this metaphor, computation is something which happens by and through the communications which computational entities have with one another.  Cognition and intelligent behaviour is not something which a computer does on its own, or not merely that, but is something which arises through its interactions with other intelligent computers to which is connected.  The network is the computer, in SUN’s famous phrase.  This viewpoint is a radical reconceptualization of the notion of computation.
coveral3roadmap
In this new metaphor, computation is an activity which is inherently social, rather than solitary, and this view leads to a new ways of conceiving, designing, developing and managing computational systems.  One example of the influence of this viewpoint, is the model of software as a service, for example in Service Oriented Architectures.  In this model, applications are no longer “compiled together” in order to function on one machine (single user applications), or distributed applications managed by a single organization (such as most of today’s Intranet applications), but instead are societies of components:

  • These components are viewed as providing services to one another rather than being compiled together.  They may not all have been designed together or even by the same software development team; they may be created, operate and de-commissioned according to different timescales; they may enter and leave different societies at different times and for different reasons; and they may form coalitions or virtual organizations with one another to achieve particular temporary objectives.  Examples are automated procurement systems comprising all the companies connected along a supply chain, or service creation and service delivery platforms for dynamic provision of value-added telecommunications services.
  • The components and their services may be owned and managed by different organizations, and thus have access to different information sources, have different objectives, have conflicting preferences, and be subject to different policies or regulations regarding information collection, storage and dissemination.  Health care management systems spanning multiple hospitals or automated resource allocation systems, such as Grid systems, are examples here.
  • The components are not necessarily activated by human users but may also carry out actions in an automated and co-ordinated manner when certain conditions hold true.  These pre-conditions may themselves be distributed across components, so that action by one component requires prior co-ordination and agreement with other components.  Simple multi-party database commit protocols are examples of this, but significantly more complex co-ordination and negotiation protocols have been studied and deployed, for example in utility computing systems and in ad hoc wireless networks.
  • Intelligent, automated components may even undertake self-assembly of software and systems, to enable adaptation or response to changing external or internal circumstances.  An example is the creation of on-the-fly coalitions in automated supply-chain systems in order to exploit dynamic commercial opportunities.  Such systems resemble those of the natural world and human societies much more than they do the example arithmetical calculations  programs typically taught in Fortran classes, and so ideas from biology, ecology, statistical physics, sociology, and economics play an increasingly important role in computer science.

How should we exploit this new metaphor of computation as a social activity, as interaction between intelligent and independent entities, adapting and co-evolving with one another?  The answer, many people believe, lies with agent technologies.  An agent is a computer programme capable of flexible and autonomous action in a dynamic environment, usually an environment containing other agents.  In this abstraction, we have software entities called agents, encapsulated, autonomous and intelligent, and we have demarcated the society in which they operate, a multi-agent system.  Agent-based computing concerns the theoretical and practical working through of the details of this simple two-level abstraction.
Reference:
Text edited slightly from the Executive Summary of:
M. Luck, P. McBurney, S. Willmott and O. Shehory [2005]: The AgentLink III Agent Technology Roadmap. AgentLink III, the European Co-ordination Action for Agent-Based Computing, Southampton, UK.

Social forecasting: Doppio Software

Five years ago, back in the antediluvian era of Web 2.0 (the web as enabler and facilitator of social networks), we had the idea of  social-network forecasting.  We developed a product to enable a group of people to share and aggregate their forecasts of something, via the web.  Because reducing greenhouse gases were also becoming flavour-du-jour, we applied these ideas to social forecasts of the price for the European Union’s carbon emission permits, in a nifty product we called Prophets-360.  Sadly, due mainly to poor regulatory design of the European carbon emission market, supply greatly outstripped demand for emissions permits, and the price of permits fell quickly and has mostly stayed fallen.  A flat curve is not difficult to predict, and certainly there was little value in comparing one person’s forecast with that of another.  Our venture was also felled.

But now the second generation of social networking forecasting tools has arrived.  I see that a French start-up, Doppio Software, has recently launched publicly.   They appear to have a product which has several advantages over ours:

  • Doppio Software is focused on forecasting demand along a supply chain.  This means the forecasting objective is very tactical, not the long-term strategic forecasting that CO2 emission permit prices became.   In the present economic climate, short-term tactical success is certainly more compelling to business customers than even looking five years hence.
  • The relevant social network for a supply chain is a much stronger community of interest than the amorphous groups we had in mind for Prophets-360.  Firstly, this community already exists (for each chain), and does not need to be created.  Secondly, the members of the community by definition have differential access to information, on the basis of their different positions up and down the chain.  Thirdly, although the interests of the partners in a supply chain are not identical, these interests are mutually-reinforcing:  everyone in the chain benefits if the chain itself is more successful at forecasting throughput.
  • In addition, Team Doppio (the Doppiogangers?) appear to have included a very compelling value-add:  their own automated modeling of causal relationships between the target demand variables of each client and general macro-economic variables, using  semantic-web data and qualitative modeling technologies from AI.  Only the largest manufacturing companies can afford their own econometricians, and such people will normally only be able to hand-craft models for the most important variables.  There are few companies IMO who would not benefit from Doppio’s offer here.

Of course, I’ve not seen the Doppio interface and a lot will hinge on its ease-of-use (as with all software aimed at business users).  But this offer appears to be very sophisticated, well-crafted and compelling, combining social network forecasting, intelligent causal modeling and semantic web technologies.

Well done, Team Doppio!  I wish you every success with this product!

PS:  I have just learnt that “doppio” means “double”, which makes it a very apposite name for this application – forecasts considered by many people, across their human network.  Neat!  (2009-09-16)

Article in The Observer (UK) about Doppio 2009-09-06 here. And here is an AFP TV news story (2009-09-15) about Doppio co-founder, Edouard d’Archimbaud.  Another co-founder is Benjamin Haycraft.

Action-at-a-distance

For at least 22 years, I have heard business presentations (ie, not just technical presentations) given by IT companies which mention client-server architectures.   For the last 17 of those years, this is not suprising, since both the Hyper-Text Transfer Protocol (HTTP) and the World-Wide Web (WWW) use this architecture.    In a client-server architecture, one machine (the client) requests that some action be taken by another machine (the server), which responds to the request.  For HTTP, the standard request by the client is for the server to send to the client some electronic file, such as a web-page.  The response by the server is not necessarily to undertake the action requested.    Indeed, the specifications of HTTP define 41 responses (so-called status codes), including outright refusal by the server (Client Error 403 “Forbidden”), and allow for hundreds more to be defined.  Typically, one server will be configured to respond to many simultaneous or near-simultaneous client requests.   The functions of client and server are conceptually quite distinct, although of course, one machine may undertake both functions, and a server may even have to make a request as a client to another server in order to respond to an earlier request from its clients.   As an analogy, consider a library which acts like a server of books to its readers, who are its clients;  a library may have to request a book via inter-library loan from another library in order to satisfy a reader’s request.
Since the rise of file sharing, particularly illegal file sharing, over a decade ago, it has also been common to hear talk about Peer-to-Peer (P2P) architectures.   Conceptually, in these architectures all machines are viewed equally, and none are especially distinguished as servers.   Here, there is no central library of books; rather, each reader him or herself owns some books and is willing to lend them to any other reader as and when needed.   Originally, peer-to-peer architectures were invented to circumvent laws on copyright, but they turn out (as do most technical innovations) to have other, more legal, uses – such as the distributed storage and sharing of electronic documents in large organizations (eg, xray images in networks of medical clinics).
Both client-server and P2P architectures involve attempts at remote control.  A client or a peer-machine makes a request of another machine (a server or another peer, respectively), to undertake some action(s) at the location of the second machine.   The second machine receiving the request from the first may or may not execute the request.   This has led me to think about models of such action-at-a-distance.
Imagine we have two agents (human or software), named A and B, at different locations, and a resource, named X, at the same location as B.   For example, X could be an electron microscope, B the local technician at site of the microscope, and  A a remote user of the microscope. Suppose further that agent B can take actions directly to control resource X.   Agent A may or may not have permissions or powers to act on X.
Then,  we have the following five possible situations:

1.  Agent A controls X directly, without agent B’s involvement (ie, A has remote access to and remote control over resource X).
2.  Agent A commands agent B to control X (ie, A and B have a master-slave relationship; some client-server relationships would fall into this category).
3.  Agent A requests agent B to control X (ie, both A and B are autonomous agents; P2P would be in this category, as well as many client-server interactions).
4.  Both agent A and agent B need to take actions jointly to control X (eg, the double-key system for launch of nuclear missiles in most nuclear-armed forces; coalitions of agents would be in this category)
5.  Agent A has no powers, not direct nor indirect, to control resource X.

As far as I can tell, these five situations exhaust the possible relationships betwen agents A and B acting on resource X, at least for those cases where potential actions on X are initated by agent A.  From this outline, we can see the relevance of much that is now being studied in computer science:

  • Action co-ordination (Cases 1-5)
  • Command dialogs (Case 2)
  • Persuasion dialogs (Case 3)
  • Negotiation dialogs (dialogs to divide a scarce resource) (Case 4)
  • Deliberation dialogs (dialogs over what actions to take) (Cases 1-4)
  • Coalitions (Case  4).

To the best of my knowledge, there is as yet no formal theory which encompasses these five cases.   (I welcome any suggestions or comments to the contrary.)  Such a formal theory is needed as we move beyond Web 2.0 (the web as means to create and sustain social networks) to reification of the idea of computing-as-interaction (the web as a means to co-ordinate joint actions).
Reference:
Network Working Group [1999]: Hypertext Transfer Protocol – HTTP/1.1. Technical Report RFC 2616.  Internet Engineering Task Force.