On prophecy

They know not what to make of the Words, little time, speedily, shortly, suddenly, soon. They would have me define the Time, in the Prophecies of my ancient Servants. Yet those Predictions carried in them my authority, and were fulfilled soon enough, for those that suffered under them . . . I have seen it best, not to assign the punctual Times, by their Definition among Men; that I might keep Men always in their due distance, and reverential Fear of invading what I reserve, in secret, to myself . . . The Tower-Guns are the Tormenta e Turre aethera, with which this City I have declared should be battered . . . I have not yet given a Key to Time in this Revelation.”

John Lacy, explaining to his followers among a millenarian French Huguenot sect in Britain in 1707 why his prophecies had not yet been fulfilled, cited in Schwartz 1980, p. 99.
Reference:
Hillel Schwartz [1980]:  The French Prophets: The History of a Millenarian Group in Eighteenth-Century England (Berkeley, CA, USA: University of California Press)

The decade around 1664

We noted before that one consequence of the rise of coffee-houses in 17th-century Europe was the development of probability theory as a mathematical treatment of reasoning with uncertainty.   Ian Hacking’s history of the emergence of probabilistic ideas in Europe has a nice articulation of the key events, all of which took place a decade either side of 1664:

  • 1654:  Pascal wrote to Fermat with his ideas about probability
  • 1657: Huygens wrote the first textbook on probability to be published, and Pascal was the first to apply probabilitiy ideas to problems other than games of chance
  • 1662: The Port Royal Logic was the first publication to mention numerical measurements of something called probability, and Leibniz applied probability to problems in legal reasoning
  • 1662:  London merchant John Gaunt published the first set of statistics drawn from records of mortality
  • Late 1660s:  Probability theory was used by John Hudde and by Johan de Witt in Amsterdam to provide a sound basis for reasoning about annuities (Hacking 1975, p.11).

Developments in the use of symbolic algebra in Italy in the 16th-century provided the technical basis upon which a formal theory of uncertainty could be erected.  And coffee-houses certainly aided the dissemination of probabilistic ideas, both in spoken and written form.   Coffee houses may even have aided the creation of these ideas – new mathematical concepts are only rarely created by a solitary person working alone in a garret, but usually arise instead through conversation and debate among people each having only partial or half-formed ideas.
However, one aspect of the rise of probability in the mid 17th century is still a mystery to me:  what event or phenomena led so many people across Europe to be interested in reasoning about uncertainty at this time?  Although 1664 saw the establishment of a famous brewery in Strasbourg, I suspect the main motivation was the prevalence of bubonic plague in Europe.   Although plague had been around for many centuries, the Catholic vs. Protestant religious wars of the previous 150 years had, I believe, led many intelligent people to abandon or lessen their faith in religious explanations of uncertain phenomena.   Rene Descartes, for example, was led to cogito, ergo sum when seeking beliefs which peoples of all faiths or none could agree on.  Without religion, alternative models to explain or predict human deaths, morbidity and natural disasters were required.   The insurance of ocean-going vessels provided a financial incentive for finding good predictive models of such events.
Hacking notes (pp. 4-5) that, historically, probability theory has mostly developed in response to problems about uncertain reasoning in other domains:  In the 17th century, these were problems in insurance and annuities, in the 18th, astronomy, the 19th, biometrics and statistical mechanics, and the early 20th, agricultural experiments.  For more on the connection between statistical theory and experiments in agriculture, see Hogben (1957).  For the relationship of 20th-century probability theory to statistical physics, see von Plato (1994).
POSTSCRIPT (ADDED 2011-04-25):
There appear to have been major outbreaks of bubonic plague in Seville, Spain (1647-1652), in Naples (1656), in Amsterdam, Holland (1663-1664), in Hamburg (1663), in London, England (1665-1666), and in France (1668).   The organist Heinrich Scheidemann, teacher of Johann Reincken, for example, died during the outbreak in Hamburg in 1663.   Wikipedia now has a listing of global epidemics (albeit incomplete).
 
POSTSCRIPT (ADDED 2018-01-19):
The number 1664 in Roman numerals is MDCLXIV, which uses every Roman numeric symbol precisely once.  The number 1666 has the same property, and for that number, the Roman symbols are in decreasing order.
 
References:
Ian Hacking [1975]:  The Emergence of Probability: a Philosophical study of early ideas about Probability, Induction and Statistical Inference. London, UK: Cambridge University Press.
Lancelot Hogben [1957]: Statistical Theory. W. W. Norton.
J. von Plato [1994]:  Creating Modern Probability:  Its Mathematics, Physics and Philosophy in Historical Perspective.  Cambridge Studies in Probability, Induction, and Decision Theory.  Cambridge, UK:  Cambridge University Press.

Old, beardy revolutionaries wielding spreadsheets

If you were aiming to model global, 21st-century capitalism, the obvious place to start would not be with a model of the firm based on mid-Victorian Lancashire textile manufacturing companies.   Firstly, the textile industry developed in Lancashire in the 18th century because only here (and almost nowhere else) was the climate sufficiently conducive for the then leading-edge technology to operate successfully.  (The air needed sufficient dampness, but not heat, for the cotton fibres not to be broken by the early textile machines.)   Technology in most industries has progressed so much in the two centuries since that very few industries are now tied to specific climates.     So industries are mostly not tied to place any more.
Secondly, large swathes of  work – even most work undertaken in companies described as manufacturers – is not anything a Victorian economist would recognize as manufacturing.  Rather it would be better described as symbol analysis and manipulation.  A relative of mine recently embarked on training as a surface-materials mixer for a road-building company – a great job, all done inside in an airconditioned office, mixing different ingredients and assessing the results, achieved by moving graphic objects around on a computer screen.  Of course, some person still has to be outside moving and switching on the automated machinery which actually lays the road surface once it has been created, so not every task is symbol processing.     But mixing is no longer done by eye, by a man using using a shovel in front of a furnace.  The relevant attribute of symbol manipulation – unlike, say the operation of a loom – is that this too is something no longer tied to place, thanks to our global telecommunications infrastructure and to digitisation.  Thus, the processing of US insurance claims can move from Hoboken, New Jersey, to Bangalore and Mauritius, and then maybe back again, if circumstances so dictate.
Thirdly, for companies whose sole business involves symbol manipulation – eg, banks, investment firms, graphics designers, media companies, software developers, consultancy companies, etc – the economics of traditional manufacturing industry no longer applies.  Information is a product whose value increases as more people use it, and whose marginal costs of production can decline to zero with multiple users.  It costs Microsoft  between US$1 and $2 billion to develop the first copy of each new generation of its Windows operating system, but less than $1 each for the second and subsequent copies;  the cost of producing these subsequent copies comprises only the cost of the media used to store the product (a DVD or a filestore).   (Of course, I am not including the cost of marketing and distribution in this statement of the cost of production.)  Similarly, a well-crafted, well-timed IPO and financial plan may raise (to cite the case of one IPO whose writing we led) US$5 billion on the world’s capital markets if successful, and nothing at all if less-well crafted or placed at a different time.  These information-economy attributes also apply to those parts of so-called manufacturing companies which undertake symbol manipulation:  the design team of Mazda cars for example, which relocated from Japan to the UK because London is a world-centre of art, design and marketing, or the 2-man in-house forex-trading desk which two decades ago first enriched and then almost bankrupted Australia’s largest defence electronics firm, AWA.
So, although I do not share the sentiment, I think it fine for someone to express a personal dislike of an alleged bonus culture in our banks and financial sector companies, as Alex Goodall has done.  But to argue against this feature of our modern world using a micro-economic model based on mid-Victorian manufacturing would seem inappropriate.   Much as I admire Karl Marx for his startling and still-interesting contributions to the 250-year-old conversation that is economic theory, for his insightful criticisms of the world he knew, and for his desire to make that world better for all, his model of the firm describes almost nothing about the world of 21st-century business that I know.

The Better Angels

[WARNING:  In-jokes for telecoms people!]
Prediction, particularly of the future, is difficult, as we know.  We notice a good example of the difficulties reading Charles McCarry’s riveting political/spy thriller, The Better Angels.  Published in 1979 but set during the final US Presidential election campaign of the 20th Century (2000? 1996?), McCarry gets some of the big predictions spot on: suicide bombers, Islamic terrorism, oil-company malfeasance, an extreme right-wing US President, computer voting machines, a Greek-American in charge of the US foreign intelligence agency, uncollected garbage and wild animals in Manhattan’s streets, and, of course, the manned space mission to Jupiter’s moon, Ganymede, for instance.  But he makes a real howler with the telephone system:  a brief mention (p. 154) of “the Bell System” indicates he had no anticipation of the 1982 Modified Final Judgement of Judge Harold H Greene.  How could he have failed to see that coming, when AT&T’s managers were preparing for decades for the competition which would follow, evident in the masterful way these managers and their companies have prospered since?!  A future with a unified Bell system was so weird, I was barely able to concentrate on the other events in the novel after this.
Reference:
Charles McCarry [1979]: The Better Angels. London, UK:  Arrow Books.

Evaluating prophecy

With the mostly-unforeseen global financial crisis uppermost in our minds, I am led to consider a question that I have pondered for some time:   How should we assess forecasts and prophecies?   Within the branch of philosophy known as argumentation, a lot of attention has been paid to the conditions under which a rational decision-maker would accept particular types of argument.

For example, although it is logically invalid to accept an argument only on the grounds that the person making it is an authority on the subject, our legal system does this all the time.  Indeed,  the philosopher Charles Willard has argued that modern society could not function without most of us accepting arguments-from-authority most of the time, and it is usually rational to do so.  Accordingly, philosophers of argumentation have investigated the conditions under which a rational person would accept or reject such arguments.   Douglas Walton (1996, pp. 64-67) presents an argumentation scheme for such acceptance/rejection decisions, the Argument Scheme for Arguments from Expert Opinion, as follows:

  • Assume E is an expert in domain D.
  • E asserts that statement A is known to be true.
  • A is within D.

Therefore, a decision-maker may plausibly take A to be true, unless one or more of the following Critical Questions (CQ) is answered in the negative:

  • CQ1:  Is E a genuine expert in D?
  • CQ2:  Did E really assert A?
  • CQ3:  Is A relevant to domain D?
  • CQ4:  Is A consistent with what other experts in D say?
  • CQ5:  Is A consistent with known evidence in D?

One could add further questions to this list, for example:

  • CQ6:  Is E’s opinion offered without regard to any reward or benefit upon statement A being taken to be true by the decision-maker?

Walton himself presents some further critical questions first proposed by Augustus DeMorgan in 1847 to deal with cases under CQ2 where the expert’s opinion is presented second-hand, or in edited form, or along with the opinions of others.
Clearly, some of these questions are also pertinent to assessing forecasts and prophecies.  But the special nature of forecasts and prophecies may enable us to make some of these questions more precise.  Here is my  Argument Scheme for Arguments from Prophecy:

  • Assume E is a forecaster for domain D.
  • E asserts that statement A will be true of domain D at time T in the future.
  • A is within D.

Therefore, a decision-maker may plausibly take A to be true at time T, unless one or more of the following Critical Questions (CQ) is answered in the negative:

  • CQ1:  Is E a genuine expert in forecasting domain D?
  • CQ2:  Did E really assert that A will be true at T?
  • CQ3:  Is A relevant to, and within the scope of, domain D?
  • CQ4:  Is A consistent with what is said by other forecasters with expertise in D?
  • CQ5:  Is A consistent with known evidence of current conditions and trends in D?
  • CQ6:  Is E’s opinion offered without regard to any reward or benefit upon statement A being adopted by the decision-maker as a forecast?
  • CQ7:  Do the benefits of adopting A being true at time T in D outweigh the costs of doing so, to the decision-maker?

In attempting to answer these questions, we may explore more detailed questions:

  • CQ1-1:  What is E’s experience as forecaster in domain D?
  • CQ1-2: What is E’s track record as a forecaster in domain D?
  • CQ2-1: Did E articulate conditions or assumptions under which A will become true at T, or under which it will not become true?  If so, what are these?
  • CQ2-2:  How sensitive is the forecast of A being true at T to the conditions and assumptions made by E?
  • CQ2-3:  When forecasting that A would become true at T, did E assert a more general statement than A?
  • CQ2-4:  When forecasting that A would become true at T, did E assert a more general time than T?
  • CQ2-5:  Is E able to provide a rational justification (for example, a computer simulation model) for the forecast that A would be true at T?
  • CQ2-6:  Did E present the forecast of A being true at time T qualified by modalities, such as possibly, probably, almost surely, certainly, etc.
  • CQ4-1:  If this forecast is not consistent with those of other forecasters in domain D, to what extent are they inconsistent?   Can these inconsistencies be rationally justified or explained?
  • CQ5-1: What are the implications of A being true at time T in domain D?  Are these plausible?  Do they contradict any known facts or trends?
  • CQ6-1:  Will E benefit if the decision-maker adopts A being true at time T as his/her forecast for domain D?
  • CQ6-2:  Will E benefit if the decision-maker does not adopt A being true at time T as his/her forecast for domain D?
  • CQ6-3:  Will E benefit if many decision-makers adopt A being true at time T as their forecast for domain D?
  • CQ6-4:  Will E benefit if few decision-makers adopt A being true at time T as their forecast for domain D?
  • CQ6-5:  Has E acted in such a way as to indicate that E had adopted A being true at time T as their forecast for domain D (eg, by making an investment betting that A will be true at T)?
  • CQ7-1:  What are the costs and benefits to the decision-maker for adopting statement A being true at time T in domain D as his or her forecast of domain D?
  • CQ7-2:  How might these costs and benefits be compared?  Can a net benefit/cost for the decision-maker be determined?

Automating these questions and the process of answering them is on my list of next steps, because automation is needed to design machines able to reason rationally about the future.   And rational reasoning about the future is needed if  we want machines to make decisions about actions.
References:
Augustus DeMorgan [1847]: Formal Logic.  London, UK:  Taylor and Walton.
Douglas N. Walton [1996]:  Argument Schemes for Presumptive Reasoning. Mahwah, NJ, USA: Lawrence Erlbaum.
Charles A. Willard [1990]: Authority.  Informal Logic, 12: 11-22.

Kaleidic moments

Is the economy like a pendulum, gradually oscillating around a fixed point until it reaches a static equilibrium?  This metaphor, borrowed from Newtonian physics, still dominates mainstream economic thinking and discussion.  Not all economists have agreed, not least because the mechanistic Newtonian viewpoint seems to allow no place for new information to arrive or for changes in subjective responses.   The 20th-century economists George Shackle and Ludwig Lachmann, for example, argued that a much more realistic metaphor for the modern economy is a kaleidoscope.  The economy is a “kaleidic society, interspersing its moments or intervals of order, assurance and beauty with sudden disintegration and a cascade into a new pattern.” (Shackle 1972, p.76).
The arrival of new information, or changes in the perceptions and actions of marketplace participants, or changes in their subjective beliefs and intentions, are the events which trigger these sudden disruptions and discontinuous realignments.   Recent events in the financial markets show we are in a kaleidic moment right now.  If there’s an invisible hand, it’s not holding a pendulum but busy shaking the kaleidoscope.
Reference:
Geoge L S Shackle [1972]: Epistemics and Economics.  Cambridge, UK:  Cambridge University Press.
 

Perceptions and counter-perceptions

The recent death of Yuri Nosenko allows me to continue an intelligence arc in these posts.  What is the connection to marketing, I hear you cry!  Well, marketing is about the organized creation and management of perceptions, which could also be a definition of secret intelligence activities.  In any case, the two disciplines have many overlaps, including some coincident goals and some similar methods, which I intend to explore on this blog.

First, let us focus on Nosenko.   He presented himself in Geneva in 1961 to CIA as an agent of KGB willing to spy for the Americans, and then defected to the USA in 1964.  Among other information, he came bearing a firm denial that the USSR had had anything to do with the assassination of John F. Kennedy in November 1963.   JFK’s alleged assassin, Lee Harvey Oswald, was, after all, one of the very few (perhaps under 1000) people who had defected from the USA to the USSR between 1945 and 1963, and one of the even fewer (perhaps under 50) people who had defected back again.  Nosenko claimed to have read Oswald’s KGB file.
From the start, lots of doubts arose regarding Nosenko’s testimony.   He did not seem to know his way around KGB headquarters, his testimony contradicted other information which CIA knew, there were  internal inconsistencies in his story, and he cast serious aspersions on an earlier defector from KGB to CIA, claiming him to be a KGB plant.   Was Nosenko, then, a KGB plant or was he the genuine defector?   Within CIA the battle waged throughout the 1960s, with first the sceptics of Nosenko and then subsequently the believers in his bona fides holding sway.  Chief among the sceptics was James J. Angleton, who came to see conspiracies everywhere, and who was eventually fired from CIA for his paranoia.   (Robert De Niro’s film “The Good Shepherd”  is based on the life of Angleton, with Matt Damon taking this part, and features a character based on Nosenko.)  Finally, CIA decided officially to believe Nosenko, and he was placed in a protection programme.  He was even asked to give lectures to new CIA recruits on the practices of KGB, such was his apparent acceptance by the organization.
This final position so angered one of the protagonists, Tennent Bagley, that, 40 years later, he has written a book arguing the case for Nosenko being a KGB plant who duped CIA.   The book is very compelling, and I find myself very much inclined to the sceptic case.   However, one last mirror is missing from Bagley’s hall.   What if the top-most levels of CIA really did doubt that Nosenko was genuine?    Would it not be better for CIA to not let KGB know this?  In other words, if your enemy tries to dupe you, and you realise that this is what they are trying to do, is it not generally better to let them think they have succeeded, if you can?    Certainly, more information (about their methods and plans, about their agents, about their knowledge) may potentially be gained from them if you manage to convince them that they have indeed duped you.  All you lose is – perhaps – some pride.  Pretending to be duped by Soviet intelligence is perhaps what Britain’s MI6 did regarding Kim Philby, Donald Maclean and Guy Burgess:  it is possible that MI6 knew many years before their defections that these men were working for the Soviets, and used them in that period as conduits for messages to Moscow.
In the case of Nosenko the dupe arrived bearing a message about the JFK assassination.   For many and various reasons (not all of them necessarily conspiratorial), CIA may have been keen to accept the proposition that KGB were not involved in JFK’s assassination.   How do you convince KGB that you believe this particular message if you don’t believe the messenger is genuine?   So, also for pragmatic reasons, the top levels of CIA may have decided to act in a way which would lead (they hoped) to KGB thinking that the KGB’s ruse had worked.
How then to convince KGB that their plant, Nosenko, was believed by CIA to be for real?  Simply accepting him as such would be too obvious – even KGB would know that his story had holes and would not believe that a quick acceptance by CIA was genuine.   Better, rather, for CIA to argue internally, at length and in detail, back-and-forth-and-forth-and-back, about the question, and then, finally, in great pain and after much disruption, decide to believe in the defector.   Bagley either does not understand this last mirror (something I sincerely doubt, since his book evidences a fine mind and very keen understanding of perception management), or else perhaps his book is itself part of a plan to convince KGB that Nosenko was fully believed by CIA.
 
References:
Tennent H. Bagley [2007]: Spy Wars: Moles, Mysteries, and Deadly Games.  New Haven, CT: Yale University Press.
Robert De Niro, Director [2006]:  The Good Shepherd. Universal Pictures.
Tim Weiner [ 2007]:  Legacy of Ashes: The History of the CIA.  Doubleday.

The post-modern corporation

Anyone who has done any strategic planning or written a business case knows that planning requires one to forecast the future.  If you want to assess the financial viability of some new product or company, you need to make an estimate of the likely revenues of the company, and this requires making a prognosis of the level and nature of demand for whatever it is the company plans to provide.   “Taking a view on the future” is what the M&A people call this.
The problem is that the future is uncertain and different people may have different views of it.   There are usually many possible views one could take, and stakeholders are not always able to agree on which is the most likely.  Financial planners typically deal with this uncertainty by developing a small number of scenarios: often called a best case,  an average case, and a worst case.    These scenarios are very rarely ever the actual “best” or the actual “worst” that the planners could conceive.  More typically, they are the best or worst “plausible” cases.  Similarly, the middle case may not be average in any sense of the word, but simply a case the planners happen to favour that is somewhere between the best and worst.   Often the average case is the best the planners think they can get away with, and they contrast this with an outlandish upside and a still-profitable downside.   As with other human utterances (eg, speeches and published papers), effective business planners take into account the views of their likely audience(s) when preparing a business plan.
For telecommunications companies operating in a regulated environment, there is a further wrinkle:  the fifth “P” of telecoms marketing, Permission.  To gain regulatory approval or an operating licence for a new service, telcos in many countries need to make a business case to the regulatory agency.  Here, the regulators may have their own  views of the future.  Quite often, governments and regulators, especially those in less developed countries, feel they are behind in technology and believe that their country has a vast, untapped market ready for the taking.   Sometimes, governments have public policy or even party-political reasons for promoting a certain technology, and they want the benefits to be realized as quickly as possible.   For these and other reasons, governments and regulators often have much more optimistic views of likely demand than do the companies on the ground.
Thus, we have the situation where a company may prepare different business plans for different stakeholders, each plan encoding a different view of the future:  an optimistic plan for the regulator, a parsimonious plan for a distribution partner and yet another for internal use.   Indeed, there may be different views of the future and thus different plans for different internal audiences also, for reasons I will explain in my next post.   Living with uncertainty, the post-modern corporation treats its view of the future as completely malleable — something which can be constructed and re-constructed as often as occasion or audience demands.
In my next post, I’ll talk about the challenges of planning with multiple views of the future, and give some examples.
Reference:  This post was inspired by Grant McCracken’s recent post on Assumption-Hunting.