Of quacking ducks and homeostasis

After reading a very interesting essay (PDF) by biologist J. Scott Turner discussing Intelligent Design (ID) and Evolution which presents an anti-anti-ID case, I was led to read Turner’s recent book, The Tinkerer’s Accomplice: How Design Emerges from Life Itself. Turner argues that Darwinian Evolution requires, but lacks, a notion of intentionality. Despite the use of an apparently teleological concept, he is no creationist: he argues that both Evolutionary theorists (who refuse to consider any such notions) and Creationists/IDers (who have such a notion, but refuse to examine it scientifically) are missing something important and necessary.

Turner’s key notion is that biological and ecological systems contain entities who create environments and seek to regulate them. Typically, such entities seek to maintain their environment in a particular state, i.e., they aim for environmental homeostasis.  The concept of homeostasis is due to the French pioneer of physiology, Claude Bernard (1813-1878), who observed that the human body and its various organs seek to maintain various homeostatic states internally, for example, the chemical composition of the blood stream. That indefatigable complex systems theorist and statistician Cosma Shalizi has thus proposed calling entities which create and regulate environments, Bernard Machines, and Turner also uses this name. (Turner credits Shalizi for the name but provides no citation to anything written by Shalizi, not even a URL — I think this very unprofessional of Turner.)
For Turner, these entities have some form of intentionality, and thus provide the missing component of Darwinian evolution. For a computer scientist, at least for those who have kept up with research since 1990, a Bernard Machine is just an intelligent agent:  they are reactive (they respond to changes in their environment), they are pro-active (ie, goal-directed), and they are autonomous (in that they may decide within some parameters, how, when, and whether to act). Some Bernard Machines may also have a sense of sociality, i.e., awareness of the existence of other agents in their environment, to complete the superfecta of the now-standard definition of agenthood due to Wooldridge and Jennings (1995).
I understand that the more materialist biologists become agitated at any suggestion of non-human entities possibly having anything like intentionality (a concept with teleological or spiritual connotations, apparently), and thus they question whether goal-directedness can in fact be said to be the same as intentionality. But this argument is exactly like the one we witnessed over the last two decades in computer science over the concept of autonomy of software systems: If it looks like a duck, walks like a duck, and quacks like a duck, there is nothing to be gained, either in practice or in theory, by insisting that it isn’t really a duck. Indeed, as software agent people know very well (see Wooldridge 2000), one cannot ever finally verify the internal states of agents (or Bernard machines, or indeed ducks, for that matter), since any sufficiently clever software developer can design an agent with any required internal state. Indeed, the cleverest software developers can even design agents themselves sufficiently clever to be able to emulate insincerely, and wittingly insincerely, any required internal states.
POSTSCRIPT: Of course, with man-made systems such as economies and societies, we cannot assume all agents are homeostatic; some may simply seek to disrupt the system. For computational systems, we cannot even assume all agents always act in their own self-interest (however they perceive that), since they may simply have buggy code.
J. Scott Turner [2007]: Signs of design. The Christian Century, June 12, 2007, 124: 18-22. Reprinted in: Jimmy Carter and Philip Zaleski (Editors): Best American Spiritual Writing 2008. Houghton Mifflin.
J. Scott Turner [2007]: The Tinkerer’s Accomplice: How Design Emerges from Life Itself. Cambridge, MA, USA: Harvard University Press.
Michael J. Wooldridge [2000]: Semantic issues in the verification of agent communication languages. Journal of Autonomous Agents and Multi-Agent Systems, 3 (1): 9-31.
Michael J. Wooldridge and Nicholas R. Jennings [1995]: Intelligent agents: theory and practice. The Knowledge Engineering Review, 10 (2): 115-152.

0 Responses to “Of quacking ducks and homeostasis”

  • No Comments

Leave a Reply

You must be logged in to post a comment.