Make your own free website on


The second postulate of incommensurability gained strong apparent support with Boltzmann's hypothesis of the second law of thermodynamics as a law of disorder, and is still promoted actively by leading Darwinians today (e.g., Mayr's [1985] arguments for the autonomy of biology from physics). As noted above, it is found prominently at work in Darwin's Dangerous Idea with Dennett's (1995b, p. 38) assertions that living things are "organized in the service of the battle" against the second law of thermodynamics, or that living things "are things that defy" or constitute a "systematic reversal" of the second law of thermodynamics (Dennett, 1995b, p. 69). The view of an impoverished physical world which is thus built into the core of Dennett's scheme, as with all Cartesian schemes in general, becomes the justification for invoking extra-physical, immaterial, or ideal agents to animate the world and get it ordered. More specifically, it becomes the justification for adopting Dawkin's idealist reductionism to support Dennett's computational world view, where, with some minor repackaging, immaterial, active, striving, algorithms are used to bring all the agency into the world and account for intentional ordering.

As shown above, Dennett makes a number of separate, and nonequivalent claims with respect to selfish algorithm theory that he erroneously blurs together or elides. The major example is the claim that natural selection, and hence evolution, is an algorithmic process, and the claim that it is algorithms on which natural selection works-that living things have descended from algorithms, and hence constitute a branching phyla of algorithms. Although Dennett attempts to move seamlessly from one to the other, the two are not equivalent claims. In addition they are both erroneous. The rest of this section is in four parts. The first will refute the claim that natural selection and hence evolution is an algorithmic process, the second the claim that our ancestors were algorithms, the third, the claim that all agency in the universe is due to bits of program or algorithms, and the fourth, that evolution is for the good of "immortal" replicators.

Algorithmic Processes Have Been Produced By Evolution, But Evolution Is Not An Algorithmic Process

Computer programs are algorithms, and algorithms, as Dennett describes them, and as they are often described by others, are "recipes", or lists of step-by-step procedures of discrete rules or instructions for completing a task, solving some problem, or accomplishing some end. Like recipes, and other rule-based procedures, algorithms, as ordinarily understood and defined, are artifactual productions of cultural systems (human social systems) and thus very lately evolved products of evolution. In his effort to computationalize evolution, Dennett would like to turn this empirical fact on its head and make evolution the product of an algorithmic process. What Darwin discovered with natural selection, Dennett says, was an algorithm, and his dangerous idea was that the products of evolution are thus explained as consequences of an algorithmic process. But natural selection is not an algorithmic process, and to claim that it is, as Dennett does, is to commit a category error.

Laws, Rules, and the Modeler's Fallacy. As noted in the previous section, some years ago, Popper (1985) described natural selection as being entailed by a "situational logic", namely, if certain conditions are present then natural selection necessarily follows. Natural selection is a lawful process in this sense since it always happens if the conditions are met, and the requisite conditions, all quite well-known, are the fecundity principle, heritable variation, and finiteness of accessible resources. Dennett, who does not cite Popper, notes the if...then logic of natural selection, and, pointing out that algorithms are based on if...then logic, asserts that natural selection is an algorithmic process. But this conclusion simply does not follow. Dennett's assertion is based on the category error that follows from conflating the model with the thing being modeled (call this "the modeler's fallacy").

The error follows from the assumption that if a rule-based system such as a model or mechanical device simulates or captures the behavior of some part of the world in some sense then that part of the world is itself a rule-based system or mechanical device. The illegal or erroneous move that Dennett repeatedly makes is from "can be considered as an algorithmic process", as in modeled with an algorithmic process, to "is an algorithmic process". But this is an unsuccessful sleight of hand. The fact that every lawful process can be simulated by an algorithm using an if...then set of rules does not mean that lawful processes entail algorithmic procedures, or sets of rules in order to occur. In fact, the complete opposite is true. A defining property of a lawful, as opposed to a rule-based, behavior is that, as with the case of natural selection, lawful behavior follows directly from initial conditions and the respective law or laws, without a list of procedures or instructions required for its occurrence.