## Dembskis Complex Specified Information

JEFFREY SHALLIT AND WESLEY ELSBERRY

Throughout science, random chance is an accepted component of our explanation of observed physical events. In chemistry, the ideal-gas laws can be explained as the average behavior of the random motion of molecules. In physics, the concept of half-life tells us what percentage of radioactive nuclei can be expected to decay within a given time period, even if we cannot identify, before the fact, which ones specifically will survive. In biology, random mutations and genetic drift are two of the probabilistic components of the modern theory of evolution.

But chance cannot explain everything. If we were to draw letters at random from a bag of Scrabble tiles, and the resulting sequence formed the message CREATIONISM IS UTTER BUNK, we would be very surprised (notwithstanding the perspicacity of the sentiment). So under what circumstances can we reject chance as the explanation for an observed physical event?

Let's be more specific. Suppose I entered a room alone, shut the door, and flipped a coin 50 times, recording the outcomes as heads (H) or tails (T). I then came out of the room and showed you the record of coin flips. Imagine that I produced this list of outcomes:

A: HHHTTHTHTTHHTHTHTTHTTHTHHHHHTHTHTHTHHT TTHHHTHTHHHT

No one would be surprised in the least. But what if I produced this record? B : HHHHHHHHHHHHHHHHHHHHHHHHHTTTTTTTTTTTTT

You would probably view with skepticism my claim of having produced it through random coin flips, and you would seek an explanation other than random chance. Perhaps I was really using two coins, one of which had two heads, the other two tails, and I accidentally switched between the two halfway through. Or perhaps I simply made up the record without flipping a coin at all. Can your skepticism be given a rigorous theoretical basis?

It does no good at all to say that B is a very unlikely outcome. According to standard probability theory, A is just as unlikely as B. In fact (assuming a fair coin), events A and B both occur with this probability,

In other words, the probability is about 10-15, or about 1 in a million billion. Yet B seems to us a much more unlikely result than A. We have stumbled on what appears to be a paradox of probability theory.

It is not a new paradox. James Boswell (1740-95), the biographer of lexicographer and essayist Samuel Johnson, wrote this about the events of 24 June

1784:

I recollect nothing that passed this day, except Johnson's quickness, who, when Dr. Beattie observed, as something remarkable which had happened to him, that he had chanced to see both No. 1 and No. 1000, of the hackney-coaches, the first and the last; "Why, Sir, (said Johnson,) there is an equal chance for one's seeing those two numbers as any other two." He was clearly right; yet the seeing of the two extremes, each of which is in some degree more conspicuous than the rest, could not but strike one in a stronger manner than the sight of any other two numbers. (Boswell 1983, 1319-20)

French mathematician Pierre Simon Laplace (1749-1827) discussed the paradox in his 1819 Essai philosophique sur les probabilités :

On a table we see letters arranged in this order, C o n s t a n t i n o p l e, and we judge that this arrangement is not the result of chance, not because it is less possible than the others, for if this word were not employed in any language we should not suspect it came from any particular cause, but this word being in use among us, it is incompara bly more probable that some person has thus arranged the aforesaid letters than that this arrangement is due to chance.

This is the place to define the word extraordinary. We arrange in our thought all possible events in various classes; and we regard as extraordinary those classes which include a very small number. Thus at the play of heads and tails the occurrence of heads a hundred successive times appears to us extraordinary because of the almost infinite number of combinations which may occur in a hundred throws; and if we divide the combinations into regular series containing an order easy to comprehend, and into irregular series, the latter are incomparably more numerous. (Laplace 1951, 231)

These remarks of Boswell and Laplace suggest a possible resolution of our paradox. In flipping a fair coin 50 times, some outcomes fit a short, simple pattern,

B: HHHHHHHHHHHHHHHHHHHHHHHHHTTTTTTTTTTTTT

whereas others do not:

A: HHHTTHTHTTHHTHTHTTHTTHTHHHHHTHTHTHTHHT TTHHHTHTHHHT

The number of very simple patterns is small, so when a record that fits such a pattern is produced, we might legitimately reject "flips of a fair coin" as a valid explanation.

So far we have spoken imprecisely. What, exactly, is a valid pattern? How many valid patterns are there, and what does it mean to say this quantity is "small"? We will take up these questions later in the chapter. But now it is time to see how our paradox and its resolution can be misused to reach extraordinary conclusions.