Monday, April 18, 2005

The Law of Very Large Numbers

I was touted onto Deborah Bennett's excellent book Randomness (see The Random and the Deliberate) by an endnote in William Dembski's Intelligent Design. The note cited Bennett's citation of a fifteenth century rabbi, Isaac ben Mosheh Aramah. In the Bible, said the Rabbi, Jonah's being found guilty of bringing God's wrath upon his shipmates was legitimate since many lots had been cast in sequence, all falling upon Jonah. Had just one lot been cast, though, the adverse result would have been a matter of chance.

Casting lots was an ancient way of flipping a coin or rolling a die. Other randomizers used in antiquity included the oddly shaped, four-sided, bonelike talus (Latin) or astragalus (Greek) — see Bennett, pp. 19-20. In the first century B.C., the Roman orator-statesman Cicero is on record as having disagreed with the thinking that would be endorsed by Isaac ben Mosheh Aramah some sixteen centuries later. Of the "Venus-throw" — four thrown tali, wherein each talus displays a different one of the four faces — Cicero said multiple successive Venus-throws, however improbable, would be just as much a matter of chance as a single Venus-throw (see Bennett, p. 74).

Modern probability theory agrees with Cicero. The reason, as Bennett points out, is the law of very large numbers. It is not impossible for, say, a tossed coin to come up heads 100 times in a row — just very unlikely. The odds are one in two to the 100th power, and thus vanishingly small, but still greater than zero.

Let's call two to the 100th power a "kazillion." If you tossed the coin a kazillion times, you would expect to get 100 heads in a row exactly once, somewhere along the line. Failing that, if you tossed the coin several kazillion times, the chances of getting a 100-heads sequence at some point would approach certainty. This is the power of the law of very large numbers (sometimes quite erroneously called the "law of averages"):

... with many trials the number of times a particular outcome will occur is very close to mathematical conjecture, or mathematical expectation. And this applies to even the most unlikely events; if it is possible for them to happen, given enough opportunities, eventually they will happen, in accordance with the laws of probability. (Bennett, p. 76)


In comparing Cicero's and Cicero's contemporaries' view of chance — the latter being closer to Rabbi Isaac ben Mosheh Aramah's in feeling that "these things happen at the direction of the gods" — Bennett calls Cicero's understanding "more mature" (p. 74).

She also notes that children in modern times don't get the law of very large numbers of trials:

At very young ages children do not understand this concept. Part of the problem is that young children do not accept the notion of randomness, which is at the heart of any understanding of probability. Piaget and Inhelder [child psychology researchers] found that young children conceive of random results as displaying regulated but hidden rules. (p. 78)
Bennett also links this (shall we call it) "pre-randomness mentality" with notions of fairness. When a game is "fair" by the standards of probability theory, all that is meant is that there is no bias against any player's hopes. A "fair" six-sided die has an equal chance of coming up 1, 2, 3, 4, 5, or 6. In the very, very long run, it will bear that expectation out. However, this does not guarantee that a player at dice will not lose all his money before that happens — a seemingly "unfair" result.

There is nothing in the laws of probability that guarantees that this second definition of fairness will be adhered to. But children, as well as many adults, are prone to the "gambler's fallacy" of thinking otherwise:
... the heart of the gambler's fallacy lies in a misconception about the fairness of the laws of chance. We believe chance to be a self-correcting process — in which deviations in one direction [say, "too many heads" in a row] will soon be countered with a deviation in the other direction [a long run of tails]. But in fact, deviations in the short run are not corrected; they are merely diluted over the long run ... . (p. 79)
There's a pattern here. The more "modern" and "mature" our views of chance, the more we reject the notion of God's determining hand on the dice ... and the more we also reject belief in any personal or impersonal force that very soon will make events come "fair" or "just" by this second, "childish" definition.

Note how much the emphasis is on the word "soon." Events will definitely come fair and just if there can be several kazillion coin tosses or die throws. So says the law of very large numbers.


I have to note, before closing, that the Stuart Kauffman view of evolution which I favor (see Welcome to Beyond Darwin), and which forms the core of his book At Home in the Universe, could well be grounded in a so-called "childish" view "of random results as displaying regulated but hidden rules."

After all, the basic idea Kauffman advances is that there is a previously unsuspected "lawfulness" to evolutionary history. Though at the level of fine detail, the history of life on Earth is indeed unpredictable and "incompressible" — it cannot be reduced to any simpler, more-quickly-run computer algorithm — at a macro level, it might well produce outcomes that are "robust" and eminently predictable.

Among the results that are "robust" and predictable may well be creatures that have our signal qualities, such as intelligence, consciousness, and self-awareness.

Let's say for the sake of argument that in terms of Kauffman's personal psychology — and mine — there is an underlying need not to abandon a "childish" belief in ... how shall I put it? Human fairness over dice fairness? Goodness and justice over blindness? Meaning over meaninglessness?

However we may put it, we must also remind ourselves that science is science because, over time, it factors out personal psychology. Kauffman in his book gives us many "mature" grounds for believing that his vaunted laws of "self-organization" may complement blind-chance mutations, winnowed by natural selection, in guiding evolutionary history. Other researchers — whatever their own biases — can test his bold hypotheses, although some of that will have to wait until science gains more experimental control of the requisite bio-molecules in the lab.


However "immature" some of our biases may be against believing in blind chance and the law of very large numbers, there is also great value in "growing up," probability-wise. I do not necessarily mean that the only "mature" stance is an atheistic one. Rather, I think it vital for the grown-up theist to recognize that we work with God to "make our own luck."

Consider the victim of the "gambler's fallacy" who loses all his or her money out of a conviction that Lady Luck is "fair" and will soon reward the bettor's persistence. Put differently, this equates to the assumption that the bettor can passively rely on an equalizing tendency built into the grand scheme of things. The bettor doesn't actually have to do anything to "change his luck." Which, if worst comes to worst and the bettor loses all, makes him or her a victim, not of chance, but of rank injustice.

Thus does the childish "gambler's fallacy" go along with a victim mentality, when things don't work out.

A more mature mentality is to assume that there is a way for us to work with God to change our outcome into a more just one.

It's not all up to us, as the atheist believes ... but neither is it all up to God, or Lady Luck, or Fortuna, the Roman goddess who gave us the word "fortune" (see Bennett, p. 31). Instead, there is an active, dynamic, world-creative partnership between God and ourselves. In faith, and with God's help, we can create our own luck as we create our own world. We do not have to passively accept all the "bad stuff" that happens to us. This I would call a "mature" faith in God.

0 Comments:

Post a Comment

<< Home