Sunday, May 15, 2005

Information, Order, and Entropy (I.D. XXI)

Here is another in my series of posts inspired by Intelligent Design, William A. Dembski's book-length claim that scientific inquiry can demonstrate God's hand in the design of evolved biological systems. My most recent prior posts were A Place Saver (I.D. XX) and Questioning Specification (I.D. XIX).

John R. Pierce's
An Introduction
to Information
Theory

This post introduces a book I will be using to educate myself about information theory, John R. Pierce's An Introduction to Information Theory: Symbols, Signals, and Noise. I obtained it in view of the fact that the crux of Dembski's argument concerns this formal branch of mathematical science, also called communication theory, which among other things seeks a suitably general way to quantify the amount of information in any "message" sent or received across any "communication channel."

Dembski argues that only a divine designer could originate the "complex specified information" (CSI) that is found in the natural world: information that only very improbably could have arisen by chance, and that also betrays a demonstrable, if hidden, pattern. This pattern reflects "side" information, a body of knowledge which — independently of the CSI "main event" — is capable of generating the pattern that lurks behind or within the CSI. Hence, the pattern can be identified as the specification of the CSI.

In this post, however, I will be concerned not with CSI but with just plain information, whether or not complex, whether or not specified. Information is, at its most abstract, that by which uncertainty can be reduced. What I am interested in is whether self-organization of the type championed by Stuart Kauffman in At Home in the Universe can create information. Dembski says it cannot.

Yet, as Pierce shows, the antonym of information is entropy. And its synonym is order. Kauffman says that the "laws of self-organization and complexity" which he describes produce "order for free." If that isn't the same thing as information creation, why not?


Pierce broaches the topic of entropy and order (pp. 21-23) with reference to two related fields of physics, thermodynamics and statistical mechanics.

Thermodynamics has to do with thermal energy — heat — in systems whose molecules are in dynamic motion — canonically, gases. If a gas is allowed to expand against a moving metal piston in a metal cylinder, and if this takes place so slowly that no heat flows between the gas and the metal, some of the erstwhile thermal energy of the gas is converted to work, as the gas cools.

But if an equal amount of work is done by an external force that pushes the piston slowly back to its original position in the cylinder, the gas compresses and heats back up to its former temperature. Since no heat has been allowed to escape, this process is the exact reversal of the first process. Because the two processes exactly reverse one another, the entropy of the gas remains constant. So, says Pierce, "entropy is an indicator of reversibility; when there is no change of entropy, the process is reversible" (p. 21).

But what if, instead of a piston, the cylinder is simply divided into two parts by a membrane or partition, such that all the gas molecules start out on one side of the partition? This is a maximally low-entropy situation, since this particular arrangement of molecules — all on one side, none on the other — is the epitome of order.

Now, instead of moving the piston, let us imagine that the membrane dissolves. The gas molecules spread out to fill both halves of the cylinder. Entropy — disorder — increases. Yet the thermal energy of the gas remains the same, since no mechanical work has been done.

And, once the partition has been removed or the membrane has vanished, no work can be done. Before, work was possible, if the membrane simply became a piston. After, that option is no longer available. An increase in entropy means a decrease in "the possibility of converting thermal energy into mechanical energy" (p. 23).

But here's the key thing, with respect to information theory. When all the gas molecules were quarantined on one side of the membrane, we knew more about the positions of the molecules than afterward, after they had spread out into both halves of the cylinder. We had greater certainty as to the position of any one molecule, call her Hermione. Before, Hermione's location was definitely confined to one half of the cylinder. After, she could be anyhwere.

So an increase in entropy corresponds to greater uncertainty. Since information is that which reduces uncertainty, entropy is "negative information." The information in a system goes down when its entropy goes up. Which is simply another way of saying the obvious: the order in a system goes down when the disorder goes up.

That means that anything that increases a system's order increases its information content.


But isn't that exactly what happened when, in the first scenario, the piston was forced back into the cylinder, thus moving all the gas molecules to one side of the chamber? Yet, we said there was no change in entropy — and so there couldn't have been added order, added information content, right?

The trick here is that Pierce assumes, in the first scenario, that the work done when the piston moves outward is stored by having it lift a weight, and then is fully recovered by letting the weight fall, thereby forcing the piston back to its original position. Under the idealized assumption that no heat escapes during the process, the overall system (including the weight) returns exactly to its initial state. There was neither a net gain nor a net loss in entropy, or order, or information.

Yet, in the middle of the stroke-counterstroke process, after the stroke but before the counterstroke, there was less order — less information — in the cylinder, taken as a system separate from the external weight. If we frame just the cylinder as "the system" — excluding the weight, that is — and if we compare the system's state of affairs prior to the stroke with that which obtained prior to the counterstroke, there is a (temporary) entropy gain/information loss after the stroke has been completed.

So my conclusion is this: whether or not information has been "created" or "destroyed" is a question that cannot be answered until you decide how you want to frame or matte the system in terms of its spatial inclusiveness (i.e., is the weight part of the system?) and how you want to bracket the system's behavioral history in terms of its temporal inclusiveness (i.e, is the post-counterstroke situation also to be considered?).


It is believed that, overall, the entropy of the cosmos increases as the universe "runs down" and heads ineluctably for "heat death," billions of years from now. As the world we know becomes on net more disordered, it would seem accordingly that its information content must inexorably diminish. After the universe's ultimate heat death, it will offer no means whatever for reducing the uncertainty of any post-facto observer from any hypothetical sister cosmos as to what transpired in our cosmos before it died.

Yet, here we are, alive and kicking. Stuart Kauffman's view of this fact in At Home in the Universe is clearly that we and all other living things possess less entropy and contain more order and information than we have any "right" to possess. We are the beneficiaries of "order for free." We are also its bestowers. This dual truth arises, he says, because we are self-organizing systems.

I imagine the idea at the heart of Kauffmanian self-organization is the development — nay, the evolution — of living things' ability to "export" entropy.

Visualize the piston-in-cylinder system, framed spatially without inclusion of the external weight in the frame. Bracket it temporally from the beginning of the counterstroke to the end thereof. The system goes from high entropy to low. As it gains in order, it gains in information.

This looks like magic, because of how we've framed and bracketed the situation. We've excluded the weight which stores the energy which the stroke converts to mechanical form, and we've bracketed out the time of the stroke itself, focusing exclusively on the "information-creating" counterstroke. Once we broaden the frame and remove the bracket, we see that there's nothing mysterious going on. In fact, there is no net change in entropy or information.

Still, we cannot deny that the cylinder-piston-counterstroke system, seen with its original spatial frame and its original temporal bracket, does appear to "export" its entropy. As (to borrow Kauffman's Shakespearean allusion) it "struts and frets its appointed hour upon the stage," it flourishes a seeming "order for free."

So, the question becomes, when the process of "entropy exportation" cannot be as readily demystified simply by removing an arbitrary spatial frame and an arbitrary temporal bracket, isn't "self-organization" the likely verdict?

If so, then William Dembski's claim that self-organization, like other natural processes, cannot originate information qua information seems justified. On this view, self-organization simply changes the flow of existing information; it doesn't make new information.

Yet, even though it's by such lights something of a cheat, self-organized information flow could be responsible, all by itself, for life's origin and evolution on this planet. If, over eons of time as you evolve, you as an arbitrary initial life form become very, very good at entropy exportation — staving off death — you could even become quite brainy and start writing books like At Home in the Universe and Intelligent Design.

0 Comments:

Post a Comment

<< Home