Terrestrial life as negative entropy
by Schuyler Erle
Tim O'Reilly forwarded the following Slashdot posting to the ORA editors' mailing list:
"To distinguish images derived from living vs. non-living sources, USC and NASA JPL researchers report today using the standard gzip compression utility. As a measure of overall pattern complexity, they find that the inherent pixel content of biologically generated fossils
produces higher image compression ratios [more data redundancy], compared to their non-biological counterparts. The more the file shrinks, the more likely it is
that a living process was involved."
This experiment, and the others like it described in the Slashdot posting, make a very deep and intuitive sort of sense. Data compression involves encoding
redundancy in a dataset in such a way as to increase entropy -- the
apparent degree of randomness in a given set of information. The term "entropy" is used in information theory to mirror the analogous concept from thermodynamics -- and they are described by equations of the same form.
From the standpoint of information theory, life itself might be viewed as being a kind of spontaneous ordering
force in the universe, a sort of anti-entropy. Nearly all living things
embody some form of survival-promoting symmetry or redundancy. So it
amuses me intensely to see that idea, that intuitive notion of life as an ongoing pattern-generating pattern -- life as negative entropy -- made
manifest in such a direct and compellingly mundane fashion as the
comparison of image compression ratios.
What other useful intuitions can we describe or discuss about living things, from an information theoretic viewpoint?
More fun with compression algorithms
I don't know if they were using gzip or another algorithm but this article from the New York Times describes the work of some physicists using compression algorithms to compute the "distance" between different languages as well as to identify texts written by the same author. (gzip == plagiarism detector).
Which invalidates the second law of thermodynamics...
thermodynamics != information theory
Well, there's information theory, and then there's thermodynamics. The one deals with the abstract notion of quantifiable information, the other with the actual physical phenomena of our universe. For example, to the best of my knowledge, there's no idea of conservation in information theory, e.g. the notion that the amount of information in the universe necessarily must remain constant. So, there are many ways in which the study of information theory and thermodynamics differ. To me, that makes the analogy between the two different sorts of "entropy" described all the more striking.
Re: Which invalidates the second law of thermodynamics...
Life forms are not closed systems, to which the laws of thermodynamics apply. Living things create order in themselves by exporting entropy to the environment in the form of heat generated by metabolic activity. The total entropy (ultimately of the universe) always rises, which gets you into all sorts of interesting philosophical questions...
|in my paper arxiv.org/pdf/cs.it/0602023, I show that information is entropy and not "negative entropy" and a compressed file is in a state of equilibrium, with accordance to Clausius inequality. Clausius inequality is the basic definition of equilibrium. Therefore, information like entropy has a tendency to increase. The information increase in life is similar, thermodynamically, to information increase by antenna broadcasting to many receivers.|