How does life help spread entropy

Entropy and information: fuel of the universe

Physics and the love of the world

The dark days between the winter solstice and the turn of the year are the time of the fireplace and the fireworks, of tea and mulled wine, the review of the news and the annual events, the reflection on life, possibilities and realities, valuable and worthless, good and bad and the passing of the Time itself.

From the perspective of physics, all of these things, from fire, fireworks and hot drinks, to news and events, to life, values, good and bad and time itself, are inextricably linked to a concept whose power can hardly be overestimated: the Information, and its most closely related counterpart - entropy. In fact, entropy and information are like the two opposing faces of a Janus head, one laughing, one crying - or the two contradicting temperaments of Dr. Jekyll aka Mr. Hyde.

The abstract, non-visual entropy is older than the information omnipresent today. It dates from the 19th century, the great time of the steam engines, and was first introduced by the German physicist Rudolf Clausius, who wrote in 1865 [1]:

“The energy of the world is constant. The entropy of the world strives towards a maximum. "

The second sentence of this wisdom is today called “2. Law of thermodynamics ”: it describes processes that only take place in one direction. For example, why does milk that is poured into a hot tea heat up while the tea itself cools down a bit? If only conservation of energy would apply, the whole thing could also work the other way round, the milk would be even colder and the tea even hotter. The increase in entropy, however, requires the temperature to equalize - at least as long as one does not use external energy to increase the temperature difference. In this way, entropy determines what happens, i.e. which processes take place at all and in which direction they run.

Only the Austrian physicist Ludwig Boltzmann was able to decipher what entropy really is. He understood the laws of thermodynamics as a result of statistical distributions of atoms and molecules. Temperature is then nothing other than the average energy of these particles. The following applies: Anyone who conducts statistics and works with average quantities does not own or use complete information about the state of a physical system: Instead of knowing the energy of each individual particle (the so-called “microstate”), only the average energy (the so-called “macrostate”). The same average can be realized in many different ways, a series of numbers with two 2's has an average of 2, just like a series of numbers consisting of 1 and 3. Entropy describes the number of possible microstates in a macrostate, or more precisely: Entropy is defined as the (normalized) logarithm of the number of microstates that correspond to a macrostate. Since it can be assumed that all microstates are equally probable, the following applies: The size of the entropy thus provides information about how probable a macrostate is. And in physical processes the entropy increases because the more improbable macrostate evolves towards the more probable.

This turns entropy into an information-theoretical concept: Entropy is the information that is missing to identify a specific microstate in the macrostate. And it is precisely this understanding that explains the power of the concept: Suddenly, entropy can not only be applied to gases and steam engines, but universally! Entropy also has a subjective component: whether a microstate is assigned to a macrostate of low or high entropy is a question of definition or of the available knowledge.

Of course, this does not detract from the omnipresence of entropy. Because even if entropy is subjective to a certain extent, our life takes place in macro states: Instead of atoms and elementary particles, we speak of people and weather, of news, laws and the economy. Or cars.

For example, why do working cars break down while broken cars seldom repair themselves? The answer is: entropy! A working car is one where every part is in its place and working in a way that is foreseen. This can be achieved in exactly one way: The macrostate “healthy” corresponds exactly to a microstate, i.e. it has zero entropy.

A broken car, on the other hand, can be broken in many ways: Maybe the left front wheel is loose? Has a gasoline or exhaust pipe come loose in the engine? The windshield is splintered and lies on the asphalt? The wheel is no longer connected to the wheels? All very different micro-states of our car, which are summed up under the macro-state “broken”. So cars break because the macro state is much less likely than the broken macro state, which can be realized in many more possible ways (through many more micro states) and thus has higher entropy. And: anyone who shakes through a bunch of screws, metal and rubber will hardly get a functioning car by chance. The other way around, if you drive your car down a precipice, it's likely to end up with a pile of screws, metal, and rubber.

The same applies to the game of dice: When throwing two dice, the most likely total number is “seven”. The reason for this is not that a certain number is more likely than another on a single die, but that there are more possibilities to roll a "seven" with two dice (namely six: "16", "25 "," 34 "," 43 "," 52 "," 61 ") as eg a" twelve ", for which there is only one possibility (namely" 66 "). Therefore the total number of eyes "seven" is exactly six times as likely as the "twelve". Here the total number of eyes is the macrostate: the state of the cube, which we characterize by the properties that are of interest to us (or perceptible or experimentally accessible). The macrostate "seven" corresponds to the six microstates "16", "25", "34", "43", "52", "61", the macrostate "twelve" corresponds to the microstate " 66 ″. Because it is much more likely to turn a “twelve” into a “seven” by rolling the dice than the other way round, the entropy normally increases. Low entropy corresponds to order (both cubes neatly turned on the same side or every screw in the car in its place), high entropy to disorder (the cubes in any state that they assume with high probability).

And from this also follows the connection between entropy and energy: in order to transform a highly entropic chaotic state into an orderly low entropy state (e.g. to clean up a room or to repair a car or to turn the sixes up with a large number of dice) one needs energy spend, and that always means converting low-entropy energy into higher-entropy energy. For example, if you light the fireplace and heat the apartment, i.e. increase the temperature difference between a cold winter night and a cozy room, the entropy in the room and winter night decreases, but you burn wood, oil or coal and distribute the energy of these fossil fuels, which was stored in an ordered carbon structure, through mixing and connection with the surrounding air: the total entropy increases. Fireworks also explode because they increase entropy.

So low entropy or order is the essence of resources, and wasting resources is always associated with an increase in entropy: coal, gas, wood, oil and black powder have low entropy, warm air and ash high.

Entropy is thus inextricably linked with ethics: there is no doubt that evil thrives particularly well in conflicts over resources. The waste of resources is therefore undoubtedly evil, while - loosely based on Wilhelm Busch:"The good - this sentence is clear - is always the bad, what you leave"Avoiding pointless entropy increases must be good. And most crimes up to and including murder are characterized by an increase in entropy:

Because just as "being salvation" has low entropy, "being broken" has high, the following applies: life has low entropy, being dead has high (simply because there are far fewer opportunities to be alive - so pretty much all organs in yours Space and intact - as dead (e.g. head torn off, heart torn out, blood flowed out of the body, vital veins clogged, etc.). Ordered structures like life only work because biological organisms through the conversion of low-tropical energy - e.g. in the form of food - and almost every form of food production is ultimately based on photosynthesis. So life typically ultimately uses the sun's energy reserves (a possible alternative energy source is volcanism in the earth's interior). As the legendary quantum physics pioneer Erwin Schrödinger 1944 in his book “Was ist Leben?” [2] thought about the fundamentals of biology and thus decisive researchers like James W Atson and Francis Crick, who inspired the development of molecular biology, spoke of biological information as “negentropy”: negative entropy.

And this idea can also be generalized: When four years later Claude Shannon, the founder of information theory, was looking for a name for the loss of information during data transmission in telegraph lines, John von Neumann, a universal genius, quantum, bomb, computer and Game theory pioneer and author of the "Mathematical Foundations of Quantum Mechanics":

“You should call it entropy for two reasons. First, the concept was used by that name in statistical mechanics. So it already has a name. Second, and more importantly, nobody knows what entropy really is, so you will always have an advantage in a discussion. "

Shannon understood entropy as a measure of information and thus combined thermodynamics with information theory [3]. If the entropy of a macrostate is characterized by the number of possible microstates, then the entropy corresponds to the information that is missing to fully describe the corresponding microstate. Information is then the difference between the entropy of a certain macrostate and the entropy of the macrostate with the greatest possible entropy. Entropy thus becomes, as it were, a lack of information, and thus the antagonist of the term, which is indispensable for rational decisions, political transparency and democracy.

Ultimately, the low entropy at the beginning of the universe is also the prerequisite for things to develop in one direction at all (namely in the direction of increasing entropy), that we can use resources and that ordered structures and life are even possible.

Yes, for something happening at all! Without low entropy at the beginning of the universe there would only be a chaotically mixed mass of particles. Or as the US cosmologist Sean Carroll put it: “We are all surfing on a wave of increasing entropy” [4].

And perhaps the entropy is even behind time itself: What we experience as time in everyday life is actually the increase in entropy. This is easy to understand when you look backwards and forwards in a film: If you see a cracked coffee cup on the floor in a film, which is composed with the help of the energy gained by cooling the air in its environment, collecting the coffee that has run out and putting it on the The table bounces, then you are sure to watch the film in reverse. Many physicists therefore believe that time itself is only a description of the increase in entropy.

Yes, even what we consider reality, our everyday world, in which things are in certain places and have specific properties, is related to entropy. Because this everyday world works according to the laws of classical physics, and this is only a borderline case of quantum mechanics. In quantum mechanics, on the other hand, a physical system has all properties that are physically possible until an observation or measurement decides which of these properties is “realized”. This measuring process can now be described using the laws of quantum physics, as a phenomenon that bears the name decoherence and was discovered in 1970 by the Heidelberg physicist H. Dieter Zeh. Decoherence describes how information is diverted into the environment of the system and measuring device through the interaction during the measurement process. Since this information is lost for a local observer, it can be shown that the observer can only perceive one of the possible realities, while the other realities correspond to the unobservable parallel universes of Hugh Everett's interpretation of many worlds. The loss of information during the measurement process, which leads to the creation of a classical reality, is synonymous with an increase in entropy, the so-called Von Neumann entropy.

With this, of course, time and reality themselves become phenomena that depend on our perspective and are not fundamental properties of the universe itself. Philosophers call such concepts emergent. It remains to be seen whether this is really the case. There is no doubt, however, that entropy and information determine the world in which we live. And also form a bracket between us and the world. They describe the connection between the fundamental universe and our perspective on it. We are, as it were, part of the universe and, with our perspective, determine how we experience the world. And:

The world is wonderful!

[1] Claus Kiefer: The quantum cosmos - from the timeless world to the expanding universe, S.Fischer-Verlag, Frankfurt am Main 2008.

[2] Erwin Schrödinger: What is life ?, Piper, Munich 1987.

[3] James Gleick: The Information, HarperCollins, London 2011.

[4] Sean Carroll: From Eternity to Here - The quest for the ultimate theory of Time, Penguin Books, New York 2010.

Heinrich Päs is Professor of Theoretical Physics at TU Dortmund University and researches neutrinos, particle physics and cosmology. He has also tried his hand at developing a time machine and a philosopher, writing a book ("Neutrinos - The Perfect Wave") and inspiring several science fiction novels. He was a postdoc in Hawaii. When he is not researching or reading, he enjoys nature, sailing, surfing, hiking, skiing or running. And he loves his wife Sara even more than the world.