Confusion about Entropy

The word entropy entered the popular culture. This may not be a good thing because even scientists are confused about the definition of entropy. I was trained as a physicist and I was confused about it for a long time. This was not entirely my fault.  Physics teachers are responsible for the confusion. I hope this helps.

Main Reason for the Confusion

The main reason for the confusion is teachers’ failure to explain that all these fancy statements about entropy are only valid for isolated systems. Some scientists claim that increasing entropy in the universe means the universe will end in a thermal death. They assume universe is an isolated system. I question that assumption.

Another reason for the confusion is that there are different definitions of entropy. In popular culture entropy is understood as randomness or disorder. In science, however, there are multiple definitions.

Entropy as Dispersal of Energy

Let’s consider a glass of ice water brought into a room. Let’s assume the room is isolated from the rest of the house. Over time the temperature of the glass and its contents and the temperature of the room become equal. At thermal equilibrium, the entropy of the system (ice and water and the room together) is at maximum. For isolated systems, entropy never decreases. Once the isolated system reaches thermal equilibrium you cannot extract any useful work from it. Useful work can only be extracted when there is an energy differential in the system.

When the system is far from equilibrium the entropy is smaller. When the isolated system reaches thermal equilibrium the energy differentials are removed (energy is evenly dispersed). The quantitative measure of the degree of equilibrium is known as entropy.

Entropy as Disorder

In statistical mechanics, entropy is a measure of the number of ways in which a system may be arranged. This variability can be interpreted as disorder. The higher the variability, the higher the disorder.

Entropy as Uncertainty

Entropy is a measure of the intrinsic uncertainty of an isolated system. The intrinsic uncertainty is related to the “variability” concept mentioned above. If there are many ways you can arrange the system then you are less certain about the actual state of the system.

Entropy as Randomness

Once again the “variability” concept is the key. More variability means there are more ways you can arrange the system. In measurements this leads to randomness. If there are many states then each measurement may give you different result. The randomness of measurement results will increase as the number of states increase.

Entropy as Information

The number of states is also known as the degrees of freedom. Information holding capacity of a system increases as the degrees of freedom increases. You can turn this around and say that entropy is the amount of information required to describe a system. It gets harder to describe a system when number of states is larger.

When we deal with random variables (signals for example) then we can say that the entropy of a random variable is a measure of the uncertainty of the random variable; it is a measure of the amount of information required on the average to describe the random variable.

Arrow of time

Isolated systems tend to progress in the direction of increasing entropy until they reach equilibrium. This process is irreversible. The irreversibility is known as the “arrow of time”.

Entropy and Aging

Biological systems are not isolated systems, we are constantly interacting with the environment. New objects and energy (food and drink) enter our bodies and we extract useful energy from the environment all the time. Our bodies renew themselves constantly. Therefore the common statements about entropy do not apply to biological systems. The aging process seems to be controlled by genetics. The aging process is not really a consequence of entropy. I suppose we age because our bodies lose the ability to renew themselves. Many processes are involved but entropy is only a small contributor to the aging process.

Relative Entropy

The so called “relative entropy” is rather different from the definition of entropy in physics and information theory.

  • Relative entropy is a measure of statistical distance. It tries to measure the “closeness” of two distributions.
  • In Bayesian context, relative entropy can be used to measure the “distance” between the prior and posterior distributions.
  • In the context of regression modeling, you minimize the relative entropy to find the best fit to data
  • In the context of derivative pricing, you minimize the relative entropy to come up with smooth volatility surfaces.
  • Some people use relative entropy to detect “surprises”. This is part of the Bayesian approach as well.
  • Relative entropy can be used in the context of “pattern matching”. This is why it is used in machine learning and artificial intelligence.
  • Relative entropy can be used as a “similarity” measure
Advertisements

About Suresh Emre

I have worked as a physicist at the Fermi National Accelerator Laboratory and the Superconducting Super Collider Laboratory. I am a volunteer for the Renaissance Universal movement. My main goal is to inspire the reader to engage in Self-discovery and expansion of consciousness.
This entry was posted in philosophy, physics and tagged , , , , , , . Bookmark the permalink.