Entropy
- For other senses of the term, see entropy (disambiguation).

In thermodynamics, thermodynamic entropy (or simply entropy) is an important state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. Thermodynamically, entropy is modeled via two perspectives:
- In classical thermodynamics, entropy multiplied by a particular temperature can be understood as a measure of the amount of energy in a physical system at that temperature that cannot be used to do thermodynamic work, i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward.
- In statistical thermodynamics, entropy is envisioned as a measure of the statistical "mixedupness" or "disorder" of the thermodynamic system, the amount of uncertainty that would remain about the exact microscopic state of the system, given a description of its macroscopic properties. It can be shown that this definition of entropy reproduces all of the properties of the entropy of classical thermodynamics.
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. Unlike almost all other laws of physics, this associates thermodynamics with a definite arrow of time.
History
The short history of entropy begins with the work of mathematician Lazare Carnot who in his 1803 work Fundamental Principles of Equilibrium and Movement postulated that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat engines “caloric”, or what is now known as heat, moves from hot to cold and that “some caloric is always lost”. This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics. In the 1850s, Rudolf Clausius gave this “lost caloric” a mathematical interpretation and called it entropy. Later, scientists such as Ludwig Boltzmann, Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis.
Overview
The typical example of entropy loss is the combustion of hydrogen and the release of energy in the form of work. Here, energy is released from the hydrogen in the form of heat and mechanical work, from the pressure forcing the gas outward. While the energy released can be used to do work, some of it cannot. The heat created in such combustion would not be available to do work, and thus contributes to the entropy of the system. The combusted form of hydrogen - water - would have greater entropy than the hydrogen had before combustion.
Many quantities of matter tend to equalize their thermodynamic parameters - reducing differentials towards zero. Pressure differences, density differences, and temperature differences, all tend towards equalizing. Entropy is a measure of how far along this process of equalization has come. Entropy increases as this equalization process advances. For example, the combined entropy of "a cup of hot water in a cool room" is less than the entropy of "the room and the water after it has cooled (and warmed the room slightly)," because the heat is more evenly distributed. The entropy of the room and the empty cup after the water has evaporated is even higher.
Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic quantities, like temperature and volume, the entropy is a function of the probability that the system is in various quantum states. The more states available to the system with higher probability, the greater the "disorder" and thus, the greater the entropy.
Here, it is important to distinguish the definition of disorder in the context of entropy and the definition of disorder in the context of everyday usage. In physics, the term "disorder" in this sense refers to a specific, well-defined quantity, while disorder in everyday usage is more akin to disorganization. The two definitions match up because adding heat to a system, which increases its classic thermodynamic entropy, also increases the system's thermal fluctuations, so giving an increased lack of information about the exact microscopic state of the system, i.e. an increased statistical mechanical entropy. This will be considered in more detail below.
The entropy in statistical mechanics can be considered as a specific application of Shannon entropy, according to a viewpoint known as MaxEnt thermodynamics. Roughly speaking, Shannon entropy is proportional to the minimum number of yes/no questions you have to ask to get the answer to some question. The statistical mechanical entropy is then proportional to the minimum number of yes/no questions you have to ask in order to determine the microstate, given that you know the macrostate.
Units and symbols
Entropy is the measure of disorder. It is a key physical variable in describing a thermodynamic system. The SI unit of entropy is 'joule per kelvin' (J·K−1), which is the same as the unit of heat capacity, and entropy is said to be thermodynamically conjugate to temperature. The entropy depends only on the current state of the system, not its detailed previous history, and so it is a state function of the parameters like pressure, temperature, etc., which describe the observable macroscopic properties of the system. Entropy is usually symbolized by the letter S.
There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward. (TR is the temperature of the system's external surroundings, which may not be the same as the system's current temperature T ).
The second law
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value; and so, by implication, the entropy of the universe as a whole (i.e. the system and its surroundings) tends to increase. We will consider the meaning of the "second law" further in a subsequent section. Two important consequences are that heat cannot of itself pass from a colder to a hotter body: i.e., it is impossible to transfer heat from a cold to a hot reservoir without at the same time converting a certain amount of work to heat. It is also impossible for any device that operates on a cycle to receive heat from a single reservoir and produce a net amount of work; it can only get useful work out of the heat if heat is at the same time transferred from a hot to a cold reservoir. This means that there is no possibility of a 'perpetuum mobile' which is isolated. Also, from this it follows, that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.
Thermodynamic definition
In the early 1850s, Rudolf Clausius put the concept of “energy turned to waste” on a differential footing. Essentially, he set forth the concept of the thermodynamic system and positioned the argument that in any irreversible process a small amount of heat energy dQ is incrementally dissipated across the system boundary. In 1876, chemical engineer Willard Gibbs, building on the work of those as Clausius and Hermann von Helmholtz, situated the view that the measurement of “available energy” ΔG in a thermodynamic system could be mathematically accounted for by subtracting the “energy loss” TΔS from total energy change of the system ΔH. These concepts were further developed by those as James Clerk Maxwell [1871] and Max Planck [1903].
Statistical interpretation
In the 1877, thermodynamicist Ludwig Boltzmann visualized a probabilistic way to measure the entropy of the ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrodinger, has been to determine the distribution of a given amount of energy E over N identical systems.
The arrow of time
Entropy is the only quantity in the physical sciences that "picks" a particular direction for time, sometimes called an arrow of time. As we go "forward" in time, the Second Law of Thermodynamics tells us that the entropy of an isolated system can only increase or remain the same; it cannot decrease. Hence, from one perspective, entropy measurement is thought of as a kind of clock.
Entropy and cosmology
We have previously mentioned that the universe may be considered an isolated system. As such, it may be subject to the Second Law of Thermodynamics, so that its total entropy is constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.
If the universe can be considered to have increasing entropy, then, as Roger Penrose has pointed out, an important role in the disordering process is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes.
The role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.
Entropy in fiction
- Martin Amis's Time's Arrow, a novel written in reverse.
- Isaac Asimov's "The Last Question," a short science fiction story about entropy
- Thomas Pynchon, an American author who deals with entropy in many of his novels
- Diane Duane's Young Wizards series, in which the protagonists' ultimate goal is to slow down entropy and delay heat death.
- Gravity Dreams by L.L. Modesitt Jr.
- "The Arrow of Time," an essay by journalist K.C. Cole, takes the physical law of entropy and metaphorically applies it to real life.
- Jeremy Rifkin's Entropy: A New World View, a notorious misinterpretation of entropy [1]
- The Planescape setting for Dungeons & Dragons includes the Doomguard faction, who worship entropy.
- Arcadia, a play by Tom Stoppard, explores entropy, the arrow of time, and heat death.
- Stargate SG-1 and Atlantis, science-fiction television shows where a ZPM (Zero Point Module) is depleted when it reaches maximum entropy
- In DC Comics's series Zero Hour, entropy plays a central role in the continuity of the universe.
- "Time's Arrow," a two-part episode of Star Trek: The Next Generation
- H.G. Wells' story "The Time Traveler" had a theme that was based upon entropy and how instead of humans evolving, they in fact devolved into two species
See also
|
References
- Fermi, Enrico (1937). Thermodynamics. Prentice Hall. ISBN 048660361X.
- Kroemer, Herbert (1980). Thermal Physics (2nd Ed. ed.). W. H. Freeman Company. ISBN 0716710889.
{{cite book}}
:|edition=
has extra text (help); Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - Penrose, Roger (2005). The Road to Reality : A Complete Guide to the Laws of the Universe. ISBN 0679454438.
- Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN 0070518009.
- Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN 0674753259.
{{cite book}}
: CS1 maint: multiple names: authors list (link)
External links
- Entropy is Simple...If You Avoid the Briar Patches
- Dictionary of the History of Ideas:Entropy
- Entropy Is Simple, Qualitatively article by Frank Lambert on http://EntropySite.com/
- Entropy and the second law of thermodynamics Molecular approach to entropy.
- Entropy for students in general chemistry Simple approach to the second law, in Q and A form.
- Entropy as the Capacity Factor for Thermal Energy that is Hidden with Respect to Temperature