Introduction to entropy
From Wikipedia, the free encyclopedia
- This article provides an accessible introduction: for a more technical approach see entropy.
The concept of thermodynamic entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. In a general sense the second law says that temperature differences between systems in contact with each other tend to even out and that work can be obtained from these non-equilibrium differences, but that loss of heat occurs, in the form of entropy, when work is done. The concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in an increase in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this increase in internal energy at a given temperature. A simple and more concrete visualisation of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.
Entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.
Contents |
[edit] Origins and uses
Originally, entropy was named to describe the "waste heat", or more accurately energy losses, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions as more came to be understood about the behavior of molecules on the microscopic level. In the late 19th century the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.
For most of the 20th century textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the motional energy of molecules. More recently there has been a trend in chemistry and physics textbooks to describe entropy in terms of "dispersal of energy". Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of information entropy where a constant replaces the Temperature which is inherent in thermodynamic entropy.
[edit] Heat and entropy
At a microscopic level motional energy of molecules is responsible for the temperature of a substance or a system. “Heat” is the motional energy of molecules being transferred: when motional energy is transferred from hotter surroundings to a cooler system, faster moving molecules in the surroundings collide with the walls of the system and some of their energy gets to the molecules of the system and makes them move faster.
- (molecules in a gas like nitrogen at room temperature at any instant are moving at an average speed of nearly a thousand miles an hour, constantly colliding and therefore exchanging energy so that their individual speeds are always changing, even being motionless for an instant if two molecules with exactly the same speed collide head-on, before another molecule hits then and they race off, as fast as 2500 miles an hour. At higher temperatures average speeds increase and motional energy becomes considerably greater.)
- Thus motional molecular energy (‘heat energy’) from hotter surroundings like faster moving molecules in a flame or violently vibrating iron atoms in a hot plate will melt or to boil a substance (the system) at the temperature of its melting or boiling point. That amount of motional energy from the surroundings that is required for melting or boiling is called the phase change energy, specifically the enthalpy of fusion or of vaporization, respectively. This phase change energy breaks bonds between the molecules in the system (not chemical bonds inside the molecules that hold the atoms together) rather than contributing to the motional energy and making the molecules move any faster – so it doesn’t raise the temperature, but instead enables the molecules to break free to move as a liquid or as a vapor.
- In terms of energy, when a solid becomes a liquid or a liquid a vapor, motional energy coming from the surroundings is changed to ‘ potential energy ‘ in the substance (phase change energy, which is released back to the surroundings when the surroundings become cooler than the substance's boiling or melting temperature, respectively). Phase change energy increases the entropy of a substance or system because it is energy that must be spread out in the system from the surroundings so that the substance can exist as a liquid or vapor at a temperature above its melting or boiling point. When this process occurs in a ‘universe’ that consists of the surroundings plus the system, the total energy of the universe becomes more dispersed or spread out as part of the greater energy that was only in the hotter surroundings transfers so that some is in the cooler system. This energy dispersal increases the entropy of the 'universe'.
The important overall principle is that ”Energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature.
[edit] Classical calculation of entropy
When entropy was first defined and used in 1865 the very existence of atoms was still controversial and there was no concept that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Entropy change, ΔS, was described in macro terms that could be measured, such as volume or temperature or pressure. The 1865 equation, which is still completely valid, is that ΔS = q (rev)/T. This can be expanded, part by part, in modern terms of how molecules are responsible for what is happening. Here is that equation expanded:
- ΔS = the entropy of a system (i.e., of a substance or a group of substances), after some motional energy (“heat”) has been transferred to it by fast moving molecules, minus the entropy of that system before any such energy was transferred to it. So, ΔS = S (final) – S (initial).
- Then, ΔS = S (final) – S (initial) = q, the motional energy (“heat”) that is transferred "reversibly" (rev) to the system from the surroundings (or from another system of in in contact with the first system) divided by T, the absolute temperature at which the transfer occurs = q (rev) / T.
- “Reversible” or “reversibly” (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That’s easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example in the melting of ice at 273.0 K, no matter what temperature the surroundings are – from 273.1 K to 500 K or even higher, the temperature of the ice will stay at 273.0 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole of q(rev)/T = 6008 J/273 K or 22 J/K.
- When the temperature isn't at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy (“heat”) from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of “T” at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many many small temperature intervals or increments. For example, to find the entropy change (q(rev)/T) from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
- Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred “per incremental change in temperature” (the heat capacity, Cp), multiplied by the integral of dT/T from T(initial) to T(final), is directly given by ΔS = Cp ln T(final)/T(initial).
[edit] Technical descriptions of entropy
Traditionally, 20th century chemistry textbooks have described entropy as "a measurement of the disorder or randomness of a system".[citation needed] Modern chemistry textbooks have increasingly tended to say: "Entropy measures the spontaneous dispersal of energy... — at a specific temperature."[citation needed]
In thermodynamics, entropy is one of the three basic thermodynamic potentials U (internal energy), S (entropy) and F (Helmholtz free energy).
[edit] Entropy increase as energy dispersal
Where heat entropy involves a heat source, there is a continuous flow of that energy through the surrounding physical structure or space. From a technical standpoint, entropy is defined as a measurement of the heat energy in a given place or in a defined space, at a given instant in time, (identified as the quantity S ). A change in this measurement is called "Delta-S" (DeltaS). Where a heat generating source (one that produces heat in excess of the rest of the defined space) is not involved, entropy within a defined space involves a movement of existing energy and/or particles towards a steady state, known as equilibrium.
The description of entropy as amounts of "mixedupness" or "disorder" and the abstract nature of statistical mechanics can lead to confusion and considerable difficulty for students beginning the subject.[1][2] An approach to instruction emphasising the qualitative simplicity of entropy has been developed.[3] In this approach, entropy increase is described the spontaneous dispersal of energy: how much energy is spread out in a process or how widely spread out it becomes at a specific temperature. A cup of hot coffee in a room will eventually cool, and the room will become a bit warmer. The higher amount of heat energy in the hot coffee has been dispersed to the entire room, and the net entropy of the system (coffee and room) will increase as a result.
In this approach the statistical interpretation is related to quantum mechanics, and the generalization is made for molecular thermodynamics that "Entropy measures the energy dispersal for a system by the number of accessible microstates, the number of arrangements (each containing the total system energy) for which the molecules' quantized energy can be distributed, and in one of which – at a given instant – the system exists prior to changing to another.[4] On this basis the claim is made that in all everyday spontaneous physical happenings and chemical reactions, some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.[3] Thus in situations such as the entropy of mixing it is not strictly correct to visualise a spatial dispersion of energy; rather the molecules' motional energy of each constituent is actually spread out in the sense of having the chance of being, at one instant, in any one of many many more microstates in the larger mixture than each had before the process of mixing.[4]
The subject remains subtle and difficult, and in complex cases the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot. However, it is claimed that this does not weaken its explanatory power for beginning students.[3] In all cases however, the statistical interpretation will hold - entropy increases as the system moves from a macrostate which is very unlikely (for a system in equilibrium), to a macrostate which is much more probable.
[edit] Notes and references
- ^ Frank L. Lambert, Disorder — A Cracked Crutch For Supporting Entropy Discussions
- ^ Frank L. Lambert, The Second Law of Thermodynamics (6)
- ^ a b c Frank L. Lambert, Entropy Is Simple, Qualitatively
- ^ a b Frank L. Lambert, Entropy and the second law of thermodynamics