Entropy of mixing
From Wikipedia, the free encyclopedia
The entropy of mixing (also known as configurational entropy) is the change in the entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the spatial locations of the different kinds of molecules when they are interspersed. We assume that the mixing process has reached thermodynamic equilibrium so that the mixture is uniform and homogeneous. If the substances being mixed are initially at different temperatures and pressures, there will, of course, be an additional entropy increase in the mixed substance due to these differences being equilibrated, but if the substances being mixed are initially at the same temperature and pressure, the entropy increase will be entirely due to the entropy of mixing.
The entropy of mixing may be calculated by Gibbs' Theorem which states that when two different substances mix, the entropy increase upon mixing is equal to the entropy increase that would occur if the two substances were to expand alone into the mixing volume. (In this sense, then the term "entropy of mixing" is a misnomer, since the entropy increase is not due to any "mixing" effect.) Nevertheless, the two substances must be different for the entropy of mixing to exist. This is the mixing paradox which states that if the two substances are identical, there will be no entropy change, yet the slightest detectable difference between the two will yield a considerable entropy change, and this is just the entropy of mixing. In other words, the entropy of mixing is not a continuous function of the degree of difference between the two substances.
Contents |
[edit] Liquids
Assume that the molecules of two different substances are approximately the same size, and regard space as subdivided into a square lattice whose cells are the size of the molecules. (In fact, any lattice would do, including close packing.) This is a crystal-like conceptual model to identify the molecular centers of mass. If the two phases are liquids, there is no spatial uncertainty in each one individually. Everywhere we look in component 1, there is a molecule present, and likewise for component 2. After they are intermingled (assuming they are miscible), the liquid is still dense with molecules, but now there is uncertainty about what kind of molecule is in which location. Of course, any idea of identifying molecules in given locations is a thought experiment, not something one could do, but the calculation of the uncertainty is well-defined.
We can use Boltzmann's equation for the entropy change as applied to the mixing process
where is Boltzmann’s constant. We then calculate the number of ways of arranging molecules of component 1 and molecules of component 2 on a lattice, where
is the total number of molecules, and therefore the number of lattice sites. Calculating the number of permutations of objects, correcting for the fact that of them are identical to one another, and likewise for ,
After applying Stirling's approximation, the result is
This expression can be generalized to a mixture of components, , with
where we have introduced the mole fractions, which are also the probabilities of finding any particular component in a given lattice site.
A more direct and logically transparent derivation, not requiring require Stirling's approximation, is to start with the Shannon entropy or compositional uncertainty
The summation is over the various chemical species, so this is the uncertainty about which kind of molecule is in any one site. It must be multiplied by the number of sites to get the uncertainty for the whole system. The entropy of mixing from above can be rearranged as
The equivalence of the two follows immediately.
Reverting to two components, we obtain
where is the gas constant, equal to times Avogadro's number, and are the numbers of moles of the components, and is the total number of moles. Since the mole fractions are necessarily less than one, the values of the logarithms are negative. The minus sign reverses this, giving a positive entropy of mixing, as expected.
[edit] Solutions
If the solute is a crystalline solid, the argument is much the same. A crystal has no spatial uncertainty at all, except for crystallographic defects, and a (perfect) crystal allows us to localize the molecules using the crystal symmetry group. The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed phases. If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid.
The Flory-Huggins solution theory provides the entropy of mixing for polymer solutions, in which the macromolecules are huge compared to the solute molecules. In this case, the assumption is made that each monomer subunit in the polymer chain occupies a lattice site.
Note that solids in contact with each other also slowly interdiffuse, and solid mixtures of two or more components may be made at will (alloys, semiconductors, etc.). Again, the same equations for the entropy of mixing apply, but only for homogeneous, uniform phases.
[edit] Gases
In gases there is a lot more spatial uncertainty because most of their volume is merely empty space. We can regard the mixing process as simply conjoining the two containers. The two lattices which allow us to conceptually localize molecular centers of mass also join. The total number of empty cells is the sum of the numbers of empty cells in the two components prior to mixing. Consequently, that part of the spatial uncertainty concerning whether any molecule is present in a lattice cell is the sum of the initial values, and does not increase upon mixing.
Almost everywhere we look, we find empty lattice cells. But we do find molecules in those few cells which are occupied. For each one, there is a contingent uncertainty about which kind of molecule it is. Using conditional probabilities, it turns out that the analytical problem for the small subset of occupied cells is exactly the same as for mixed liquids, and the increase in the entropy, or spatial uncertainty, has exactly the same form as obtained previously. Obviously the subset of occupied cells is not the same at different times. But only when an occupied cell is found do we ask which kind of molecule is there.
See also: Gibbs Paradox, in which it would seem that mixing two samples of the same gas would produce entropy.
[edit] Notes
- ↑ 1. This is, of course, an approximation. Liquids have a “free volume” which is why they are (usually) less dense than solids.
- ↑ 2. Claude Shannon introduced this expression for use in information theory, but similar formulas can be found as far back as the work of Ludwig Boltzmann and J. Willard Gibbs. Shannon uncertainty is completely unrelated to the Heisenberg uncertainty principle in quantum mechanics.