Talk:Introduction to entropy
From Wikipedia, the free encyclopedia
[edit] The first paragraph
Feel free to revise the following!! I think the present (9 am PST, 26 October is good, but too dense, too many ideas per paragraph.) Maybe this is a useful small-stete wise start: (FLL)
The concept of entropy is central to the second law of thermodynamics. A modern version of the second law is "Energy of all types spontaneously changes from being localized to becoming more dispersed or spread out, if it is not hindered". A simple example would be a hotter bar of iron touching a cooler bar. Always the energy of the hotter spreads out to the cooler until both are at the same temperature. Then, entropy (or better, entropy change, ΔS) is the quantitative measure of how much energy, q, has been dispersed in a process, divided by the temperature, T, at which it occurs. This yields the fundamental equation for entropy: q(reversible)/T.
(That "reversible" means that the process should be carried out at a temperature where each bar is almost the same -- the hotter only slightly warmer than the cooler so the energy would flow in reverse if there were only a slight change in the relative temperatures. When the temperature of one bar is considerably hotter than the other, as is most often the practical case, we can use calculus to 'paper-correct' the temperature difference. Our calculations simulate a series of reversible steps by treating the entropy change as though it occurred in tiny jumps or increments of temperature.)
There are two principal types of process involving energy change for which entropy measurement is especially valuable: heating (i.e., the transfer (spreading out) of energy from hotter to cooler and raising the temperature of 'the cooler'), or allowing the internal energy of a substance to become more spread out by allowing it to have more volume -- as in letting a gas expand or in mixing fluids (gases or liquids). Chemical reactions involve both types due to the release of energy (i.e., its dispersal to the surroundings) when new stronger bonds are formed as old ones are cleaved. (Endothermic reactions are the opposite: energy flows from the surroundings to become more dispersed in a system because that energy, in breaking bonds in the system, yield more particles that can more widely spread out that energy.)
The entropy of any substance at a given temperature, T, is given in tables of standard entropy as so many joules (energy)/T298 K. Actually, that final q/T 298 K is a sum and thus a ΔS of very many individiual measurements or calculations of q/T over very small ΔT changes from 0 K to T. Thus, the number for the entropy in joules/K is not the total amount of joules of internal energy in the substance (because the many increments of energy have each been divided by their T values). However, the standard entropy, listed in Tables at 298 K, is a useful rough comparison of the internal energy of every substance. EXAMPLE?? Graphite vs. diamond?? That is significant because it means a non-exact but qualitatively useful comparative value for the energy which a substance must have to exist at 298 K. (The phrase in italics is the qualitative meaning of "energy that is unable to do work" or "energy that is unavailable for thermodynamic work". Of course, the internal energy of a substance A can be transferred to another that is colder, but then substance A can no longer exist at its initial temperature because its entropy has also decreased.) [Posted 26 Oct; revised 28 Oct FrankLambert 03:41, 28 October 2006 (UTC) ]
[edit] Entropy as measure of loss of heat or energy
Only just found this. As as asked on Talk:Entropy, where on earth do you get the idea that entropy is a measure of energy? They have different dimensions. It is like saying apples are a measure of oranges. It is completely unclear. However a worthy start. I'll try to look at it but I am very busy and my WP time has been diverted to something else. --Bduke 02:27, 28 October 2006 (UTC)
- To clarify. The question here is addressed to Kenosis as the main author of the article, not to Frank. --Bduke 04:13, 28 October 2006 (UTC)
- Dave_souza added that passage, so the question might reasonably be deferred to Dave. And I don't necessarily disagree with the passage adequately to jump in and change it, because the technical usage of the word "entropy" depends on how you're using it, given that virtually every field that uses the term seems to have their historical and current perspectives. But... from a purely technical standpoint involving reasonable operational definitions in thermodynamics, this passage is a very reasonable expression of the concept. It is widely accepted that a small Delta-S in a larger environment (where the entropy being described does not appreciably change the denominator reflecting the temperature of that environment) involves entropy expressed solely by the numerator. And even where the denominator changes significantly (such as, for instance, where an engine heats up or cools down) we are still talking about a measure of energy in the process of being dispersed to, or from, or within, a place, structure, set of molecules, part, sector, or other definable place or region. ... Kenosis 04:29, 28 October 2006 (UTC)
-
-
- I am very sorry -- I started this Talk:Introduction page and thought I had signed it. Thus Bduke I think! was properly questioning me. I was finishing the following and about to post it when a 'conflict' notice appeared. Here is my response to Bduke:
-
-
-
- Bduke is certainly correct that entropy and energy have different dimensions. But the standard entropy of a substance is the result of q(energy) being dispersed to a substance (reversibly) divided by temperature at a theoretically 'infinite' number of (i.e, many) temperatures from 0 K to 298 K. Thus, energy input is explicitly involved in the standard entropy values of substance A and a substance O.
-
-
-
- This is not at all like the impossibility of truly comparing apples to oranges -- it is roughly comparing an important ingredient in both, e.g., the amount of carbohydrate in Apples (~15%) to Oranges (~10%). I have long urged this as an important interpretation of what is in all elementary chemistry books, but never interpreted in approximate terms of energy required for the substance to exist at 298 K:. relatively obvious: that liquids have higher entropies than the corresponding solids, polyatomic gases more than monatomic, etc. -- but WHY can be answered now, if one considers the approximate energy needed for each to exist (as shown by their standard entropy values): polyatomic gases need more energy for their rotation, liquids must have enthalpy of fusion going from 0 K to 298 K, etc.,etc. A remarkable article, strongly supporting my view by going quantitatively beyond it, has just appeared in the November JChemEducation that would satisy Bduke because the author uses the physicist's dimensionless 'sigma entropy'(S/l B to compare a large number of elements and compounds for their ln W, energy level occupancy -- a neat quant. idea. So don't chop my simple but important qualitative view! JChemEduc, 2006, 83, 1686-1694. FrankLambert 05:04, 28 October 2006 (UTC)
-
OK, we seem to be clear about who the question is for! I do not disagree with the above. I disagree with the language. "Entropy is a measure of energy" suggests that they have the same dimension. This will confuse people. It is the word "measure" that is the problem. I'll look at the Nov JChemEd when it comes but that will not be for a while. --Bduke 06:13, 28 October 2006 (UTC)
- Apologies for the inadequacies of my phrasing: I've added a reference to temperature which I think is the other main ingredient: perhaps it would be better to say "contributes to a measure of"? .. dave souza, talk 10:21, 28 October 2006 (UTC)
[edit] Aims of the article
As was discussed on Talk:Entropy, the aim here is to give an introduction suited to beginners: I've added a "Heat and entropy" section adapted from FrankLambert's work with the intention of making it more concise and giving it a more encyclopaedic tone: a section on Statistical thermodynamics should follow. I've tried to improve the intro a bit and made it shorter by moving content into "Origins and other uses" – the order of sections could be reviewed – but haven't yet tackled the important point Frank makes in #The first paragraph above: in my opinion it's worth trying to move from a more traditional approach to the energy dispersal concept, but agree that a simpler start will be better for beginners. .. dave souza, talk 10:34, 28 October 2006 (UTC)