Privacy Policy Cookie Policy Terms and Conditions Talk:Entropy/Archive4 - Wikipedia, the free encyclopedia

Talk:Entropy/Archive4

From Wikipedia, the free encyclopedia

Contents

[edit] Increase of entropy is not necessarily dispersal of energy

I plan to remove all references to entropy increase as a dispersal of energy unless someone can explain how it relates to entropy of mixing. As far as "dispersion of energy" is concerned, that's wrong. In the case of entropy of mixing there is nothing that can be identified as a dispersion of energy, which goes to show that thinking of entropy as a dispersion of energy is wrong.

If you have a box divided in two by a partition, with type A gas on one side, type B on the other, same mass per particle, same temperature and pressure, then there will be no change in the energy density when the partition is removed and the gases mix, but there will be an increase in entropy. You could say that the energy held by gas A gets spread out, as does the energy held by gas B, but I don't think it is productive to think of energy as being "owned" by a particular molecule type. Energy is being exchanged at every collision, its not "owned" by any gas. Its the particles that are being spread out, not their energy. PAR 03:59, 28 September 2006 (UTC)

Entropy as dispersal of energy is a well sourced viewpoint, and WP:NPOV requires that such viewpoints be shown, Your viewpoint should also be shown. You appear to be pointing to a kind of "entropy" in which there is no energy or temperature differential, so presumably this is not thermodynamic entropy, but an analogy to it. ...dave souza, talk 07:56, 28 September 2006 (UTC)
PAR, given that an increase in entropy is a reordering of molecules caused by a increase in the energy-level, I wouldn't go removing the references. In your example, a reordering occurs (or may occur), but given your parameters in describing the two types of gas, that "reordering" is not thermodynamic in nature for the following reason: Thermodynamic entropy is a reordering in which the amount of activity exhibited by the molecules increases in response to an increase in energy. In the TypeA/TypeB instance, again given your own criteria, there is no reason to assume a change in the activity level, unless the act of removing the partition generates energy, thus changing the activity level and thus increasing the amount of entropy due to a dispersal of energy. •Jim62sch• 10:06, 28 September 2006 (UTC)
You are explaining that I am wrong by assuming that you are right. Thermodynamic entropy is clearly defined, and it is not defined as the degree of dispersion of energy. This dispersal must be proven, not assumed, which is what you are doing. To reiterate the point - by the thermodynamic definition of entropy, the entropy of mixing is a real, thermodynamic entropy, and it does not imply a dispersal of energy. PAR 13:21, 28 September 2006 (UTC)
Since we're not going to get anywhere going back an forth, humour me by providing citations for what you are saying. Another idea -- explain what happens when a hot pan is placed into a room to the energy and matter involved in that system. •Jim62sch• 14:53, 28 September 2006 (UTC)
I'm not sure what citations you need.
  • Citation for the definition of entropy - Do you really need this?
  • Citation for the definition of entropy of mixing - See Callen, "Thermodynamics and and introduction to Thermostatistics" (The "Bible" of Thermodynamics), pages 69, 108, 290. Also Wikipedia articles Entropy of mixing and Mixing paradox.
  • Citation stating that entropy is not dispersal of energy - I'm sure you are not asking for that (referencing a negative).
Regarding the hot pan in a cool room - The energy is dispersed, the matter is not. I am not saying that entropy increase is never accompanied by energy dispersal, I am saying that entropy increase does not automatically imply energy dispersal. I am using the entropy of mixing as an example of entropy increase without energy dispersal. In the entropy of mixing there happens to be particle dispersal associated with the increase of entropy, but no energy dispersal. PAR 18:17, 28 September 2006 (UTC)

Intriguing. The thought experiment envisages a situation where there's no change in available energy, hence no change in the amount of energy that cannot be used to do thermodynamic work, the factor that entropy provides a measure of in relation to a reference temperature. Yet because restoring the previous order is assumed to require energy, this is interpreted as increasing entropy. So there's an increase in entropy with no thermodynamic effect. An apparent paradox, but the lead to Entropy of mixing indicates an answer where it states that "This entropy change must be positive since there is more uncertainty about the spatial locations of the different kinds of molecules when they are interspersed." Note that "uncertainty" links to Information entropy#Formal definitions. In other words this conflates Shannon "information entropy" with thermodynamic uncertainty. There's a loss in "information" without a thermodynamic change. ...dave souza, talk 18:50, 28 September 2006 (UTC)

(edit conflict) :::PAR, unfortunately I missed the word "necessarily". Had I realised it was there I'd have phrased my answer differently. And thanks for the cites on mixing. (You were right, I didn't need the others). One other point: re the room, I hope I didn't imply that matter is was dispersed, rather, it is reordered -- the molecules in the air become more active, the molecules in the material the pan is made of, less, although these are not equal reactions, of course.
However, see Disorder; "Entropy change in a number of other basic processes can be seen to be related to that in the expansion of a gas. The mixing of different ideal gases and of liquids fundamentally involves an expansion of each component in the phase involved. (This is sometimes called a configurational entropy change.) Of course the minor constituent is most markedly changed as its entropy increases because its energy is now more spread out or dispersed in the microstates of the considerably greater volume than its original state."
Needless to say, 2LOT (and entropy) is probably one of the most discussed, debated and misunderstood of the laws of science -- I've seen physicists who should know better refer to entropy as chaos, which is one hell of an anthropomorphic assumption. •Jim62sch• 20:51, 28 September 2006 (UTC)
I support PAR in saying that defining entropy as "dispersal of energy" is incorrect, as we already discussed several months ago, when Frank Lambert (the author of the reference) was here in Wikipedia and suggested these ideas. Yevgeny Kats 20:44, 28 September 2006 (UTC)
"these"? these ideas meaning? •Jim62sch• 20:51, 28 September 2006 (UTC)
BTW, since under WP:V, WP:RS and WP:NPOV we can support both definitions, both will have to be in the article, with essentially equal weight. •Jim62sch• 20:53, 28 September 2006 (UTC)
An additional point:
It seems to me that mixing gas would show an increase in entropy only for the simple reason that a physical action must occur to mix gasses previously held separately, and as every physical action generates heat, entropy would increase. As I can't see any way to mix the gasses other than via some physical action (turning a stopcock, removing a partition, breaking an ampule, etc) I'd have to say that the mixing bit of increasing entropy without dispersing energy is merely a Gedankenexperiment. •Jim62sch• 22:50, 28 September 2006 (UTC)
A comment @Dave: the entropy of mixing is very real, and very thermodynamic, because if you were to try to restore the unmixed state, eg by forcing a semi-permeable membrane through the gas container, you would have to put in energy to do so. The effects of the changed entropy also show up in things like freezing point depression and boiling point elevation. @Jim The entropy of mixing is completely additional to any entropy increase caused by eg friction against the turning of the tap. The mixing entropy is quantifiable, and it's different. Jheald 23:29, 28 September 2006 (UTC)
Can you explain a bit more? Different how? Are there formulae that prove the additional entropy? What, other than energy causes the gases to mix (assuming movement is energy)? I'm guessing "Tme" is a typo, but for what? Also, I'm not sure what your point is re freezing and boiling as both are manifestions of applied energy. (BTW: this is a quite interesting discussion, it's nice to finally be in a discussion on wiki as opposed to a danged argument) •Jim62sch• 23:47, 28 September 2006 (UTC)
You might like to look at it this way. Dave said that mixing produces no change in the amount of energy that cannot be used to do useful work. But that's not true. Suppose you had, in the middle of your box of particles, a semi-permeable membrane that would allow particles of type A through (either way) but no particles of type B. And suppose the divider is mounted on a (friction free) slide, so it can move backwards and forwards. You would expect some particles of A to randomly diffuse through the membrane to the B side. This would create a slight overpressure on the B side, pushing on the divider to make the A side smaller. Eventually, the divider would get pushed all the way to the wall, no A side any more, and all the molecules of both types mixed on the B side. In principle, in the process, you could have attached the divider to a piston, and extracted a little useful work. This would exactly correspond to T times the entropy of mixing.
The usual entropy increase (if you haven't built this complicated apparatus) equal to the entropy of mixing is arising because you're giving up this work you might have had. It's got nothing to do with friction losses in turning the tap -- instead it's to do with you throwing away this work you might have had, because you're just letting it go. Jheald 01:51, 29 September 2006 (UTC)

Thanks, Jheald, for a really useful explanation of Entropy of mixing – would it be possible to add it to that article? Now it seems to me that you've shown a dispersal of energy as the particles diffuse and the pressure potential is lost. PAR stated at the outset that he didn't think it is productive to think of energy as being "owned" by a particular molecule type, but this illustration does just that. Looking now at Frank Lambert's pages, here he describes letting the motional energy of each gas spread out more widely into the larger volume, and here he describes the mixing of different ideal gases and of liquids as fundamentally involving an expansion of each component in the phase involved, and entropy increases because the energy [of each] is now more spread out or dispersed in the microstates of the considerably greater volume than its original state. He comments that the "Gibbs Paradox"... is no paradox at all in quantum mechanics where the numbers of microstates in a macrostate are enumerated, but doesn't treat that further. His entropysite lists over 15 textbooks that have deleted “entropy is disorder” and now describe the meaning of entropy in various terms of the spreading or dispersing of energy. I don't have access to these books, but it would be interesting to know how they tackle such issues. In terms of this article it's pretty clear that such description is useful, particularly for non experts or "beginning students". If there are significant problems with this approach, it should be possible to find critical appraisal of these textbooks. ...dave souza, talk 09:08, 29 September 2006 (UTC)

Yes, thanks Jheald, that is an excellent description of the "reality" of the entropy of mixing and deserves to be in one or more of the relevant articles. I do not, however, see how this illustration uses the idea of energy being "owned" by a particular particle type.
On a microscopic level, suppose you have an A and a B particle with energy of 1 (arbitrary units). They collide, and after the collision, they still have energy 1. Has energy been dispersed? Was it the case that each particle retained its energy or did they exchange energy? Or was only a fraction exchanged? This question is unanswerable. Suppose, after the collision, the B particle again collides with an A particle. If there is energy transfer from the B particle, did it transfer what was formerly A energy, or did it transfer B-energy? Or maybe a fraction of each? I think you can see the emptiness of this line of reasoning.
I suppose there are two types of "ownership". First is the sum of the individual energies of the A particles (or B particles). This is a real, measureable quantity. The second is the energy "formerly owned" by A or B particles. This is not measureable. The problem with "energy dispersion" as a description of the entropy of mixing, is that if you use the first definition of "ownership", you are just saying "the energy that happens to be owned by say, the A particles at any particular point in time, is spread out after mixing". But this is not dispersal. To say that the energy has been dispersed, you have to say that the energy formerly owned by the A particles is now spread out over the whole volume, and this is an unmeasureable statement.
The bottom line is that nobody, Frank Lambert included, ever quantitatively defines energy dispersal. Please, somebody, define it, or lets forget about it. It clearly cannot be defined as an equalization of variations in the energy density, since this never changes in the mixing example. Dispersal implies "measurebly, something that used to be here is no longer" and "measureably, something is here that wasn't before." With regard to the particles, this dispersion is measureable, with regard to energy it is not. PAR 12:58, 29 September 2006 (UTC)

I do appreciate that you're finding it difficult to think in terms of energy, but as you state, the displacement of the particles is measurable so we have a measure of mass, of distance and of time, which are the dimensions required to measure energy. On a macro level, the example has the work and hence energy of the A particles moving the semi-permeable membrane by diffusing through that membrane. As is being stated, this work/energy is quantified as T times the entropy of mixing. Which statistical mechanics can predict. This approach to a basic understanding of entropy for students exists, and should be explained in the article: saying "lets forget about it" is not what an encyclopedia is for. ...dave souza, talk 14:52, 29 September 2006 (UTC)

To put things a slightly different way, it's potentially troublesome to talk about energy dispersing when you open the tap between tank A and tank B, because at the end of the day both tanks will have contents with exactly the same energy they started with. What's really dispersing are the particles, not the energy.
At a more abstract (and more fundamental) level, the possible state of the overall system has become dispersed over a larger set of different microstates. This is what entropy is really all about.
As a footnote, note also that the most dispersed arrangement of energy of all, a completely even distribution, actually has very low entropy, because there is only one way to achieve it. In reality there are fluctuations - brief local concentrations of energy, deviations from complete dispersion - which gives a much larger number of possible states, an overwhelmingly more probable total distribution.
It's true, an increase in entropy (for an overall fixed total energy) means there's less energy available hypothetically to do useful work. But insisting the energy has become more dispersed seems (at least to me) just not quite the right word. Jheald 15:24, 29 September 2006 (UTC)
Of course, "useful work" is one of those human-only concepts that assumes everything (including energy/work) has a purpose or must be useful. ;) <--Just a comment, a random thought.
The more I think of it, dispersal may not be the right word, but what is the right word? If one is concerned with "useful work", I suppose you could say the energy was "wasted", but that's not really right either. Lost won't really do, because it's just wrong. "Differently distributed?" I don't know, but at issue for the article isn't what we say, but what reliable sources say. Lambert is a reliable source, so, as I said above, dispersal should be used in the article, but surely there are competing terms out there somwhere that can be contrasted with dispersed. •Jim62sch• 16:13, 29 September 2006 (UTC)

Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend? I'm sure we will all answer NO to this question. The question then remains whether entropy increase as energy dispersal fits the above description. If it does, then we get rid of it. If it does not, then a definition of energy dispersal will be forthcoming, as well as a defense of the description. Please, I really would like to hear it. What is the definition of energy dispersal?. If we cannot answer this question, we must agree to eliminate energy dispersal as a description of entropy increase. PAR 17:41, 30 September 2006 (UTC)

The only real question here deals with the term "dispersal". However, the fact that we don't like it is irrelevant. What is relevant is WP:RS and WP:V. [1], [2], [3]. •Jim62sch• 15:47, 30 September 2006 (UTC)
I'm not sure I understand your point. The question is:

Do we want to include in this article a description of entropy increase which all of us agree is lacking in some sense? Do we want to include a description which none of us can explain clearly, nor defend?

Is your answer "we must include it, we have no choice"? If this is not your answer, what exactly is your answer? PAR 22:53, 30 September 2006 (UTC)
Well, PAR,, my answer is that this description is evidently found useful in textbooks as a way of explaining entropy to students less gifted in the abstract than yourself, which category would include most users of Wikipedia. You appear to be in a rush to remove this useful description without giving consideration to the definitions supplied, which look rather familiar. This page relating entropy to the tendency of thermal energy to disperse as widely as possible refers back to this page which concludes with a set of key concepts about energy spreading, the fifth of which is "Spontaneous change is driven by the tendency of thermal energy to spread into as many microstates of the system and surroundings as are thermally accessible." In this example energy spreading appears to be equivalent to dispersal, though to me the latter better conveys the loss of non-equilibrium differences. And no, I don't agree that the description is lacking in some sense. As a simple introduction it provides a useful basis for further study. ...dave souza, talk 23:37, 30 September 2006 (UTC)
Ok, thats a good reference, it gives a clear (but nonsensical) definition of energy dispersal. First of all, it says that energy dispersal is NOT dispersal in space. Thats a good point to remember. The sentence you wrote above summarizes what it does say very well.

Thermal energy spreads rapidly and randomly throughout the various energetically accessible microstates of the system.

The problem with the above sentence is that it gives a thoroughly wrong picture of energy and microstates. At any instant in time, a system is in one and only one microstate. The system jumps from microstate to microstate but it is never in more than one microstate. It only has a limited number of microstates available to it, each of which have the same energy. The system occupies a microstate. Energy does not occupy a microstate. A single value of energy is associated with each microstate, and as the system jumps from one microstate to the other, its energy remains the same. Energy does not spread throughout the availiable microstates, this statement is... I don't know how to say it kindly... Its gibberish. If you are interested in communicating the concept of entropy increase, it is counterproductive to offer an explanation which is gibberish. I know, it gives a warm feeling to say "entropy increase is equivalent to energy dispersal", it sounds so much more accessible and gives the sensation of understanding, but unfortunately it is senseless.
Your final sentence was "And no, I don't agree that the description is lacking in some sense." If it is not lacking, then my above discussion is wrong. Please point out where it is significantly in error. (Insignificant quantum-physical errors like "its energy remains the same" don't count). PAR 00:41, 1 October 2006 (UTC)
OK, now we're getting away from the point of the article. We have sources that are both reliable WP:RS and verifiable WP:V with which you disagree, in fact which you say are wrong. I hate to put to fine a point on it, but as I said above this isn't about the semantics of dispersal and whether or not we like the terminology (although I'll admit that "spreading out" is "better"), or even about whether or not you think it is "true" or "right" or accurate, it is about representing the sources we have -- anything else wanders into WP:OR territory.
Also, whether you think the definition is senseless or gibberish is also irrelevant. Unless you can come up with a verifiable and reliable source that specifically says that these cites are wrong, your distaste for the definition/verbiage is utterly irrelevant. In fact, even if you do find such a cite, it merely means that both cites have to be reported.
Dave also brings up a good point: you're obviously seeing "dispersal" as more of an abstract, although the problem with that is that abstracts are often hard to adequately define. It seems that your concern is with a dispersal in "space" (but not time?) although I'm not dure what exactly you're driving at, and while I am interested in understanding your point, I'm not sure it has any bearing on the article itself. Why? Because Wikipedia deals with verifiability and not truth per se. For example, an editor may know that relativity fails to explain, phenomenom x, and that neither quantum mechanics nor M-theory adequately explain the phenomenon either, but unless there is at least one verifiable and reliable source backing him up, it just doesn't matter. •Jim62sch• 10:04, 1 October 2006 (UTC)

Ok - we've stated our positions for the record.PAR 13:33, 1 October 2006 (UTC)

Regarding the request to provide references which prove the above references wrong - this is a classic wikipedia problem - asking for negative reference. See this link for example.PAR 13:46, 1 October 2006 (UTC)

You're utterly missing the point. I've given you the links to WP:V (verifiability) and WP:RS (reliable sources) several times. Read them. No one gives a rat's behind whether you "think" the statement is "wrong", "nonsensical", "gibberish", etc., you, as the person making the assertion, need to prove it following the above guidelines.
Also, re a negative reference, if the statement and refs we are using are so bloody wrong, you'd think someone would have noted it somewhere by now. After all, we have thousands of articles chock full of references negating various "contentious" points. •Jim62sch• 15:37, 1 October 2006 (UTC)
The point about systems and microstates is addressed in this article published in the Journal of Chemical Education (October 2002, vol. 79, pp. 1241-1246):
"Some instructors may prefer “delocalization” to describe the status of the total energy of a system when there are a greater number of microstates rather than fewer, as an exact synonym for “dispersal” of energy as used here in this article for other situations in chemical thermodynamics. The advantage of uniform use of ‘dispersal' is its correct common-meaning applicability to examples ranging from motional energy becoming literally spread out in a larger volume to the cases of thermal energy transfer from hot surroundings to a cooler system, as well as to distributions of molecular energies on energy levels for either of those general cases. Students of lesser ability should be able to grasp what ‘dispersal' means in three dimensions, even though the next steps of abstraction to what it means in energy levels and numbers of microstates may result in more of a ‘feeling' than a preparation for physical chemistry that it can be for the more able.
Of course, dispersal of the energy of a system in terms of microstates does not mean that the energy is smeared or spread out over microstates like peanut butter on bread! All the energy of the macrostate is always in only one microstate at one instant. It is the possibility that the total energy of the macrostate can be in any one of so many more different arrangements of that energy at the next instant — an increased probability that it could not be localized by returning to the same microstate — that amounts to a greater dispersal or spreading out of energy when there are a larger number of microstates "
On the one hand we have this approach to educational education regarding entropy published, discussed and incorporated in textbooks. Against it we have the word of some anonymous self-proclaimed expert who is unable to explain with sources why the terms are in some way "a thoroughly wrong picture". There has been plenty of opportunity for this teaching approach to be criticised, it should not be too hard to research such criticism. ...dave souza, talk 14:29, 1 October 2006 (UTC)

Dave - I looked at the reference you gave. It says:

"some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.

I still don't agree. To "spread out" means to occupy a larger volume in some space. It is simply a nonsense statement until this space can be specified. I really believe that because energy dispersal in physical space is so often the signature of entropy increase, the proponents of this point of view are trying to use this intuitively accessible description to shoe-horn in all forms of entropy increase, with the idea that it is better to have a wrong idea than no idea at all. Sometimes that is true, but I think in this case we can arrive at the same value article without the warm fuzzy nonsensical statements.

I am in no way a self-proclaimed expert. I have had my head handed to me on this very page (or perhaps it was the second law page) by people who know this subject better than I and whom I expect are listening to us now. This was because of my insistence that I was right until it was shown to me with a logical argument that I was wrong. I will not ever resort to legal loopholes to insert a bunch of words that I don't understand and cannot defend into an article. Ever. And I expect the same of everyone else.

I am carrying on this discussion so that others can see exactly what our points are and make their decisions accordingly. If someone reverts my edit, as you have done, I will not revert it back. I leave that to someone who agrees with me, and if there is no such person, then so be it.

Meanwhile I will search for those negative references. PAR 16:51, 1 October 2006 (UTC)

For the moment I'm removing the statement
Entropy measures the spontaneous dispersal of energy: how much energy is spread out in a process or how widely spread out it becomes at a specific temperature. (Frank L. Lambert, A Student’s Approach to the Second Law and Entropy)
Where a statement like this has wide currency, WP:NPOV encourages us to discuss it and examine its applicability and its limitations in the article. It does not require us to quote such a statement without qualification in the lead as an absolute definition of fact. (Any more than followers of Frank Lambert would I imagine like to see "entropy is a measure of disorder" in the same placce without further comment).
The fact is that the statement, taken on its own or taken as a fundamental statement of the meaning of entropy, is materially misleading as we have discussed above. When stated in this point blank way, it is over-simplified to the point of being false, and likely to give rise to lasting misconceptions for readers.
The statement has no place being quoted up in the lead, without analysis, as the interpretation of entropy, so I have removed it from there. If somebody wants to add a balanced, NPOV discussion of the statement's benefits and shortcomings further down the article, I leave that to them. Jheald 11:19, 2 October 2006 (UTC)

[edit] Entropy increase as energy dispersal

I have added a section on energy dispersal in the "statistical interpretation" section that I think we can all agree on. It states clearly that entropy increase is not necessarily equivalent to energy dispersion into physical space, which we all (including Lambert) agree is the case. There is no explicit attack on the idea of energy dispersal being applied to e.g. the entropy of mixing. PAR 13:36, 2 October 2006 (UTC)
Good idea. I've developed it a bit: note that we need a citation to support the assertions being made in this talk page that it's materially misleading. I've also complied with WP:LEAD by mentioning the existence of this as a reaching method, making it clear that it's not the interpretation of entropy. ...dave souza, talk 20:03, 2 October 2006 (UTC)

I have modified the sentence in the introduction to avoid the misconception that energy dispersal is exclusively a spatial dispersal. I have also removed the nonsense about energy dispersing into microstates. Dave, you have developed it to a point that we cannot all agree on. We have to be very clear on two points:

1. No one can dispute the fact that Lambert's "energy dispersal" as a description of entropy increase does not always mean spatial energy dispersal. This follows from the sentence I quoted above from the Entropy is simple page:

some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.

with emphasis on the word "often". This is the reference that was asked for and I have included.

2. The point that IS in contention is the statement that energy disperses into a greater number of microstates, or to a state with a greater number of microstates, or some variation thereof.

As long as we state point 1 I'm ok. When point 2 is stated as fact I am not ok. My revision avoided discussion of point 2 and I object to any revision stating point 2 as fact, unless a clear, more or less quantitative definition of non-spatial "energy dispersal" can be given. PAR 02:41, 3 October 2006 (UTC)

Point 2 was not being stated as "fact", but as the interpretation used in this model. I've put in an alternative statement of this position, and emphasised that this relates to the "dispersal" approach. The question of whether this approach is not applicable to the mixing example was still in contention on this page: I've added a more specific reference to it, including the point that it is not strictly correct to visualise a spatial dispersion of energy. From my reading of our discussions it appeared that we all agreed that there was spatial dispersal related to each of the two gases, but the question was whether the particles can be dissociated from their energy. Either way, we have a referenced statement on this issue seen from the "dispersal" approach. Any critical appraisal of this approach from a relliable source will of course be welcome. ...dave souza, talk 10:38, 3 October 2006 (UTC)
Helllooooo???!!!! It really doesn't fucking matter whether you agree with Lambert or not: his statement is sourced, your statements (PAR and JHeald) are OR -- OR is a no-no here. Which part of this are you having trouble with?
Dave, you really should not have agreed to the change, as the other two provided no sourcing for their arguments. •Jim62sch• 11:46, 3 October 2006 (UTC)
Look, I understand that you see this as a legal battle of us vs them. The reason I don't respond to you is because I reject that whole point of view. My main goal is to improve the article, and apparently Dave is trying to do the same. Nothing is cast in stone, If Dave has second thoughts, he can re-edit the article, he hasn't given up anything, hasn't lost any "points" in your "war". I worry myself that my edits are not strong enough in rejecting the offending sentence. The objection that I and others have is not original research. It is a request from anyone to clarify a statement that makes no sense. You cannot explain it, you cannot expand on it, you cannot give and take on it, you cannot discuss it rationally, you cannot defend it from criticism, you cannot argue the merits of the sentence back and forth. All we get are links to Lambert's pages which repeat the statement and legal threats. Thats a pretty sorry situation for a concept that is supposed to be a teaching breakthrough. Nevertheless, you demand that it be included, because this is war and you want to win, and when the reasoned arguments begin to fly, all you can do is run for legal cover. Dave is at least stepping out of that hole, and engaging in a dialog about the meaning of the contended sentence. Please, for the sake of the article, let the people who are concerned with the article figure out how to deal with this problem. PAR 13:34, 3 October 2006 (UTC)
Jim, I appreciate that I'm bending over backwards here to accommodate the concerns of these people who still unfortunately seem to have difficulty in finding other sources to support their arguments. As further well sourced information comes to hand I'd hope that we'll be able to clarify and review these changes. PAR, Jim's quite right to point to the need to ensure that we don't stray beyond policy. Reasoned arguments are interesting, but at the end of the day we have ensure that contentions are properly sourced and are not original research. .. dave souza, talk 16:48, 3 October 2006 (UTC)
No, PAR, it's not a matter of winning, it's a matter of following the rules of Wikipedia. If you don't like the rules, start a blog, write an article on entropy and get it peer-reviewed, publish a book on 2LOT, but keep your OR off the page. Sorry dude, but that's the way Wiki works -- it's not "legal cover", it's reality.
As for your implied suggestion that I back off -- not a chance. •Jim62sch• 01:38, 4 October 2006 (UTC)

[edit] Apology

Just yesterday, I looked at Talk:Entropy for the first time since late July and today I am terribly embarrassed to see this fire storm and even more because Leff (for an old ref, see end) in recent months has convinced me that I was absolutely wrong in limiting energy dispersal (in re earthly entropy) in 3-D space in any way. The following is a full explication that I hope will be helpful, but I apologize for not being right in 2002, nor correcting it before now on www.entropysite.com. FrankLambert 00:22, 4 October 2006 (UTC)

[edit] Spontaneous entropy increase always involves the dispersal of energy

“Energy of all types spontaneously changes from being localized to becoming more dispersed or spread out in space if it is not constrained.” This is a modern view of 2LOT.

Chemical processes deal with readily comprehensible examples because the kind of energy most commonly considered is that of molecular motion. Beginners in chemistry early learn about the average thousand-mile-an-hour molecules in a gas at room temperature. Such speeds and incessant elastic collisions whereby their individual energies change (ranging from zero to thrice average) dramatize the relationship of the total energy of a system to the kinetic energy of its molecules in gases and liquids, and vibrations in solids. (More precisely of course, such motion must be considered as momentum and include the PE of any phase change occurring in a process.)

“Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T in two seemingly disparate kinds of process: how much energy becomes dispersed (as in thermal energy transfer between surroundings and system or the converse, or between two contiguous systems insulated from their surroundings), “thermal”; or by how widely spread out the initial energy within a system becomes (as in gas expansion into a vacuum, or by two or more fluids mixing), “positional”.”

In chemical reactions, both “how much” energy and “how widely” dispersed it becomes may be involved. (The failure of so many reactions to achieve complete conversion to products is due to the greater increase in entropy in formation of the equilibrium mixture – i.e., greater energy distribution in a very small quantity of reactants in much product -- than in solely forming pure product.)

Both kinds of process, “thermal” and “positional”, involve a change in the dispersal of energy from a smaller to a larger volume. Thus, fundamentally, the difference between them need not be mentioned to beginners. More important is their commonality: Spontaneous processes always involve energy dispersal in space. This is a unifying concept in presenting varieties of entropy change to beginners or non-scientists.

However, for processes of thermal energy transfer, the quantitative determination of “how much energy/T” has been spread from a warmer to a cooler part of the universe intrinsically includes any entropic dependence on volume change in the process. No matter what were the relative sizes of the warmer and cooler portions of the universe, the increase in entropy of the cooler is greater than the decrease in entropy of the warmer. Thus, spontaneous transfer of molecular motional energy from a warm system or surroundings to cooler surroundings or system in a (quasi) reversible process always results in an entropy increase.

The unusual value of viewing entropy change as a dispersal of molecular energy is its seamless application from a beginner’s view of molecules’ behavior to a qualitative or a quantitative Clausius calculation as well as to more sophisticated qualitative views of quantized energy distributions and quantitative Boltzmann calculations of the number of accessible microstates.

The relationship of the “dispersal of energy” in a system to the number of accessible microstates as calculated from Δ S = k ln WFinal / WInitial comes directly from the modern version of the 2LOT stated at the start of this section.. “Energy...spontaneously changes from...localized to...more dispersed...” The greater the number of microstates in the final state after a spontaneous process, the less localized is the total energy of a system, obviously corresponding to an increase in entropy. Of course, that energy is always only in one microstate at one instant, but the more microstates that are accessible the next instant, the more dispersed is the energy, in the ‘temporal dance” of Physicist Harvey Leff (in his source article on the spreading and sharing of energy): (Leff, H.S. Am. J. Phys.,1996, 64, 1261-1271).

In the traditionally named “positional” kind of entropy change, molecules that are constrained in a given volume are then allowed to move into an adjacent evacuated chamber. Qualitatively, their initial motional energy has become more spread out in the new total volume than it was in the smaller original chamber. That greater dispersal of the original energy of the gas in space should result in an increase in its entropy. The classical reversible reversal of the process proves that the work involved is indeed dependent on the volume change and related to a q(rev)/T for the original spontaneous expansion.

However, a modern view of the quantization of the motional energy of the molecules shows them with their energy (this is not a case of radiation or energy dissociated from particles) in a Boltzmann distribution on many energy levels in an initial volume at a given instant. If that volume is increased adiabatically, the total motional energy of the system is unchanged but the energy levels on which the molecules and/or their energies are distributed are closer together. Thus, with no change in total energy, there is a marked increase in the number of accessible distributions of the total motional energy due to the increase in available energy levels – only one distribution being occupied at any one instant. This is what is meant by a “dispersal of molecular motional energy for a given system”: an increase in the number of accessible energy distributions for the system.

In no way does “a dispersal of energy” mean that the motional energy of a system is in any more than one microstate at one instant! Energy is quantized in one specific distribution on an enormous number of accessible energy levels in one microstate at one instant. “Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.

The same phenomenon of increased energy dispersal applies to allowing gas A with its initial motional energy in a given volume to mix with gas B with its initial energy in an equal or different volume. The total initial motional energy of each gas is unchanged but that initial energy of each is now more widely spread throughout the greater volume of the two; it has become more dispersed and thus, from our view of entropy as a quantitative measure of the spreading out of energy, there should be a distinct increase in entropy. Classical Gibbs’ calculation, or combinatorial calculations of the number of possible cells containing A and B and the Boltzmann entropy equation both lead to the same conclusion: Δ S = - n R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions.

However, as just mentioned, the usual calculation of the entropy change in mixing by statistical mechanics has been via consideration of the numbers of possible cells containing the appropriate ratio of A and B. Such a combinatorial determination of the number of microstates is completed using the Boltzmann entropy equation. But this counting of cells has traditionally been considered a count of positions in phase space. The position of a real molecule in phase space is inseparable from its momentum, its motional energy. Therefore, the final &Delta S for the mixing of A and B is properly a measure of the increased dispersal of the energy/momentum of the initial system and equal to k ln WFinal / ln WInitial , the increase in the number of microstates due to mixing and the quantitative measure of molecular energy dispersal.

[This equivalence of combinatorial locations of particles to the probable distribution of the final energy of a system is supported by Hanson, R.M., J. Chem. Educ. 2006. 83, 581-588; p. 587, 586; and Kozliak, E. I. J. Chem. Educ.[in press] 2006, 83. The spreading of energy in space as a key to understanding entropy change is from Leff, H.S. Am. J.Phys., 1996, 64, 1261-1271.] FrankLambert 00:22, 4 October 2006 (UTC)

Hello Frank Lambert - Thank you for joining the discussion. The problem we are having is that no one is able to clarify certain questions that have been raised about the meaning of "energy dispersal". Its clear that not all entropy increase is energy dispersal in physical space and the main example of this is the entropy of mixing, so I am focusing on the last few paragraphs you have written in which energy dispersal for this case is elaborated.

This is what is meant by a “dispersal of molecular motional energy for a given system”: an increase in the number of accessible energy distributions for the system.

“Dispersal” or “greater dispersal” of energy as a result of a process simply means that there are many more choices of different microstates (different distributions of the system’s energy) for the system at the next instant than there were for the system prior to the process that had fewer accessible microstates.

Webster defines dispersal as

to distribute (as fine particles) more or less evenly throughout a medium.

among other possibilities, but this seems the most physical and relevant to this situation.
In order for dispersion to occur, there must be some "medium" or space in which this dispersal occurs and there must be some extensive quantity - "fine particles" or "energy" that are being dispersed. In this case it is energy. This space must have a concept of distance, so that "more or less evenly" can be quantified. i.e. you need to be able to say "more is over here than over there" (or not).
Saying that the number of accessible energy states has increased, I cannot find the medium that is being referred to nor can I see the concept of distance in order to define increased or decreased concentration, i.e. the concept of distance between greater and lesser concentrations of energy which is being dispersed. As you mentioned, at any one instant of time, the system is represented by a single point in phase space, so it cannot be phase space we are talking about, there is no energy dispersal in phase space.
Can you please give a mathematical expression or at least a more quantitative expression for the degree of dispersal for these cases where the energy dispersal is not in physical space? Also, please identify the space in which the dispersal takes place, and the way in which differences in concentrations of energy may be conceptualized. Thanks. PAR 03:27, 4 October 2006 (UTC)
Another wee question from a less informed viewpoint: the entropy equation for mixed gases seems to involve moles and mole fractions for A and B. In the example at #Increase of entropy is not necessarily dispersal of energy the particles of A and B both have the same mass per particle, which did strike me as odd since I thought that molecules which were different tended have different masses. Anyway, if moles for A and B are the same, will mixing still produce an increase in entropy? ...dave souza, talk 13:54, 4 October 2006 (UTC)
It's an IMPORTANT question. When I scanned that incredible statement, I thought PAR was making some kind of an inside joke or challenge to those with whom he was chatting, so I ignored it. In saying "same mass" (but hiding from you that it was a different structure), he might have been describing organic isomers, but why bring such a complex example to a general audience? Equal mass normally means identity. Total nonsense -- that is not at all supported by the good, though ancient, Callen text. (Why did PAR cite the page in Callen that discussed the separation of uranium isotopes? They certainly don't have the same mass -- or identical energetic collisions or no entropy change on mixing!) Somewhere in my files, that I cannot find (:-( ), is an article "ALL ENTROPY IS MIXING"! If not the truth, it's close to it FrankLambert 17:02, 4 October 2006 (UTC)

[edit] So many wonderful questions!

Don't have time before tomorrow but let me respond to your first paragraph and then make a quick connection that may be behind all/most of your questions.

First, all entropy increase in chem molec motion is DUE to real physical molecules moving in 3-D space, of course -- thus ALL I'm/beginning chem is focused on is ent. increase in phys space and the most obvious example is ent. of mixing.

Second, the beauty of the concept of energy dispersal (to me!) is that we focus first on real molec movement and then directly go to its abstract meaning like this: Consider the fast colliding behavior of a mole of real molecules in real space (the medium)-- yes, of course, 'evenly distributed' at one instant -- one instant, then, is the time to take one FREEZE FRAME -- that can be translated to an ABSTRACTION: the plot of ONE Boltzman distribution of the energy of each molecule (merely assuming that you and I could see the energy of each molec. in that freeze frame instant!). Then, in the next instant, in which perhaps only 1 or 1 trillion molec collisions occurred, we could snap another freeze frame, rapidly check each molecule and plot another.

Oh, if you believe in ergodicity, it would take many millennias times millennias times millenias to do a complete job -- but does that start to answer your question of whathehell energy dispersal means? It means literal spreading out of literal molecules in greater space than before (in mixing)and initially and then finally taking a myriad of freeze frames of all the possible B. distributions that they sequentially demonstrate in a few zillion eons or million seconds or seconds -- but our count, even though we're quick-eyed, won't be a bejeebers-worth compared to your ability as a skillled stat mech to count the number of microstates via your cell model of A and B and their relative moles -- that you now know is NOT a count of location "microstates"."configurations", but honest to God microstates of the different ENERGY distributions for that particular mixture system!

It's all based on simple molecules hitting each other and our checking their energies and counting the different possibilities....That's a start. I'm sure you'll tell me what's wrong with it and then we can go from there :-) FrankLambert 05:37, 4 October 2006 (UTC)

No problem with your description of what is going on. No problem with what we both know is actually happening. The problem is the use of the term "energy dispersal".
First, can we confirm clearly and unambiguously that "energy dispersal" is not always dispersal of energy in physical space. From the "Entropy is simple" page I take the following quote:

some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.

Also can we confirm that the entropy of mixing example is just such a case - the internal energy densities are a fixed constant before and after mixing and therefore no dispersal of energy in physical space has occurred? ((PAR, I assume. (FLL)))


An apologetic note: That word "often" in the quotation is my stupid error. ALWAYS, energy of any type flows from being localized, if it is not constrained. I have asked my web expert to remove it immediately. Now to substance:


Thanks very much for sharpening the questions. At the outset, if you say you accept "what is going on" (that energetic molecules move in 3-D space, gases and liquids relatively freely), I don't understand what is ambiguous about "energy dispersal". It is a description of molecules moving from a relatively smaller 3-D space/situation/status/container, continually exchanging energy with ANY other molecule(s) they encounter, to a larger 3-D space.


Internal energy density? By that, do you mean the density in space of the motional energy of gas molecules of A, prior to their being admitted to a space occupied by gas molecules of B? Then, on the concrete/beginners level, the change from initial pure A to its mixing with B (and thus, in a larger final volume) is a radical change in its 'energy density' in space: fewer A molecules per cubic cm. in that final state. On a more sophisticated level of abstraction -- of energy levels (!) -- the 'energy density' (however you define it) is also radically changed because of the decreased spacing between energy levels common to any increase in volume. (Despite a possible confusion in terminology of the energy levels being considered "more dense" in some writing.)
That's WHY mixing occurs: the enabling factor of energetic molecules PLUS the actualizing factor of a process (in this case, allowing a fluid access to more mutual volume in the space of another fluid) that provides a larger number of accessible microstates -- an increase in entropy. FrankLambert 16:22, 4 October 2006 (UTC)
Ok - The "energy of the B molecules" at any instant in time, is perfectly definable. Lets use shorthand and say it is "owned" by the B molecules at that instant. At some other instant, there is another value of energy "owned" by the B molecules. No problem.
The problem is that, for dispersal to be defined in a constant energy density situation, the energy must be "trackable". You must not only be able to say "this molecule owns this energy" but also that, at a later time, "that same energy is now owned by this molecule". I have two problems with the idea that the energy initially "owned" by, say, the B molecules has become dispersed by reason of the dispersal of the B particles themselves.
  • This dispersal must be able to be tracked on a molecular level. We must be able to say "look, the energy that was once here, owned by a B molecule, is now there, and there, and there." There is great difficulty here.
    • Suppose an A molecule and a B molecule collide, and after the collision have the same energies as before the collision. Has energy been unchanged throughout the collision, or has energy been exchanged, or maybe half unchanged, and half exchanged?.
    • Suppose particle A is at rest, and particle B strikes it, and after the collision, it is now particle B at rest and particle A has all the energy. Was energy unambiguously transferred from B to A? What ever the answer is, consider the very same collision from the point of view of an inertial system moving with the center of mass of the two particles. To an observer in this frame, the two particles have equal energy before and after the collision. Whatever the answer is in the first frame of reference must be consistent with the second frame of reference.
My point is, that at a molecular level, energy cannot be said to be "owned" by a particular particle and then that same energy be "owned" by another particle at a later time. The total energy of all B particles at a given instant in time can be defined, but how that very same energy is distributed at a later instant, particularly when the energy density is unchanging, cannot be defined.
  • On a macroscopic (and classical!) level, lets suppose that spatial energy dispersal can, in some way, be defined for the mixing case in which the energy density is constant. Lets suppose the A and B molecules have the same mass, diameter, etc. In other words, the difference between them is some esoteric thing that is not relevant to the energy range we are dealing with. Ok, thats one case. Now consider another system in which each B molecule is replaced with an A molecule. All velocities and positions are the same. Now, as time goes by, each collision in the AB case has a corresponding collision in the AA case. The energy flows, however you wish to define them, are identical because the velocities, positions, and collisions are identical. Yet in the AB case there is an increase in entropy, and in the AA case there is not. The energy dispersal, even if you could define it, implies entropy increase in one case, not in the other.
In short - energy dispersal in a constant energy density situation cannot be unambiguously defined, and even if it could, an entropy increase could not be unambiguously assigned to that dispersal. PAR 17:43, 4 October 2006 (UTC)


OK, we seem to be getting somewhere! I have two questions, answers to which would help me to understand your above statements better. One, of two parts, is from my prior post:

Internal energy density? By that, do you mean the density in space of the motional energy of gas molecules of A, prior to their being admitted to a space occupied by gas molecules of B? Then (1), on the concrete/beginners level, the change from initial pure A to its mixing with B (and thus, in a larger final volume) is a radical change in its 'energy density' in space: fewer energetic A molecules per cubic cm. in that final state. Now (2): on a more sophisticated level of abstraction -- of energy levels (!) -- the 'energy density' (however you define it) is also radically changed because of the decreased spacing between energy levels common to any increase in volume. (Despite a possible confusion in terminology of the energy levels being considered "more dense" in some writing.)

Two: When you consider a mixture of two completely different gas molecules, A and B, by your usual statmech routines of combinatorially counting them and inserting the result in a Boltz. entropy equation, followed by conversion to moles/mole fractions, what would you tell students or me that you have been counting?

Thx! (Maybe we should break this up to "2.Questions" or such, for readier editing?) FrankLambert 21:21, 4 October 2006 (UTC)

Re, "Suppose particle A is at rest, and particle B strikes it, and after the collision, it is now particle B at rest and particle A has all the energy"...why would particle B be at rest? That violates both the classical and quantum laws of physics. •Jim62sch• 00:31, 5 October 2006 (UTC)


Thanks, Jim, but let's hang in there until PAR responds to my two basic questions! Entropy and its relation to energy dispersal only can be considered for systems with, say, some billions of constantly and rapidly colliding particles. (I really don't know limits but conventional chemistry deals with a minimum of around a quadrillion.) Thus, his case is only pertinent if it applies to that level of system so far as chem is concerned. He may care intensely about 10 particles suspended in vac in a laser beam -- and bully for him -- but that is trivial in our forging toward the goal of presenting entropy to WK readers generally! FrankLambert 01:20, 5 October 2006 (UTC)
Hello Frank - in response to your first question, I am thinking classically, and yes, in the mixing example, at any instant in time we can define a varying energy density of A molecules and/or B molecules as a function of position in space. Also, yes, lets keep it on a concrete/beginners level, and concentrate on the discussion of whether energy dispersion is spatial in the mixing example. My point is that demonstrating that the A and B energy densities change in time during mixing does not demonstrate energy dispersal. To cut to the bottom line here, I'm saying you cannot measure dispersal microscopically nor macroscopically. You can measure A densities and B densities varying in space, but you cannot measure nor define dispersal using these quantities, nor any other measureable quantity. Let me ask: If energy dispersal exists in this case, how would you quantitatively define it and/or measure it?
In response to the second question - I'm not sure I'm interpreting your question the way you meant it. When you say "combinatorially counting them" I am taking that to mean counting the number of microstates availiable to a system containing these particles assuming a fixed total energy and number of A and B particles. Or, equivalently, counting the number of ways the particles can be distributed in the various energy levels of the system, again making sure that energy and particle numbers are conserved. (Correct "Boltzmann" counting assumed).
In response to Jim - its like a billiard ball striking another ball head on. The first ball stops dead, the second then moves away at the same speed in the same direction (assuming no spin effects). Energy is conserved, momentum is conserved, no laws are broken. PAR 02:09, 5 October 2006 (UTC)
PAR, thanks, but....the billiard ball does not stop. If you take freeze-frame pics, you'll see that it continues to move, forward, backward, laterally, etc (see Heisenberg). (Also, when you note "assuming no spin effects" you introduce a quantum violation -- you can't assume an ansense of motion effects). Also, when discussing atoms or molecules, even if the atom or molecule were to truly stop, its point particles (or strings) would continue to vibrate. Thus, while its momentum, spatial motion, whatever-you-want-to-call-it might "appear" to stop, its dispersal of energy would not due to the vibrations. •Jim62sch• 02:28, 5 October 2006 (UTC)
I was just speaking classically. By "no spin effects" I meant no english on the ball, a macroscopic classical spin, not a quantum effect. In classical mechanics, if the strike is dead-on and the masses are the same and the balls are not spinning, the first ball will stop, the second will continue on with the same velocity. PAR 02:45, 5 October 2006 (UTC)
True enough, in a sense, however we're not really discussing billiard balls here (quite the leap from gas molecules to billiard balls). Also, the first ball will non stop at the nanosecond it strikes the second, but some time afterward. If one could snap pictures at a sufficiently fast enough rate, movement would still be seen. And, the molecules, atoms, and particles or strings, will continue to vibrate. In fact, were we to carry out this experiment in a frictionless, microgravity environment, we would see Newton's third law at work. •Jim62sch• 11:54, 5 October 2006 (UTC)

[edit] Conclusion --

PAR, we now have a well-coupled dialogue:


You said (about mixing A and B), “If energy dispersal exists in this case, how would you quantitatively define it and/or measure it?” (Previously, you said, “No problem with what we both know is actually happening” in re my 05:37 4 Oct. comments about ‘real molec movement'.)


Let’s take a more obvious example than A and B 'anonymous gases' – say, molecules that are responsible for the color in a liquid food dye. In their bottle (that is mainly water) they really are speeding, colliding and this kinetic activity (at an unchanged T) is their ‘internal energy’. The molecules of the dye in one drop placed in a beaker of water at T, even with no stirring or convective currents, will very slowly spontaneously move so that the whole beaker has a faint color. This, to me or any chemist, is only understandable as a dispersal of the internal energy (the kinetic movement but also including any potential energy) of those dye molecules. Mainly and ceaselessly colliding with water molecules, they reach an equilibrium state of greatly spread out energy distribution (compared to their state in the bottle) in the space of the enormously greater volume of the beaker of water. The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase (but I will discuss examples whose entropy change is easier to calculate).


For over a century, this has been a quantitative technique for determining the amount of a colored substance in a solution, fundamentally based on how spread out are the energetic molecules in a particular sample. (It can be equally successfully used to determine the concentration of red-orange bromine vapors in air et sim., of course.).


On the educationally lowest level, then, this is a quantitative definition and measure of molecular motional energy dispersal. Let’s move to a more fundamental level. My definition at the start of our discussion (with mixing as a principal case) was, “Entropy change is the quantitative measure of spontaneous increase in the dispersal of energy/T …”


Consider a mole of argon in a 24.5 L chamber, connected by a stopcock to an identical chamber with a mole of helium, the two mounted in a thermostat at 298 K. The stopcock is opened and the two gases spontaneously mix.


Obviously, the internal energy of the argon – the motional energy of the total of all the molecules of argon -- that was concentrated in its 24.5 L is more spread out in the final 49 L. But the next more fundamental level, is the entropy increase in the mixing: Δ S = - R (nA ln xA) + (nB ln xB) when n is the total number of moles, nA and nB are the moles of the components and xA and xB are their mole fractions. With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.


But I have emphasized again and again, and assumed that readers would do the calculations, that the ultimate measure of dispersal of molecular motional energy in a process is the change in the number of accessible microstates as a result of that process. It is a trivial calculation thanks to Boltzmann’s (and Planck’s) genius: the WFinal from Δ S = k ln WFinal/WInitial.


Given that the standard entropy of argon is ~155 J/K mol (and, of course, its WInitial at 0 K is 1) there are two questions. 1. How many microstates are accessible for a mole of argon at 298 K in its original chamber? 2. How many are accessible for the argon after the two gases have been mixed.? (This question simply involves adding 5.76 J/K to argon’s standard state and recalculating W.))


The answers are: 1. 10 ^ 48,600,000,000,000,000,000,000,000. [This exponent is 48.6 x 10 ^ 24.] 2. The W for the argon in the mixture is 10 ^ 48,800,000,000,000,000,000,000,000.


The numbers of these accessible microstates are enormous to the level of incomprehensibility. They should be interpreted simply as the relative number of choices that a system has in one instant of being in a different microstate the next: the larger the number, the less the chance of the system remaining in its present microstate – i.e. the less the chance of being localized. Thus, in contrast to localization, the greater the number of accessible microstates, the greater the dispersal of the energy of the system. This is the definition of entropy change that I repeated in my 4 October summary here.


(A system never could explore ["in a temporal dance", as Harvey Leff says] even a tiny fraction of the calculated number of microstates in 'near-infinite' time, and in fact computer programs show a rapid narrowing down from ‘truly incredible 10^10^25’ to merely ‘gigantic’! Nevertheless, the greater is the entropy change, the greater is the number of calculated accessible microstates.)


Spontaneous entropy increase always involves the dispersal of energy. FrankLambert 17:30, 5 October 2006 (UTC)


Thanks for your reply. Please - I am having trouble visualizing energy dispersal on a microscopic level. I would like to concentrate on the educationally lowest level (i.e. spatial energy dispersion in the entropy of mixing case) since if we can get that out of the way I believe the rest will be easy. You said:

The total of the internal energies of all those dye molecules has not changed a bit, but their distribution in space has become spread out perhaps a thousand fold. This is an example of spontaneous energy dispersal of the molecules, an example of a spontaneous entropy increase.

Suppose just after the dye is dropped into the water, a dye and a water molecule collide and the dye molecule comes to rest (assume they have the same mass). The dye molecule has no energy now. Then, the dye molecule is struck only by water molecules that have not collided with a dye molecule, and finally the dye molecule makes it to the edge of the glass. It carries none of its original energy to the edge of the glass - the energy it carries is only the energy it has picked up in its collisions with water molecules. The presence of a dye molecule some distance from where it originally was does not imply that its original energy has been carried along (i.e. dispersed) with it. Also, I still cannot measure energy dispersal. The measure you gave is:

With the amounts in our example, the Δ S for either argon or helium comes out to be 5.76 J/K. This is the quantitation that you demand for the spreading out or dispersal of the motional energy of argon in this mixture.

This Δ S is the amount of entropy change, not a measure of energy dispersal. PAR 19:21, 5 October 2006 (UTC)


Hey PAR --this helps me enormously to understand your complex 17:43, 4 October from which I gained a glimmer but not enough. I believe that THIS is the key to our differences: You are, it seems, an expert physicist/math guy and so, as Jim62sch, you focus well on individual particle collisions.
But that sort of detail is of NO concern ZERO zilch when you have 10^23 -- or even a teensy 10^14 -- molecules!!!! THIS is the kind of system chemists and most students deal with. Going an average of a thousand miles an hour so N2 at 1 atm and 298 K collides about 5 billion times a second, THAT'S why everything averages out and any specific 'brand' (i.e., mass, rotat, vibrat possible modes)of molecule DOES retain its initial motional energy AS it disperses. (Of course, that's simple mixing with no change in temp -- it's the temp change that causes a different KIND of dispersion, i.e, butg still a greater number of accessible microstates -- a change in the M-Boltz distribution so more molecs. can be on the higher energy levels (excuse me, I'm writing fast -- you know THIS sort of change VERY well!)
BUT considering your last two lines, PLEASE reread my next to the last paragraph -- just as the concrete beginning student view sees molec energies not left localized but dispersed in space in entropy increase, entropy increase profoundly but abstractly means more accessible microstates: LESS LOCALIZATION of the system in the sense of having more rather than fewer microstates in ONE of which the system might be the next instant --- THAT is what energy dispersal DOES/IS (!) in terms of entropy. Delta S IS the key to understanding for concrete to high-level abstract thinkers how energy dispersal is involved in spontaneity.
Excuse caps! Hastily, FrankLambert 20:42, 5 October 2006 (UTC)

[edit] Some observations (energy dispersal)

Some observations (made already, but I thought worth gathering together again) on "dispersedness of energy" as the be-all-and-end-all way of thinking about entropy:

1. The most straightforward interpretation of energy dispersal fails for entropy of mixing, because at the end of the day both originall separated parts of the system will have contents with exactly the same energy (and the same energy density) that they started with.

2. What happens if a structure with configurational entropy is cooled towards a limiting temperature of absolute zero? In the limit there's still a residual entropy, even though there's no energy.

3. The most dispersed arrangement of energy of all, a completely even distribution, actually has very low entropy, because there is only one way to achieve it. In reality in equilibrium (maximum entropy) there are significant deviations from perfect dispersal.

To even start to make the "dispersal of energy" story work, it seems to me you have to start making non-obvious special definitions that are at least as radical as defining "disorderedness" in terms of multiplicity.

Fundamentally, the problem is that it's not energy being dispersed between different microstates - it's probability. Whichever microstate the universe ends up in, each of those microstates must contain all the energy of the original microstate. It's not energy being dispersed, it's certainty being spread out into more and more broadly spread distributions of probability.

That's why ultimately, I fear tying entropy uniquely to the idea of energy dispersal can become a much bigger stumbling block to a good understanding of entropy even than the confusion which can be caused by the word "disorder". Jheald 23:44, 5 October 2006 (UTC).

Incidentally, I think we should also be careful with the phrase "accessible microstates". It can get quite confusing as to whether we're talking about the totality of ergodically accessible microstates in infinite time; or the "number of choices that a system has in one instant of being in a different microstate the next" ((where I assume we're thinking about a Heisenberg picture notion of states of the system, subject to "quantum jumps" from external perturbations)). Better usually I think to talk about the number of microstates compatible with the particular macroscopical and dynamical description we have in mind. Jheald 00:05, 6 October 2006 (UTC)


[edit] Thermodynamic Entropy

Thanks to Jheald for relegating older Talk:Entropy sections to an Archive. Skilled in information theory, Jheald has every right to express his understanding of that topic (e.g.,as in Archive 3 at 21:41 4 July). However, I chose to emphasize that word "Thermodynamic" in the heading of this section because his preceding comments may confuse readers who do not realize his divergent views about thermodynamic entropy.

Taking them in order and number: 1. Here, Jheald implies that "energy of A/Volume" exactly equals "energy of A/ 2Volumes". I'm sure it is an innocent error. The second paragraph in my "Conclusion" above corrects this kind of misunderstanding. I would urge readers to check it and its explication.

2. I told JHeald (19:25, 30 June) about a ms. from a colleague concerning the "problem" of residual entropy and, to save the list complex trivia, offered to give full details via email. He didn't respond. The ms. has since been accepted for publication (E.I Kozliak, J.Chem.Educ. "in press"; will be published within 6 months; email me and he might be willing to release a copy of the ms.) It is perfectly covered by my emphasis on energy as the proper focus in evaluating entropy change: multiple Boltzmann distributions (of ENERGY/energetic molecs.!) being frozen-in to solid CO, H2O, FClO3, etc.

3. This statement shows the confusion that arises when information 'entropy' ideas are misapplied to thermodynamic systems. All thermodynamic entropy numbers/calculations/evaluations are actually entropy changes -- from 0 K; from an initial state to a final state [each of them ultimately relatable to 0 K state]. There is only one situation in thermodynamics where any system can have "one way to achieve it" and that is a perfect crystal of a substance at 0 K. Thermodynamic equilibrium for any system indeed is a max distibution of energies, i.e., max dispersal, MaxEnt for that system under its particular equilib. constraints of P, V, T, etc.

The dispersal of energy evaluated by the Boltzmann entropy equation is coincident with the multiplicity of thermal physics. That's a major part of its value -- direct relationships that unite simple standard views of molecular behavior that then lead to stimulating new insights in stat mech and are consonant with QM.

JHeald: "the problem is that it's not energy being dispersed between different microstates - it's probability. Here we go again! (First, _I_ am not talkng about energy "being dispersed BETWEEN different microstates"!!! The energy of a system is ALWAYS and ONLY present in one microstate at one instant. If it stayed in that same microstate or had only a dozen or a million microstates TO WHICH IT COULD change in the next instant, that would be a more localized distribution of that energy than if there were the choice of any one of quadrillions or godzillions of microstates to which it could change. That is what 'energy dispersal' in a system means in terms of microstates.)

Then, JHeald's viewpoint is skewed toward info 'entropy' and diverts emphasis on thermodynamic entropy, the focus of this section of Wikipedia. Information theory, indeed, is half of thermodynamic entropy results; the probability measurement of energy's dispersal in the 'actualizing' half. It is important. (See 21:32 of 1 July for my presentation). But the function of probability is in describing the max number of ENERGY distributions that are accessible under the constraints of the process being examined. I don't think that a roaring gas flame heating a huge boiler or even a hot gas stove burner boiling a pan of water is well described by "certainty being spread out into distributions of probability". Thermodynamic entropy deserves a better press than that :-).

Energy dispersal a stumbling block? I don't know evidence of that. Wish I could quiz the 50,000 or more students who were exposed to it instead of 'disorder' last year.

"Accessible microstates"? We are indeed talking about the totality of ergodically accessible microstates in infinite time which is better described to beginners in the terms of 'choices..etc.' NO WAY are those Heisenberg QM susceptible states! The 'jumps' occur because each microstate describes a particular set of molecular energies of the system that certainly ARE compatible with what JHeald urges. FrankLambert 04:51, 6 October 2006 (UTC)


There are important points in the following by Jheald and I think my responses would be more readily related to them if, instead of a separate large reply, each of my comments were inset immediately after his statement. I hope that this meets his wishes and to avoid confusion I will add our initial to each, each time: Jh and FLL. FrankLambert 04:55, 7 October 2006 (UTC)
I'm going to be away from the 'net for a bit, so forgive me if this is my last follow-up for a while. Jh
Firstly, information theory & thermodynamic entropy. As Frank says, there are two elements to thermodynamic entropy. One is the numerics - as far as this goes, thermodynamic entropy simply is an information entropy - it's the same fundamental formula &Sum; p ln p, arising for the same mathematical reasons, with the same interpretation: thermodynamic entropy is a measure of the uncertainty about the microstate of a system given a macroscopic description. The second element is that this is not just any old uncertainty, this measure has a particular physical significance.Jh
I agree with virtually all that Jheald writes -- it's just that to me, that paragraph is like a near perfect replica of a human being, the instant of/after death. All the framework, all the systems, all the chemicals...but no energy flow. That's really all that separates us: I see the enabling of constant energetic molecular movement as SO vital, SUCH a dynamic activity that MIGHT make something happen, that when a process ALLOWS it to happen and it takes the course that probability predicts, that's a nice conclusion. Jheald, as I hear him, marvels at the truly near-infinite number of relationships describable by probability -- from encrypting messages to predicting the number of energy arrangements when a pot of water changes to steam and thus, he sees thermodynamics as just something of 'particular physical significance'. FLL
According to Frank, that's because thermo entropy counts the different possible arrangements of energy. I think that's too much of a short-cut. I think it's more accurate to say that thermo entropy counts the different possible arrangements of the system. The relation to energy is that (spatial) dispersal of energy is one way entropy can increase, so if we want to "undisperse" energy from heat into useful work, that has to be paid for by some other increase in entropy (for example, by the entropy of mixing in the thought-experiment some sections above). Jh
Thermo entropy is defined in terms of an energy transfer/dispersal/spreading out: q(rev)/T -- q spreading out either to or from somewhere depending on whether the 'somewhere' is a tiny bit cooler or warmer. Thus, we're really required (or I'd say privileged!) to see the commonality of spontaneous events as involving energy dispersal: Whether we're looking at mixing (that we have beat to death this week!) or thermal energy transfer, it is conceptually unifying that all involve energetic molecules moving. Jheald is right that traditional statistical mechanics has successfully counted the number of arrangements frolm seemingly stationary 'cells'/positions of the system's molecules in phase space and arrived at its entropy. However, in August, I had my ms. accepted for publication that states the obvious: in phase space the location of a molecule and its momentum are coincident. Thus, the number of possible arrangements of the locations of molecules is precisely the same as the distributions of energy for a system. (As I have cited before, Hanson, J.Chem.Educ. 2006, 83, 581-588, foresaw this result in his calculations.) FLL
This second element, it seems to me, is not so much about what entropy is, but rather why this particular information entropy - &Sum p ln p applied to the thermodynamic setup - is so significant. Jh
Agreed. Both together constitute what. (Disagreed (!) ONLY for emotional reasons! -- info theorists see their beloved domain as the most important. Thermodynamicists see energy as fundamental. There'd be no why if either were missing.) FLL
Now, dispersal. We can accurately say that an increase in entropy is associated with a dispersal of probability. That is something that is reliable, is simple, and always true. On the other hand, it seems to me that to insist that everything must be fitted into the language of "dispersal of energy" necessitates more and more complication - "oh well, you have to distinguish between the energy associated with molecules A and molecules B"; "oh well, it still sort of works even if there's no energy"; "oh well, it's not a perfectly even spatial dispersal - because that would have unphysically low entropy - it's some other sort of dispersal"; "oh well, it's not spatial dispersal, it's microstates"; "oh well, it's not energy dispersal between microstates, because they all have the same energy, it's dispersal of what microstate the energy might fall into"; ... -- Cui bono? Jh
Sorry. Don't want to put you down, but you've missed the simplicity of all that I have been writing about in mixing. Here are serious responses: "we deal with so many molecules that A's molecules B's different mass molecules have no problem in distinguishing themselves from the other guys; no, nothing works if there is no energy; you go to 0 K if you want to fool around with single state non-dispersal!; it IS always spatial dispersal on a macro scale; just so it IS always in the abstract/theoretical the number of microstates." All the above? No sweat. Like shooting fish in a barrel IF you merely read what I've written in the past week. FLL
What's the advantage of being such a one-club golfer? Jh
Because you're playing the wrong game!! It's pool (on a 3-D or 6-D table :-)) with 10^23 balls and ONE FAST CUE, man!! FLL
Isn't it easier to say that (spatial) energy dispersal is one way entropy can increase -- and it's particularly important because it's what we need to counterbalance, if we want to turn some heat into useful work? Jheald 10:18, 6 October 2006 (UTC)
Yipe!! Please, please flip up to my 00:22 4 October detailed presentation of the two major modes of entropy increase -- that puts the types of thermal energy transfer and expansion/mixing in perspective. Thx! FLL FrankLambert 04:55, 7 October 2006 (UTC)
THIS WEB:

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - be - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - closed_zh_tw - co - cr - cs - csb - cu - cv - cy - da - de - diq - dv - dz - ee - el - eml - en - eo - es - et - eu - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gd - gl - glk - gn - got - gu - gv - ha - haw - he - hi - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mg - mh - mi - mk - ml - mn - mo - mr - ms - mt - mus - my - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - rm - rmy - rn - ro - roa_rup - roa_tara - ru - ru_sib - rw - sa - sc - scn - sco - sd - se - searchcom - sg - sh - si - simple - sk - sl - sm - sn - so - sq - sr - ss - st - su - sv - sw - ta - te - test - tet - tg - th - ti - tk - tl - tlh - tn - to - tokipona - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu

Static Wikipedia 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2007:

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - be - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - closed_zh_tw - co - cr - cs - csb - cu - cv - cy - da - de - diq - dv - dz - ee - el - eml - en - eo - es - et - eu - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gd - gl - glk - gn - got - gu - gv - ha - haw - he - hi - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mg - mh - mi - mk - ml - mn - mo - mr - ms - mt - mus - my - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - rm - rmy - rn - ro - roa_rup - roa_tara - ru - ru_sib - rw - sa - sc - scn - sco - sd - se - searchcom - sg - sh - si - simple - sk - sl - sm - sn - so - sq - sr - ss - st - su - sv - sw - ta - te - test - tet - tg - th - ti - tk - tl - tlh - tn - to - tokipona - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu

Static Wikipedia 2006:

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - be - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - closed_zh_tw - co - cr - cs - csb - cu - cv - cy - da - de - diq - dv - dz - ee - el - eml - en - eo - es - et - eu - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gd - gl - glk - gn - got - gu - gv - ha - haw - he - hi - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mg - mh - mi - mk - ml - mn - mo - mr - ms - mt - mus - my - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - rm - rmy - rn - ro - roa_rup - roa_tara - ru - ru_sib - rw - sa - sc - scn - sco - sd - se - searchcom - sg - sh - si - simple - sk - sl - sm - sn - so - sq - sr - ss - st - su - sv - sw - ta - te - test - tet - tg - th - ti - tk - tl - tlh - tn - to - tokipona - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu