Greetings from Paris! Just checking in to do a bit of self-promotion, from which no blog-vacation could possibly keep me. I’ve written an article in this month’s Scientific American about the arrow of time and cosmology. It’s available for free online; the given title is “Does Time Run Backward in Other Universes?”, which wasn’t my choice, but these happenings are team events.
As a teaser, here is a timeline of the history of the universe according to the standard cosmology:
- Space is empty, featuring nothing but a tiny amount of vacuum energy and an occasional long-wavelength particle formed via fluctuations of the quantum fields that suffuse space.
- High-intensity radiation suddenly sweeps in from across the universe, in a spherical pattern focused on a point in space. When the radiation collects at that point, a “white hole” is formed.
- The white hole gradually grows to billions of times the mass of the sun, through accretion of additional radiation of ever decreasing temperature.
- Other white holes begin to approach from billions of light-years away. They form a homogeneous distribution, all slowly moving toward one another.
- The white holes begin to lose mass by ejecting gas, dust and radiation into the surrounding environment.
- The gas and dust occasionally implode to form stars, which spread themselves into galaxies surrounding the white holes.
- Like the white holes before them, these stars receive inwardly directed radiation. They use the energy from this radiation to convert heavy elements into lighter ones.
- Stars disperse into gas, which gradually smooths itself out through space; matter as a whole continues to move together and grow more dense.
- The universe becomes ever hotter and denser, eventually contracting all the way to a big crunch.
Despite appearances, this really is just the standard cosmology, not some fairy tale. I just chose to tell it from the point of view of a time coordinate that is oriented in the opposite direction from the one we usually use. Given that the laws of physics are reversible, this choice is just as legitimate as the usual one; nevertheless, one must admit that the story told this way seems rather unlikely. So why does the universe evolve this way? That’s the big mystery, of course.
Dear Sean,
Firstly, I want to say that I think your model is really interesting. I also think that you should be really commended for taking on such a difficult issue. I think your model has some problems though. Take the/a big bang. At this point, you posit that events evolve both towards the future and past with entropy increasing. This is akin to simply treating time as being both positive and negative, and presumably this is justified by the fact that the second law predicts that entropy should equally increase in both past and future directions.
However, other than the following clearly being so problematic, there doesn’t seem to be any real reason in your model why entropy shouldn’t equally increase into the past at all times during the future directed evolution (rather than just at t=0). Moreover, and more importantly, evolving something into the past only makes sense if one has a past to work with (i.e. something that has already happened), otherwise it wouldn’t/couldn’t be the past! Without one, the temporal orientation of the model’s so-called past-directed evolution would be indistinguishable from the so-called future-directed one. Of course, at t=0, you do have a past to work with; the earlier universe. But for events to evolve into the past here, entropy would be decreasing in this direction, and it would be a no go.
This, however, is not to say that entropy cannot increase in both time directions, as I certainly think it can. One needs a past to work with, however, and by definition, this is set and has already taken place. Moreover, the reason entropy cannot ‘simultaneously’ increase in both time directions, is because if one reverses the direction of evolution of a system with entropy increasing, entropy would be decreasing in that direction. The system already has a past which is set, and one cannot invent a new one for it.
Forgetting the above, though, I think your model has a more basic problem. If the past was infinite and the universe had no beginning point, it would be impossible for it to evolve, not only to where we are today, but at all (forward or back). Of course, the idea of a universe having a beginning at some finite time in the past is equally problematic. Neither can be correct. Although you are obviously far from alone in this respect, I think you have to face up to this.
Best wishes
Peter
I have a problem with the idea that the number of possible entropic microstates remains constant across the time line between nearly empty to nearly empty. It seems to me that communication of individual perceptions carries information which can effect outcomes in a blend of eddies that reflect qualitative differences.
Yes, there is a fertile field at the extreme limit which one might refer to as quantum-gravitational microstates, even so, I cannot see why this uniform and featureless blend of perfectly organized building blocks should not be viewed as unstable zero entropy ready to burst with potential qualities expressed as conscious and democratic decisions by perceivers who act as individuals to qualitatively manipulate unfolding physics.
Am I way off course here? It’s kinda important to get this part of things straight.
Garrett
Garrett Connelly wrote (#102):
>
> Yes, there is a fertile field at the extreme limit which one might refer to as quantum-gravitational microstates, even so, I cannot see why this uniform and featureless blend of perfectly organized building blocks should not be viewed as unstable zero entropy ready to burst with potential qualities
Leaving aside the (sentient?) perceivers aspect, which seems entirely irrelevant, and assuming “unstable zero entropy” is supposed to mean something like “immanent or potential low entropy”, my model (summarized in #59 and #96) does precisely that.
Some overarching influence is needed to sweep those microstates into a configuration where the “potential qualities” you refer to can cohere into a new form of dynamics (which I argue is comparable to an early low-entropy evolutionary stage of the previous lot of microstates, and the process can be repeated).
Although the influence must have the appearance of being fairly uniform, one need not take the word “overarching” too literally: The uniformity could be a consequence of local interactions acting the same everywhere, rather as a large flock of starlings seems to swoop and soar as a unit even though this is a consequence of each bird simply tracking its nearest (seven I gather?) neighbours.
I think Peter Lynds is thinking along the right lines; but my impression, after a quick look at his paper, was that it doesn’t adequately address the issue of initial low entropy.
Interesting speculation, a bit out there, but interesting. Loop Quantum Gravity and BigCrash.org have some interesting and possibly more compelling speculations.
I do find very plausible the notion of dark energy. If space contains even any vacuum energy, that would destroy the arguments of some that Hawking Radiation might require energy to be pulled from black holes, somehow across the event horizon, to balance the energy equation. That disputed and now more frequently rejected theory was based on several other dubious conjectures also…
Speaking of conjecture, some are claiming that the Safety arguments claiming that creation and capture of micro black holes on Earth are nothing more that conjecture.
Any idea why some very eminent scientists such as the inventor of Chaos Theory’s Rossler Attractor are EXTREMELY concerned that the risks from the Large Hadron Collider may other that what is being told to the public. The danger might actually be a virtual certainty if some disputed assumptions are in fact as some scientist believe may be possible or likely.
Did you know that we currently have NO credible reason to exclude possible high or inevitable risk from stable fast growing micro black holes?
No good scientific reason to reasonably rule that possibility out. Really.
Start this learning process at LHCFacts.org.
John Ramsden:I cannot see why this uniform and featureless blend of perfectly organized building blocks should not be viewed as unstable zero entropy ready to burst with potential qualities
It’s an “alternative” to progressing the view of what is actually taking place in RHIC, and LHC, keeping in mind, what Sean is offering for perspective.
What is self evident by way of inductive deductive inference(historically sits at the very derivations of a constitution by John Adams) is an exercise in coming to the next step in what is possible in “other universe” is not possible in this one.
By phenomenology alone, and while there are experts here on this site, such leading experimental situations offer the continuance of this perspective, that I offer to Sean’s position?
“A phase” in the collision process itself? Provides for, the “relativity of expression” at this level?
Plato (#105), Those looked like my words. But that whole first “paragraph” was a quote from the preceding post by Garrett Connelly, to which I was replying.
Does it mean that our universe and all other universes are just bubbles in the ocean of eternal unchanging energy?
I have been fascinated by such references in ancient Indian mythology but I am not sure if there is ever a way to prove it?
On the other hand, just the idea of things moving to higher and higher entropy for ever does not seem tenable either.
I think I read in Brian Greene’s book that said that if the laws of physics are symmetric with regards to time that means Second Law of Thermodynamics applies in the time-reversed direction as well and that would imply that present state is always the lowest entropy state ?
Paradoxical?
hitesh
My apologies John R Ramsden.
The reply still holds.
No offence – easily done. I’ll start using the ‘blockquote’ syntax, which I just noticed.
Maybe I’m being thick, but it wasn’t clear what you were trying to convey in #105.
It’s a continuation of #99.
For such “beginning and ends” to occur in our universe, they must satisfy a “relativistic interpretation “even at such quantum perspectives as denoted in the microseconds of the earlier universe? I think “this point” is important, and one must push back perspective according to the experimental data?
From all the responses to Dr. Carroll article in the June 2008 issue of Sci Am. it is clear that it is well written and interesting reading. Thanks to him and perhaps his wife, Jennifer Ouellette, for making science reading fun and informative(#33). We need more such scientist to help educate us on science in our complex modern world. Thanks and keep up the good work.
Quoting from his Sci Am article: ” Unfortunately, we do not fully understand entropy when gravity is involved” and, “In actuality, though, empty space has plenty of mocrostates-the quantum-gravitational microstates built into the fabric of space. We do not yet know what exactly these states are, any more than we know what microstates account for the entropy of a black hole”.
Perhaps in the evolution of a black hole the intense concentration of matter and energy leads to extremely high temperatures and matter “disappears” similar to the reverse situation of the Big Bang where in the first fractions of seconds or so, the universe cools and matter first appears. In both cases it seems that without matter, gravitational energy is minimal or absence and in both cases we have a hot, dense concentrations of energy. Just as in the Big Bang energy is thought to be homogeneous and in the low entropy state, so to we might expect at least some black holes to evolve into a similar rare homogeneous microstate that reverses the entropy from “Medium” or high entropy to the state of low entropy, and poises the black hole into a Big Bang scenario and a new universe.
While this “bounce” violates the second law of thermodynamics, it occurs under very rare and extremely different conditions than any other in the universe,and thus many not follow the same physical laws. However, it could lead to one that is similar to the state we know occurred at the time of the Big Bang.
How do you think you’re going to deal with ‘entropy’?
A good read though, thanks!
Thanks John R Ramsden. In relation to the issue of initial low entropy, my model (if correct) simply shows that there isn’t actually an issue and that the question is a misnomer. It can be a tough one to get one’s head around though.
Sean, I would be curious to hear how you would respond to my comments. Think of it as a friendly/collegial cosmological call out.
Best wishes
Peter
I want/try to keep my keyboard in neutral and strive to look strictly with other’s eyes, yet, in this particular realm, is entropy made any more complicated or difficult to understand by gravity than by the emergence of sentient perceivers?
Brian Greene says the only thing expanding as fast as the cosmos is consciousness. Albert Einstein said our perception of ourselves as individuals is something of an “optical illusion” of consciousness.
These points of view have practical everyday implications that are difficult to bring up most of the time.
Cordially wondering,
Garrett
Pingback: Reality is what happens to you while you’re busy coming up with other theories.* « Communion Of Dreams
Arrow of time always points towards the future as time is not just motion but also forces are part of time. One way to understand this is to consider two objects orbiting each other and imagine what would happen if time was slower faster or stopped for the orbiting pair. Both motion and forces will change increase and decrease or stop completely in the above senario with change in time.
At quantum level time seems to symetrical because we fail to keep into account that forces are also part of time. Forces determine the arrow of time. That is why we objects fall down but rise up from the ground unless force is applied. Same applies to any other situation.
i think that the way we are understanding time is wrong. there can be two reasons why time is asymmetry 1. we still dont fully understand the way time works 2. we are looking time in such a way that it seems asymmetric.
ENTROPY AND “DISORDER”
Since my sophomore year some 4 decades ago, when I first met ENTROPY, I’ve been pondering the darned thing (not continuously, of course,) desperately trying to make sense of it. My efforts certainly included Boltzmann and Gibbs, but not before I came across Shannon and informational entropy did I make any significant progress. I think I can now identify the culprit impeding my understanding: it was the culturally entrenched qualitative description of entropy as a measure of “disorder.”
The disorder explanation is widely used, especially when addressing scientists outside of physics, and the general public. In his SciAm article, Sean uses it several times, and with great stress. I claim that THE EXPLANATION OF ENTROPY AS A MEASURE OF DISORDER IS MISLEADING. But because it has become a culturally ingrained MEME that gets passed on uncriticized, even some accomplished physicists who were raised on it turn out to actually not understand entropy at all, let alone undergrads, engineers, non-physics scientists, science writers, the educated public and self-confident philosophers.
While entropy properly applies to a SYSTEM — the totality of all system states of non-zero probability, represented by an ENSEMBLE — I think most people intuit order and disorder, as I once did, as attributes of a state. Consider, for example, a rigid cubical vessel full of an ideal, monoatomic gas in thermodynamic equilibrium. For simplicity of discussion look at the gas particle locations only. In one PARTICULAR state, call it DG1, the particles are evenly distributed throughout the volume, but randomly, unlike in a crystal. To specify DG1 completely one must list all its particle coordinates — no data compression possible. This is what one would call disorder. In another PARTICULAR state, OG1, the particles are also evenly distributed, but periodically, as in a crystal, and conforming to the vessel walls. Few parameters are needed to specify OG1 and natural language would call it an ordered state. Yet both DG1 and OG1 are states of the same system and, being fully specified, are equally probable. But to neither of them does the concept of entropy apply. It does apply to the ensemble of all such states, is related to their number in the ensemble, and is divorced from their degree of order or disorder. (If most system states happen to possess a high degree of order, then, one may argue, their number must be relatively small and so must be the system entropy. While being true, this roundabout argument has hardly any didactical value.)
Consider now the example of the broken vs. whole egg that stars in Sean’s SciAm article. Again, for the sake of simplicity, consider a PARTICULAR whole egg named Humpty Dumpty (HD) and ignore its macroscopic mechanical motions (e.g., sitting on the wall,) and its microscopic thermal motions. Under these restrictions there exists just one distinguishable object that qualifies as a member of the set {HD}. Consequently, {HD}’s entropy is zero and we are justified in praising HD as a neat and orderly dude. Next, let HD have a great fall. The mess thus created surely deserves to be called “disgustingly disordered.” To determine the entropy of the new situation we would usually define an ensemble, a new set called “Broken Humpty Dumpty,” {BHD}, and enlist in it all distinguishable objects that qualify bona-fide as broken Humpty Dumpty. Since there is a vast number of such objects — the vast number of different ways HD might splinter and splatter and create Jackson Pollocks — the entropy of {BHD} is great and beyond the powers of all KH and KM.
But notice that the above definition of the set {BHD} is quite arbitrary. In deterministic physics one can mean by {BHD} the actual configuration HD assumed after his historical fall on that fateful afternoon of 15 July 1648, the fall that was documented by notable eyewitness historians in the famous nursery rhyme. In that case, {BHD}, too, contains only a single member and, despite its remarkable messiness, its entropy still equals zero.
A system may have many possible states, most may be “disordered” and some may pass as “ordered.” But regardless of whether the system accidentally or fleetingly occupies an ordered state or a disordered one, ITS ENTROPY IS THE SAME! That’s because THE ENTROPY IS RELATED TO WHAT THE SYSTEM MAY BE, NOT TO WHAT IT IS.
(A plausible measure of state disorder may be the “algorithmic randomness”, a.k.a. Kolmogorov complexity, of the state — the size of the shortest algorithm that generates a specification of the state on a Turing machine. A theorem due to C. H. Bennett (I think; Int. J. Th. Phys. V21, N12, 1982, p.938) demonstrates that the entropy of a system is all but indistinguishable from the MEAN algorithmic randomness over all system states. This is instructive, because the mean of state-disorder taken over all system states does qualify as a system disorder measure. Nevertheless, as said above, this explanation is not at all transparent and hardly serves for clearing the entropy=disorder confusion.)
I think it’s high time to exorcise the demon of disorder from academia and pop science alike. The wrong mental construct of entropy=state_disorder is very readily formed, in students as well as in the educated public, because people feel they understand, from everyday experience, what disorder is. But what they envision are disordered states, not disordered absract ensembles. Once the wrong construct takes root, however, uprooting it is next to impossible, especially as it has become a meme. It had played havoc with my own understanding of entropy and is surely detrimental to the understanding of all who has been exposed to it. [See Carson & Watson, Roy. Soc. Chem., 2002.] I was glad to find in Wikipedia’s “Entropy”, in the paragraph entitled Energy Dispersal, a full support of my position and some research-based references. For the sake of saving that trouble from future generations I hope that the new approach will become a trend and that Sean’s next SciAm article will contribute towards it.
Thanks, Dov Elyada for your entertaining and very instructive thoughts on entropy. If nothing else, it is reassuring for me and perhaps other educated people that entropy is not as simplistic as just “order-disorder”.
If you are so inclined would you elaborate on your statement, “Entropy of a system is…the MEAN algorithmic randomness over all system states”.
Is “mean algorithmic randomness” quantifiable, perhaps as a measure of randomness? But isn’t entropy an energy term in a thermodynamic equation, and how does that square with “mean randomness”, which is more a statistical value?
Thanks again for your insights.
Rich Wilson
Dear Sean,
I take it that’s a no comment. For if you might find it interesting, this paper by John Moffat shares some features with your model (and to a lesser extent, my one too).
Best wishes
Peter
Hi, Rich.
Please excuse the delay in my answer, I couldn’t find an earlier time.
That entropy is indeed statistical is well known since 1877 and, by now, universally accepted. This is due to Boltzmann’s profound discovery expressed in his celebrated equation:
S = k log(W), where S = entropy, k = Boltzmann’s constant, and W = the number of different molecular microstates, assumed equiprobable, available to the gas in the macrostate corresponding to S.
We are not speaking here of the level of statistics that other thermodynamic state variables — temperature, pressure, etc. — are subject to. Unlike these, entropy is not a particle-average of a conserved quantity, such as energy or momentum, that survives the averaging to become a global system variable; it is related to an average of a quantity which is itself statistical in nature. Actually, no physical property of a system’s micro constituents exists the average of which can be interpreted as the system’s entropy.
The bulk-thermodynamics facet of entropy — being a proper thermodynamic state variable and appearing in the 2nd Law and theorems derived from it — is, of course, fully consistent with the statistical view.
Since about 1948, due to C.E. Shannon, the concept of entropy has broken through the boundaries of strict thermodynamics to become a major player in information theory, as a measure of the amount of information. At first one might think that information entropy and thermodynamic entropy are just similar or analogous, or that the term has been borrowed by one from the other and is used concurrently with different meanings. But no, they are one and the same. Entropy is the amount of information not only in a phonebook, in a movie DVD, on a communications satellite beam or in the whole of Cyberspace; it is also the amount of information in a volume of hot gas pushing on a car engine piston.
Algorithmic randomness and its mean are indeed quantifiable. This is now a huge field and I cannot possibly do any justice to it — or to you — in a blog comment. But if I did succeed in arousing your curiosity enough to follow the lead I can assure you plenty of exciting reading and revelations.
Yours truly
Dov Elyada, Ph.D.
Haifa, Israel
In an ever expanding “de-cluttered” spacetime, the density decreases exponetially to the expansion, thus at a single point in an accelerating future-time, the “big rip” appears out of the vacuum? At this moment there is Entropic Phase “flipping”, the Universe comes to be out of the almost empty vacuum. The signal to those future beings looking backwards, must be that of a hot big-bang emerging out of a cold big crunch.
Now interestingly, those forward looking beings may also record data that registers them to be in existence, prior to the big-bang.. if the current signal being Dark Energy?
Is our percieved times-arrow, just like the archers arrow that has rebounded off its aimed target, and gone through a permiable target on the rebound, of area (vacuum)of less density?..the low density of a cold crunch?
Pingback: What Do You Say? | Cosmic Variance
Pingback: Words in the Clouds | Cosmic Variance
Sean, I have one question:
I don’t understand why a social (or human) concept is involved in physics. What I’m wondering is the example of “broken egg”. They were given to prove that “broken egg” has higher entropy because there are countless forms of broken eggs but only one form of good egg. Here we inolves a concept that isolate “good egg” from “broken egg”, but this is from the point of view of human beings. If evaluate this from the view of nature, the “good egg” is just one of the countless forms of egg. It has not any speciall meaning to the nature than any form of “broken egg”. So when an egg is broken, why do we say that its entropy increased? For example, from the point of view of a bird, maybe the broken egg is more meaningful to him/her. So to the bird, when the egg is broken, he/she may thinks the entropy of the egg is decreased.