Breaking radio silence here to report on some of the actual work I’ve been able to complete: a new paper with Heywood Tam.
Unitary Evolution and Cosmological Fine-Tuning
Authors: Sean M. Carroll, Heywood Tam
(Submitted on 8 Jul 2010)Abstract: Inflationary cosmology attempts to provide a natural explanation for the flatness and homogeneity of the observable universe. In the context of reversible (unitary) evolution, this goal is difficult to satisfy, as Liouville’s theorem implies that no dynamical process can evolve a large number of initial states into a small number of final states. We use the invariant measure on solutions to Einstein’s equation to quantify the problems of cosmological fine-tuning. The most natural interpretation of the measure is the flatness problem does not exist; almost all Robertson-Walker cosmologies are spatially flat. The homogeneity of the early universe, however, does represent a substantial fine-tuning; the horizon problem is real. When perturbations are taken into account, inflation only occurs in a negligibly small fraction of cosmological histories, less than 10-6.6×10^7. We argue that while inflation does not affect the number of initial conditions that evolve into a late universe like our own, it nevertheless provides an appealing target for true theories of initial conditions, by allowing for small patches of space with sub-Planckian curvature to grow into reasonable universes.
In English: our universe looks very unusual. You might think we have nothing to compare it to, but that’s not quite right; given the particles that make up the universe (or the quantum degrees of freedom, to be technical about it), we can compare their actual configuration to all the possible configurations they could have been in. The answer is, our observed universe is highly non-generic, and in the past it was even more non-generic, or “finely tuned.” One way of describing this state of affairs is to say that the early universe had a very low entropy. We don’t know why; that’s an important puzzle, worth writing books about.
Part of the motivation of this paper was to put some quantitative meat on some ideas I discussed in my book. The basic argument is an old one, going back to Roger Penrose in the late 1970’s. The advent of inflation in the early 1980’s seemed to change things — it showed how to get a universe just like ours starting from a tiny region of space dominated by “false vacuum energy.” But a more careful analysis shows that inflation doesn’t really change the underlying problem — sure, you can get our universe if you start in the right state, but that state is even more finely-tuned than the conventional Big Bang beginning.
We revisit this question, bringing to bear some mathematical heavy machinery developed in the 1980’s by Gary Gibbons, Stephen Hawking, and John Stewart. Previous discussions have invoked general ideas of entropy or reversibility, but we were able to do a relatively down-to-earth calculation using conventional cosmological models. And we tried our best to explicitly list all of the caveats of the argument, which is important in a context like this where we don’t know all the rules.
We find that inflation is very unlikely, in the sense that a negligibly small fraction of possible universes experience a period of inflation. On the other hand, our universe is unlikely, by exactly the same criterion. So the observable universe didn’t “just happen”; it is either picked out by some general principle, perhaps something to do with the wave function of the universe, or it’s generated dynamically by some process within a larger multiverse. And inflation might end up playing a crucial role in the story. We don’t know yet, but it’s important to lay out the options to help us find our way.
Concerning the 1980s, what is it that the 1980’s possessed?
From sci.physics.research : “Arrow of Time”
A nonlinear dynamical system is deterministic and fully causal, and
yet not entirely predictable. It can go from quasi-classical behavior,
including periodic behavior, into full or partial chaotic behavior,
and back again.
A NLDS has a definite “arrow” and you can call it the arrow of time,
or the arrow of determinism, or the arrow of causality. They are all
different “facets of the same crystal”.
The key issue here is that you do not have to invent untestable
hypothetical “multiverse” pipe-dreams in order to explain the arrow.
If you have an NLDS on any scale, microscopic or macroscopic, then you
have a local arrow for that system on that scale.
This is the reason you cannot unscramble your scrambled eggs. It has
nothing remotely to do with the Big Bang, or pre-Big Bang physics.
Then the question is: how common are NLDS? My intuition and
observations suggest that the answer is: highly ubiquitous.
I would ask: what well-studied, and observed at high resolution,
physical systems are not NLDS?
This fine-tuning stuff might not be needed – along with God, and Natural Selection in Physics, and the Anthropic Principle, and Multiverses. Suppose merely that the Algebraic Design of the world requires the construction of atoms – the whole collection of atomic structures. That forbids any kind of crummy tuning that would derail construction. You might not even need any “constants” . In fact, the structure would force the constants to be what they have to be for construction to be possible. Evidently Matter is just as much an issue of construction as Space being a collection of all its subspaces, in a multi-vector. Every atomic structure imposes constraints on the properties of forces. You might imagine fiddling with the properties of Hydrogen, but if you must account for Helium, then less fiddling is allowed if you expect consistency across all atoms. That is why I think it is a huge oversight for Physics to ignore Octonion algebra – it requires associations of building blocks – because it is nonAssociative. We are surrounded by and are made of associations of elementary things. Every atom and every molecule is an association.
I definitely prefer “Algebraic Design” to “God, Anthropic Rubbish, and Mix-Master Multiverse”.
But why not seek Geometric Principles for the unified modeling of an eternal multi-scaled cosmos, with one reasonably limited set of fundamental constants that apply on all scales?
It’s not that I’m against algebra, it’s just that it tends to be an approximation for underlying geometric fundamentals.
Discrete Scale Relativity offers a new paradigm of this type. It has passed 39 fundamental retrodictions and makes at least 10 definitive predictions. It’s prediction for the radius of the proton is closer to the brand new high precision measurement than anything the standard hep model has barfed up with much straining. Discrete Scale Relativity does it with a simple calculation using the Kerr-Newman metric and the correct G value for the interiors of atomic scale systems.
Boffin – multivector algebra ( ie, Clifford Algebra ) already is as geometrical as it gets , as well as being as fundamental as it gets, although if we generate Clifford algebras by picking the number of vectors and the signature, it seems to depend on the heavy human hand. Luckily Complex Quaternions can go like 1 + x + y + z + xy +yz + zx + xyz for plain old three dimensional spatial structure. It is also the same thing as Pauli algebra when expressed as 2×2 complex matrices. Talk about Dual Use Technology ! Besides, it is the even subalgebra of complex octonions, which have -+++ and +— signatures, and no others, which looks like it gets Minkowski spacetime automatically, and can not be otherwise.
So I do not see algebra being an approximation for geometrical fundamentals, and find the whole idea of getting rid of these infernal ‘constants’ an endlessly facinating possibility.
Sean, I’m not quite sure I understand the nature of this question – to what extent does the cosmology community already recognise that the flatness problem is not really a problem? Has this paper just put some meat on the bones of an already-understood argument, or has the point not been properly made before (I’m afraid I don’t have access to those prior papers you cite).
I think the consensus in the cosmology community is that the flatness problem is real. Hawking and Page previously pointed out that the canonical measure suggests otherwise, but they didn’t really push it, and it certainly didn’t become conventional wisdom.
I do not see why a small probability for a state out of a near infinity of possible states surprises anyone. I do see value in comparing possible states as a way to uncover some fundamental operating phenomena; I just don’t see that “our universe looks very unusual.”
For example, I have not been hit by a meteor weighing 1.0 kg travelling 100 m/s.
I have also not been hit by a meteor weighing 1.01 kg travelling 100 m/s.
I have also not been hit by a meteor weighing 1.001 kg travelling 100 m/s.
…
I have also not been hit by a meteor weighing 1.0 kg travelling 100.1 m/s.
…
rotating meteors, iron vs. carbonaceous meteors, “just missing me” meteors and all the x,y values for that.
…
Wow there are a lot of ways for a meteor to hit or almost hit me! Since no meteor has yet done that, I must be very, very special indeed. ….Not! Actually, I am not surprised at all that a meteor has not startled me; it is not unusual.
I would solve the meteor problem by experimentally measuring the number of meteors striking the earth in a year with a measured size distribution, a little planetary astronomy to look at possible variations to that average number, and then calculating the probability a large enough object meets my criterion in some period of time, and most importantly — adding the error bars which are computed from each and every measurement and assumption I use.
What am I missing? Why is my gedanken description different?
Shouldn’t the paper be titled: “You can’t get to there from here”?
Sean: I would be interested in your perspective on a new theoretical paper that has gotten some buzz claiming that gravity is an entropic force. Thermodynamics is not my strong point, so it is hard for me to judge its credibility or importance.
NYTimes article: http://www.nytimes.com/2010/07/13/science/13gravity.html?_r=1
Original paper: http://arxiv.org/abs/1001.0785
Kevin– It’s an interesting idea, and Verlinde is certainly a smart and creative guy. It might turn out to be important, or to be relatively trivial. I haven’t written about it because I haven’t had the time to really digest the paper.
re #21: I’ve argued since the very month that Smolin put forth the idea of Cosmic Natural Selection, that Population Genetics shows advantage to sexually reproducing species, so we should deduce a multiverse cosmology where pairs of universes pass physical constants on to baby universes. Then the Genetic Algorithm kicks in, per the breakthrough book Adaptation in Natural and Artificial Systems, John H. Holland, 1975, (republished by The MIT Press, 1992), which I beta-tested 1974 for prof. Holland in preprint while I was in grad school.
Mr. Mayer might say,
Entopy
Is workin’
Against me
And entropy
Wants to
Bring
Me
Down
In case someone thinks I was blowing smoke above, see the last page of Robert Hermann’s
“Spinors, Clifford and Cayley Algebras” ( Math Sci Press 1974 page 272). well, he gets half
of it.
Sean : considering that the argument I made above might blow a hole in your position, just wondering if you care to comment.
Sean and other inflation afficianados, what do you people think about
this which talks about torsion as an alternative to
regular scalar field inflation?
Pingback: Lo dice hasta NATURE! « Gravedad Cero
Well, if it was God doing the fine-tuning then He did a shoddy job of it. Shoddy enough that He does not deserve to be worshiped for it. Unless our existence was not the point. In which case God should either be prosecuted for crimes against humanity, or sued for negligence. Much, much preferable, theologically, methinks, to choose a mindless multiverse.
@Shantanu #40
I don’t know if I’m an ‘inflation aficionado’, but torsion is a really heavy handed way to fix the issues solved by inflation. In particular, in the presence of torsion, test particles no longer follow the same geodesics, and you get new spin-orbit coupling effects. If you’re going to add torsion to your theory, you have to be extremely careful about making sure that solar system tests of relativity aren’t violated. Not to mention things like the Hulse Pulsar, which looks pretty consistent with Einstein gravity, at least in the evolution of its orbital period.
The paper claims that the magnitude of the torsion tensor is small, so maybe this isn’t violated. But such things require care, and I would be skeptical about torsion being the answer until the issue of consistency with existing observations was settled.
bittergradstudent,
In hulse-taylor binary pulsar you have 3 unknowns and 4 observables, which is how you
can test GR. In an alternate theory of gravity, you will have one more unknown and I don’t
see how you can test torsion theories with binary pulsar. Also I am not sure PPN formalism
deals with torsion.
Shantanu,
You’d have to actually do a calculation that shows me that the radiation reaction is the same. The Hulse-Taylor Pulsar has tiny, tiny error bars–if you change the radiation reaction one bit, you’re going to change the shape of that graph, and torsion changes everything. Same with the solar system tests of GR.
If you’re going to make me believe in torsion, you’re going to have to show me that it isn’t already disproven.
(sorry about the double post)
Or, put another way, an inflaton adds one degree of freedom to your theory, and an unknown potential function. Perhaps less than that if you can make it an emergent phenomenon of particle physics.
In D=4, a torsion tensor adds 24 independent tensor components (6 antisymmetric components to a 10×10, times four coordintes) to your theory, complete with unknown dynamics to generate those components. It’s a much less sparse explanation. Absent a reason other than not liking inflation, I don’t see much of a reason to want torsion over inflation. Of course, every theory should be explored and explained, but I’m unconvinced, and am skeptical that its even a possibility unless I can be shown that a torsion theory passes astrophysical tests with a large enough torsion tensor to matter cosmologically intact. And of course you could deal with torsion in a PPN formalism–torsion just modifies the Christoffel symbols, and, in the end, PPN is really an expansion of the Christoffel symbols.
bittergradstudent,
the radiation reaction rate depends upon the masses of the binary pulsar, distance,inclination etc. All of these are determined using the post-keplerian formalism developed
by Damour and Durelle. However since the number of observables are more than the number of unknown, you can test self-consistency of the theory as system is over-determined. whereas with the inclusion of torsion, no of unknowns will be = no of equations and I don’t think you can test anything.
Even if you can’t get precise measurements of the parameter, you can get bounds on them. I wouldn’t be surprised if torsion made it so that stable bound orbits didn’t exist in the strong field case, for example. The shape of the radiation curve might change. It will also either violate gauge invariance or the equivalence principle, depending on whether the new Maxwell tensor is $latex nabla_{[a}A_{b]}$ or $latex nabla_{[a}A_{b]} -T^{c}{}_{ab}A_{c}$ . The first one is no longer gauge invariant, while the second one is gauge invariant, but will treat the Maxwell field differently than ordinary matter and violate the equivalence principle.
There are a lot of unobserved effects that torsion predicts. I’m pretty sure that there is a strenuous lower bound on it. I just don’t know how low it is.
What if conditions were such in the very early universe that mass was impossible. No mass, no speed limit. I’ve got further thinking along this line, but that would take up space. BTW, I have no idea if I am a crank, because cranks are often unaware of their crankhood.
bittergradstudent, what do you think of this recent paper by BJ which discusses the role of torsion
in cosmology here am surprised there is no blog
discussing this paper.