Lenny Susskind has a new book out: The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics. At first I was horrified by the title, but upon further reflection it’s grown on me quite a bit.
Some of you may know Susskind as a famous particle theorist, one of the early pioneers of string theory. Others may know his previous book: The Cosmic Landscape: String Theory and the Illusion of Intelligent Design. (Others may never have heard of him, although I’m sure Lenny doesn’t want to hear that.) I had mixed feelings about the first book; for one thing, I thought it was a mistake to put “Intelligent Design” there in the title, even if it were to be dubbed an “Illusion.” So when the Wall Street Journal asked me to review it, I was a little hesitant; I have enormous respect for Susskind as a physicist, but if I ended up not liking the book I would have to be honest about it. Still, I hadn’t ever written anything for the WSJ, and how often does one get the chance to stomp about in the corridors of capitalism like that?
The good news is that I liked the book a great deal, as the review shows. I won’t reprint the thing here, as you are all well-trained when it comes to clicking on links. But let me mention just a few words about information conservation and loss, which is the theme of the book. (See Backreaction for another account.)
It’s all really Isaac Newton’s fault, although people like Galileo and Laplace deserve some of the credit. The idea is straightforward: evolution through time, as described by the laws of physics, is simply a matter of re-arranging a fixed amount of information in different ways. The information itself is neither created nor destroyed. Put another way: to specify the state of the world requires a certain amount of data, for example the positions and velocities of each and every particle. According to classical mechanics, from that data (the “information”) and the laws of physics, we can reliably predict the precise state of the universe at every moment in the future — and retrodict the prior states of the universe at every moment in the past. Put yet another way, here is Thomasina Coverley in Tom Stoppard’s Arcadia:
If you could stop every atom in its position and direction, and if your mind could comprehend all the actions thus suspended, then if you were really, really good at algebra you could write the formula for all the future; and although nobody can be so clever as to do it, the formula must exist just as if one could.
This is the Clockwork Universe, and it is far from an obvious idea. Pre-Newton, in fact, it would have seemed crazy. In Aristotelian mechanics, if a moving object is not subject to a continuous impulse, it will eventually come to rest. So if we find an object at rest, we have no way of knowing whether until recently it was moving, or whether it’s been sitting there for a long time; that information is lost. Many different pasts could lead to precisely the same present; whereas, if information is conserved, each possible past leads to exactly one specific state of affairs at the present. The conservation of information — which also goes by the name of “determinism” — is a profound underpinning of the modern way we think about the universe.
Determinism came under a bit of stress in the early 20th century when quantum mechanics burst upon the scene. In QM, sadly, we can’t predict the future with precision, even if we know the current state to arbitrary accuracy. The process of making a measurement seems to be irreducibly unpredictable; we can predict the probability of getting a particular answer, but there will always be uncertainty if we try to make certain measurements. Nevertheless, when we are not making a measurement, information is perfectly conserved in quantum mechanics: Schrodinger’s Equation allows us to predict the future quantum state from the past with absolute fidelity. This makes many of us suspicious that this whole “collapse of the wave function” that leads to an apparent loss of determinism is really just an illusion, or an approximation to some more complete dynamics — that kind of thinking leads you directly to the Many Worlds Interpretation of quantum mechanics. (For more, tune into my Bloggingheads dialogue with David Albert this upcoming Saturday.)
In any event, aside from the measurement problem, quantum mechanics makes a firm prediction that information is conserved. Which is why it came as a shock when Stephen Hawking said that black holes could destroy information. Hawking, of course, had famously shown that black holes give off radiation, and if you wait long enough they will eventually evaporate away entirely. Few people (who are not trying to make money off of scaremongering about the LHC) doubt this story. But Hawking’s calculation, at first glance (and second), implies that the outgoing radiation into which the black hole evaporates is truly random, within the constraints of being a blackbody spectrum. Information is seemingly lost, in other words — there is no apparent way to determine what went into the black hole from what comes out.
This led to one of those intellectual scuffles between “the general relativists” (who tended to be sympathetic to the idea that information is indeed lost) and “the particle physicists” (who were reluctant to give up on the standard rules of quantum mechanics, and figured that Hawking’s calculation must somehow be incomplete). At the heart of the matter was locality — information can’t be in two places at once, and it has to travel from place to place no faster than the speed of light. A set of reasonable-looking arguments had established that, in order for information to escape in Hawking radiation, it would have to be encoded in the radiation while it was still inside the black hole, which seemed to be cheating. But if you press hard on this idea, you have to admit that the very idea of “locality” presumes that there is something called “location,” or more specifically that there is a classical spacetime on which fields are propagating. Which is a pretty good approximation, but deep down we’re eventually going to have to appeal to some sort of quantum gravity, and it’s likely that locality is just an approximation. The thing is, most everyone figured that this approximation would be extremely good when we were talking about huge astrophysical black holes, enormously larger than the Planck length where quantum gravity was supposed to kick in.
But apparently, no. Quantum gravity is more subtle than you might think, at least where black holes are concerned, and locality breaks down in tricky ways. Susskind himself played a central role in formulating two ideas that were crucial to the story — Black Hole Complementarity and the Holographic Principle. Which maybe I’ll write about some day, but at the moment it’s getting late. For a full account, buy the book.
Right now, the balance has tilted quite strongly in favor of the preservation of information; score one for the particle physicists. The best evidence on their side (keeping in mind that all of the “evidence” is in the form of theoretical arguments, not experimental data) comes from Maldacena’s discovery of duality between (certain kinds of) gravitational and non-gravitational theories, the AdS/CFT correspondence. According to Maldacena, we can have a perfect equivalence between two very different-looking theories, one with gravity and one without. In the theory without gravity, there is no question that information is conserved, and therefore (the argument goes) it must also be conserved when there is gravity. Just take whatever kind of system you care about, whether it’s an evaporating black hole or something else, translate it into the non-gravitational theory, find out what it evolves into, and then translate back, with no loss of information at any step. Long story short, we still don’t really know how the information gets out, but there is a good argument that it definitely does for certain kinds of black holes, so it seems a little perverse to doubt that we’ll eventually figure out how it works for all kinds of black holes. Not an airtight argument, but at least Hawking buys it; his concession speech was reported on an old blog of mine, lo these several years ago.
Aargh, I wrote: but different initial states still evolve to the same final state which should of course be but different initial states still evolve to different final states.
Nice review, which I read earlier today in the dead-tree Wall Street Journal. All this reminds me of Steve Martin’s bit “My Uncle’s Metaphysics”, from his book Cruel Shoes. Here it is in its entirety:
My Uncle’s Metaphysics
My Uncle was the one who developed and expounded a system of cosmology so unique and unexpected, that it deserves to be written down; his papers were destroyed by fire. I am reconstructing his philosophy from memory as he told it to me on my birthdays and other such holidays. We would be sipping lemonade, perhaps, and he would begin to rock and peer at the sky on those cool afternoons, and with a slow drawl, begin to explain in the cleanest logic why the sky existed, why the universe was the total of all information yet unknown, and how each star in every galaxy could be plotted and predicted by a three dimensional number system. Then he would explain to me his numerical device called random mathematics, where any equation could be unbalanced for any reason that existed. With it, he predicted to the minute the gestation period of the white giraffe.
As the afternoon rolled on, he fluently spoke philosophy and lost all inhibitions of language, explaining complex ideas with gestures, it seemed. He expressed how sorry he was I had ever heard the word God, and then said something about M39. (Later I discovered that this was a method of numbering the galaxies.)
The loss of information in black holes is some form of encryption of quantum information. It is likely some form of entanglement phase of a vacuum state near or across the horizon, which is close to the null congurence the vacuum which comes in from I^-. The entanglement of states inside and outside the black hole becomes complicated by its coupling with the vacuum later on (equivalently further out). I thnk the process is similar to chaos theory. The underlying dynamics are completely deterministic, but we lack the data processing or accumulating ability to make the prediction. Similarly, quantum information is perfectly preserved, but it becomes “mixed up” or encyrpted in a highly complex form.
An infalling observer will observer Hawking radiation. However once the observer enters into the region where the radiation is being produced ~ 1/8piM, the BB radiation distribution becomes modified as the long wavelength stuff disappears. Eventually very close to the horizon then sees no Hawking radiation and detects essentially a pure vacuum. This transition reflects a scaling where coherent states or entanglements further out are randomized or “recoded” so that one can’t make unitary predictions.
It has to work this way. If black holes really destroy information it would seem that unification with particle physics may be impossible.
Lawrence B. Crowell
Pingback: Linkblogging for the last few days « Thoughts on music, science, politics and comics. Mostly comics.
If a well-defined theory is truly non-unitary, it’s not because probabilities don’t add up to one, it’s because the evolution isn’t invertible.
Even though I believe that the problem with information loss lies in the time evolution not being reversible (indicating that the problem is the singularity and not the horizon) which seems to agree with your point of view, I can’t but wonder if there’s a proof for that claim that you have made. As I said in my first comment, it seems to me like a leap in argument where you’ve gone from determinism to unitarity. Best,
B.
Sean: It occurred to me what I wrote in the previous comment is utterly unclear, sorry. What I was trying to say: You start up talking about determinism, then go on to talk about information being conserved in qm which I interpret as talking about unitarity. My question is, if you’d have a well-defined deterministic evolution of the wave-function in bh collapse, how would you know it’s also unitary? I think you just don’t know, but I’d be more than happy if I was wrong.
B, wouldn’t any non-unitary effects in black hole evaporation show up in processes not involving black hole via the effects of virtual black holes? So, perhaps this could be observed as a faster than expected decoherence rate…
Sean wrote:
Which seems to be exactly what happens, except that I wouldn’t characterize this as changing any rules, because it occurs in a regime explicitly outside the validity of Hawking’s calculations. And that’s the whole point: if you assume that the region where quantum corrections are important doesn’t solve the problem, then quantum corrections don’t appear to solve the problem.
Brett, you seem to imply that we know what happens in the last stages of evaporation of (small) asymptotically AdS black hole, is that true? I’d be somewhat surprised, even more so if remnants were involved.
B, I don’t have a good idea of what the experimental consequences would be, I suspect that would depend on precisely what kind of well-defined non-unitary deterministic evolution you were talking about. I don’t know of any proposals along those lines, but it would presumably depend on the model.
As Sean mentioned, in MWI, there is no collapse. There’s just the appearance of collapse because the different components of the wave function lose the ability to affect one another significantly.
To attempt to illustrate this, consider a system that is in a superposition of two states, A and B. If we then make a measurement of the state, we will find that we only observe one state: either A or B. Why is this?
In the MWI, the perspective is this: what is the consequence of observing? In order to observe, our wave function must necessarily interact with the wave function we are attempting to observe. Because our wave function is this big, complex, messy beast, this interaction effectively prevents the combined wave function from interfering. That is to say, once the measurement has been performed, outcome “A” and outcome “B” can no longer communicate with one another. So, in the MWI, it’s not that the wave function collapses, it’s that we are part of that same quantum mechanical system, and the system loses the ability to interfere once we try to measure it.
This has the direct consequence of the wave function that makes up us splitting when we observe such a situation: our wave function becomes a superposition of two states, but because those two states can’t interfere, they can’t obtain information about one another, and thus we only ever observe ourselves as existing in one of the two states, while another self observes itself as existing in the other.
Hi Sean,
I didn’t have anything specific in mind, it was more a general question of reasoning. See, I would say, if there is no singularity then the evolution must be deterministic. The whole point of there being a singularity is that different initial states get crunched into the same infinity. If you now go and say, well if it’s deterministic then it also has to be unitary you come to conclude removing the singularity removes the information loss paradox. Just that, as I was trying to say, as far as I can see that conclusion doesn’t hold because you don’t get unitarity for free. Best,
B.
It might be good to try to understand this process by which black holes store information by conceptually looking at the situation in reverse. Rather than seeing everything collapsing into a black hole, lets consider the process of (cosmic) black hole formation from the standpoint of invariant observing frames in the manifold…the coordinates of information and complexity stored there.
Everything in the universe- all information and complexity- stays in the same “place”. It is time and space which periodically cease to exist, and in that instant, as they do (cease to exist), the universe is only information…an ultimately low entropy object. Lawrence’s comments on this thread are very interesting…he refers to the entanglement of information, vacuums and related phenomena. If we view the universe as I just described it, we can appreciate both the significance of entanglement- and the possible reality of cosmic unitarity, which Sean has been discussing.
There are some interesting mathematical and geometric relationships here which strongly imply, when we compare them with the observed radius of the universe, for example, that understanding the universe in the above way is better than trying to understand how everything can be compressed into a black hole, yet continue to exist as information.
The big bang is confirmed by powerful field evidence. We even know how long ago the big bang occurred, and the present radius of the universe…as observed in the astronomical antipode, from our frame. However, the actual nature of the big bang, and the extent of the periodicity involved in the cosmic evolution, I would think, will eventually be determined by the investigation of the sub-microscopic universe, from the quark level on downward…matter/antimatter occillations…things of that nature.
A thought provoking and interesting thread!
The intro to this book review caught my attention – never mind the book itself.
Are you seriously suggesting a viable hidden variables theory is going to appear out of thin air and rescue the Determinist cause? Did i read that correctly?
Dear Mr. Sam Cox,
Your insights are very interesting, and we here at Hahvahd have read your paper “Seven Dimensional (and up) Einsteinian Hyperspherical Universe” (which you linked to in your comment above), and we are prepared to offer you a faculty position as an Assistant Professor, with very generous benefits. Here at Hahvahd, we pride ourselves in recruiting the very best of the best, and your works have shown to us that you fit Hahvahd’s needs very nicely.
We especially like the non-mathematical of your work, as working through the precision of endless equations has gotten very tiresome over the years. Your future students will very much appreciate your verbose and non-mathematical research style.
We hope this research style will lead to the next big revolution in physics, and we expect that you’ll be the next Albert Einstein.
Welcome aboard, Sam Cox, and we’ll see you in September 2008!
Sincerely,
The CEO of Hahvahd University
P.S. Go Red Sox!
our wave function becomes a superposition of two states, but because those two states can’t interfere, they can’t obtain information about one another, and thus we only ever observe ourselves as existing in one of the two states, while another self observes itself as existing in the other.
The biggest question in MWI though, is why do you, as a macroscopic conscious object, find yourself in the particular state or branch that you’re in? It will do no good to say that there are many other instances of “you” in other branches. “You” are in this branch – that is what is manifestly real and observed. and therefore other “you’s” are not equivalent to this “you”, simply by virtue of what is being observed. How does MWI deal with the conscious identity problem?
(sorry, I messed up the reply – the first paragraph should be quoted)
Sam Cox: … Lawrence’s comments on this thread are very interesting…he refers to the entanglement of information, vacuums and related phenomena.
I like the analogue with chaos theory in a way. The event horizon carries the early vacuum with it before the black hole was formed. Regions removed from it pertain to a later vacuum. So for a vacuum mode very close to the event horizon may have an entanglement (or approximate one) with a mode inside the black hole, just on the other side of the event horizon. Yet if that mode propagates outward it pertains to later vacuum states which are not unitarily equivalent to the vacuum right near the horizon. The entanglement decoheres and the vacuum mode becomes “vacuum plus random radiation.” A chaos analogue might be the laminar flow of a gas from a series of small holes in a wall. The flow remains laminar up to a point, where the flow then begins to break up into vortices and turbulent flow. At this point predictability begins to fail due to the nonlinearities in the flow. Yet we know that on a particle level the dynamics are prefectly deterministic In an analogous manner the gravity field is nonlinear and it propagates vacuum modes into regions where entanglement phases become scrambled up. Again on a quantum bit level things are perfectly deterministic, but on a coarse grained level things appear random and non-deterministic.
In Wolfram’s book “A New Kind of Science” he writes about the Hadamard matrix. This has a number of interesting properties, they are recursive, such as H_(2^n} is constructed from H_{2^{n-1}), and they play a role in chaos and the “emergent complexity” Wolfram writes about. Hadamard matrices form the basis of Reed-Muller error correction codes. They are also the Weyl group for some sporadic group systems, such as the Leech lattice.
It is not possible for me to go into great detail on this, but these sphere packing models, such as the Leech lattice with three E_8’s in a modular system, may be error correction codes for how entangled modes, or vacua, ultimately preserve all of the quantum bits. This would be the case even though on a coarse grained level things appear completely randomized.
This appears to be a trend in physics. The underlying dynamics, or how information is shuffled around, is perfectly deterministic, but in collective or complex systems there is a randomizing process which sets in. This might be due to nonlinearity, such as with gravity, chaos or hydrodynamics, or due to a large number of elements or atoms, such as thermodynamics. The underlying structure or fine grained physics is completely deterministic and information preserving, but systems at large exhibit complexity or chaos which makes this determinism impossible to assertain without the approrpriate fine grain data and the “error correction code.”
Lawrence B. Crowell
Lawrence Crowell said,
“This appears to be a trend in physics. The underlying dynamics, or how information is shuffled around, is perfectly deterministic, but in collective or complex systems there is a randomizing process which sets in. This might be due to nonlinearity, such as with gravity, chaos or hydrodynamics, or due to a large number of elements or atoms, such as thermodynamics. The underlying structure or fine grained physics is completely deterministic and information preserving, but systems at large exhibit complexity or chaos which makes this determinism impossible to assertain without the approrpriate fine grain data and the “error correction code.”
It is interesting that even in the macroscopic world we see a shadow of determinism and predictability. We observe periodicity, stored information and complexity. We note time direction and process. The morphology of conception and birth is very different from death and decomposition. Yet, even within the biological world, at lower levels of scale, these “obvious” morphological differences become less obvious. The Paramecium is continuously living and dividing, unless the whole population perishes. Trees reproduce vegetatively as well as sexually.
If we accept that fine grained physics is completely deterministic, it would seem that we would be also inclined to accept the idea that cosmologically, that is the way things are. Our way of experiencing the universe, observing and measuring it at macroscopic scales, while real to us, is but a product of relativistic and quantum effects observed in the manifold of 3+1 dimensions…scale and time.
We were speaking to this idea on another thread. Relativisitc and Quantum effects are real. Accelerations and their resulting relativistic effects on 4D particulate event horizon surfaces when remotely observed are so real we can build technology on them…they are us and our world. Quantum mechanics is the basis for a whole new technology. Also we know that all of these relativisitc and quantum relationships can be mathematically descibed and compared with the results of experiment.
Your analogies are excellent, and you make it clear that a complete understanding of the process requires “appropriate fine grain data and the error correction code”. For some of the same reasons you have discussed, I’m inclined to see the universe as foundationally and cosmologically deterministic, yet observed (and only observed) differently. It seems to me there is a dichotomy between the way the universe actually exists, and the way we observe it…that there may be little or no actual randomness at all, anywhere in the system, except as we observe, define and experience it at our coordinates.
I honestly however, do not believe that our world of choice and free will is a complete illusion. After all, it IS our world. We observe motion and change, and a unique reality which has a very advanced holographic quality about it. In some way, what we cumulatively do and experience influences the overall direction of informational entropy. While the universe is very deterministic, and is vast, and though it exists permanently, yet, being finite in mass, (momentum) the cosmos rides a fine line between existence and non-existence. “Life” functions as the force which compensates for a slight increase in overll thermal entropy, by decreasing informational entropy ( increasing complexity).
On the GR side, I picked up the following excerpt from Wikipedia which also applies to this discussion:
“Each solution of Einstein’s equation encompasses the whole history of a universe—it is not just some snapshot of how things are, but a whole, possibly matter-filled, spacetime. It describes the state of matter and geometry everywhere and at every moment in that particular universe. By this token, Einstein’s theory appears to be different from most other physical theories, which specify evolution equations for physical systems: if the system is in a given state at some given moment, the laws of physics allow extrapolation into the past or future. Further differences between Einsteinian gravity and other fields are that the former is self-interacting (that is, non-linear even in the absence of other fields), and that it has no fixed background structure—the stage itself evolves as the cosmic drama is played out.[145]”…
Thanks for your remarks! Sam Cox
PS Jeff, I think a partial answer to your question may lie in the universal geometry, and the relationship between the way the cosmos exists and the way it is observed, time process and irreversibility in the macroscopic etc. An extra 3-space and fixed coordinates of observation might explain why we only observe one of ourselves at a time…Your comments are very thought provoking.
Lawrence,
blockquote>The underlying structure or fine grained physics is completely deterministic and information preserving, but systems at large exhibit complexity or chaos which makes this determinism impossible to assertain without the approrpriate fine grain data and the “error correction code.
Isn’t that based on the assumption there is a quantum fine grain? What if it isn’t ultimately digital, but analog? That process is cause and structure is effect. rather than the other way around. Yes, we can only measure structure at the microscopic level, but why is that proof function follows form, when form follows function at every other level. Strings may just be vortices.
Dear Sam Cox, Lawrence Crowell, Jeff, John Merryman, and Qubit:
Go study some physics and shut up.
Pingback: Noticias GL » The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics
reeree,
But I might get more confused!!!!!
I have enough trouble trying to make sense of this world and universe.
Listen, you clowns, cut it out. Go stalk Stephen Hawking for his autograph.
reeree,
Even though it is only out of sheer ignorance on your part, I do thank you for putting me on the same list as Lawrence.