Scott Aaronson has thown down a gauntlet by claiming that theoretical computer science, “by any objective standard, has contributed at least as much over the last 30 years as (say) particle physics or cosmology to humankind’s basic picture of the universe.” Obviously the truth-value of such a statement will depend on what counts as our “basic picture of the universe,” but Scott was good enough to provide an explanation of the most important things that TCS has taught us, which is quite fascinating. (More here.) Apparently, if super-intelligent aliens landed and were able to pack boxes in our car trunks very efficiently, they could also prove the Riemann hypothesis. Although the car-packing might be more useful.
There are important issues of empiricism vs. idealism here. The kinds of questions addressed by “theoretical computer science” are in fact logical questions, addressable on the basis of pure mathematics. They are true of any conceivable world, not just the actual world in which we happen to live. What physics teaches us about, on the other hand, are empirical features of the contingent world in which we find ourselves — features that didn’t have to be true a priori. Spacetime didn’t have to be curved, after all; for that matter, the Earth didn’t have to go around the Sun (to the extent that it does). Those are just things that appear to be true of our universe, at least locally.
But let’s grant the hypothesis that our “picture of the universe” consists both of logical truths and empirical ones. Can we defend the honor of particle physics and cosmology here? What have we really contributed over the last 30 years to our basic picture of the universe? It’s not fair to include great insights that are part of some specific theory, but not yet established as true things about reality — so I wouldn’t include, for example, anomalies canceling in string theory, or the Strominger-Vafa explanation for microstates in black holes, or inflationary cosmology. And I wouldn’t include experimental findings that are important but not quite foundation-shaking — so neutrino masses don’t qualify.
With these very tough standards, I think there are two achievements that I would put up against anything in terms of contributions to our basic picture of the universe:
- An inventory of what the universe is made of. That’s pretty important, no? In units of energy density, it’s about 5% ordinary matter, 25% dark matter, 70% dark energy. We didn’t know that 30 years ago, and now we do. We can’t claim to fully understand it, but the evidence in favor of the basic picture is extremely strong. I’m including within this item things like “it’s been 14 billion years since the Big Bang,” which is pretty important in its own right. I thought of a separate item referring to the need for primordial scale-free perturbations and the growth of structure via gravitational instability — I think that one is arguably at the proper level of importance, but it’s a close call.
- The holographic principle. I’m using this as a catch-all for a number of insights, some of which are in the context of string theory, but they are robust enough to be pretty much guaranteed to be part of the final picture whether it involves string theory or not. The germ of the holographic principle is the idea that the number of degrees of freedom inside some region is not proportional to the volume of the region, but rather to the area of its boundary — an insight originally suggested by the behavior of Hawking radiation from black holes. But it goes way beyond that; for example, there can be dualities that establish the equivalence of two different theories defined in different numbers of dimensions (ala AdS/CFT). This establishes once and for all that spacetime is emergent — the underlying notion of a spacetime manifold is not a fundamental feature of reality, but just a good approximation in a certain part of parameter space. People have speculated about this for years, but now it’s actually been established in certain well-defined circumstances.
A short list, but we have every reason to be proud of it. These are insights, I would wager, that will still be part of our basic picture of reality two hundred years from now. Any other suggestions?
Joe Fitzsimons,
“1) Quantum computers can simulate quantum systems efficiently.”
True. How does this differ from actually performing an experimental observation
on the quantum system in question. I am truly ignorant about this, so would be
glad to know if this is a dumb question or not.
I wonder how this qualifies as quantum information theory (or computer science, per Scott Aaronson), in any case, especially since such things do not currently exist.
Rob, I just don’t get why you think for example that inflationary theory is unchallengable when you have many reasonable physicists arguing just the opposite.
The standard interpretation is that thermodynamic arrow of time necessarily requires low entropy initial conditions, which John Page pointed-out, would be extremely improbable. Rather than solving this problem, the inflation theory further aggravates it because the reheating or thermalization at the end of the inflation era, necessarily increases entropy, meaning that the initial state of the universe had to be even more orderly than in other Big Bang theories that don’t have an inflationary phase.
Lawrence Krauss pointed out that the amplitude of the quadrupole moment of the CMBR is unexpectedly low, and the other low multipoles are observed to be preferentially aligned with the ecliptic plane. Thhis is a signature of what is known as, “non-Gaussianity”, which contradicts the simplest models of inflation, requiring more bandaids and cream.
If the microwave background at the multipoles is correlated with the geometry and direction of motion of the solar system, and the incoherence manifests via octopole and quadrupole components in a bound universe, then there should be a center of gravity at the center of the visible universe that correlates to the ecliptic.
Call me what you want when I can’t be reasoned with, but the physicists that taught me physics explained the flaw in any understanding, with hard physics and facts, without prejudicial preference for any given cosmological models and theory until something has been definitively decided, but especially before they label and dismiss you out of hand, and I appreciate that Rob hasn’t done that.
Why are quantum fields in curved space limited only to creating particles during rapid inflation?.. since it works just fine to explain expansion without inflation.
You know where I’m going and I don’t get shot down… so I won’t repeat myself…
Jack, gauge theories have spacetimes just like gravity theories do. From the point of view of one side of the duality, the spacetime on the other side looks “emergent.” Which nobody thinks means “not important.”
And the idea that I am uncomfortable with GR is somewhat falsified by the fact that I wrote a book about it.
Sean,
“Emergent” in what sense? Could you elaborate on this? Given the context I get the sense that some people (at least) mean that, in light of the relevant dualities, curved/dynamic spacetime can be regarded as “emergent” from a more conventional gauge theory formulated in flat spacetime. (Of course, I’m not referring here to the old perturbative analysis of a spin-2 field in Minkowski spacetime.)
This is to be distinguished from another sense of “emergent”, namely that spacetime as a continuous manifold may emerge as an approximation to an underlying discrete dynamics, in somewhat the same sense that the apparent continuity of condensed matter emerges from the dynamics of its atomic or molecular constituents in spacetime. This point of view bears a somewhat closer similarity to certain alternative approaches to quantum gravity, although the analogy breaks down for the obvious reason that the discrete “constituents” in these approaches cannot be referred to a background; there is no background structure.
A former Student,
Well, essentially, a universal quantum computer can synthesise any Hamiltonian. You don’t need to have the Hamiltonian naturally occuring in your system.
You can construct a universal quantum computer out of a chain of spin-1/2 particles interacting via an Ising interaction, but can happily simulate, say, an RKKY interaction, something the system does not have.
It is not the same as doing the experiment, since they are very different systems.
You use a quantum algorithm to perform the simulation, just as you use a classical algorithm to do simulations on a classical computer.
Whether the devices exhist or not does not bother theoretical computer scientists. You seem to be confusing them with programmers, etc. They are much more closely related to mathematicians (some would say they are mathematicians).
Also, who said quantum computers don’t exhist? We’re just small scale at the moment.
In the sense, in our current understanding, that the CFT on the boundary is taken as the definition of quantum gravity in the “bulk” AdS space. In an appropriate limit , this definition is supposed to reduce to semiclassical supergravity. (And many calculations corroborate that it does.)
It didn’t have to work out this way. A priori, 4-dimensional N=4 Super Yang-Mills would seem to have little to do with General Relativity on 10-dimensional spacetime, AdS5× S5. And, yet, the latter “emerges” in an appropriate limit.
To what Jacques just said, let me add that you shouldn’t think of “emergent” as necessarily referring to some underlying discrete structure. More generally, it refers to collective phenomena characteristic of the behavior of some apparently-different set of degrees of freedom. In particular, a holographic duality can’t be something so straightforward as a Planck-scale discretization of spacetime; it’s a different number of dimensions! In AdS/CFT, there isn’t any discretization — instead, the continuous local degrees of freedom in one description look quite non-local in the other.
This gets to the heart (at the risk of derailing the discussion yet again) of one reason why so many people are fond of string theory. Saying “maybe spacetime is discrete at the Planck scale” is easy to do, and people have been doing it for years, with various degrees of promise. The holographic duality of AdS/CFT seems like something much more profound, involving non-locality and different numbers of dimensions and UV/IR correspondences in a deep-rooted way. Nobody would just be sitting around in their armchair, thinking deep thoughts about the nature of spacetime, and say “Hey, maybe if we look at quantum gravity with anti-de Sitter boundary conditions, it will be dual to a large-N conformal field theory in Minkowski space.” You had to be led there, bit by bit, by struggling to understand the individual puzzles presented by different pieces of the theory along the way. And it paid off big-time.
[Sean]
Right. The distinct uses of the word have been quite evident to me for some time. String theorists seem generally anxious to distance themselves from the “spacetime probably has a discrete basis” outlook. Their use of the word “emergent” seems more closely allied to that of condensed matter field theorists, where “emergent phenomena” seem to be associated with somewhat similar correspondences between what are ostensibly very different (often field theoretic) descriptions of a system. Of course this has inspired people like Robert Laughlin to go rather far afield (no pun intended) from condensed matter, but let’s not get into that.
Count Iblis said:
I’m always puzzled how muddled expressions like this and its many variations arise, and they always lead to absurdities if carried to conclusion with the same lack of rigor. A concept of embedding always involves two objects, and a map from one into another preserving some kind of structure. In this case, then, we have on the one hand, an implied state of otherness (“we” are not of this universe) mediated by that embedding or map, but on the other hand, the structure that we are faithfully mapped onto already exists as a subset of that universe, so we have a kind of house-of-mirrors identity crisis. Ack! I think we definitely need to be speaking in a more rigorous framework.
On a lighter note, one of my favorite cartoons of all time was a single panel showing a woman removing laundry from a dryer. The caption on the panel said something like “Somewhere in a parallel universe” and the woman is shown exclaiming: “Well what do you know, extra socks again!”
When I ask how does the holographic principle relate to empirical data, the response from Sean is
“the empirical data are the existence of gauge theories and gravity. We now know (from theoretical work, admittedly) that these two things are not entirely separate — a theory can look like gauge fields in flat spacetime in one limit, and like gravity in another. The laws of black hole mechanics speak strongly to the idea that something like this is happening in our real world”,
and onymous says
“Sean has given the zero-order answer: we know there is gravity, and we have good reasons to suspect that any gravitational theory is holographic. Unfortunately like most things in quantum gravity, it’s hard to find experiments that can test it.”
These responses illustrate how under the way physics is now being done by the theoretical physics community, the old concept of proof by theoretical prediction and experimental confirmation is being eroded and replaced by something much weaker. Here `experimental proof’ is just a statement that a new theory implies there is a link between two already well tested experimentally confirmed topics. So what is the new observation that will confirm this theoretical link, for example the prediction of a new particle? or how about a prediction of the mass of some known particle? Where is the hard data that such a link exists?
Just for the record, while the existence of astrophysical black holes is well established, the laws of black hole mechanics have not been experimentally confirmed. They remain well-based but unproven theoretical predictions, and so do not provide the needed empirical link.
If the holographic principle is one of the two major achievements of physics in recent decades, then physics is no longer a solidly empirically based subject.
George wrote:
“These responses illustrate how under the way physics is now being done by the theoretical physics community, the old concept of proof by theoretical prediction and experimental confirmation is being eroded and replaced by something much weaker. Here `experimental proof’ is just a statement that a new theory implies there is a link between two already well tested experimentally confirmed topics.”
It’s not that the old concept of proof is being eroded, it’s that the experimental confirmation is difficult. Quantum gravity is just inherently hard to test, no matter what. If you don’t like the lack of experimental support, the options are (a) tell everyone to stop working on quantum gravity, or (b) be patient and hope that eventually a better theoretical understanding will lead to new ways of testing things. I won’t say choice (a) is inherently unreasonable, but we do know that we need quantum gravity in order to really make sense of the universe. So I think choice (b) is preferable: maybe someday someone will come up with a clever experimental test. It’s not that no one tries, it’s that quantum gravitational effects are inherently extremely suppressed in every situation we can probe. Cosmology offers hints, but for obvious reasons it’s not the ideal laboratory.
(There are other highly indirect ways to try to get experimental confirmation that theory is on the right track, like the viscosity bound from AdS/CFT that seems to be borne out by RHIC data. At the moment such things are fairly crude, but still nontrivial.)
If you think quantum gravity research should be abandoned because there’s no obvious hope of testing it in the near future, I can’t really argue, but I think it would be a mistake to give up so soon. We didn’t even know about the cosmological constant until quite recently. Maybe we’ll get more surprises.
To answer that question, we need to step back for a moment and ask what it means to confirm this conjectured link.
As I said, in a certain limit, quantum gravity in AdS becomes semiclassical. While we don’t currently have an independent definition of the full quantum gravity theory, we do understand the semiclassical theory.
So one thing we could do to check the conjecture is to compute something in the gauge theory, compute the same observable in the supergravity approximation and compare.
“Whoa, there! says George, I asked for an empirical check on the conjecture, not one of your fancy-pants computations.”
Well, OK. We can’t do quantum gravity experiments in AdS. But we can do experiments involving gauge theories. So, instead of comparing a calculation on the supergravity side with another calculation on the gauge theory side, we could compare it with a measurement on the gauge theory side.
For extra bonus points, we should make the observable something that no one knows how to calculate on the gauge theory side. If we succeed, we then not only add evidence that the conjecture is true, but also show that it is useful, in the sense that it allows us to calculate interesting quantities in the gauge theory that we otherwise would not be able to do.
One such application is the computation of finite-temperature transport coefficients of strongly-coupled gauge theory plasmas. The relevant experiments are being done at RHIC.
As “onymous” says, those checks are rather crude, at present.
On the experimental side, the error bars are large, and the interpretation of the results are still open to question (some argue that we don’t even have definitive evidence that we’ve seen the quark gluon plasma). On the theoretical side, one needs to argue that the transport coefficients (or, appropriate ratios) are relatively “universal,” and don’t much depend on the details of the gauge theory. (This, at least, can be checked, by studying the same quantity in different AdS/CFT backgrounds.)
I’m sorry if this state of affairs doesn’t quite fit your schema for the way “good science” is supposed to operate. But it seems to be the best we can do, at present. We really are trying to do the best we can …
Can someone clarify the concept of “emergence” and “evolving” :
http://en.wikipedia.org/wiki/Emergence
A specific situation in a specific model would be great, ie if some models use the same terminology, WRT the same process’s?
Sean: *of course* I wasn’t suggesting that you are uncomfortable with GR! OK, let’s forget the sociology stuff and get back to “emergence”. I take it that when people say that something is “emergent” they mean that it arises in some non-trivial way from something more basic. Color is an emergent property of things around us, in the sense that the color of an object is not a fundamental property. We *understand* the more fundamental things from which it “emerges”. I don’t think AdS/CFT is like this [yet]. Sure, it has given us a tremendously powerful *alternative way* of thinking about gravity [in certain cases]. That’s great. But has it really given us an understanding of some more fundamental something-or-other from which spacetime “emerges” at the Big Bang? If so, what is that? Somebody mentioned dS/CFT, which would be far more relevant if it could be made to work. But even there, you have an *equivalence* between a cosmological spacetime and some weird gauge theory on S^3, which would be great, but where is anything emerging? Nobody thinks that the theory on S^3 is more fundamental than the deSitter dynamics, do they?
For the skeptics: all this is on-topic! We are trying to assess the importance of AdS/CFT. Is it a really important technical advance, or is it something even more important, a real advance in our understanding of the basic nature of spacetime? I say yes to the first, not yet to the second. [But I would like to be convinced.] In particular I’m not convinced that giving a *definition* of something allows us to claim that it is emergent. It’s great, it’s good, Juan well deserves his 4000 cites. But emergent spacetime? Where?
Joe Fitzsimons,
I am aware of the distinction between computer scientists and programmers-however the point is that the proofs of algorithms and other mathematical statements are purely that until it is possible to implement them within an actual quantum computer. So describing these algorithms as having explicated fundamental physical concepts is rather stretching it, given that it is unclear whether quantum computers can be scaled beyond their present size due to things like decoherence. In short, as a physicist it does not help me much if a hypothetical device can solve my problem in polynomial time, if such a device does not exist or cannot be made.
Secondly:
“Well, essentially, a universal quantum computer can synthesise any Hamiltonian. You don’t need to have the Hamiltonian naturally occuring in your system.”
If I am not wrong this is mathematically equivalent to stating that one can write the Hamiltonian in a basis of qubits of the appropriate dimension. However, I am not sure that in general quantum systems can be simulated efficiently (e.g with a polynomial number of quantum gates(?) etc) with quantum computing algorithms (as you assert earlier). This maybe and is possibly true, but to my knowledge there is no general demonstration of this fact, and I think this distinction is important. I may be wrong about this since I dont have expertise in this area, and would be glad to know if such proofs exist.
onymous said:
If you think quantum gravity research should be abandoned because there’s no obvious hope of testing it in the near future, I can’t really argue, but I think it would be a mistake to give up so soon. We didn’t even know about the cosmological constant until quite recently. Maybe we’ll get more surprises.
I don’t think anyone is suggesting that quantum gravity research should be abandoned; of course it shouldn’t. It’s vital that it continue. But I agree with George Ellis that it doesn’t make sense to claim that exciting conjectures like the holographic principle should be elevated to the status of “most important empirical discoveries about the universe,” when there is as yet no experimental evidence for them — or even clear experimental predictions. Particularly when there are so many other examples of empirical discoveries, such as most of the things listed by Rob Knop.
(I’ll mention in passing that, given Scott Aaronson’s “past thirty years” cutoff, the discovery of the existence of dark matter — or whatever the hell is causing its effects — would certainly qualify, since really convincing evidence only started showing up in the mid/late-1970s.)
Former student: It’s perfectly conceivable that a fundamental reason will be discovered why quantum computers can never be built. It’s also perfectly conceivable that a physical system will be discovered whose Hamiltonian can’t be simulated by a quantum computer with polynomial overhead.
I desperately hope that one or the other of these things will happen, since it would be the biggest scientific thrill of my life! The former discovery would imply either that quantum mechanics is false, or else that there’s some fundamental process (maybe a gravitational decoherence process?) layered on top of quantum mechanics that’s unlike anything we can currently imagine. The latter discovery would imply a “failure of reductionism”: the existence of a physical system whose Hamiltonian can’t be decomposed into a reasonable number of “local” (e.g. 2- or 3-particle) interactions. (If a Hamiltonian is a sum of polynomially many local terms, then certainly one can simulate it efficiently on a quantum computer — see here for some pointers to the literature.)
Thus, either discovery would contribute much more to “explicating fundamental physical concepts” than the mere confirmation of our current belief: namely, that the class of functions that are feasibly computable in physical reality coincides with Bounded-Error Quantum Polynomial-Time.
Actually I didn’t assert this. Hamiltonians with only two particle interactions can be simulated efficiently (as can 3,4,5, etc.). An arbitrary Hamiltonian has an exponential number of free parameters, and so needs an exponential number of gates to simulate.
Scott, re: post #67 : if you keep saying stuff like that, I’m gonna have to be forced to re-evaluate my idea that computer science is a subdiscipline of math, and start wondering if it’s physics instead. And you know that the last thing a good red-blooded American ever wants is to be forced to re-evaluate his preconceptions.
Of course, “quantum computing” has that quantum in its name already.
Re: Hamitonians with arbitrary numbers of free parameters : is that really something realistic to worry about? In math and physics, when we get to large numbers of particles we go to continuum treatments. Thus, calculus, thus, field theory.
-Rob
Yeah, who cares about neutrinos? What are they good for anyways? Higher topos are the real thing.
Richard,
We are in fact part of this universe. The question is why. I mean, the brain is a formally describable system, and therefore defines (in the Tegmark ensemble) a universe in its own right.
This is why “computer science” is more fundamental than physics. In physics you just postulate a universe and try to find the fundamental laws. If, on the other hand, you assume that an ensemble of all posible worlds exists, then many more questions can be raised that don’t make sense in traditional physics.
This is why “computer science” is more fundamental than physics.
Not to mention the dotcom revolution…
Pauli,
The dotcom revolution is nothing compared to what is coming 🙂
Scott,
Thanks for the link. I agree with you that either situation would be tremendously exciting and
have fundamental physical implications. Personally, I am hoping for the first – I rather like the idea of decoherence as a limiting process. But they don’t fall into your 30 year timeline 🙂 . BTW,
I am not sure that limitations due to decoherence would imply that quantum mechanics is false
or that there is a fundamental source of decoherence. It may be a statistical consequence, like with entropy in classical statistical systems?. I think this alone would be of great importance.
Edward Witten: