Science

Danger, Phil Anderson

[Update: Prof. Anderson was kind enough to reply in the comments.]

Another somewhat problematic response to Brockman’s World Question Center is given by Philip Anderson, one of the world’s leading condensed-matter theorists. Like fellow Nobel Laureate Robert Laughlin, Anderson takes a certain pleasure in tweaking the noses of his friends on the high-energy/astrophysics side of the department. We can all use a little tweaking now and then, but must be expected to get tweaked back in return.

Anderson talks about dark matter and dark energy; his piece is short enough that we can go through the whole thing.

Dark Energy might not exist

I hope this idea isn’t too dangerous, by the way, since certain of your favorite bloggers have been quite active in this area. Overall, one gets the impression in the World Question Center that these folks somewhat overestimate how dangerous they are really being.

Let’s try one in cosmology. The universe contains at least 3 and perhaps 4 very different kinds of matter, whose origins probably are physically completely different. There is the Cosmic Background Radiation (CBR) which is photons from the later parts of the Big Bang but is actually the residue of all the kinds of radiation that were in the Bang, like flavored hadrons and mesons which have annihilated and become photons. You can count them and they tell you pretty well how many quanta of radiation there were in the beginning; and observation tells us that they were pretty uniformly distributed, in fact very, and still are.

All true, although the “you can count them” bit is a little confusing — I think the “them” he’s referring to is the photons. The basic idea is that the total number of photons hasn’t changed much since the extremely early universe, which is basically right; it may have increased by a factor of 100 or so during phase transitions when other stuff annihilates into photons, but by cosmological standards that’s not a big change.

Next is radiant matter — protons, mostly, and electrons.

I think by “radiant” he means “not the non-baryonic dark matter.” Neutrons would also count.

There are only a billionth as many of them as quanta of CBR, but as radiation in the Big Bang there were pretty much the same number, so all but one out of a billion combined with an antiparticle and annihilated. Nonetheless they are much heavier than the quanta of CBR, so they have, all told, much more mass, and have some cosmological effect on slowing down the Hubble expansion.

Not even “much” more mass — a factor of 102 or 103, but okay, now we’re nit-picking.

There was an imbalance — but what caused that? That imbalance was generated by some totally independent process, possibly during the very turbulent inflationary era.

Yes; that’s baryogenesis. Maybe it happened during inflation; that’s not a leading candidate, but certainly a plausible one. So far, just a slightly idiosyncratic retelling of the conventional story.

In fact out to a tenth of the Hubble radius, which is as far as we can see, the protons are very non-uniformly distributed, in a fractal hierarchical clustering with things called “Great Walls” and giant near-voids. The conventional idea is that this is all caused by gravitational instability acting on tiny primeval fluctuations, and it barely could be, but in order to justify that you have to have another kind of matter.

Now we’re getting into a bit of trouble. This statement would have been perfectly reasonable, if somewhat alarmist, fifteen or so years ago. These days we know a lot more about the distribution of matter on very large scales, from the microwave background as well as large-scale structure surveys. It’s not a fractal in any interesting sense on very large scales; certainly the density fluctuations on those scales are quite tiny. And when he says “barely could be,” I think he means “fits the data remarkably well.” The Cold Dark Matter model has some issues with the structure of individual galaxies and clusters, but for the overall distribution it’s a fantastic fit.

So you need — and actually see, but indirectly — Dark Matter, which is 30 times as massive, overall, as protons but you can’t see anything but its gravitational effects. No one has much clue as to what it is but it seems to have to be assumed it is hadronic, otherwise why would it be anything as close as a factor 30 to the protons?

That’s just a mistake. “Hadronic” means “made of quarks”; almost nobody thinks the dark matter is hadronic, and in fact it would be extremely difficult to reconcile that idea with primordial nucleosynthesis. The fact that it’s close to the density of protons is certainly interesting, and we don’t know why.

But really, there is no reason at all to suppose its origin was related to the other two, you know only that if it’s massive quanta of any kind it is nowhere near as many as the CBR, and so most of them annihilated in the early stages. Again, we have no excuse for assuming that the imbalance in the Dark Matter was uniformly distributed primevally, even if the protons were, because we don’t know what it is.

I’m not sure what “no excuse” means. If he means “no data support the assumption,” that’s wrong; the idea that fluctuations are adiabatic (correlated fluctuations in dark matter, photons, and baryons) has been pretty well tested, and agrees with the CMB very well. There is some room for a bit of variation (known as “isocurvature perturbations”), but the limits are pretty constraining. Perhaps a real cosmologist could chime in. If he means “we have no idea why the distributions are correlated,” that’s also false; in the simplest models of inflation, it’s exactly what you would expect, as the energy density from the inflaton decays into everything with some fixed amplitudes. Again, there are ways around it, and we don’t know that inflation is correct, but it’s by no means inexplicable.

Finally, of course there is Dark Energy, that is if there is. On that we can’t even guess if it is quanta at all, but again we note that if it is it probably doesn’t add up in numbers to the CBR.

Well, we actually guess that it is not quanta (i.e., particles) — if it were, the number density of particles would presumably dilute away as the universe expands, decreasing the density of dark energy, which isn’t what we observe. The dark energy is nearly constant in density, which is why most people imagine that it’s vacuum energy or the potential of some very light field, not particle excitations.

The very strange coincidence is that when we add this in there isn’t any total gravitation at all, and the universe as a whole is flat, as it would be, incidentally, if all of the heavy parts were distributed everywhere according to some random, fractal distribution like that of the matter we can see — because on the largest scale, a fractal’s density extrapolates to zero.

This “very strange coincidence” is of course a prediction of inflation, that the universe is spatially flat. The bit about the random fractal distribution manages to somehow be simultaneously wrong and ill-defined. Again, we know what the distribution looks like on large scales, from CMB fluctuations, and it’s incredibly smooth. If it were wildly fluctuating, including on scales much larger than our Hubble radius, then most of the universe would have a large amount of spatial curvature — that’s certainly what we see in the local distribution. Not that it’s very clear what such a distribution would actually look like in general relativity.

That suggestion, implying that Dark Energy might not exist, is considered very dangerously radical.

Well, not so much “radical” as “incorrect.” Anderson doesn’t mention the fact that the universe is accelerating, which is curious, since that’s the best evidence for dark energy. His offhanded proposal that density fluctuations are somehow responsible is similar in spirit to the original proposal of Kolb, Matarrese, Notari, and Riotto, that ultra-large-scale inhomogeneities could mimic the effects of dark energy. Everyone now agrees that this idea doesn’t work, although the authors are trying again with small-scale fluctuations. While that hasn’t been cleanly ruled out, it’s a long shot at best; most folks agree that we either need dark energy, or somehow to modify gravity.

Anderson then goes on to argue against any particular conception of God, on the basis of Bayesian probability theory. I’m not a big God booster, but he probably didn’t run this idea by anyone in the Religious Studies department, any more than he ran his dark-energy ideas by any of the local cosmologists (I understand that Princeton has one or two). I think it’s great when smart people step outside their areas of expertise to make interesting suggestions about other fields (if I didn’t, the blogging thing would be kind of indefensible). But we shouldn’t forget that there are smart people in other parts of the university, and have some respect for their expertise. Or is that another one of those dangerous ideas?

Danger, Phil Anderson Read More »

23 Comments

Thought experiments

You are offered a deal in which you are asked to flip a coin ten times. If any one of the flips comes up tails, you are swiftly and painlessly killed. If it comes up heads ten times in a row, you are given a banana. Do you take the deal?

For the purposes of this thought experiment, we may assume it is a perfectly fair coin, and that you like bananas, although not any more so than would generally be considered healthy. We may also assume for simplicity that your life or death is of absolutely no consequence to anyone but yourself: you live in secret on a deserted island, isolated from contact with the outside world, where you have everything you need other than bananas. We may finally assume that we know for certainty that there is no afterlife; upon death, you simply cease to exist in any form. So, there is an approximately 99.9% chance that you will be dead, which by hypothesis implies that you will feel no regrets or feelings of disappointment. And if you survive, you get a banana. What do you think?

Now change the experiment a little. Instead of flipping a coin, you measure the x-component of the spin of an electron that has been prepared in an eigenstate of the y-component of the spin; according to the rules of quantum mechanics, there is an even chance that you will measure the x-component of the spin to be up or down. You do this ten times, with ten different electrons, and are offered the same wager as before, with spin-up playing the role of “heads” for the coin. The only difference is that, instead of a classical probability, we are dealing with branching/collapsing wavefunctions. I.e., if you believe in something like the many-worlds interpretation of quantum mechanics, there will always be a branch of the wavefunction of the universe in which you continue to exist and now have a banana. Do you take the deal?

Thought experiments Read More »

98 Comments

No reasonable definition of reality could be expected to permit this

A thousand years from now, the twentieth century will be remembered as the time when we discovered quantum mechanics. Forget wars, computers, bombs, cars and airplanes: quantum mechanics is a deep truth that will continue to be a part of our understanding of the universe into the foreseeable future.

Schrodinger's Cat So it’s kind of embarassing that we still don’t understand it. Unlike relativity, which seems complicated but is actually quite crystal clear when you get to know it, quantum mechanics remains somewhat mysterious despite its many empirical successes, as Dennis Overbye writes in today’s New York Times.

Don’t get me wrong: we can use quantum mechanics quite fearlessly, making predictions that are tested to the twelfth decimal place. And we even understand the deep difference between quantum mechanics and its predecessor, classical (Newtonian) mechanics. In classical mechanics, any system is described by some set of quantities (such as the position and velocity), and we can imagine careful experiments that measure these quantities with arbitrary precision. The fundamentally new idea in quantum mechanics is that what we can observe is only a small fraction of what really exists. We think there is an electron with a position and a velocity, because that’s what we can observe; but what exists is a wavefunction that tells us the probability of various outcomes when we make such a measurement. There is no such thing as “where the electron really is,” there is only a wavefunction that tells us the relative likelihood of observing it to be in different places.

What we don’t understand is what that word “observing” really means. What happens when we observe something? I don’t claim to have the answer; I have my half-baked ideas, but I’m still working through David Albert’s book and my ideas are not yet firm convictions. It’s interesting to note that some very smart people (like Tony Leggett) are sufficiently troubled by the implications of conventional quantum mechanics that they are willing to contemplate dramatic changes in the basic framework of our current picture. The real trouble is that you can’t address the measurement problem without talking about what constitutes an “observer,” and then you get into all these problematic notions of consciousness and other issues that physicists would just as soon try to avoid whenever possible.

I feel strongly that every educated person should understand the basic outline of quantum mechanics. That is, anyone with a college degree should, when asked “what’s the difference between classical mechanics and quantum mechanics?”, be able to say “in classical mechanics we can observe the state of the system to arbitrary accuracy, whereas in quantum mechanics we can only observe certain limited properties of the wave function.” It’s not too much to ask, I think. It would also be great if everyone could explain the distinction between bosons and fermions. Someday I will write a very short book that explains the major laws of modern physics — special relativity, general relativity, quantum mechanics, and the Standard Model of particle physics — in bite-sized pieces that anyone can understand. If it sells as many copies as On Bullshit, I’ll be quite happy.

No reasonable definition of reality could be expected to permit this Read More »

101 Comments

Raoul Bott, 1923-2005

Sad to hear that Raoul Bott passed away this week (via Peter Woit; see also Jaques and Luboš). Bott was one of the leading mathematicians of his time, but he was also an inspirational teacher and a warm human being. When we were grad students at Harvard, Ted Pyne and I would try to attend whatever class he was giving, even though they were invariably at 8:30 in the morning, a time that was probably chosen intentionally since they were always so popular. He had a joyful sense of humor, and was kind enough to help me out with some geometry questions relevant to a paper I was writing. A truly great man.

Update: Perhaps one story will give a flavor for Bott’s personality. In class one morning he was in the midst of explaining the Atiyah-Singer index theorem (one of the most important results in modern topology), when he paused and looked reflective. Then he said something like, “The first I heard of this kind of thing was at a party at Princeton. Just talking with one of those physicists, it may have been Wigner. He was explaining this idea, saying that something like this ought to be true. Unfortunately, I had had a few drinks, and I didn’t follow him so well. But Atiyah was standing next to me, and he was perfectly sober!” And he laughed at his own story with a sense of open delight.

Raoul Bott, 1923-2005 Read More »

5 Comments

The universe is the poor man's particle accelerator

David SchrammOne thing I wanted to add to Mark’s post about the New Views conference. The conference as a whole was dedicated to the memory of David Schramm, whose 60th birthday would have been this year; he died while piloting his own airplane in 1997. Schramm was an enormously influential figure in contemporary cosmology, one of the prime movers in bringing together particle physics and astrophysics in the study of the early universe. In particular, he was a pioneer in the use of Big-Bang Nucleosynthesis as a way to understand both particle physics and cosmology.

Between a few seconds and a few minutes after the Big Bang, the universe was a nuclear reactor, converting nucleons (neutrons and protons) into nuclei of helium, lithium, and deuterium. At very high temperatures the nucleons can’t bind together without being knocked apart; at low temperatures they would like to be bound into their lowest-energy state, which would be iron nuclei. But the universe is rapidly expanding, so we get a competition: as the temperature declines and it’s possible to form nuclei, the density is also falling, making reactions less frequent. We end up with several light nuclei, but don’t have enough time to make anything heavier.

The relic abundances of these nuclei depend on everything about physics when the universe was one minute old: particle physics parameters that govern the reaction rates, the number of species that governs the energy density, and the laws of general relativity that govern the expansion of the universe. (For example, if the universe were expanding a little bit faster, the reactions would happen a little bit earlier, implying that fewer neutrons would have decayed, allowing for the production of more helium.) Miraculously, the observed abundances fit precisely onto the predictions that come from extrapolating what we know about physics here and now all the way back to a minute after the Big Bang. The helium abundance provided the first empirical evidence that there were only three families of matter particles, long before Earth-based particle accelerators verified the result. And BBN assures us that Einstein’s general relativity works without modification in the very early universe; in particular, we know that Newton’s constant of gravitation had the same value then as it does now to within about twenty percent.

Personally, I find the success of BBN to be one of the most impressive feats in all of modern science. Here we are, 7,000,000,000,000,000 minutes after the Big Bang, making quantitative statements about what was going on 1 minute after the Big Bang — and it’s a perfect fit. I’ll never cease to be amazed that we know exactly what the universe was doing when it was one minute old.

The universe is the poor man's particle accelerator Read More »

14 Comments

Susskind interview

While we’re getting the multiverse out of our system, let me point to this interview with Leonard Susskind by Amanda Gefter over at New Scientist (also noted at Not Even Wrong). I’ve talked with Amanda before, about testing general relativity among other things, and she was nice enough to forward the introduction to the interview, which appears in the print edition but was omitted online.

Ever since Albert Einstein wondered whether the world might have been different, physicists have been searching for a “theory of everything” to explain why the universe is exactly the way it is. But one of today’s leading candidates, string theory, is in trouble. A growing number of physicists claim it is ill-defined, based on crude assumptions and hasn’t got us any closer to a theory of everything. Something fundamental is missing, they say (see New Scientist, 10 December, p 5).

The main complaint is that rather than describing one universe, the theory describes some 10500, each with different kinds of particles, different constants of nature, even different laws of physics. But physicist Leonard Susskind, who invented string theory, sees this huge “landscape” of universes not as a problem, but as a solution.

If all these universes actually exist, forming a huge “multiverse,” then maybe physicists can explain the way things are after all. According to Susskind, the existence of a multiverse could answer the most perplexing question in physics: why the value of the cosmological constant, which describes how rapidly the expansion of the universe is accelerating, appears improbably fine-tuned to allow life to exist. A little bigger and the universe would have expanded too fast for galaxies to form; a little smaller and it would have collapsed into a black hole. With an infinite number of universes, says Susskind, there is bound to be one with a cosmological constant like ours.

The idea is controversial, because it changes how physics is done, and it means that the basic features of our universe are just a random luck of the draw. He explains to Amanda Gefter why he’s defending it, and why it’s a possibility we simply can’t ignore.

Susskind interview Read More »

62 Comments

Is our universe natural?

Hey, has anyone heard about this string theory landscape business, and the anthropic principle, and some sort of controversy? Hmm, I guess they have. Perhaps enough that whatever needs to be said has already been thoroughly hashed out.

But, hey! It’s a blog, right? Hashing stuff out is what we like to do. So I’ll modestly point to my own recent contribution to the cacophony: Is Our Universe Natural?, a short review for Nature. To give you an idea of the gist:

If any system should be natural, it’s the universe. Nevertheless, according to the criteria just described, the universe we observe seems dramatically unnatural. The entropy of the universe isn’t nearly as large as it could be, although it is at least increasing; for some reason, the early universe was in a state of incredibly low entropy. And our fundamental theories of physics involve huge hierarchies between the energy scales characteristic of gravitation (the reduced Planck scale, 1027 electron volts), particle physics (the Fermi scale of the weak interactions, 1011 eV, and the scale of quantum chromodynamics, 108 eV), and the recently-discovered vacuum energy (10-3 eV). Of course, it may simply be that the universe is what it is, and these are brute facts we have to live with. More optimistically, however, these apparently delicately-tuned features of our universe may be clues that can help guide us to a deeper understanding of the laws of nature.

The article is not strictly about the anthropic principle, but about the broader question of what kinds of explanations might account for seemingly “unnatural” features of the universe. The one thing I do that isn’t common in these discussions is to simultaneously contemplate both the dynamical laws that govern the physics we observe, and the specific state in which we find the universe. This lets me tie together the landscape picture with my favorite ideas about spontaneous inflation and the arrow of time. In each case, selection effects within a multiverse dramatically change our naive expectation about what might constitute a natural situation.

About the anthropic principle itself (or, as I much prefer, “environmental selection”), I don’t say much that I haven’t said before. I’m not terribly fond of the idea, but it might be right, and if so we have to deal with it. Or it might not be right. The one thing that I hammer on a little is that we do not already have any sort of “prediction” from the multiverse, even Weinberg’s celebrated calculation of the cosmological constant. These purported successes rely on certain crucial simplifying assumptions that we have every reason to believe are wildly untrue. In particular, if you believe in eternal inflation (which you have to, to get the whole program off the ground), the spacetime volume in any given vacuum state is likely to be either zero or infinite, and typical anthropic predictions implicitly assume that all such volumes are equal. Even if string theorists could straightforwardly catalogue the properties of every possible compactification down to four dimensions, an awful lot of cosmological input would be necessary before we could properly account for the prior distribution contributed by inflation. (If indeed the notion makes any sense at all.)

I was asked to make the paper speculative and provocative, so hopefully I succeeded. The real problem is that draconian length constraints prevented me from making arguments in any depth — there are a lot of contentious statements that are simply thrown out there without proper amplification. But hopefully the main points come through clearly: calculating probabilities within an ensemble of vacua may some day be an important part of how we explain the state of our observed universe, but we certainly aren’t there yet.

Here’s the conclusion:

The scenarios discussed in this paper involve the invocation of multiple inaccessible domains within an ultra-large-scale multiverse. For good reason, the reliance on the properties of unobservable regions and the difficulty in falsifying such ideas make scientists reluctant to grant them an explanatory role. Of course, the idea that the properties of our observable domain can be uniquely extended beyond the cosmological horizon is an equally untestable assumption. The multiverse is not a theory; it is a consequence of certain theories (of quantum gravity and cosmology), and the hope is that these theories eventually prove to be testable in other ways. Every theory makes untestable predictions, but theories should be judged on the basis of the testable ones. The ultimate goal is undoubtedly ambitious: to construct a theory that has definite consequences for the structure of the multiverse, such that this structure provides an explanation for how the observed features of our local domain can arise naturally, and that the same theory makes predictions that can be directly tested through laboratory experiments and astrophysical observations. Only further investigation will allow us to tell whether such a program represents laudable aspiration or misguided hubris.

Is our universe natural? Read More »

37 Comments

Where the dark matter is

Dark Matter Map Just because you can’t see the dark matter doesn’t mean you can’t take a picture of it. Via Universe Today, here’s a press release from Johns Hopkins announcing a beautiful new image of the reconstructed dark matter density in cluster CL 0152-1357 by Jee et al. (I couldn’t find the paper online, but you can get a higher-resolution version of the picture at Myungkook Jee’s home page.) The dark matter is in purple, the galaxies are in yellow.

How do you do that? It’s not because we’ve detected some form of light coming from the dark matter. Rather, we’ve detected (once again) its gravitational field — this time, via the tiny distortions in the shapes and positions of background galaxies (weak lensing). This is a form of gravitational lensing that is so subtle you could never detect it happening to a single galaxy — it would be impossible to distinguish between lensing and the intrinsic shape of the galaxy. But if you have a large number of background galaxies (which the universe is kind enough to provide us), you can use statistics to reconstruct the gravitational field through which the light travels, and hence figure out where the dark matter must be.

Of course we’re still trying to detect the dark matter, both directly (in ground-based experiments) and indirectly (looking for high-energy radiation produced by annihilating dark matter particles), not to mention using particle accelerators to actually produce candidate dark matter particles. Over the next ten or twenty years, probing the properties of dark matter is going to be one of the top priorities at the particle/astrophysics interface.

Where the dark matter is Read More »

21 Comments

Live-blogging from the lab

Hopefully Mark’s post explains why there hasn’t been much content from this occasional blog lately — at least three of us are distracted by the New Views symposium (about which I also hope to say something substantive soon). While you’re all waiting for our ungrounded speculations about the universe to return, why not cleanse the palate with some real experimental physics? Chad Orzel at Uncertain Principles has just completed a week’s worth of blogging about the work in his lab. Check out the entries to see the unpredictable hazards of hands-on research. (For a theorist like me, a typical unpredictable hazard is when the barista uses 2% instead of whole milk in my latte.)

Live-blogging from the lab Read More »

13 Comments

How many dimensions are there?

When the fall quarter started, there were six papers that I absolutely had to finish by the end of the term. Three have been completed, two are very close, and the last one — sadly, I think the deadline has irrevocably passed, and it’s not going to make it. So here’s the upshot.

About a year ago I gave a talk at the Philosophy of Science Association annual meeting in Austin. The topic of the session was “The Dimensions of Space,” and my talk was on “Why Three Spatial Dimensions Just Aren’t Enough” (pdf slides). I gave an overview of the idea of extra dimensions, how they arose historically and the role they currently play in string theory.

But in retrospect, I didn’t do a very good job with one of the most basic questions: how many dimensions does spacetime really have, according to string theory? The answer used to be easy: ten, with six of them curled up into a tiny manifold that we couldn’t see. But in the 1990’s we saw the “Second Superstring Revolution,” featuring ideas about D-branes, duality, and the unification of what used to be thought of as five distinct versions of string theory.

One of the most important ideas in the second revolution came from Ed Witten. Ordinarily, we like to examine field theories and string theories at weak coupling, where perturbation theory works well (QED, for example, is well-described by perturbation theory because the fine-structure constant α = 1/137 is a small number). Witten figured out that when you take the strong-coupling limit of certain ten-dimensional string theories, new degrees of freedom begin to show up (or more accurately, begin to become light, in the sense of having a low mass). Some of these degrees of freedom form a series of states with increasing masses. This is precisely what happens when you have an extra dimension: modes of ordinary fields that wrap around the extra dimension will have a tower of increasing masses, known as Kaluza-Klein modes.

In other words: the strong-coupling limit of certain ten-dimensional string theories is an eleven-dimensional theory! In fact, at low energies, it’s eleven-dimensional supergravity, which had been studied for years, but whose connection to string theory had been kind of murky. Now we know that 11-d supergravity and the five ten-dimensional string theories are just six different low-energy weakly-coupled limits of some single big theory, which we call M-theory even though we don’t know what it really is. (Even though the 11-d theory can arise as the strong-coupling limit of a 10-d string theory, it is itself weakly coupled in its own right; this is an example of strong-weak coupling duality.)

So … how many dimensions are there really? If one limit of the theory is 11-dimensional, and others are 10-dimensional, which is right?

I’ve heard respected string theorists come down on different sides of the question: it’s really ten-dimensional, it’s really eleven. (Some have plumped for twelve, but that’s obviously crazy.) But it’s more accurate just to say that there is no unique answer to this question. “The dimensionality of spacetime” is not something that has a well-defined value in string theory; it’s an approximate notion that is more or less useful in different circumstances. If you look at spacetime a certain way, it can look ten-dimensional, and another way it can look like eleven. In yet other configurations, thank goodness, it looks like four!

And it only gets worse. According to Juan Maldacena’s famous gravity-gauge theory correspondence (AdS/CFT), we can have a theory that is equally well described as a ten-dimensional theory of gravity, or a four-dimensional gauge theory without any gravity at all. It might sound like the degrees of freedom don’t match up, but ultimately infinity=infinity, so a lot of surprising things can happen.

This story is one of the reasons for both optimism and pessimism about the prospects for connecting string theory to the real world. On the one hand, string theory keeps leading us to discover amazing new things: it wasn’t as if anyone guessed ahead of time that there should be dualities between theories in different dimensions, it was forced on us by pushing the equations as far as they would go. On the other, it’s hard to tell how many more counterintuitive breakthroughs will be required before we can figure out how our four-dimensional observed universe fits into the picture (if ever). But it’s nice to know that the best answer to a seemingly-profound question is sometimes to unask it.

How many dimensions are there? Read More »

70 Comments
Scroll to Top