Science

A Bit of Physics History: Ed Witten Introduces M-Theory

The Second Superstring Revolution was, like most revolutions, a somewhat messy affair, with a number of pivotal steps along the way: understanding the role of membranes in 11-dimensional supergravity, the discovery of dualities in supersymmetric gauge theories, Polchinski’s appreciation of D-branes as dynamical extended objects in string theory, and of course Maldacena’s formulation of the AdS/CFT correspondence. But perhaps the high point was Ed Witten’s formulation of M-Theory in 1995. And I just noticed that Witten sharing it with the world was captured on video.

Here is Witten’s paper:

String Theory Dynamics In Various Dimensions
Edward Witten

The strong coupling dynamics of string theories in dimension d≥4 are studied. It is argued, among other things, that eleven-dimensional supergravity arises as a low energy limit of the ten-dimensional Type IIA superstring, and that a recently conjectured duality between the heterotic string and Type IIA superstrings controls the strong coupling dynamics of the heterotic string in five, six, and seven dimensions and implies S duality for both heterotic and Type II strings.

Before this result, we knew about five different kinds of string theory, each living in ten dimensions: Type I, two different Type II’s, and two different “heterotic” theories. Then there was the most symmetric form of supergravity, living in 11 dimensions, which some people thought was interesting but others thought was a curiosity that had been superseded by string theory. To everyone’s amazement, Witten showed that all of these theories are simply different limiting cases of a single underlying structure. Nobody knows what that underlying theory really is (although there are a few different formulations that work in some contexts), but we know what to call it: M-theory.

mtheory

Now Amanda Gefter, author of the new book Trespassing on Einstein’s Lawn (and a recent guest-blogger at Cocktail Party Physics), takes to Twitter to point out something I wasn’t aware of: a video record of Witten’s famous 1995 talk at USC. (I’m pretty sure this is the celebrated talk, but my confidence isn’t 100%.) [Update: folks who should know are actually saying it might be a seminar soon thereafter at Stony Brook. Witten himself admits that he’s not sure.] It’s clearly a recording by someone in the audience, but I don’t know who.

Most physics seminars are, shall we say, not all that historically exciting. But this one was recognized right away as something special. I was a postdoc at MIT at the time, and not in the audience myself, but I remember distinctly how the people who were there were buzzing about it when they returned home.

Nature giveth, and Nature taketh away. The 1995 discovery of M-theory made string theory seem more promising than ever, to the extent that just a single theory, rather than five or six. Then the 1998 discovery that the universe is accelerating made people take more seriously the idea that there might be more than one way to compactify those extra dimensions down to the four we observe — and once you have more than one, you sadly end up with a preposterously high number (the string theory landscape). So even if there is only one unifying theory of everything, there seem to be a bajillion phases it can be in, which creates an enormous difficulty in trying to relate M-theory to reality. But we won’t know unless we try, will we?

A Bit of Physics History: Ed Witten Introduces M-Theory Read More »

15 Comments

Guest Post: Katherine Freese on Dark Matter Developments

Katherine Freese The hunt for dark matter has been heating up once again, driven (as usual) by tantalizing experimental hints. This time the hints are coming mainly from outer space rather than underground laboratories, which makes them harder to check independently, but there’s a chance something real is going on. We need more data to be sure, as scientists have been saying since the time Eratosthenes measured the circumference of the Earth.

As I mentioned briefly last week, Katherine Freese of the University of Michigan has a new book coming out, The Cosmic Cocktail, that deals precisely with the mysteries of dark matter. Katie was also recently at the UCLA Dark Matter Meeting, and has agreed to share some of her impressions with us. (She also insisted on using the photo on the right, as a way of reminding us that this is supposed to be fun.)


Dark Matter Everywhere (at the biannual UCLA Dark Matter Meeting)

The UCLA Dark Matter Meeting is my favorite meeting, period. It takes place every other year, usually at the Marriott Marina del Rey right near Venice Beach, but this year on UCLA campus. Last week almost two hundred people congregated, both theorists and experimentalists, to discuss our latest attempts to solve the dark matter problem. Most of the mass in galaxies, including our Milky Way, is not comprised of ordinary atomic material, but instead of as yet unidentified dark matter. The goal of dark matter hunters is to resolve this puzzle. Experimentalist Dave Cline of the UCLA Physics Department runs the dark matter meeting, with talks often running from dawn till midnight. Every session goes way over, but somehow the disorganization leads everybody to have lots of discussion, interaction between theorists and experimentalists, and even more cocktails. It is, quite simply, the best meeting. I am usually on the organizing committee, and cannot resist sending in lots of names of people who will give great talks and add to the fun.

Last week at the meeting we were treated to multiple hints of potential dark matter signals. To me the most interesting were the talks by Dan Hooper and Tim Linden on the observations of excess high-energy photons — gamma-rays — coming from the Central Milky Way, possibly produced by annihilating WIMP dark matter particles. (See this arxiv paper.) Weakly Interacting Massive Particles (WIMPs) are to my mind the best dark matter candidates. Since they are their own antiparticles, they annihilate among themselves whenever they encounter one another. The Center of the Milky Way has a large concentration of dark matter, so that a lot of this annihilation could be going on. The end products of the annihilation would include exactly the gamma-rays found by Hooper and his collaborators. They searched the data from the FERMI satellite, the premier gamma-ray mission (funded by NASA and DoE as well as various European agencies), for hints of excess gamma-rays. They found a clear excess extending to about 10 angular degrees from the Galactic Center. This excess could be caused by WIMPs weighing about 30 GeV, or 30 proton masses. Their paper called these results “a compelling case for annihilating dark matter.” After the talk, Dave Cline decided to put out a press release from the meeting, and asked the opinion of us organizers. Most significantly, Elliott Bloom, a leader of the FERMI satellite that obtained the data, had no objection, though the FERMI team itself has as yet issued no statement.

Many putative dark matter signals have come and gone, and we will have to see if this one holds up. Two years ago the 130 GeV line was all the rage — gamma-rays of 130 GeV energy that were tentatively observed in the FERMI data towards the Galactic Center. (Slides from Andrea Albert’s talk.) This line, originally proposed by Stockholm’s Lars Bergstrom, would have been the expectation if two WIMPs annihilated directly to photons. People puzzled over some anomalies of the data, but with improved statistics there isn’t much evidence left for the line. The question is, will the 30 GeV WIMP suffer the same fate? As further data come in from the FERMI satellite we will find out.

What about direct detection of WIMPs? Laboratory experiments deep underground, in abandoned mines or underneath mountains, have been searching for direct signals of astrophysical WIMPs striking nuclei in the detectors. At the meeting the SuperCDMS experiment hammered on light WIMP dark matter with negative results. The possibility of light dark matter, that was so popular recently, remains puzzling. 10 GeV dark matter seemed to be detected in many underground laboratory experiments: DAMA, CoGeNT, CRESST, and in April 2013 even CDMS in their silicon detectors. Yet other experiments, XENON and LUX, saw no events, in drastic tension with the positive signals. (I told Rick Gaitskell, a leader of the LUX experiment, that I was very unhappy with him for these results, but as he pointed out, we can’t argue with nature.) Last week at the conference, SuperCMDS, the most recent incarnation of the CDMS experiment, looked to much lower energies and again saw nothing. (Slides from Lauren Hsu’s talk.) The question remains: are we comparing apples and oranges? These detectors are made of a wide variety of types of nuclei and we don’t know how to relate the results. Wick Haxton’s talk surprised me by discussion of nuclear physics uncertainties I hadn’t been aware of, that in principle could reconcile all the disagreements between experiments, even DAMA and LUX. Most people think that the experimental claims of 10 GeV dark matter are wrong, but I am taking a wait and see attitude.

We also heard about the hints of detection of a completely different dark matter candidate: sterile neutrinos. (Slides from George Fuller’s talk.) In addition to the three known neutrinos of the Standard Model of Particle Physics, there could be another one that doesn’t interact with the standard model. Yet its decay could lead to x-ray lines. Two separate groups found indications of lines in data from the Chandra and XMM-Newton space satellites that would be consistent with a 7 keV neutrino (7 millionths of a proton mass). Could it be that there is more than one type of dark matter particle? Sure, why not?

On the last evening of the meeting, a number of us went to the Baja Cantina, our favorite spot for margaritas. Rick Gaitskell was smart: he talked us into the $60.00 pitchers, high enough quality that the 6AM alarm clocks the next day (that got many of us out of bed and headed to flights leaving from LAX) didn’t kill us completely. We have such a fun community of dark matter enthusiasts. May we find the stuff soon!

Guest Post: Katherine Freese on Dark Matter Developments Read More »

23 Comments

Effective Field Theory and Large-Scale Structure

Been falling behind on my favorite thing to do on the blog: post summaries of my own research papers. Back in October I submitted a paper with two Caltech colleagues, postdoc Stefan Leichenauer and grad student Jason Pollack, on the intriguing intersection of effective field theory (EFT) and cosmological large-scale structure (LSS). Now’s a good time to bring it up, as there’s a great popular-level discussion of the idea by Natalie Wolchover in Quanta.

So what is the connection between EFT and LSS? An effective field theory, as loyal readers know, an “effective field theory” is a way to describe what happens at low energies (or, equivalently, long wavelengths) without having a complete picture of what’s going on at higher energies. In particle physics, we can calculate processes in the Standard Model perfectly well without having a complete picture of grand unification or quantum gravity. It’s not that higher energies are unimportant, it’s just that all of their effects on low-energy physics can be summed up in their contributions to just a handful of measurable parameters.

In cosmology, we consider the evolution of LSS from tiny perturbations at early times to the splendor of galaxies and clusters that we see today. It’s really a story of particles — photons, atoms, dark matter particles — more than a field theory (although of course there’s an even deeper description in which everything is a field theory, but that’s far removed from cosmology). So the right tool is the Boltzmann equation — not the entropy formula that appears on his tombstone, but the equation that tells us how a distribution of particles evolves in phase space. However, the number of particles in the universe is very large indeed, so it’s the most obvious thing in the world to make an approximation by “smoothing” the particle distribution into an effective fluid. That fluid has a density and a velocity, but also has parameters like an effective speed of sound and viscosity. As Leonardo Senatore, one of the pioneers of this approach, says in Quanta, the viscosity of the universe is approximately equal to that of chocolate syrup.

So the goal of the EFT of LSS program (which is still in its infancy, although there is an important prehistory) is to derive the correct theory of the effective cosmological fluid. That is, to determine how all of the complicated churning dynamics at the scales of galaxies and clusters feeds back onto what happens at larger distances where things are relatively smooth and well-behaved. It turns out that this is more than a fun thing for theorists to spend their time with; getting the EFT right lets us describe what happens even at some length scales that are formally “nonlinear,” and therefore would conventionally be thought of as inaccessible to anything but numerical simulations. I really think it’s the way forward for comparing theoretical predictions to the wave of precision data we are blessed with in cosmology.

Here is the abstract for the paper I wrote with Stefan and Jason:

A Consistent Effective Theory of Long-Wavelength Cosmological Perturbations
Sean M. Carroll, Stefan Leichenauer, Jason Pollack

Effective field theory provides a perturbative framework to study the evolution of cosmological large-scale structure. We investigate the underpinnings of this approach, and suggest new ways to compute correlation functions of cosmological observables. We find that, in contrast with quantum field theory, the appropriate effective theory of classical cosmological perturbations involves interactions that are nonlocal in time. We describe an alternative to the usual approach of smoothing the perturbations, based on a path-integral formulation of the renormalization group equations. This technique allows for improved handling of short-distance modes that are perturbatively generated by long-distance interactions.

As useful as the EFT of LSS approach is, our own contribution is mostly on the formalism side of things. (You will search in vain for any nice plots comparing predictions to data in our paper — but do check out the references.) We try to be especially careful in establishing the foundations of the approach, and along the way we show that it’s not really a “field” theory in the conventional sense, as there are interactions that are nonlocal in time (a result also found by Carrasco, Foreman, Green, and Senatore). This is a formal worry, but doesn’t necessarily mean that the theory is badly behaved; one just has to work a bit to understand the time-dependence of the effective coupling constants.

Here is a video from a physics colloquium I gave at NYU on our paper. A colloquium is intermediate in level between a public talk and a technical seminar, so there are some heavy equations at the end but the beginning is pretty motivational. Enjoy!

Colloquium October 24th, 2013 -- Effective Field Theory and Cosmological Large-Scale Structure

Effective Field Theory and Large-Scale Structure Read More »

8 Comments

Dept. of Energy Support for Particle Theory: A “Calamity”

One of the nice things that governments do is support basic scientific research — work that might help us better understand how the world works, but doesn’t have any direct technological or economic application. Particle physics and cosmology are great examples. In the U.S., much of the funding for these fields comes from the Office of High Energy Physics within the Office of Science at the Department of Energy (DOE).

Now that support is crumbling — drastically. In the last couple of years, the DOE has radically changed how it carries out reviews of different university theory groups, to decide how much grant support each will get. All for ostensibly good reasons — leveling the playing field and all that. But, without much fanfare, the actual result has been a significant drop in funding for almost every major theory group in the country.

Laurence Yaffe of the University of Washington, a respected particle and nuclear theorist, just released an analysis he informally carried out after serving a temporary assignment at the DOE. Here is his abstract (emphasis mine):

Impacts of Recent Comparative Review Cycles on DOE-funded High Energy Theory
L.G. Yaffe, University of Washington
February 19, 2014

A summary is presented of data obtained from a grass-roots effort to understand the effects of the FY13 and FY14 comparative review cycles on the DOE-funded portion of the US high energy theory community and, in particular, on graduate students and postdoctoral researchers who are beginning their careers. For a sample comprised of nearly all of the larger groups undergoing comparative review, total funding declined by an average of 23%, with numerous major groups receiving reductions in the 30–55% range. Funding available for postdoc or graduate student support declined over 30%, with many reductions in the 40–65% range. The total number of postdoc positions in this large sample of theory groups is declining by over 40%. The impacts on young researchers raise grave concerns regarding continued U.S. leadership in high energy theory.

A 20% cut in funding in one year is kind of a big deal. A picture is worth a thousand words, so here are two of them; overall funding changes for all the different groups:

calamity1

and changes specifically in support for graduate students and postdocs:

calamity2

Obviously this is unsustainable, unless as a society we make the decision that particle physics just isn’t worth doing. But hopefully things can be rectified at least a bit, to restore some of that money. Everyone I know is bemoaning the cuts, complaining that they have been turning away prospective grad students and postdocs more than ever before. I’m not necessarily against decreasing the number of postdocs (as opposed to grad students); the pipeline has to narrow somewhere, and there’s a sensible argument to be made to do it at that point. But we should do it deliberately and after thinking and talking about it, not as the haphazard result of some new bureaucratic procedures. It would be a shame to destroy our future prospects in this centrally important area of science.

Dept. of Energy Support for Particle Theory: A “Calamity” Read More »

31 Comments

The Many Worlds of Quantum Mechanics

Greetings from Sihanoukville, Cambodia, or at least the waters immediately off. I’m here as part of Bright Horizons 19, a two-week cruise on the Holland American ship Vollendam, in collaboration with Scientific American. We started in Hong Kong and have been working our way south, stopping a few times in Vietnam, and after this we’ll briefly visit Thailand before finishing in Singapore. A fascinating, once-in-a-lifetime experience, even if two weeks is an amount of time I can’t honestly afford to be taking off. Been getting a touch of work done here and there, but not as much as I would have liked, in between dashes ashore to sample the local cuisine. Although the local cuisine has been pretty spectacular, I have to admit.

My job here is to give a few talks about physics and cosmology to the folks who signed up for the package — a public audience, but the kind of people whose idea of a good time while sailing the South China Sea is hearing talks about molecular biology or world history. Mostly my talks are variations of themes I’ve spoken on frequently before — the Higgs boson, the arrow of time, dark matter and dark energy. But to spice things up I decided to throw in something new, so I wrote up a talk on The Many Worlds of Quantum Mechanics.

And here it is — the slides, at least. The content is roughly based on my explanation in From Eternity to Here, with a few improvements thrown in.

Two basic goals here. One is to introduce QM to people who don’t know much more about it than a vague notion of “uncertainty” or “fluctuations.” And in particular, to focus on the conceptual foundations, rather than any of the other perfectly legitimate angles one could take: the historical development, the calculational basics, the experimental evidence, the role in modern technology, and so on. Hey, it’s my talk, I might as well concentrate on the parts I’m most fascinated by. So there’s a discussion of entanglement and decoherence that is a bit more specific and detailed than one would often get in a talk of this type, even if it is enlivened by silly pictures of cats and dogs.

The second goal was to give a subtle sales pitch for the Many-Worlds interpretation. Really more damage control than full-on hard sell; the very idea of many worlds is so crazy-sounding and counterintuitive that my job is more to let people know that it’s actually quite a natural implication of the formalism, rather than a bit of ad hoc nonsense tacked on by theorists who have become unmoored from reality. I’m happy to bring up the outstanding issues with the approach, but I do want people to know it should be taken seriously.

Comments welcome, especially since I’ve never tried this approach in a talk before. Of course by only seeing the slides you miss all the witty asides, but the basic substance should come through.

The Many Worlds of Quantum Mechanics Read More »

67 Comments

Reality, Pushed From Behind

Teleology” is a naughty word in certain circles — largely the circles that I often move in myself, namely physicists or other scientists who know what the word “teleology” means. To wit, it’s the concept of “being directed toward a goal.” In the good old days of Aristotle, our best understanding of the world was teleological from start to finish: acorns existed in order to grow into mighty oak trees; heavy objects wanted to fall and light objects to rise; human beings strove to fulfill their capacity as rational beings. Not everyone agreed, including my buddy Lucretius, but at the time it was a perfectly sensible view of the world.

These days we know better, though the knowledge has been hard-won. The early glimmerings of the notion of conservation of momentum supported the idea that things just kept happening, rather than being directed toward a cause, and this view seemed to find its ultimate embodiment in the clockwork universe of Newtonian mechanics. (In technical terms, time evolution is described by differential equations fixed by initial data, not by future goals.) Darwin showed how the splendid variety of biological life could arise without being in any sense goal-directed or guided — although this obviously remains a bone of contention among religious people, even respectable philosophers. But the dominant paradigm among scientists and philosophers is dysteleological physicalism.

However. Aristotle was a smart cookie, and dismissing him as an outdated relic is always a bad idea. Sure, maybe the underlying laws of nature are dysteleological, but surely there’s some useful sense in which macroscopic real-world systems can be usefully described using teleological language, even if it’s only approximate or limited in scope. (Here’s where I like to paraphrase Scott Derrickson: The universe has purposes. I know this because I am part of the universe, and I have purposes.) It’s okay, I think, to say things like “predators tend to have sharp teeth because it helps them kill and eat prey,” even if we understand that those causes are merely local and contingent, not transcendent. Stephen Asma defends this kind of view in an interesting recent article, although I would like to see more acknowledgement made of the effort required to connect the purposeless, mechanical underpinnings of the world to the purposeful, macroscopic biosphere. Such a connection can be made, but it requires some effort.

Of course loyal readers all know where such a connection comes from: it’s the arrow of time. The underlying laws of physics don’t work in terms of any particular “pull” toward future goals, but the specific trajectory of our actual universe looks very different in the past than in the future. In particular, the past had a low entropy: we can reconcile the directedness of macroscopic time evolution with the indifference of microscopic dynamics by positing some sort of Past Hypothesis (see also). All of the ways in which physical objects behave differently toward the future than toward the past can ultimately be traced to the thermodynamic arrow of time.

Which raises an interesting point that I don’t think is sufficiently appreciated: we now know enough about the real behavior of the physical world to understand that what looks to us like teleological behavior is actually, deep down, not determined by any goals in the future, but fixed by a boundary condition in the past. So while “teleological” might be acceptable as a rough macroscopic descriptor, a more precise characterization would say that we are being pushed from behind, not pulled from ahead.

The question is, what do we call such a way of thinking? Apparently “teleology” is a word never actually used by Aristotle, but invented in the eighteenth century based on the Greek télos, meaning “end.” So perhaps what we want is an equivalent term, with “end” replaced by “beginning.” I know exactly zero ancient Greek, but from what I can glean from the internet there is an obvious choice: arche is the Greek word for beginning or origin. Sadly, “archeology” is already taken to mean something completely different, so we can’t use it.

I therefore tentatively propose the word aphormeology to mean “originating from a condition in the past,” in contrast with teleology, “driven toward a goal in the future.” (Amazingly, a Google search for this word on 3 February 2014 returns precisely zero hits.) Remember — no knowledge of ancient Greek, but apparently aphorme means “a base of operations, a place from which a campaign is launched.” Which is not a terribly bad way of describing the cosmological Past Hypothesis when you think about it. (Better suggestions would be welcome, especially from anyone who actually knows Greek.)

We live in a world where the dynamical laws are fundamentally dysteleological, but our cosmic history is aphormeological, which through the magic of statistical mechanics gives rise to the appearance of teleology in our macroscopic environment. A shame Aristotle and Lucretius aren’t around to appreciate the progress we’ve made.

Reality, Pushed From Behind Read More »

66 Comments

Searching for the Science of Self

memywhy_cover Book release day! Not by me — I’ve gone on quasi-hiatus from book-writing, and for that matter from blogging, while I am happily getting some actual science done. But the brilliant and talented Jennifer Ouellette has come out with her best book yet — Me, Myself, and Why: Searching for the Science of Self.

Jennifer’s last book was The Calculus Diaries: How Math Can Help You Lose Weight, Win in Vegas, and Survive a Zombie Apocalypse. The idea behind that one stemmed from her conviction that, despite having been an English major who did badly in math, it was important that she learn the basics of calculus in order to appreciate the way it alters how we experience the world. But in doing the research for that book, she discovered something surprising: according to her high-school transcripts, she hadn’t done badly in math at all. In fact she got all A’s. But she left school with a conviction that she was bad in math.

Where did that conviction come from? Was it society’s fault, man? Or did it come from her parents? And since she was adopted, which set of parents should be blamed? Clearly there was an science question here: what were the crucial influences that made her the person she eventually became?

Thus, the new book. Here Jennifer traces various scientific strands that weave together to make us the people that we are. Starting with some of the obvious strategies — genome sequencing, brain scans — and working up to some more (literally) brain-bending ideas about sexual identity, addiction, virtual reality, and the origins of consciousness.

Me, Myself and Why (book trailer)

Jennifer even convinced her innocent, straight-arrow husband to experiment briefly with hallucinogenic substances, in order to better understand how a temporary alteration in brain chemistry affects the self/other boundary. See Chapter Seven for the scandalous details.

Searching for the Science of Self Read More »

5 Comments

What Scientific Ideas Are Ready for Retirement?

Every year we look forward to the Edge Annual Question, and as usual it’s a provocative one: “What scientific idea is ready for retirement?” Part of me agrees with Ian McEwan’s answer, which is to unask the question, and argue that nothing should be retired. Unasking is almost always the right response to questions that beg other questions, but there’s also an argument to be made in favor of playing along, so that’s what I did.

My answer was “Falsifiability.” More of a philosophical idea than a scientific one, but an idea that is bandied about by lazy scientists far more than it is invoked by careful philosophers. Thinking sensibly about the demarcation problem between science and non-science, especially these days, requires a bit more nuance than that.

Modern physics stretches into realms far removed from everyday experience, and sometimes the connection to experiment becomes tenuous at best. String theory and other approaches to quantum gravity involve phenomena that are likely to manifest themselves only at energies enormously higher than anything we have access to here on Earth. The cosmological multiverse and the many-worlds interpretation of quantum mechanics posit other realms that are impossible for us to access directly. Some scientists, leaning on Popper, have suggested that these theories are non-scientific because they are not falsifiable.

The truth is the opposite. Whether or not we can observe them directly, the entities involved in these theories are either real or they are not. Refusing to contemplate their possible existence on the grounds of some a priori principle, even though they might play a crucial role in how the world works, is as non-scientific as it gets.

I’m also partial to Alan Guth’s answer: “The universe began in a low-entropy state.” Of course we all know that our observable universe had a relatively low entropy at the Big Bang; Alan is making the point that the observable universe might not be the whole thing, and the Big Bang might not have been the beginning, so it’s completely possible that the universe as a whole was never in what one might call a “low-entropy” state. Instead, starting from a generic state, entropy could increase in both directions, leading to a two-sided arrow of time. This has been one of my favorite ideas for a while now, and Alan and I are writing a paper with Chien-Yao Tseng that examines toy models with such behavior.

Here are some other interesting/provocative answers, picked unsystematically out of over 100,000 words overall. Remember that the titles are what the person wants to retire, not something they’re in favor of.

What Scientific Ideas Are Ready for Retirement? Read More »

81 Comments

Buchalter Cosmology Prize

Ari Buchalter is one of the many people who has successfully made the transition from graduate student and researcher in physics (Columbia PhD, Caltech postdoc) to the business world, where he is currently the CEO of MediaMath. But he never lost his interest in theoretical cosmology, which is completely appropriate — how our universe works is something everyone should be interested in, no matter what their day job might be.

In order to promote innovative thinking in cosmology (experimental as well as theoretical), Ari has founded the Buchalter Cosmology Prize, which was just announced at the meeting of the American Astronomical Society. It will be an annual award, given to the best cosmology papers to have appeared on the arxiv, as decided by a panel of esteemed judges. (I’m one of the esteemed judges, which is a mixed blessing — should be a lot of fun, but it means I can’t win.) Any PhD or current graduate student in physics or astronomy is eligible to submit papers for consideration; this year’s deadline is 30 September. The winner will walk away with $10,000, and even third place will bag you $2,500.

Currently, cosmology is in a situation where the dominant theoretical framework (Big Bang, Hubble expansion, dark energy and dark matter, possibly primordial inflation) is pretty darn good at fitting the data, but nevertheless has some worrisome conceptual issues. (Was there really inflation? Is there a multiverse? Is the dark energy a cosmological constant, and is the dark matter a WIMP? Why is the vacuum energy so small? Etc.) Hopefully a prize like this will help spur people to be just a tiny bit more bold and imaginative in tackling these issues than they would otherwise be.

Buchalter Cosmology Prize Read More »

8 Comments

Neutrinos From the Sky

It’s been hard to find time for blogging, but there’s one story I don’t want to let slip by before the end of the year: the observation by Ice Cube of neutrinos from beyond the Solar System.

It was my own bad sense of timing to blog about Ice Cube mere days before they announced this result — but just to mention the fun fact that they confirmed the existence of the Moon. And, like noticing the Moon, there’s a sense in which we shouldn’t be too surprised — we were pretty confident that neutrinos were in fact raining down upon us from the sky all the time. But that’s a bad attitude, because this is a big deal. It’s a new way of looking at the universe, and historically new ways of looking at the universe have always brought us surprises and new insights of one form or another.

The actual process by which Ice Cube determined that they had found cosmic neutrinos is a bit convoluted, so let’s go through it. For one thing, the detector doesn’t “see” neutrinos directly. It sees Cherenkov radiation, which is emitted when a charged particle moves through a medium at a speed faster than the velocity of light in that medium. (Nothing moves faster than light moves in vacuum, but the speed of light in ice is lower than in vacuum.) Neutrinos, you may have figured from the name, are neutral particles, not charged ones. So what you’re actually seeing are events where a neutrino bumps into one of the water molecules in the ice and creates some charged particles.

But most of the neutrinos you detect by this method are not really cosmic. They’re byproducts of cosmic rays — mostly charged particles flying through space at enormous energies, which smash into Earth’s atmosphere, creating neutrinos (and various other particles) along the way. So a cosmic ray interacts with the atmosphere, creating a neutrino, which then interacts with the ice to make charged particles we can observe. Ice Cube sees these “atmospheric neutrinos” all the time; indeed, it makes maps of them. And that’s great, and certainly helps teach us something about cosmic rays. But it would still be cool to find some neutrinos that have themselves made the long journey across the desolate cold of interstellar space. And that’s not easy; even if the detector finds some, they are likely to be swamped by the bountiful atmospheric beasts.

Enter Bert and Ernie.

Bert-and-ernie

Those are the colorful names given to two events observed over the last couple of years by Ice Cube. What makes them remarkable is their very high energies; over 30 trillion electron volts (TeV). (Francis Halzen, doyen of the experiment, “takes no responsibility” for the whimsical names.) That’s a lot more than you would expect from atmospheric neutrinos, but right in line for the most energetic cosmic neutrinos we predicted. But it’s only two events; the finding was announced earlier this year, but like good cautious scientists the collaboration didn’t quite say they were sure the events were cosmic in origin. (Note that a “cosmic neutrino” is one that traveled across the cosmos by itself, not one that was produced by a cosmic ray — sorry for the confusing nomenclature, it’s a cosmic world out there.)

Now we can do better. In November, right after my blog post about the Moon, Ice Cube announced that they had more data, and were able to identify another twenty-six events at very high energies. They put the confidence that these are truly cosmic neutrinos at four sigma — perhaps not quite the five-sigma gold standard we would like to reach, but pretty darn convincing (especially where anything astrophysical is concerned).

This result opens up a new era in astronomy. We can now look at the universe with neutrino eyes. Previously we had discovered neutrinos from the Sun, as well as the lucky few from Supernova 1987A, but now we apparently have a persistent source of these elusive particles from very far away. Perhaps from the center of our galaxy, or perhaps from hyper-energetic events in galaxies well outside our own. At the very least this kind of work should teach us something about the origin of cosmic rays themselves, and who knows what else.

I’m not sure whether to feel happy or sorry for Bert and Ernie themselves. Born in a cosmic cataclysm half a universe away, they sped through billions of miles of empty space, witnessing untold astronomical wonders, only to come crashing into the ice on a fairly run-of-the-mill planet. But at least they brought more than a little joy to the hearts of some curious scientists, which is more than most particles can say.

Neutrinos From the Sky Read More »

21 Comments
Scroll to Top