Annotated Publications
Here is a description of the research I've done over the years, grouped loosely into topics, including links to all of the individual papers. See also my research summary, CV, or publications from inSPIRE or Google Scholar.
Quantum Spacetime
Everyone wants to quantize gravity. The usual approach is to start with some classical-sounding system -- curved spacetime, strings, loops, triangles, networks -- and "quantize" it. Nature, presumably, doesn't work that way; the world is quantum from the start, and we limited human beings perceive a classical reality in some appropriate limit. With Charles Cao and Spiros Michalakis, I took a small step toward finding an emergent notion of space inside a quantum wave function. Our starting point was a decomposition of Hilbert space (the space of all quantum states) into a product of individual factors, and using the entanglement between different factors (in certain very special states) to define a notion of distance -- highly entangled factors are nearby, less entangled factors are far away. We can even begin to see something like a precursor to Einstein's equation of general relativity emerge when we perturb our states a little bit. But it's all early; we haven't even put in time, much less recovered subtle features of the world like Lorentz invariance. See this blog post for more discussion.
- C. Cao, S.M. Carroll, and S. Michalakis, 2016, "Space from Hilbert Space: Recovering Geometry from Bulk Entanglement," arxiv:1606.08444. [arXiv, inSPIRE]
- C. Cao and S.M. Carroll, 2018, "Bulk Entanglement Gravity without a Boundary: Towards Finding Einstein's Equation in Hilbert Space," arxiv:1712.02803. [arXiv, inSPIRE]
- N. Bao, S.M. Carroll, and A. Singh, 2017, “The Hilbert Space of Quantum Gravity is Locally Finite-Dimensional," arxiv:1704.00066. [arXiv, inSPIRE]
An intriguing approach to quantum gravity is to imagine that the gravitational force arises from essentially thermodynamic underpinnings. This idea has been investigated by Jacobson, Verlinde, and Padmanabhan, among others. Grant Remmen and I asked ourselves how specific we could be about what this purported "entropy" actually was -- where is it located, what degrees of freedom does it refer to, etc? We found that the best answer related to a 2015 paper by Jacobson, where he derives Einstein's equation by assuming equilibrium between infrared and ultraviolet contributions to the entropy in small causal diamonds. Grant wrote a guest blog post to explain it all.
On somewhat related lines, Aidan Chatwin-Davies and I proved a version of something I had long suspected: that the cosmic no-hair theorem, saying that a universe with a positive cosmological constant will asymptote toward de Sitter space, can be thought of entropically. Entropy increases, and the approach to de Sitter is a kind of equilibration. We used the "Q-screen" formalism of Bousso and Engelhardt.
Evidence has gradually been accumulating that quantum gravity is not simply the theory of quantized curved spacetime. There is something nonlocal, holographic, and emergent about it. The most explicit piece of evidence in this direction is the AdS/CFTcorrespondence, which links gravity in an Anti-de Sitter spacetime to quantum field theory on a flat-spacetime boundary. But there are still things about AdS/CFT that we don't fully understand. One extremely intriguing idea, due to Brian Swingle and others, is that the bulk gravity theory could arise from a tensor network, starting with quantum bits on the boundary. I studied this idea with a gaggle of collaborators, concluding in the end that we could put stringent constraints on the most straightforward ways that one might try to implement such a proposal. I remain hopeful that some non-straightforward approach will make the idea work in greater detail. See also this blogpost.
It's harder to be precise when moving away from AdS and toward the real universe, where the cosmological constant is positive. But we can try, and there are some intriguing results that come out of the effort. In one paper we modeled the expansion of the universe as a quantum circuit, suggesting that spacetime is knitted together out of entangled degrees of freedom that were initially unentangled at early times. In another we looked at empty de Sitter space, arguing that features such as the cosmic no-hair theorem could be naturally understood from the quantum-circuit perspective.
- N. Bao, C. Cao, S.M. Carroll, A. Chatwin-Davies, N. Hunter-Jones, J. Pollack, G.N. Remmen, 2015, "Consistency Conditions for an AdS/MERA Correspondence," arxiv:1504.06632. [arXiv, inSPIRE]
- N. Bao, C. Cao, S.M. Carroll, and L. McAllister, 2017, "Quantum Circuit Cosmology: The Expansion of the Universe Since the First Qubit," arxiv:1702.06959. [arXiv, inSPIRE]
- N. Bao, C. Cao, S.M. Carroll, and A. Chatwin-Davies, 2017, "De Sitter Space as a Tensor Network: Cosmic No-Hair, Complementarity, and Complexity," arxiv:1709.03513. [arXiv, inSPIRE]
Time and the Universe
Microscopic laws of physics are essentially time-reversal invariant, but macroscopic thermodynamics exhibits a profound time-asymmetry; entropy typically increases in closed systems. This intriguing feature of the real world has a cosmological origin: the entropy of the early universe was fantastically small. After a century of effort, it has been difficult to explain this arrow of time without assuming time-asymmetric boundary conditions. Jennifer Chen and I have suggested a simple scenario in which increasing entropy is natural, based on the idea that the entropy can increase without bound (there is no equilibrium state) and that the way entropy increases is by creating universes like our own. In our picture, any generic state first evolves to an empty de Sitter phase; the small temperature of de Sitter allows for fluctuations into a proto-inflationary configuration, which grows and reheats into a conventional Big-Bang spacetime. The same thing happens in the far past, but with a reversed arrow of time. On ultra-large scales, therefore, entropy is growing without bound in the asymptotic future and past. You can read more in this Scientific American article.
- S.M. Carroll and J. Chen, 2004, "Spontaneous Inflation and the Origin of the Arrow of Time'', hep-th/0410270. [arXiv, inSPIRE]
- S.M. Carroll and J. Chen, 2005, "Does Inflation Provide Natural Initial Conditions for the Universe?," gr-qc/0505037. [arXiv; inSPIRE]
- S.M. Carroll, 2008, "What if Time Really Exists?" arxiv:0811.3772. [arXiv, inSPIRE]
If you wait long enough, a classical system in thermal equilibrium can undergo a fluctuation to a lower-entropy state. Anthony Aguirre, Matt Johnson and I looked at the creation of an inflationary universe via up-tunneling from a low-energy vacuum to a high-energy one ("true" to "false" vacua). We found that the most likely trajectory is simply the time-reverse of the ordinary evolution of the universe. Kim Boddy and I looked at the future of the universe, and worried about the creation of Boltzmann Brains. We suggested that BB's might be avoided by decay of the Higgs to a new vacuum at larger field values. However, this only works if either the top quark has a slightly larger mass than most people believe (around 178 GeV), or we choose a particular kind of cosmological measure.
In 2014, Kim and I and Jason Pollack realized that Boltzmann Brains need not occur in thermal states. If a quantum system settles down to a truly stationary state (which can happen if the Hilbert space is infinite-dimensional), there are no true dynamical "fluctuations." Our intuition to the contrary is determined by our experience making observations, but there are no external observers when the wave function of the whole universe is stationary. This eliminates the Boltzmann Brain problem in a wide class of models, and may have important implications for eternal inflation.
There are still those who think that Boltzmann Brains aren't a problem at all; I'm not one of them, that's all I need to know, even if they proliferate elsewhere in the universe. I think that's cheating, and wrote a paper explaining why. But the ultimate reason isn't because I should be a BB if I lived in such a universe; it's that such universes are cognitively unstable, and I shouldn't be sure about anything.
- A. Aguirre, M. Johnson, and S.M. Carroll, 2011, "Out of equilibrium: understanding cosmological evolution to lower-entropy states," Journal of Cosmology and Astroparticle Physics 1202, 024; arxiv:1108.0417. [arXiv, inSPIRE]
- K.K. Boddy and S.M. Carroll, 2013, "Can the Higgs Boson Save Us From the Menace of the Boltzmann Brains?", arxiv:1308.4686. [arXiv, inSPIRE]
- K.K. Boddy, S.M. Carroll, and J. Pollack, 2014, "De Sitter Space Without Quantum Fluctuations," arxiv:1405.0298. [arXiv, inSPIRE]
- K.K. Boddy, S.M. Carroll, and J. Pollack, 2015, "Why Boltzmann Brains Don't Fluctuate Into Existence From the De Sitter Vacuum," arxiv:1505.02780. [arXiv, inSPIRE]
- S.M. Carroll 2017, “Why Boltzmann Brains Are Bad," to appear in Current Controversies in the Philosophy of Science, S. Dasgupta and B. Weslake, eds.; arxiv:1702.00850. [arXiv, inSPIRE]
The early universe is in a very special state -- it has an incredibly low entropy, as Roger Penrose has often emphasized. It is often asserted that this fine-tuning can be explained by cosmological inflation. But inflation carries with it its own fine-tuning problems; not just getting the right potential, but getting the right initial conditions for inflation to begin. Heywood Tam and I investigated quantitatively the degree of tuning that is required, using a rigorous measure on cosmological spacetimes invented by Gibbons, Hawking, and Stewart. Building on this work, Grant Remmen and I showed how the informal idea of a "cosmological attractor solution" can be reconciled with the mathematical fact that Hamiltonian systems (like scalar-field cosmology) don't actually have attractors. We showed how to derive a measure on flat universes, and used this to calculate how many e-folds of inflation one should expect in representative models. Based on these insights, I wrote a paper arguing that discussions of cosmological fine-tuning should be based on the space of trajectories rather than on the traditional horizon and flatness problems.
- S.M. Carroll and H. Tam, 2010, "Unitary Evolution and Cosmological Fine-Tuning," arXiv:1007.1417. [arXiv; inSPIRE]
- G.N. Remmen and S.M. Carroll, 2013, "Attractor Solutions in Scalar-Field Cosmology," Phys. Rev. D 88, 083518; arXiv:1309.2611. [arXiv, inSPIRE]
- G.N. Remmen and S.M. Carroll, 2014, "How Many e-Folds Should We Expect from High-Scale Inflation?" Phys. Rev. D, in press; arxiv:1405.5538. [arXiv, inSPIRE]
- S.M. Carroll, 2014, "In What Sense Is the Early Universe Fine-Tuned?" to appear in a volume commemorating David Albert's Time and Chance, B. Loewer, E. Winsberg and B. Weslake, eds.; arxiv:1406.3057. [arXiv, inSPIRE]
Is it possible to travel backwards in time? Embarassingly, we don't know the answer to that question nearly as well as we should. I worked with Edward Farhi, Alan Guth and Ken Olum on obstacles to constructing time machines in (2+1) dimensional gravity, a possibility first suggested by Richard Gott (PRL server). We showed that (2+1) dimensional open universes could be classified into two types: those that inevitably contained a Gott time machine, and those that could never contain one. The case of closed universes was solved by 't Hooft, who showed that any attempt to build a Gott time machine in a closed universe would be foiled by collapse to a singularity before the time machine could arise. Our work was briefly considered newsworthy.
- S.M. Carroll, E. Farhi and A.H. Guth, 1992, "An Obstacle to Building a Time Machine,'' Phys. Rev. Lett. 68, 263; Erratum: 68, 3368. [pdf file; inSPIRE]
- S.M. Carroll, E. Farhi, A.H. Guth and K.D. Olum, 1994, "Energy-Momentum Restrictions on the Creation of Gott Time Machines,'' Phys. Rev. D 50, 6190; gr-qc/9404065. [arXiv; pdf; inSPIRE]
Foundations of Quantum Mechanics
The Everett, or Many-Worlds, formulation of quantum mechanics has a very simple structure: there is a quantum state living in Hilbert space, and it evolves unitarily according to the Schrödinger equation. All of the usual bothersome parts of textbook quantum mechanics -- identifying observables with self-adjoint operators, collapse of the wave function, the probability is the amplitude squared -- are supposed to be derived, rather than postulated. Of these, deriving the Born Rule for probabilities is philosophically the trickiest. Charles ("Chip") Sebens and I have proposed a way to do it using the idea of "self-locating uncertainty." After the wave function branches, but before the observer knows which branch they are on, we argue that there is a uniquely rational way to assign credences to different branches as long as we accept one simple assumption: the Epistemic Separability Principle, which says that what happens far away shouldn't affect your beliefs locally. See also this blog post, or this intriguing book.
- C.T. Sebens and S.M. Carroll, 2014, "Self-Locating Uncertainty and the Origin of Probability in Everettian Quantum Mechanics," British Journal for the Philosophy of Science, in press; arxiv:1405.7577. [arXiv, inSPIRE]
- S.M. Carroll and C.T. Sebens, 2013, "Many Worlds, The Born Rule, and Self-Locating Uncertainty," in Quantum Theory: A Two-Time Success Story, Yakir Aharonov Festschrift, D.C. Struppa, J.M. Tollaksen, eds. (Springer-Verlag), p. 157; arxiv:1405.7907. [arXiv, inSPIRE]
Mad-Dog Everettianism is the name Ashmeet Singh and I gave to the idea that quantum theories are defined simply by the spectrum of their Hamiltonians. Such a program creates the problem of going from such a meager amount of data (a list of energy eigenvalues, maybe some starting quantum state) and creating out of it the whole world. One step has been taken by Cotler et al., who show that a notion of "locality" can be fixed purely from the spectrum. Ashmeet and I have suggested a "quantum mereology" algorithm, to pick out useful factorizations (e.g. into system and environment) in otherwise bare Hilbert spaces.
- S.M. Carroll and A. Singh, 2018, “Mad-Dog Everettianism: Quantum Mechanics at Its Most Minimal," arxiv:1801.08132. In What Is Fundamental?, ed. A. Aguirre, B. Foster, and Z. Merali (Springer), p. 95. [arXiv, inSPIRE]
- A. Singh and S.M. Carroll, 2017, “Quantum Decimation in Hilbert Space: Coarse-Graining without Structure," arxiv:1709.01066, Phys. Rev. A 97, 032111. [arXiv, inSPIRE]
- A. Singh and S.M. Carroll, 2018, “Modeling Position and Momentum in Finite-Dimensional Hilbert Spaces via Generalized Clifford Algebra," arxiv:1806.10134. [arXiv, inSPIRE]
- S.M. Carroll and A. Singh, 2020, “Quantum Mereology: Factorizing Hilbert Space into Subsystems with Quasi-Classical Dynamics," arxiv:2005.12938. [arXiv, inSPIRE]
- S.M. Carroll, 2021, “Reality as a Vector in Hilbert Space," to appear in Quantum Mechanics and Fundamentality: Naturalizing Quantum Theory between Scientific Realism and Ontological Indeterminacy, ed. V. Allori. (Synthese Library); arxiv:2103.09780. [arXiv]
If you think that the wave function is a complete description of a system (i.e. no hidden variables), you can define an energy for a quantum state by taking the expectation value of the Hamiltonian in that state. This energy is conserved under the Schrödinger equation, but it's not seemingly not conserved if you think that wave functions "collapse" when a measurement is performed. You might think that's just because we were sloppy and didn't account for transfers of energy from the system to the measuring apparatus or the rest of the world. Jackie Lodman and I argue that this thought is not correct; the change in energy of the system is not tied to the change of energy of the rest of the world, so that the energy truly changes. In an Everettian perspective, this makes perfect sense: the energy of the entire wave function is conserved, but it is distributed unequally between different branches when decoherence occurs. We also propose an experimental protocol (not completely realistic, but promising) for observing this effect.
Philosophy and/of Science
Karl Popper famously proposed to solve the problem of demarcating the difference between science and non-science via his falsifiability criterion: a theory is scientific only if it could, in principle, be shown to be false if appropriate experimental data were gathered. Popper himself had a nuanced view about such things, but many modern scientists hold an over-simplified version of this principle as if it were established truth. I argue that the situation is a bit more complicated than that, and in particular that falsifiability is not the best way to think about the status of the modern cosmological multiverse.
- S.M. Carroll, 2018, "Beyond Falsifiability: Normal Science in a Multiverse," arxiv:1801.05016. To appear in Epistemology of Fundamental Physics: Why Trust a Theory?, R. Dawid, R. Dardashti, and K. Thébault, eds. (Cambridge). [arXiv]
Cosmologists study the nature and evolution of the universe; they less often are concerned with the question of why it exists at all. When they do deign to tackle that issue, they generally cheat by turning it into a simpler question they can attempt to answer. I argue that the difficulty here is that the existence of the universe isn't something for which we should expect a "reason why." The universe can simply exist.
- S.M. Carroll, 2018, "Why Is There Something Rather than Nothing?," arxiv:1802.02231. To appear in The Routledge Companion to the Philosophy of Physics, E. Knox and A. Wilson, eds. (Routledge). [arXiv]
There's a lot of physics we don't know, from the nature of the dark matter to the right explanation for high-temperature superconductivity. But if we exclude extreme energies and purely astrophysical phenomena (therefore putting aside dark matter etc), and think only about the underlying quantum-field-theory level of description (excluding superconductors and other materials), we do have good reasons to think we completely understand the ingredients and laws in that regime. I talk about what the reasons are we have to think that, and why this represents a challenge for non-physical theories of consciousness and other phenomena.
- S.M. Carroll, 2021, "The Quantum Field Theory on Which the Everyday World Supervenes." Invited contribution to Levels of Reality: A Scientific and Metaphysical Investigation (Jerusalem Studies in Philosophy and History of Science). arxiv:2101.07884. [arXiv, inSPIRE]
- S.M. Carroll, 2021, “Consciousness and the Laws of Physics," Journal of Consciousness Studies 28, pp. 16-31(16). [philarchive]
People sometimes argue over whether there is any relationship between science and religion. I think there is; there are many aspects to religion, but its claims are conventionally grounded in certain beliefs about the fundamental nature of reality, and those claims can and should be judged by the methods of science. Here, in a few different ways, I explain why thinking scientifically leads one to reject the existence of God, and what consequences that has.
- S.M. Carroll, 2005, “Why (Almost All) Cosmologists Are Atheists”, Faith and Philosophy 22, p. 622. [preprint]
- S.M. Carroll, 2012, “Does the Universe Need God?," in The Blackwell Companion to Science and Christianity, ed. J.B. Stump and A.G. Padgett (Wiley-Blackwell: West Sussex, UK), p. 185. [preprint]
- S.M. Carroll, 2018, “Purpose, Freedom, and the Laws of Physics," in Neuroexistentialism: Meaning, Morals, and Purpose in the Age of Neuroscience, ed. G. Caruso and O. Flanagan (Oxford University Press), p. 298.
Black Hole Information
Way back in the 1970's, Jacob Bekenstein and Stephen Hawking showed that black holes have an entropy proportional to the area of their event horizon. As one of the high points of the Second Superstring Revolution, Strominger and Vafa showed that string theory offered a microscopic understanding of the space of states implied by that entropy, at least in certain special cases. But there are still unanswered questions, including how information is encoded in the outgoing radiation. In 1994, Hawking, Horowitz and Ross used Euclidean quantum gravity to make a surprising claim: the entropy of an extremal black hole, one with a charge equal to its mass, is exactly zero, despite the fact that the area does not vanish. Most people simply think this result is not right, but there remain some puzzles about how to reconcile the various approaches. Matt Johnson, and Lisa Randall and I argued that the extremal limit of a non-extremal black hole is discontinuous in an interesting way: the region of spacetime in between the inner and outer event horizons does not shrink to zero size, but blows up into a completely separate spacetime (anti-de Sitter space times a two-sphere). We suggest that the approaches to calculating the entropy of extremal black holes can be reconciled if the entropy is associated with that spacetime, rather than with the black hole.
If information is encoded in the outgoing radiation from a black hole, we would like to have an algorithm for actually reconstructing it. Aidan Chatwin-Davies, Adam Jermyn and I have proposed a very baby step in that direction: how to use quantum teleportation to recover a single qubit that you have dropped into a black hole. Sadly, our method does not generalize to two or more qubits, so at present it's a curiosity, but an intriguing one. See Aidan's blog post here.
The possibility that black hole event horizons are not just empty space, but instead are "firewalls" of ultra-high-energy radiation, was first raised by the AMPS collaboration in 2012. They argued that it was impossible to simultaneously believe 1) in the rules of local quantum field theory far from the black hole, 2) information is preserved in black-hole evaporation, and 3) quantum fields near the horizon appear to an infalling observer to be close to their vacuum state. The basic reason is because outgoing Hawking particles should be entangled with other outgoing particles in order for information to escape, but they need to be entangled with ingoing particles in order to look like the vacuum near the horizon, and those can't simultaneously hold because entanglement is monogamous. We point out that this argument contains a loophole: entanglement between outgoing particles is a property in the global wave function, but the entanglement near the horizon need only hold on specific branches of that wave function, and entanglement is branch-dependent. So there is no logical requirement that black holes have firewalls.
Entropy, Information, and Complexity
Since Boltzmann we have known that the Second Law, which says that entropy increases or stays constant in closed systems, is only a statistical relation: there can be rare fluctuations in which entropy decreases. Recent interest in this phenomenon has led to a great deal of work on "fluctuation theorems," including results from Christopher Jarzynski, Gavin Crooks, and others. In this paper with Tony Bartolotta, Stefan Leichenauer, and Jason Pollack, we investigated what happens when you perform a measurement on a statistical system, then use that information to update both the original and the final statistical distribution describing that system. We are able to derive a Bayesian version of the Second Law, which relates the change in the cross-entropy of the original and updated distributions to the flow of heat into or out of the system.
Everyone knows that entropy increases in a closed system over time. But complexity is something more subtle -- we don't even know how to define it in the best way. Still, we have a feeling that we know it when we see it. The universe, for example, starts in a state of very low complexity near the Big Bang, evolves through a state of high complexity (now), and will eventually relax back into simplicity once all the stars and galaxies are scattered to the four winds by cosmological expansion. With Scott Aaronson and Lauren Ouellette (no relation to my wife Jennifer), I argued that this general behavior -- complexity first increasing, then decreasing again -- is quite robust in interacting systems. We studied a simple automaton that mimicked the behavior of cream mixing into coffee. User alert: there was a bug in our code, and we haven't fixed it yet! We think our general principles are true, but we are working on improving the numerical results.
Extra Dimensions
Many popular models in physics invoke the existence of extra dimensions of space. A great deal of work has gone into the subject of "compactification" -- how are these dimensions hidden? Less work has gone into the question of why they are hidden -- what is the dynamical mechanism that made them that way? Brandenberger and Vafa imagined that the universe began with all dimensions compact, and proposed a reason why some of them would start growing. With Matt Johnson and Lisa Randall, I looked at the opposite possibility: we started with uncompactified dimensions, and some of them curled up. This sounds hard, but we found that it happens automatically in a very simple theory, six-dimensional Einstein-Maxwell theory with a positive cosmological constant. Our construction is closely related to the idea of spontaneous decompactification, but backwards.
If space has extra compact dimensions (as predicted, for example, by string theory), the gravitational dynamics of our four-dimensional world can be altered in startling ways. If the dimensions are large (as popularized by Arkani-Hamed, Dimopoulous and Dvali), we might expect classical general relativity to apply. James Geddes, Mark Hoffman, Bob Wald and I studied extra dimensions which were completely smooth, without including the effects of branes. We found that, using only positive energy densitites, the extra dimensions should be positively curved (spherical rather than toroidal or hyperbolic) in order to be stabilized. Monica Guica and I then studied a similar problem in the presence of explicit brane sources. We find that the branes deform a sphere into the shape of an American football, with the resulting four-dimensional cosmological constant given as a function of the brane tension and bulk fields. This exact solution can now be used to study cosmology and particle physics in factorizable brane models. A related idea is that of "self-tuning'' branes, proposed by Arkani-Hamed et al. and Kachru et al. In that picture, there is a single extra dimension and a bulk scalar field, and the geometry becomes insensitive to the brane tension, but only at the cost of a naked singularity in the extra dimension. Laura Mersini and I have shown that, from a cosmology perspective, the reason this happens is that the scale factor responds to the combination (energy + pressure) rather than just the energy density; for a cosmological constant the pressure is minus the energy density, and the universe is not forced to expand.
- S.M. Carroll and L. Mersini, 2001, "Can We Live in a Self-Tuning Universe?", Phys. Rev. D 64, 124008; hep-th/0105007. [arXiv; pdf; SPIRE]
- S.M. Carroll, J. Geddes, M.B. Hoffman, and R.M. Wald, 2002, "Classical Stabilization of Homogeneous Extra Dimensions," Phys. Rev. D 66, 024036; hep-th/0110149. [arXiv; pdf; inSPIRE]
- S.M. Carroll and M.M. Guica, 2003, "Sidestepping the Cosmological Constant with Football-Shaped Extra Dimensions," hep-th/0302067. [arXiv; pdf; inSPIRE]
Miscellaneous Quantum Gravity
For a while I worked on quantizing 2-dimensional Euclidean gravity via dynamical triangulations, with Miguel Ortiz and Wati Taylor. Four papers made it into the public domain: Paper One deals with some general formalism for free-variable loop equations, while Paper Two applies it all to duality of the Ising model coupled to gravity. The exciting Paper Three and Paper Four consider the Ising model with a boundary magnetic field, and compute the magnetization on the boundary and in the bulk. The behavior of the magnetization as a function of the boundary field leads to some insights about the behavior of the geometry in 2D gravity.
- S.M. Carroll, M.E. Ortiz and W. Taylor IV, 1996, "A Geometric Approach to Free Variable Loop Equations in Discretized Theories of 2D Gravity,'' Nucl. Phys. B468, 383; hep-th/9510199. [arXiv; pdf; inSPIRE]
- S.M. Carroll, M.E. Ortiz and W. Taylor IV, 1996, "Spin/Disorder Correlations and Duality in the c=1/2 String,'' Nucl. Phys. B468, 420; hep-th/9510208. [arXiv; pdf; inSPIRE]
- S.M. Carroll, M.E. Ortiz and W. Taylor IV, 1996, "The Ising Model with a Boundary Magnetic Field on a Random Surface,'' Phys. Rev. Lett. 77, 3947; hep-th/9605169. [arXiv; pdf; inSPIRE]
- S.M. Carroll, M.E. Ortiz and W. Taylor IV, 1998, "Boundary Fields and Renormalization Group Flow in the Two-Matrix Model,'' Phys. Rev. D 58, 046006; hep-th/9711008. [arXiv; pdf; inSPIRE]
General relativity is a theory of the dynamics of geometry, as described by the metric tensor. In addition to the metric, there is another important geometric object, the connection, which in GR is defined in terms of the metric. So-called connection-dynamic theories of gravity take the connection as an independent variable, and give rise to a set of fields called the torsion tensor. George Field and I studied what happens when you allow these extra fields to propagate, and described the experimental constraints on such theories. They turn out not to be very good, since there is nothing to stop the torsion fields from having very large masses: large enough to remove them from the possibility of observational constraint. A brief introduction to torsion is part of John Baez's general relativity tutorial.
Supersymmetry is a hypothesized (but as yet unobserved) relationship between particles of different spin. Supergravity, then, adds to the usual spin-2 graviton of general relativity a new particle, the spin-3/2 gravitino. There was a claim by Peter D'Eath (hep-th/9304084) that physical solutions in quantum supergravity could be found that involved only the graviton, without the gravitino. This would have had important consequences for quantizing the theory, but Dan Freedman, Miguel Ortiz, Don Page and I argued that the result was not correct, and in fact any physical state would have to have an infinite gravitino number. Csordas and Graham went on to suggest an exact solution to the constraints of quantum N=1 supergravity (gr-qc/9507008). Paulo Vargas Moniz has a nice summary of supersymmetry and supergravity, and one of supersymmetric quantum cosmology.
- S.M. Carroll, D.Z. Freedman, M.E. Ortiz, and D.N. Page, 1994, "Physical States in Canonically Quantized Supergravity,'' Nucl. Phys. B423, 661; hep-th/9401155. [arXiv; pdf; inSPIRE]
- S.M. Carroll, D.Z. Freedman, M.E. Ortiz, and D.N. Page, 1994, "Bosonic physical states in N=1 supergravity?'' Talk given at 7th Marcel Grossmann Meeting on General Relativity (MG 7), Stanford, CA, 24-30 Jul 1994; gr-qc/9410005. [arXiv; pdf; inSPIRE]
Dark Energy and Modified Gravity
The idea that most of the universe is a mysterious form of dark energy provides an excellent fit to cosmological observations, but seems unnatural. It is therefore worth pursuing alternatives, even if they seem equally unpalatable at first. One possibility is that there is no dark energy, but rather a modification of gravity kicking in on large scales. Different versions of this idea have been suggested by Deffayet, Dvali, and Gabadadze, Freese and Lewis, and Dvali and Turner. In work with Vikram Duvvuri, Mark Trodden and Michael Turner, we investigated a very simple four-dimensional theory that implements this idea: adding a term 1/R to the conventional term R in the gravitational action, where R is the curvature scalar. The model has a new tachyonic degree of freedom, and unfortunately seems inconsistent with solar-system tests of gravity. (I am not responsible for the goofy title.) Later work by others was able to develop more sophisticated models of "f(R) gravity" that are consistent with observation; see for example Hu and Sawicki. Later we welcomed aboard Antonio DeFelice and Damion Easson, and investigated cosmological solutions to models with more baroque curvature modifications.
- S.M. Carroll, V. Duvvuri, M. Trodden, and M.S. Turner, 2003, "Is Cosmic Speed-Up Due to New Gravitational Physics?'' astro-ph/0306438. [arXiv; pdf; inSPIRE]
- S.M. Carroll, A. De Felice, V. Duvvuri, D.A. Easson, M. Trodden, and M.S. Turner, 2004, "The Cosmology of Generalized Modified Gravity Models,'' astro-ph/0410031. [arXiv; pdf; inSPIRE]
If the acceleration of the universe is due to modified gravity rather than dark energy, we may be able to experimentally detect such a modification by tests of general relativity in the ultra-low-density regime. The obvious phenomenon to consider in this regime is the formation of large-scale structure. With Ignacy Sawicki, I studied perturbation theory in a promising model of modified gravity proposed by Dvali, Gabadadze, and Porrati. DGP gravity imagines a brane embedded in an infinite Minkowski background, but separate Ricci curvature terms on the brane and in the bulk. We then collaborated with Alessandra Silvestri and Mark Trodden on another theory, dubbed Modified-Source-Gravity, in which there are no new propagating degrees of freedom.
- I. Sawicki and S.M. Carroll, 2005, "Cosmological Structure Evolution and CMB Anisotropies in DGP Braneworlds,'' astro-ph/0510364. [arXiv; pdf; inSPIRE]
- S.M. Carroll, I. Sawicki, A. Silvestri, and M. Trodden, 2006, "Modified-Source Gravity and Cosmological Structure Formation,'' astro-ph/0607458. [arXiv; pdf; inSPIRE]
One way of characterizing dark energy is through its equation-of-state parameter w=p/rho, where p is the pressure and rho is the energy density. For ordinary matter, w = 0; for radiation, w = 1/3; and for vacuum energy, w = -1. The lower (more negative) w is, the more slowly the dark energy density decreases; for w = -1 it is strictly constant, while for w < -1 the energy density actually increases as the universe expands. I have helped out the High-Z Supernova Search Team in their exploration of what kinds of dynamical energy are consistent with their results. My contribution was to provide a good reason why w < -1 could be ignored -- namely, that it violates the dominant energy condition, which is what guarantees stability of the vacuum. This issue was revisted in work with Mark Hoffman and Mark Trodden, where we considered models with w < -1, obtained by giving a negative kinetic energy to a scalar field (as proposed by Caldwell). In these models vacuum instability arises because the scalar has negative-energy excitations, and the vacuum can decay into positive- and negative-frequency particles. We found that an effective theory might be phenomenologically acceptable, but only if there is a very low cutoff on its scale of validity. Mark T. and I then worked with Antonio De Felice to ask whether we could be tricked into thinking that w was less than -1 if the Friemann equation were modified. In the specific context of the scalar-tensor theories we examined, it could only happen if the scalar potential were extremely fine-tuned.
- P.M. Garnavich, ... and S.M. Carroll [21 authors], 1998, "Supernova Limits on the Cosmic Equation of State,'' Astrophys. J. 509, 74; astro-ph/9806396. [arXiv; abstract from ADS; full article from ADS; inSPIRE]
- S.M. Carroll, M. Hoffman, and M. Trodden, 2003, "Can the dark energy equation-of-state parameter w be less than -1?,'' astro-ph/0301273. [arXiv; pdf; inSPIRE]
- S.M. Carroll, A. De Felice, and M. Trodden, 2004, "Can we be tricked into thinking that w is less than -1?,'' astro-ph/0408081. [arXiv; pdf; inSPIRE]
A popular model for dynamical dark energy is a slowly-rolling scalar field, sometimes called "quintessence." Scalar-field models are able to reproduce all of the empirical successes of a standard cosmological constant, but introducing dynamics also introduces new ways to constrain such fields. For example, the field can couple directly to standard-model particles, even if only through nonrenormalizable higher-order terms. Such a field would induce a long-range "fifth force", as well as make the constants of nature appear time-dependent. The absence of such couplings requires additional fine tunings in quintessence models. We can suppress these couplings by introducing an approximate global symmetry; this mechanism leaves open a possible pseudoscalar coupling, which might be detectable in polarization measurements. This turns quintessence into an axion, such as the ones predicted by string theory; see comments by Witten and models by Choi and Kim and Nilles.
I was fortunate enough to become an expert on the cosmological constant even before we discovered it in 1998. The first two papers here are review articles, but there are some smatterings of original thought scattered throughout. Research-wise, it has been long recognized (e.g. Abbott) that a four-form gauge field strength Fμνρσ has an energy-momentum tensor that is equivalent to that of a cosmological constant. In a conventional theory, the actual value of that energy is non-dynamical, and simply set by an initial condition. If only there were some way to get it to exactly cancel the rest of the vacuum energy! Grant Remmen and I proposed one such way to do this, by introducing a Lagrange multiplier outside the classical action.
- S.M. Carroll, W.H. Press and E.L. Turner, 1992, "The Cosmological Constant," Ann. Rev. Astron. Astrophys. 30, 499. [pdf; inSPIRE]
- S.M. Carroll, 2000, "The Cosmological Constant," Liv. Rev. Relativity 4, 1. [arXiv, inSPIRE, Living Reviews]
- S.M. Carroll and G.N. Remmen, 2017, “A Nonlocal Approach to the Cosmological Constant Problem," arxiv:1703.09715. [arXiv, inSPIRE]
Dark Matter
In a universe where 96% of the energy density is in a dark sector (dark matter and dark energy), it's worth keeping an open mind about what kinds of physics may be lurking therein. One possibility is a long-range fifth force coupled to dark matter. If a massless scalar field is responsible for such a force, and the dark matter couples to the SU(2)L weak interactions of the Standard Model, quantum effects will induce a fifth force between ordinary particles. With Sonny Mantry, Michael Ramsey-Musolf, and Chris Stubbs, I considered constraints on such a scenario from both astrophysical observations and laboratory experiments. If instead the force is mediated by a new U(1) gauge boson -- the "dark photon" -- the coupling to ordinary matter can be negligible, but there are interesting new effects in cosmological dark-matter dynamics. With Lotty Ackerman, Matt Buckley, and Marc Kamionkowski, I explored the constraints on such models from relic abundance calculations and primordial nucleosynthesis, and found limits on the strength of dark electromagnetism from the requirement that the dark matter be nearly collisionless. More on scalar forces here, and on dark electromagnetism here.
- S.M. Carroll, S. Mantry, M.J. Ramsey-Musolf, and C.W. Stubbs, 2008, "Dark-Matter-Induced Weak Equivalence Principle Violation," arxiv:0807.4363. [arXiv; pdf; inSPIRE]
- S.M. Carroll, S. Mantry, and M.J. Ramsey-Musolf, 2009, "Implications of a Scalar Dark Force for Terrestrial Experiments," arxiv:0902.4461. [arXiv; pdf; inSPIRE]
- L. Ackerman, M.R. Buckley, S.M. Carroll, and M. Kamionkowski, 2008, "Dark Matter and Dark Radiation," arxiv:0807.5126. [arXiv; pdf; inSPIRE]
The existence of dark matter is well-established, but its properties remain largely unknown. Various aspects of dark-matter physics and cosmology could be very different if the dark matter's mass and coupling were different in the early universe than they are today, for example due to the evolution of a cosmological scalar field. Not long before the acceleration of the universe was discovered, Greg Anderson and I proposed a model in which the dark matter consists of particles whose mass increases as the universe expands -- variable-mass particles, or "vamps". The underlying physics of our idea later served crucial roles in some cool ideas like chameleon fields and mass-varying neutrinos. More recently, Kim Boddy and Mark Trodden and I worked out a model in which the dark matter's self-interaction cross-section varied with time, allowing us to obtain the right thermal relic abundance even for particles that interact relatively strongly today.
Early Universe Cosmology
Given that the best-fit model for our universe requires so much fine tuning, it is natural to wonder whether we aren't missing some truly profound difference between our conventional cosmological model and the real world. For example, in general relativity the expansion rate is related to the energy density of the universe by the Friedmann equation; but in alternative models, including those with extra dimensions, this crucial equation might be modified. So Manoj Kaplinghat and I began to wonder about the empirical evidence in favor of this equation. If you have some well-defined alternative theory of gravity, there are all sorts of tests to which you can subject it; however, the only model-independent test of the expansion rate comes from Big-Bang Nucleosynthesis. When the universe was about a minute old, free protons and neutrons combined into light elements (mostly helium [and hydrogen, of course, for those protons which didn't combine], but also deuterium and lithium). The amounts produced depend sensitively on the expansion history, and so probe the Friedmann equation. Introducing a simple two-parameter family of possible expansion histories, we find that a one-dimensional space of possibilities is consistent with the data. Thus, a generic modification of general relativity will be ruled out, but there is still some room for very different universes.
I worked with Jing Shu on putting Lorentz-violationg vector fields to work to help with baryogenesis. The standard Sakharov conditions for baryogenesis include a departure from thermal equilibrium as well as violations of baryon number, C, and CP. However, once we violate Lorentz invariance, a chemical potential can arise even in equilibrium, loosening one of the most stringent requirements of baryogenesis scenarios. We had to fool around a bit to get the Lorentz-violating field to eventually go away, but otherwise the models are fairly robust.
One of the lesser-known unsolved problems of cosmology is the origin of magnetic fields in galaxies. There is no consensus on how such fields evolve with time, but there are reasons to believe that the fields observed today may have originated in the early universe. A primary concern in this game is how to take the small-scale fields from the early universe and stretch them to cosmologically interesting lengths. In the context of magnetohydrodyamics, this can happen via an inverse cascade, but only if the fields have a large amount of magnetic helicity (or Chern-Simons number, to you particle theorists). George Field and I have been examining how much inverse cascade can occur in the presence of helicity; the results are intriguing but not definitive. Our work builds on earlier investigations by Cornwall and Son. One encouraging point is that there are mechanisms to get primordial fields with substantial helicity: George and I considered one in collaboration with Dan Garretson, and a more effective scenario has been suggested by Joyce and Shaposhnikov.
- W.D. Garretson, G.B. Field and S.M. Carroll, 1992, "Primordial Magnetic Fields from Pseudo-Goldstone Bosons,'' Phys. Rev. D 46, 5346; hep-ph/9209238. [arXiv; pdf; inSPIRE]
- S.M. Carroll and G.B. Field, 1998, "Primordial Magnetic Fields that Last?'', in 33rd Rencontres de Moriond: Fundamental Parameters in Cosmology, 17-24 January 1998, Les Arcs, France; astro-ph/9807159. [arXiv; pdf; inSPIRE]
- G.B. Field and S.M. Carroll, 2000, "Cosmological Magnetic Fields from Primordial Helicity'', Phys. Rev. D 62, 103008; astro-ph/9811206. [arXiv; pdf; inSPIRE]
Once inflation gets going, it can be hard to stop. In any one region of space, the classical motion of the inflaton will generally be down its potential, but it's always possible that "quantum fluctuations" could kick it back up. If that happens, the resulting region will expand faster than its neighbors; as a result, rapidly-inflating regions come to dominate the physical volume of space, and inflation lasts eternally. However, that conventional story relies on a crypto-Copenhagen view of quantum mechanics, where wave functions spontaneously collapse. Kim Boddy, Jason Pollack and I did a proper Everettian analysis, asking when decoherence occurs and the field actually takes on a classical value. For what it's worth, we found that the conventional answer is actually very close to correct.
Late Universe Cosmology
The idea behind "effective field theory" is to collect everything that happens at small scales and describe it in terms of an effective theory of large-scale phenomena. This is an attractive approach with which to tackle the problem of cosmological large-scale structure. With Stefan Leichenauer and Jason Pollack, I investigated the underpinnings of this program. We argue that the right cosmological theory isn't really a "field" theory at all, since temporal non-localities are an important part of the description. We also suggest an alternative foundation for the program based on the renormalization group.
The cosmic microwave background provides a wealth of information, all of which can be accounted for by a fairly simple underlying model: isotropic, nearly scale-free, Gaussian, adiabatic perturbations. Studying deviations from those assumptions is of crucial importance in verifying that we are on the right track, not to mention a potential avenue for making big discoveries. The one assumption that has rarely been loosened is that of statistical anisotropy: the statistics of CMB perturbations should be the same in every direction. With Lotty Ackerman and Mark Wise, I studied the simplest possible deviation: a violation of rotational invariance, which we argued would show up first in a quadrupole power asymmetry. Mark and I later worked with Chien-Yao Tseng on violations of translational invariance in addition to rotational invariance. My paper with Lotty and Mark was explained in a series of blog posts: one, two, three.
- L. Ackerman, S.M. Carroll, and M.B. Wise, 2007, "Imprints of a Primordial Preferred Direction on the Microwave Background,'' astro-ph/0701357. [arXiv; pdf; inSPIRE]
- S.M. Carroll, C.-Y. Tseng, and M.B. Wise, 2008, "Translational Invariance and the Anisotropy of the Cosmic Microwave Background,'' arXiv:0811.1086. [arXiv; pdf; inSPIRE]
Inflationary cosmology predicts a very specific kind of primordial density perturbations: nearly scale-free, nearly Gaussian, nearly adiabatic. But that's kind of boring, so it's fun to look for anomalies that might provide a clue towards what really went on. One such anomaly is a claimed hemispherical power asymmetry -- the amplitude of CMB temperature perturbations seems just a bit higher (by about 10%) in one direction on the sky than in the opposite direction. Adrienne Erickcek, Marc Kamionkowski and I have taken a stab at explaining this feature of the data by imagining that a pre-inflationary supermode tilts the universe, as explained in this blog post. There are a number of interesting features of the idea, including that it doesn't really work in simple single-field slow-roll inflation, as that model predicts unacceptably large temperature anisotropies on very large scales. But we were able to fit everything by considering a curvaton model, in which the field responsible for inflating ("the inflaton") is different from the field responsible for the perturbations ("the curvaton"). Subsequent research has shown that it's hard to make a model like this consistent with large-scale structure observations, but it's not impossible, and it's still arguably the best model on the market.
- A. Erickcek, M. Kamionkowski, and S.M. Carroll, 2008, "A Hemispherical Power Asymmetry from Inflation'', arXiv:0806.0377. [arXiv; pdf; inSPIRE]
- A. Erickcek, S.M. Carroll, and M. Kamionkowski, 2008, "Superhorizon Perturbations and the Cosmic Microwave Background'', arXiv:0808.1570. [arXiv; pdf; inSPIRE]
The celebrated Sachs-Wolfe effect is the imprinting of temperature fluctuations on the cosmic microwave background radiation by gravitational perturbations in the universe. In this work, Ted Pyne and I considered the Sachs-Wolfe effect at second order in the perturbations, deriving complete formulae for the anisotropies induced by arbitrary metric fluctuations. Ahead of our time as usual, our work was eventually rediscovered (by Komatsu and Spergel and Hu and Cooray, among others). To learn about the CMB, hurry to Wayne Hu's site.
Lorentz Violation
My first ever paper was about violating Lorentz invariance -- long before it was cool. Roman Jackiw had helped pioneer the idea of Chern-Simons electromagnetism in 3 spacetime dimensions. My graduate advisor George Field, being a practical sort, wondered how it might work in four dimensions. It wouldn't, Roman replied, because it would violate Lorentz invariance. But LI is something that should be tested, not simply assumed -- so we figured out how to constrain it observationally. A four-dimensional Chern-Simons coupling to electromagnetism causes "cosmological birefringence." That means that a photon traveling through empty space will have its polarization vector gently rotated along the way; we were able to put stringent limits on such an effect. Years later, a paper by Nodland and Ralston claimed to find evidence for anisotropic effects in the propagation of polarized radio waves through the universe. The data used to support this claim are the same as those investigated years ago by our earlier paper. George and I were therefore moved to look at the data ourselves, to see if we agreed with this provocative new result. Unfortunately, we did not.
- S.M. Carroll, G.B. Field and R. Jackiw, 1990, "Limits on A Lorentz and Parity-Violating Modification of Electrodynamics,'' Phys. Rev. D 41, 1231. [pdf file; inSPIRE]
- S.M. Carroll and G.B. Field, 1991, "The Einstein Equivalence Principle and Polarization of Radio Galaxies,'' Phys. Rev. D 43, 3789. [pdf file; inSPIRE]
- S.M. Carroll and G.B. Field, 1997, "Is There Evidence for Cosmic Anisotropy in the Polarization of Distant Radio Sources?'', Phys. Rev. Lett. 79, 2397; astro-ph/9704263. [arXiv; pdf; inSPIRE]
A simple way to violate Lorentz invariance is to imagine a tensor field with a nonzero vacuum expectation value. There has been a great deal of investigation of particle-physics theories coupled to such fields, but less on their gravitational effects. Eugene Lim and I studied the simplest possible case of a timelike "aether" vector field in two contexts: Robertson-Walker cosmology, and the static Newtonian limit (applicable to the Solar System). We found that in both cases the primary effect of the aether was to renormalize the value of Newton's gravitational constant, but in different ways; the observable consequence is that the universe expands more slowly than you would otherwise expect. Heywood Tam and I studied a spacelike aether field, with a twist: pointing into an extra dimension. If other fields couple to the vector field, they can pick up additional mass associated with their extra-dimensional momentum, making them harder to detect. Unfortunately you have to wildly tune some numbers to make very big dimensions, but the physical effect is still interesting. Heywood and I later collaborated with Ingunn Wehus on the possibility of emergent gravity from Lorentz violation, in which the graviton is a Goldstone boson associated with a fixed-norm tensor field.
- S.M. Carroll and E.A. Lim, 2004, "Lorentz-Violating Vector Fields Slow the Universe Down," hep-th/0407149. [arXiv; pdf; inSPIRE]
- S.M. Carroll and H. Tam, 2008, "Aether Compactification," arXiv:0802.0521. [arXiv; pdf; inSPIRE]
- S.M. Carroll, H. Tam, and I. Wehus, 2009, "Lorentz Violation in Goldstone Gravity," arXiv:0904.4680. [arXiv; pdf; inSPIRE]
Since Lorentz-violating aether fields have so many fun uses, it's important to verify that the theories are well-behaved. With Tim Dulaney, Moira Gresham, and Heywood Tam, we investigated perturbations in the aether. We found that the results of a naive stability analysis were sensitively dependent on what Lorentz frame you do are looking in -- in a boosted frame, a purportedly stable model begins to look unstable. One exception was what we called "sigma-model aether," so we looked at the empirical constraints on that model. Our stability results have subsequently been challenged by Donnelly and Jacobson, who argue that everything can be fixed if you choose boundary conditions carefully.
The idea that spacetime may be intrinsically noncommutative --- that a product of functions f(x)g(x) may not equal g(x)f(x) [here's a review] --- has been around for a while, and enjoyed a resurgence in popularity following a big paper by Seiberg and Witten on the connection to string theory (essentially, that gauge theories on branes in a background antisymmetric tensor field are automatically non-commuting). It is natural to ask whether the real world might be noncommuting, and in particular what bounds we can place on the noncommutativity parameter (an antisymmetric two-index tensor). Jeff Harvey, Alan Kostelecky, Charles Lane, Takemi Okamoto and I considered how to obtain bounds from the fact that non-commutativity necessarily violates Lorentz invariance, and we already have good bounds on the various ways that Lorentz violation can be manifested in particle and atomic physics (Alan has a nice FAQ if you want to know more about such things). We found that the mass scale characteristic of non-commutativity must be larger than about 10 TeV. It has been subsequently claimed that infrared effects allow for a much more stringent bound.
Topological Defects
Extended objects play an important role in string theory (where they are known as "branes") and in field theory (where they are known as "topological defects" or "solitons"). Exact solutions representing such objects can be hard to come by, but there are sometimes a special set of static solutions, known as "Bogomolny" or "BPS" depending on context, which minimize the energy giving certain boundary conditions and often have other nice properties (such as preserving supersymmetry).
Simeon Hellerman, Mark Trodden and I have shown that one example of such a BPS state is a junction of domain walls. We've looked at N=1 supersymmetric theories in four dimensions with a finite number of discrete vacua, and argue that wall-junction configurations exist which preserve precisely one supercharge. The same conclusion was reached independently (and a couple of days earlier, if you want the truth) by Gibbons and Townsend.
One application of such configurations is to the suggestion by Randall and Sundrum that our world could be a domain wall embedded in a (noncompact) five-dimensional space. Their scenario seems to work most straightforwardly if there is a single extra dimension, but Arkani-Hamed, Dimopoulos, Dvali and Kaloper pointed out that more large dimensions could be accommodated when one considers wall junctions. We therefore derived equations describing such junctions in a supersymmetry-inspired theory of scalars coupled to gravity, in the spirit of similar work on single walls by Behrndt and Cvetic, Skenderis and Townsend, and DeWolfe, Freedman, Gubser and Karch.
- S.M. Carroll, S. Hellerman, and M. Trodden, 1999, "Domain Wall Junctions are 1/4-BPS States'', Phys. Rev. D 61, 65001; hep-th/9905217. [arXiv; pdf; inSPIRE]
- S.M. Carroll, S. Hellerman, and M. Trodden, 1999, "BPS Domain Wall Junctions in Infinitely Large Extra Dimensions'', Phys. Rev. D, 61, 044049; hep-th/9911083. [arXiv; pdf; inSPIRE]
A prominent role in recent developments in string theory has been played by Dirichlet branes (D-branes for short), which are higher-dimensional membranes on which fundamental strings can end. Given the similarities between fundamental strings and stringlike solitons (vortices or cosmic strings) in field theories, it is natural to ask whether models exist of scalar fields with configurations in which one soliton can end on another of equal or higher dimensionality --- a Dirichlet topological defect. Mark Trodden and I have succeeded in constructing a set of such models in 3+1 dimensions: walls ending on walls, strings ending on walls, and strings ending on strings. Our strings ending on walls are conceivably related to supersymmetric QCD strings, which can end on QCD walls (as discussed by Witten).
When a field theory with symmetry group G is spontaneously broken to a subgroup H, the order parameter (typically a set of scalar fields) takes values within a vacuum manifold isomorphic to the quotient set G/H. There is a lovely relationship between the homotopy groups of this vacuum manifold and the existence of topological defects in the broken field theory. If the homotopy group π0 is nontrivial, there will be domain walls; if π1 is nontrivial, there will be cosmic strings, and if π2 is nontrivial, there will be monopoles. Of course it's very natural to ask about if π3. In a gauge theory, nothing interesting happens, as configurations that appear to have topologically nontrivial π3 are actually gauge-equivalent to the identity. But when it's a global symmetry that is broken, you can get cosmic texture, an unstable field configuration that collapses and unwinds to a trivial configuration. (Back when I was your age, people hoped that such events might be relevant to accounting for large-scale structure in cosmology, but nowadays we know better.) It's not hard to find this or that theory that leads to texture, but we wanted to be more systematic. So Jim Bryan, Ted Pyne, and I went through a large number of possible symmetry-breaking theories, calculating the relevant homotopy groups for all of them. Then we joined with Andrew Sornborger to numerically simulate the dynamics of exotic textures, which was often quite surprisingly rich.
Miscellaneous
Here are some general review articles on cosmology and dark energy. See also the entry for "The Cosmological Constant" under "Dark Energy and Modified Gravity."
- S.M. Carroll, 1999. "TASI Lectures: Cosmology for String Theorists." [arXiv; inSPIRE]
- S.M. Carroll, 2001. "Dark Energy and the Preposterous Universe." [arXiv; inSPIRE]
- D.S. Akerib, S.M. Carroll, M. Kamionkowski, and S. Ritz, 2002. "Particle astrophysics and cosmology: Cosmic laboratories for new physics (Summary of the snowmass 2001 P4 working group)." [arXiv; inSPIRE]
As an undergraduate I became involved in a number of projects involving variable stars. At Villanova I worked on observations and modelling of the well-known eclipsing binary Epsilon Aurigae. We found that the invisible companion in this system is most likely a large semi-transparent disk, possibly a protoplanetary system. In 2010 Epsilon Aur went into eclipse again; Brian Kloppenborg, Bob Stencel and others led an effort to use interferometry to obtain some amazing images of the star in eclipse. At the CfA I played a small role in the HK Project, a long-term effort to track the chromospheric activity on a large number of stars. This activity is related to starspot cyles, and has implications for the behavior of the Sun.
- S.M. Carroll, E.F. Guinan, G.P. McCook and R.A. Donahue, 1991, "Interpreting Epsilon Aurigae," Astrophys. J. 367, 278. [abstract from ADS; full articlefrom ADS; pdf]
- S.L. Baliunas et al., 1995, "Chromospheric Variations in Main-Sequence Stars. II,'' Astrophys. J. 438, 269. [abstract from ADS; full article from ADS; pdf]
- B. Kloppenborg et al., 2010, "In the Shadow of the Transiting Disk: Imaging epsilon Aurigae in Eclipse," Nature 464, 870-872. [arXiv; pdf; inSPIRE]