I have a long-percolating post that I hope to finish soon (when everything else is finished!) on “Why String Theory Must Be Right.” Not because it actually must be right, of course; it’s an hypothesis that will ultimately have to be tested against data. But there are very good reasons to think that something like string theory is going to be part of the ultimate understanding of quantum gravity, and it would be nice if more people knew what those reasons were.
Of course, it would be even nicer if those reasons were explained (to interested non-physicists as well as other physicists who are not specialists) by string theorists themselves. Unfortunately, they’re not. Most string theorists (not all, obviously; there are laudable exceptions) seem to not deem it worth their time to make much of an effort to explain why this theory with no empirical support whatsoever is nevertheless so promising. (Which it is.) Meanwhile, people who think that string theory has hit a dead end and should admit defeat — who are a tiny minority of those who are well-informed about the subject — are getting their message out with devastating effectiveness.
The latest manifestation of this trend is this video dialogue on Bloggingheads.tv, featuring science writers John Horgan and George Johnson. (Via Not Even Wrong.) Horgan is explicitly anti-string theory, while Johnson is more willing to admit that it might be worthwhile, and he’s not really qualified to pass judgment. But you’ll hear things like “string theory is just not a serious enterprise,” and see it compared to pseudoscience, postmodernism, and theology. (Pick the boogeyman of your choice!)
One of their pieces of evidence for the decline of string theory is a recent public debate between Brian Greene and Lawrence Krauss about the status of string theory. They seemed to take the very existence of such a debate as evidence that string theory isn’t really science any more — as if serious scientific subjects were never to be debated in public. Peter Woit agrees that “things are not looking good for a physical theory when there start being public debates on the subject”; indeed, I’m just about ready to give up on evolution for just that reason.
In their rush to find evidence for the conclusion they want to reach, everyone seems to be ignoring the fact that having public debates is actually a good thing, whatever the state of health of a particular field might be. The existence of a public debate isn’t evidence that a field is in trouble; it’s evidence that there is an unresolved scientific question about which many people are interested, which is wonderful. Science writers, of all people, should understand this. It’s not our job as researchers to hide away from the rest of the world until we’re absolutely sure that we’ve figured it all out, and only then share what we’ve learned; science is a process, and it needn’t be an especially esoteric one. There’s nothing illegitimate or unsavory about allowing the hoi-polloi the occasional glimpse at how the sausage is made.
What is illegitimate is when the view thereby provided is highly distorted. I’ve long supported the rights of stringy skeptics to get their arguments out to a wide audience, even if I don’t agree with them myself. The correct response on the part of those of us who appreciate the promise of string theory is to come back with our (vastly superior, of course) counter-arguments. The free market of ideas, I’m sure you’ve heard it all before.
Come on, string theorists! Make some effort to explain to everyone why this set of lofty speculations is as promising as you know it to be. It won’t hurt too much, really.
Update: Just to clarify the background of the above-mentioned debate. The original idea did not come from Brian or Lawrence; it was organized (they’ve told me) by the Smithsonian to generate interest and excitement for the adventure of particle physics, especially in the DC area, and they agreed to participate to help achieve this laudable purpose. The fact, as mentioned on Bloggingheads, that the participants were joking and enjoying themselves is evidence that they are friends who respect each other and understand that they are ultimately on the same side; not evidence that string theory itself is a joke.
It would be a shame if leading scientists were discouraged from participating in such events out of fear that discussing controversies in public gave people the wrong impression about the health of their field.
Oh, sorry – I meant to close the italics after “a posteori”. BTW you can get Λ by typing this: Λ
Mark, I’m thinking about it, but my initial reaction is that if you know nothing about Λ then you are not entitled to expand in powers of E/Λ.
On Renormalization (posts 446-468)
Renormalization is an active area in mathematical physics,
with a long history (starting with Glimm & Jaffe’s phi^4 theory, via constuctive field theorists e.g. Rivasseau, Froehlich, and more recently Fredenhagen & Brunetti). Ideally one should ask one of these people to discuss this issue on this blog.
Until one of them arrives, here is my take on it.
If one accepts that a quantum theory is a representation
of some algebra of observables (usually C*), then:
1) Whilst for a canonical quantum theory with finitely many
degrees of freedom there is essentially only one
irreducible representation (cf. the Von Neumann uniqueness
theorem), for canonical quantum field theory – which has
infinitely many degrees of freedom – there is a vast number
of inequivalent representations.
2) It is known from a range of models that even mild interactions
cannot be unitarily implemented in the Fock representation, i.e.
time evolution moves you out of it
(cf. e.g. J.Math.Phys. 26(6) p1264, 1985 and others)
3) By Haag’s theorem the Hamiltonian of an interacting QFT together with an invariant vacuum vector cannot exist in the representation of the free field (hence the interaction picture only exists if there is no interaction, cf. this survey paper in Erkenntnis 64 p305, 2006)
So one knows from the outset that when you construct a perturbation series for the dynamics of an interacting QFT in the usual Fock representation that the mathematics will not allow it, something will go wrong. Unless you have a procedure for leaving the Fock representation to go to another inequivalent representation, and that is where renormalization comes in.
Here’s a sketch of how it works. If one starts e.g. from the situation in Mark Srednicki’s post #452, i.e. an infinite lattice of quantum oscillators with e.g bounded nearest neighbour interactions. Then make a chain of regions in the lattice, increasing to the full lattice, and for each region say R, choose a ground state v_R for the operator
h_R := H_R – E_R
where H_R is the “partial Hamiltonian” of R, i.e. it involves only the lattice points in R, and E_R is its smallest eigenvalue. These vectors v_R through their expectation values define states on the algebra
w_R(A) := (v_R, A.v_R)
and as the set of states is w*-compact, there is a w*-compact
accumulation point w which can therefore be written as a
limit
w(A) = lim_n w_{R_n}(A) for all A
The state w defines a new representation for the algebra
(GNS-construction) in which there will be a selfadjoint
Hamiltonian producing the desired dynamics, and with a ground
state. This is interpreted as renormalisation because
when
lim E_{R_n} = infty
it looks like an infinite subtraction procedure. But the point is that you have left your original representation for this new one.
So one can look at renormalisation as a way of moving out of the Fock representation to a different representation where one has a well-defined dynamics and vacuum.
__________________________________________________________
Whilst there is a good interpretation for (some) renormalization
procedures, there still remain serious problems in the
mathematics of perturbation series, for instance, the renormalised
perturbation expansions of QFT in general do not converge
(it is an `asymptotic series’ i.e. the terms seem to converge up
to some point, but beyond that it diverges).
So it still does not define a time evolution mathematically.
Of course there is also the problem of what the algebra of
the interacting QFT should be, and how to interpret the
pointwise products of distributions which occur in many expressions
(some recent advances by Fredenhagen & Brunetti on that one).
Chris wrote:
Mark, I’m thinking about it, but my initial reaction is that if you know nothing about Λ then you are not entitled to expand in powers of E/Λ.
There might be two issues that are being conflated here. The first is that there might be a physical cutoff scale, Λ, at which there are new particles, etc. For instance, in the SM this might be TeV-ish. In that case you can expect relatively large contributions from higher-dimension operators, but (see above) experimentally these aren’t seen, so it seems to be safe to use a larger cutoff for most things.
The other point is that there is an unphysical renormalization scale, μ, which is not physically meaningful and in fact does not enter the predictions because of the Callan-Symanzik equations. (If you truncate perturbation theory at a small order, μ can have an important effect, but computing loops will reduce this.) This scale μ can get transmuted into a real, physical cutoff, Λ. In abstract QED coupled to no other physics, this is the Landau pole, and you can only make predictions up to corrections in powers of (E/Λ). This Λ is not arbitrary, but physical. Of course in real-world QED the cutoff is lower — it’s where the rest of the SM starts to matter — so your predictions are valid up to corrections suppressed by that cutoff. In QCD, the physical scale Λ is small, and you can make corrections for high-energy processes that are undetermined in perturbation theory up to powers of (Λ/E). (Nonperturbatively, since QCD is asymptotically free we expect that it exists and that these power corrections are actually determined.)
The idea of universality is really key here, and you really should try to understand it instead of arrogantly insisting that you understand these issues better than the large community of people who work with them every day.
Hendrik,
I do not believe that these complicated constructs are necessary to get around Haag’s theorem. AFAIC the theorem means that we have to abandon hope of doing perturbation theory in QFT in the same way that we did for QM, but this is not the end of the world. One may construct interacting fields from sums of tensor products of free fields, and obtain non-zero matrix elements for real processes that agree with experiment because of the interference between free fields at different orders. A contribution to e+e–>γγ, for example, can arise from a product of free fields (γ cr)(γ cr)(e+ annih) in the expansion of the interacting e- field. This matrix element cannot be reproduced by a system where H = H_0 + V, except as an approximation, which is what one would expect on account of Haag’s theorem.
Anon.,
I am certainly not insisting, arrogantly or otherwise, that I understand the issues in building effective field theories better than the people who work with them full-time. I do not want to build effective field theories. I would rather see it all derived from first principles.
The problem with trying to find a rigorous mathematical construction of quantum field theory (in four dimensions) is that there is considerable evidence that such a construction does not exist: with the exception of nonabelian gauge theories, all 4d quantum field theories are believed to be “trivial”, that is, they become free-field theories in the limit that the cutoff is removed. (See, e.g., “On the Triviality of Textbook Quantum Electrodynamics” by S. Kim, J. B. Kogut, and M.-P. Lombardo, hep-lat/0009029.)
As explained by Hendrik, Haag’s theorem (that a free field cannot be turned into an interacting field by a unitary transformation) is also an artifact of the infinite-cutoff limit. Since one must do perturbation theory with a cutoff in place, Haag’s theorem does not invalidate the conventional interaction-picture approach.
And constructing 4d nonabelian gauge theory rigorously is thought to be a very hard problem; you can get a million dollars from the Clay Foundation if you solve it.
Even within pure mathematics, it is often useful to assume unproved results, and proceed. A huge amount of work has been done in analytic number theory assuming that the Riemann hypothesis is true. But that’s unproven, and another hard problem that you can get a million dollars for solving.
Hendrik,
Although interacting fields are not unitarily equivalent to some Fock representation, all need not be lost. A key lesson from CFT in 2D (key = important to me) is that non-Fock representations may be constructed as factor spaces: a minimal model is a Fock rep modulo singular vectors. What seems to remain true is that energy is bounded from below also in interacting models. Put bluntly, the position and momentum representions are wrong in field theory, because energy is not bounded from below there. Unlike the Bargmann rep.
The lowest-energy property is inherited by symmetry groups acting on the Hilbert space. This is why we are interested in lowest-energy reps in quantum theory. Non-trivial unitary lowest-energy reps of diffeomorphism and gauge algebras are necessarily anomalous, and a quantized observer jumps right out of their representation theory.
You are correct: I am assuming that the high energy theory respects translation invariance. There is again an interesting general lesson: I can also make the opposite assumption and see where it leads. The effect would be that the low energy effective theory would also violate this symmetry. So I can conclude that the second alternative is false. I can do the same thing for Lorentz invariance, and conclude that the high energy theory respects Lorentz invariance (or some symmetry that reduces to it at low energy, but none such is known). That is, Planck-scale breaking will feed into the low energy theory. So approaches to quantum gravity that give up give up Lorentz invariance at the start are, I believe, doomed.
On the subject of axiomatic quantum field theory, I note that this has largely gone over to the effective field theory point of view, where quantum fluctuations are considered scale-by-scale. See for example the review math-ph/9902023 by Rivasseau. In fact, if the only issue in the Clay prize were the UV limit of non-Abelian gauge theory, Tadeusz Balaban would probably already have won the million (see references in above review), using renormalization group methods. However, to get the Clay money one also has to prove a mass gap, which is a much harder problem.
> The effect would be that the low energy effective theory would also violate this symmetry.
But what about lattice-QCD etc. (simulated in the Euclidean sector). Rotation symmetry is broken, but restored at large distances.
“You say, “Subtracting infinity from infinity is not what’s done”, but that is what is in the text books.”
Chris Oakley on Apr 19th, 2007 at 6:06 pm
That is only a reflection of the fact that many textbooks are still written for the purpose of introducing students to the calculational tools of the subject, and do not explain the revolution in the conceptual understanding of the subject resulting from the work of Ken Wilson (for which he won a Nobel prize). An outstanding exposition of the modern understanding of renormalization and quantum field theory is the text:
“Quantum Field Theory and Critical Phenomena” by Zinn-Justin [Oxford, 4th ed, 2002]
A simpler exposition of the same ideas is in part 2 [“Renormalization”] of the text:
“An Introduction to Quantum Field Theory” by Peskin and Schroeder [Harper-Collins 1995]
An older article (“from the horses mouth”) is the review::
“The Renormalization Group and The Epsilon Expansion” by Wilson and Kogut
[Phys.Rept. 12, 1974, p.75-200]
As noted by Joe Polchinski above, modern constructive quantum field theory is done by these methods as well. As well as the Rivasseau review quoted, an older reference is:
“Quantum Physics: A Functional Integral Point of View” by Glimm and Jaffe [out of print].
Good luck
On another matter, you stated above that:
“Since none of this jiggery-pokery is needed for classical mechanics or electrodynamics, I do not see how you can say that such theories are no different.”
Actually this is not correct. The classical theory of charged point electrons interacting with the classical electromagnetic field is completely pathological. There is a divergent self-energy for the electron due to the energy of its coulomb field that diverges much faster (power law) in the classical case than in the quantum case (logarithmically), considered as a function of the cutoff on the “size” of the electon. Attempting to remove this by putting in a bare mechanical mass that adds to the electrostatic self-energy (for finite electon size) to give the physical mass, and including the internal stresses in the electron to maintain the finite size (with hopes of taking the size to zero at the end of the calculation), was done by Lorentz. The resulting theory is pathological. The electron motion obeys a third order ordinary differential equation with solutions that reduce to the standard solutions of the naive second order differential equation experiencing an acausal “preacceleration” *before* the application of the force producing the acceleration. This is explained in the last chapter of the text “Classical Electrodynamics” by Jackson [second edition; that’s the one I used in graduate school when I learned this…I don’t know if/where this is in later editions]. This in addition to the impossibility of forming stable atoms in the classical theory, and the liberation of an infinite amount of energy as an electron falls in to a (pointlike) classical proton destroying the observable universe with its radiation… It’s really classical mechanics and electrodynamics that are seriously pathological (though we largely hide this from our undergraduates), and quantum theories that ameliorate the situation (cf. the much more gentle growth of the electron self-energy as a function of the cutoff electron radius, as it goes to zero, in the quantum as opposed to the classical theory). Misplaced nostalgia should not blind us to how much better behaved are our quantum theories than the classical theories that they replaced.
Wolfgang #435,
I maintain that time is the reading of a clock. If time is a c-number parameter rather than a quantum operator, it must mean that clocks in QM are classical and thus macroscopic. In contrast, clocks (or test particles) in GR are assumed light, so we can ignore their self-interaction with gravity.
This gives a fresh angle on why QM and GR are incompatible: clocks are macroscopic in QM and microscopic in GR. Hence clocks in QG should be mesoscopic; a clock’s position and velocity do not commute.
> This gives a fresh angle on why QM and GR are incompatible
I think this is indeed pretty much the core problem, but not really “fresh”. Pauli’s argument is from the 1920s and Wigner and Salecker discussed clocks in QM in the 1950s…
Mark (#480)
Concerning your point:-
“The problem with trying to find a rigorous mathematical construction of quantum field theory (in four dimensions) is that there is considerable evidence that such a construction does not exist: with the exception of nonabelian gauge theories, all 4d quantum field theories are believed to be “trivial”, that is, they become free-field theories in the limit that the cutoff is removed.”
This argument is of course not a general mathematical proof, so there is still hope. As far as I see, it only says that 4d-QFT cannot be obtained by a limit procedure from lattice QFTs in a very specific form. There are many possible frameworks in which one can try to model QFT rigorously, and I cannot see how they can all be excluded by the lattice QFT approach. For instance, the approach which I follow – through C*-algebras and their representations – can solve many problems of QFT (e.g. constraint problems – see my posts #204 and #355 above) by using the more extensive representation theory available to you. If the limit of lattice QFTs take place in one specific representation, then it excludes some of these representations. On the other hand, lattice QFT cannot approximate the C*-algebras in the operator norm because the lattice algebras are separable whilst the CCR-algebra is nonseparable.
I also think this is a strange position want to defend;- it seems to say that there is no logically consistent theory underlying 4d-QFT. In other words, the framework is contradictory in an essential way. Given the strong connection of the Standard model with experiment, I would say that that is strong evidence for some consistent underlying mathematical theory for 4d-QFT, or at least for the more observable parts of it. Of course that is an article of faith in the consistency of nature.
“Haag’s theorem (that a free field cannot be turned into an interacting field by a unitary transformation) is also an artifact of the infinite-cutoff limit.”
I don’t think I understand this – it looks rather the other way round to me: Haag’s theorem forces you to go to other representations for interacting QFT, and the infinite-cutoff limit takes you to another representation. (That’s what happened in the toy model of my last post #477)
“Even within pure mathematics, it is often useful to assume unproved results, and proceed.”
Certainly, but all this work will disappear at the blink of an eye if the result gets disproved. In the case of physics, one requires both experimental support as well as logical (mathematical) consistency. The first often takes precedence over the latter, but eventually the logic must be sorted out. Especially when there are clear contradictions. Moreover, when you extrapolate an inconsistent theory into an unfalsifiable domain, (such as for some first-instant cosmology, or possibly string theory) there can be no justification for your conclusions.
_____________________________________________________________
Thomas (#481)
“Although interacting fields are not unitarily equivalent to some Fock representation, all need not be lost. A key lesson from CFT in 2D (key = important to me) is that non-Fock representations may be constructed as factor spaces: a minimal model is a Fock rep modulo singular vectors.”
I don’t know the details of your construction, but perhaps it is related to the standard way of constructing a representation from a given state on a C*-algebra (GNS-construction) which is also a factor space construction. But the main lesson remains, for interacting QFT one needs to leave the Fock representation.
… or it means that the interacting theory in 4d is built from free field states, which satisfy the axioms.
As for Haag’s theorem, one of the ingredients is Lorentz invariance. Since this is violated by introducing a cutoff, Haag’s theorem probably does not apply in this instance. So what?
But the connection between quantum clocks and anomalies is fresh. In projective representations of the diffeomorphism algebra, the Virasoro-like extension is a functional of the clock’s worldline (in 1D there is only one worldline and the extension becomes central). That’s why we cannot formulate this kind of diff anomaly in QFT proper, where the observer and her clock are not in the picture.
What are Fock representations?
Chris #479 and #488
“I do not believe that these complicated constructs are necessary to get around Haag’s theorem…”
Actually, in the toy model in which I described renormalisation, Haag’s theorem does not apply – the model need not even be translation invariant w.r.t. the lattice (e.g. by adding different bounded potentials to each point). The need for renormalisation came from the fact that the total ground state energy of an infinity of quantum oscillators is infinite in the original representation (infinite tensor product of Schroedinger reps of the oscillators), as pointed out by Mark Srednicki in his post #452 , and hence in this representation we do not have an invariant vacuum vector. One needs to go to a different representation to get such a vacuum, and this is what the limit of renormalisation accomplishes.
“AFAIC the theorem means that we have to abandon hope of doing perturbation theory in QFT in the same way that we did for QM,”
Structurally, the problem is this:- We have an algebra of observables together with an automorphic action of time evolutions on it (and possibly some symmetry groups). There is an initial representation of the algebra given (in which it is defined). Then one demands for a physical representation that the time evolution be unitarily implemented, that the generator of this unitary group be bounded below, and that the lowest point in the spectrum be discrete (i.e has an eigenvector=vacuum). For interacting theories there is no guarantee that the original representation has these properties, and a range of examples – including my toy example above – show that this is not the case. In fact even unitary implementability can fail. Haag’s theorem proves that under some natural general assumptions for interacting 4d-QFT, if the original representation is the Fock representation then it does not have the required physical properties. However, the general problem is more general than Haag’s theorem, and in fact is a separate issue from perturbation theory.
The (mathematical) techniques of renormalisation are methods for finding new representations with the correct physical properties from the data of the original representation. In my toy model, the dynamics can be defined perfectly well without any perturbation theory, however, renormalization is still necessary.
“One may construct interacting fields from sums of tensor products of free fields, and obtain non-zero matrix elements for real processes that agree with experiment because of the interference between free fields at different orders.”
I had a quick look at your paper;- if you can make your claims mathematically rigorous, you would have solved one of the biggest outstanding problems of mathematical physics. May be worth a try.
“or it means that the interacting theory in 4d is built from free field states, which satisfy the axioms.”
I don’t know what you mean by “is built from free field states”. Since from an irreducible set of normal operators I can construct any other normal operator on that Hilbert space, in a sense I can already build any interacting field (once they are properly defined as operator-valued distributions) from free fields, but this notion is too general to be useful.
“As for Haag’s theorem, one of the ingredients is Lorentz invariance. Since this is violated by introducing a cutoff, Haag’s theorem probably does not apply in this instance. So what?”
It depends what you want;- if you regard Poincare transformations as physical (on Minkowski space) I would have thought in your final QFT that you require it. Else, what do you mean by a relativistic theory? Anyway, as I argued above, it is not just Haag’s theorem which produces problems for interacting theories – the representational problem is more general.
Hi Hendrik,
OK, but also I am making no distinction between the free and interacting vacuums. In the presence of interactions, the annihilation parts of the free field operators still annihilate the vacuum. This means, of course, that the Hamiltonian of the interacting theory is bounded below in the same way as in the free field theory, so this at least is not an issue.
It is useful because (i) knowing the free-field (anti-)commutators enables one to read off matrix elements directly & (ii) the expansions of the interacting field in terms of free fields are determined by the equations of motion. This gives one a full calculational framework.
The basic ideas of this approach have been around since 1934, and I think that – in most ways – it is simpler than the interaction-picture based perturbation theory in the text books.
In regard to mathematical rigour, the main issue, as I see it, is this: local field equations always seem to result in infinities. One can remove them by normal ordering at each order of a power-series expansion in the coupling, but this still seems like a cheat. It could well be that there is a way of getting to the quasi-local, infinity-free expansion by solving the spacelike commutativity/anticommutativity requirement directly, but if there is, I have not found it yet.
Hendrik,
I think so.
I might add that honesty/transparency is much the best policy. As long as the mathematical chicanery in QFT as currently practised is regarded as a lovable foible, there will not be the will to defeat it. Apart, of course from a small minority, which includes yourself.
Hendrik, I referred to a standard construction in CFT due to Feigin and Fuks. The Virasoro algebra acts on a fermionic Fock space and preserves certain subspaces generated by singular vectors; this means that the factor modules are well defined. There is a discrete set of factor spaces where two independent singular vectors have been modded out. The unitary ones are physically relevant in statistical physics.
These minimal models are not Fock spaces, but they do have the property that there is an energy L_0 bounded from below.
Hendrik,
I agree that it may be possible to construct phi^4 theory in 4d; my views on this are the same as those Weinberg expresses in his QFT book. If it is possible to construct it, I think it would be very interesting to see how the scattering amplitudes behave above the Landau scale.
But since the Standard Model does not include gravity, and QFT of gravity has obvious problems, I feel that there is no physical reason to expect a rigorous construction of the Standard Model to succeed. What we really need is a formalism that (1) includes gravity, (2) includes particles and fields of the usual sort (scalar, fermion, vector), and (3) is renormalizable or (better!) finite.
Hmmm … where might we find such a formalism, I wonder? If such a formalism could be found, it would probably trigger intense worldwide interest for decades …
Dear all, certainly one of the fascinating aspects of the very interesting discussion on renormalization is the connections between mathematics and physics and the non rigorous nature of current physics theories as well as of the method of normalization.
One thing that I learned in these (and earlier weblog) discussions and did not know before is that there is a whole spectrum of levels for non-rigorous mathematical constructions and computations used in physics. Trying to improve the level of rigor is on physicists mind even if a full mathematical rigor is not at sight.
One possibility regarding this difficulty of incomplete rigor is that as the physics theory will be developed this will provide the framework, insights, and even technical tools for the rigorous mathematics which will then follow. Lubos Motl discussed this view in several very interesting posts on his blog. His view, if I understood it correctly, is (in short) that a successful string theory once completed may well settle all (or most) mathematical difficulties on the way and perhaps also other famous problems in mathematics.
This sounded to me very nice but a little fantastic but one has to say that there are examples in this direction: E.g. the Seiberg-Witten story (nicely described in Peter’s book). Another example is mentioned in the comment #484 of Diogenes: According to Diogenes quantum physics resolved or at least mellowed difficulties in the mathematical foundation of classical physics. (I find this very beautiful.)
Still, in spite of these examples, the idea that humanity will have to “wait” to a complete physics theory of everything to start understanding rigorously old very solid and special cases like QED still sounds a little unreasonable. (Chris’ idea that humanity has to wait to a full mathematical rigor before proceeding with the physics is even harder to accept.)
According to Mark in order to have a mathematical rigorous theory for QED, string theory or “something like string theory” is needed. What is not clear to me is whether it is really necessary to study non abelian gauge theories to settle the case of QED. Isn’t it possible to develop (or perhaps it was already been developed) “something like string theory” just for the special case of the group U(1)? So why not solve the mathematical difficulties for QED or at least make them more mallow without solving them completely, by a little “something like string theory” for U(1), forgetting all the other gauge groups? (Maybe this was already done or tried.)
And what does “something like string theory” mean? Sean referred to “something like string theory” in his original post, referring to his promised and much-hoped-for future post and so did Mark. String theory looks rather developed and rather “pointed” so when talking about “something like string theory” it is not clear what Mark and Sean are willing to give up in the current theory.
Herbert referred to “moving away from Fock representations”. I did not find a wikipedia item on “Fock representations” (It is waiting to be written, guys) but there is one on Fock spaces which give some vague ideas on what it is about. Should we regard “moving away from Fock representation” that Herbert talks about as a synonym to “something like string theory” that Mark and Sean talk about? Or perhaps “moving away from Fock representations” is related to theories that do not have perturbative versions at all (like LQG?).
Another question: Is it really the case that the easy half of finding a mathematically rigorous QCD theory was fully settled as Joe #482 pointed out? This sounds like big news!
Mark asked: “The question I always come back to is, what if the landscape is essentially correct? That is, what if string theory is the correct theory of the world, that it has zillions of metastable vacua, and that inflation creates zillions of ‘pocket universes’ (our universe being one of them), each with its own metastable vacuum?”
This possibility and related issues like the role of the anthropic principle was discussed quite substantially by Lee. Somehow, while seeing nothing wrong with the landscape issue and even with considering the anthropic principle, these aspects never strike me as being as terribly interesting as many other aspects of this debate. However, I have a little remark.
Lee regards his approach on the landscape and his questions and suggestions about time as being in deep contrast with the anthropic principle.
I am not convinced. It may well be the case that the two approaches are very similar if not identical. (Let us unify Smolin and Susskind! This may be a Grandiose Unification indeed 🙂 )
The anthropic explanation asserts that we should regard our universe and its rules as something (random) conditioned on life being possible. More precisely (and less contentiously), conditioned on sufficient irregularity in the structure of matter. (Irregularity that will allow life, if you wish).
Lee strongly rejects this idea and says (among other things) something like: probability requires a notion of time; Lee (as various other physicists) asks if time itself is an emergent parameter, and see high prospects for the understanding what time is. (Mark pointed out that new understanding of “time” may well be positive or neutral to string theory and not necessarily negative.)
So we can revise the anthropic principle and demystify it just as saying that we should condition our universe’ rules not on life but on time and then the A.P. reduces to something rather mundane.
A sort of recreational version of Lee’s question is this: Given a picture (of the universe), Guess a formula for the emerging time! So we need a formula for the time (or rather for dT) in terms of the picture. Any suggetions?
(You can let the picture be described in any way you wish, e.g. like a TV picture by a large array of pixels, and if you feel more comfortable to consider quantized picture this is fine too.)
The principle should be: “when it is interesting time flies quicker.” (And maybe first identify cases where you want dT to be zero.)
A similar question can be asked about Hollywood Movies. Given a frame in the movie find a formula estimating the movie-story-time passed from the picture one minute before (in actual movie time) to that one minute after. (I suspect the Hollywood version will be more difficult than the universe version, as the Hollywood pictures are probably more “interesting”.) Maybe in Geology and Archaeology one can find similar problems.
oops, I meant Hendrik not Herbert. Sorry!
Gina wrote,
“According to Mark in order to have a mathematical rigorous theory for QED, string theory or ‘something like string theory’ is needed.”
Actually, as I said in #495, I’m agnostic on this. For QED, there’s probably (an infinite number) of technicolor-like theories that could provide an ultraviolet completion.
It’s worth noting that standards of mathematical rigor are much lower in other area of theoretical physics, where it is well understood that all theories are effective. In condensed-matter theory, people will do things that they cheerfully admit don’t make any fundamental sense, soley because they give the right answer. (“Right” means, agrees with experiment.)
In particle theory, only the final theory has to be mathematically rigorous.
500th comment!
Do I get a prize?