Theoretical physics and cosmology find themselves in a strange place. Scientific theories have since the seventeenth century been held tight by an experimental leash. In the last twenty years or so, both string theory and theories of the multiverse have slipped the leash. Their owners argue that this is no time to bring these subjects to heel.

It is this that is strange.

Misery Loves Company

The multiverse is a collection, set, or ensemble of possibly disjoint universes. They are possible because no one knows whether they exist. Max Tegmark has contemplated four different kinds of multiverse, and Brian Greene, nine. Nothing exceeds like excess.

The cosmologist, Martin Rees, has argued that our universe cannot plausibly end at the visual horizon. “[T]his shell,” he writes, “has no more physical significance than the circle that delineates your horizon if you’re in the middle of the ocean.” There must be further domains beyond what we can see. Christopher Columbus made a similar argument, after all, and he was right.

Alan Guth, Andrei Linde, and Alexander Vilenkin have embraced the multiverse because its existence follows from some current cosmological theories. A period of extremely rapid exponential inflation occurred in the very early universe, before the Hot Big Bang era and almost directly after the Big Bang itself. Some varieties of inflation are likely to have led to many different universes in which cosmological parameters varied widely.

The multiverse appears both in cosmology and particle physics. String theory, Leonard Susskind has argued, is the correct theory of quantum gravity. Different string vacua are sufficiently distinct that their physics varies, or may vary. Each of them, in principle, embodies a universe. They may be related to inflationary universes. Since no one can determine which one is real, Susskind is disposed to accept them all.

Roger Penrose, Lee Smolin, Paul Steinhardt, and Neil Turok have all proposed that the multiverse arises in time rather than space. If the constants of nature are different in each new expanding universe, as Smolin has suggested, the result is endless variety; if they are the same, as Penrose suggested, the result is a form of eternal return.

Sean Carroll, David Deutsch, Max Tegmark, and David Wallace have all claimed that the quantum wave function of the universe splits into multiple branches every time a measurement is made. Each branch is a universe. This idea was originally advanced by Hugh Everett III in his Princeton dissertation. Multiple worlds emerge as branches of the wave function, and having branched, the various branches remain in a state of superposition, entirely subordinate to the linear and deterministic Schrödinger equation. The wave function never collapses. The Born rule is not needed on Everett’s scheme. But it is the Born rule that establishes that the squared amplitude of the wave function is a measure of probability. Something must take its place or do the same work. Some physicists argue that the many worlds of quantum mechanics and the many worlds of the multiverse are one and the same. The multiverse, some physicists claim, is necessary to give exact operational meaning to probabilistic predictions from quantum mechanics.

David Lewis and Dennis Sciama have argued for a strong form of modal realism. Whatever is possible is real. A possible world may be identified with a maximally consistent set of sentences. A world in which Lewis is the emperor of the Antarctic and a world in which he is not differ by, at least, one sentence, but they are otherwise ontologically identical. The multiverse is the set of all possible worlds.

This raises the intriguing question whether the multiverse is a coherent scientific object. What is the set of sentences that are true in the multiverse? It cannot be the union of sentences true in some universe, because the union is inconsistent. And it cannot be their intersection either, because that would leave only the truths of logic and mathematics.

Not to be outdone, Tegmark has argued that every consistent mathematical structure exists in some disconnected universe. Tegmark also believes that nothing else exists beyond the consistent mathematical structures. Tegmark is himself nothing more than a consistent mathematical structure. This is a view that assigns to mathematical structures a degree of agency that they are not otherwise thought to possess.

The constants of physics are finely tuned. Life could not exist in a universe beyond the range of temperate tuning. Martin Rees has identified six physical constants whose precise values are necessary for life. They are N, the ratio of the electromagnetic force to the gravitational force; ε, a measure of the efficiency of hydrogen to helium fusion; Ω, the ratio of the mass density of the universe to its critical density; Λ, the fabled cosmological constant; Q, the ratio of the gravitational energy required to pull a galactic cluster apart to its equivalent mass energy; and D, the number of spatial dimensions. Rees argues that if the set C = < N, ε, Ω, Λ, Q, D > of dimensionless physical parameters were slightly different, life could not have formed.

Weinberg, Susskind, Carroll, Tegmark, Stephen Hawking, Leonard Mlodinow, and Rees himself, have all argued that what is improbable in one universe is necessary in the multiverse as a whole.

These arguments have not ignited a firestorm of assent, perhaps because if C is necessary for the existence of life, the existence of life would itself be sufficient for C to have the values that it does.

Out of Sight

Any direct observation of the universe is bounded by our visual horizon. The Hot Big Bang phase of the universe ended approximately 13.8 billion years ago. Light could not pass through the universe before that time. The visual horizon corresponds to galaxies that are about 3 × 13.8 = 41.4 billion light years away. The universe’s rate of expansion is not constant—hence the factor 3. Universes beyond this distance simply cannot be observed.

It is possible that we live in a universe small enough to lie inside the visual horizon. Were this the case, we would see multiple images of galaxies, and identical circles in the cosmic microwave background (CMB). A small universe would exclude many kinds of multiverse. If the universe is not small, questions about the multiverse remain open.

The particle horizon measures the greatest distance a particle could have travelled from the Big Bang, or any comparable time t = 0, to the present. In a static universe, the particle horizon could be defined as the product of the time elapsed since t = 0, and the speed of light. The universe is expanding, hence the particle horizon must be assessed as the product of the speed of light and the conformal time. The conformal time is a measure of time rescaled so that the speed of light is unity. Having no observational or causal connection with what, if anything, lies beyond the particle horizon, we cannot directly test any conjecture about its nature.

Some physicists have suggested that given string-theoretic assumptions, our own universe must be open and negatively curved. Any observation that our universe is positively curved must, by contraposition, be evidence against these string theoretic assumptions, and so evidence against one derivation of the multiverse. Still other physicists have argued that collisions between universes might leave observational traces in the microwave background sky. If these traces were ever measured, they could support some models of the multiverse.

Writing in the Monthly Notices of the Royal Astronomical Society, Tom Shanks and his graduate student, Ruari MacKenzie, claimed that cold regions in the CMB sky might be evidence of such a collision. But the cold spot might be the result of a statistical fluctuation.

The inference to the multiverse is not obligatory.

Blow Up

The inflationary theory of the early universe affirms that between the Big Bang itself and the Hot Big Bang era, the universe underwent an incredibly short, but highly accelerated period of expansion. This seems to be supported by data. The theory was devised by Alan Guth to address problems in classical Big Bang cosmology that were widely known, but not widely adverted. The universe we observe is uniform in temperature on large scales, the cosmic background radiation homogeneous to one part in 105. On classical Big Bang cosmology, spatial patches separated by more than the particle horizon could not have reached thermal equilibrium.

Inflationary cosmology provides an answer. It is the same answer that evolutionary biologists offer in explaining why two species are alike—they share a common ancestor. Particles in thermal equilibrium have a common origin in the same spatial patch. Inflation provoked a dramatic and accelerating expansion in the size of the universe, and by the same token it revealed a dramatic and decelerating contraction in its size when causal histories are traced back to the Big Bang. Two points that could never have been in causal contact under standard Big Bang cosmology can under inflation find a common origin in the same spatial patch.

The resulting smoothness is precisely what we see in the CMB.

Inflation proceeded because its expansion factor was positively accelerated. The simplest route to expansion is by means of a positive scalar field potential, V(φ). In the classical case of the Friedmann–Robertson–Walker universe, the scalar field is equivalent to a perfect fluid—of all things! Inflation comprises a broad family of models. There is old inflation, new inflation, R2 inflation, SUGRA (supergravity) inflation, double and power-law inflations, Natural and Hybrid inflation, Extended and Assisted inflations, both SUSY (supersymmetry) F- term and D-term inflations, brane inflation, something called supernatural inflation, SUSY P- term inflation, K inflation, warped brane inflation, tachyon inflation, and roulette inflation.

The ASPIC library of fast Fortran routines in cosmology makes use of over seventy inflationary models. The common assumption that the inflationary field is a quantized particle known as the inflaton is not obviously a great help. Unless the inflaton proves to be the Higgs boson, we may never be in a position to identify it further, because there are limits to the energies reached in collider experiments. To reach still higher energies would require observations of cosmic rays, but they have not been related to any particle that might be associated with the inflaton. We can put limits on the nature of V(φ) by using CMB data. Susskind has defended Coleman–de Luccia tunneling as one means to inflation, but the mathematical viability of eternal inflation is open to question.

The Anthropic Bound

Regarding an expanding universe with some philosophical distaste, Albert Einstein introduced the cosmological constant, Λ, into the field equation of general relativity because no static solution to the equation would be accessible without it. If Λ = 0, the result is Einstein’s original field equation. There is a static solution to the field equation if Λ > 0. The result is a spherical dust-filled universe with a mass density .

Einstein regarded the cosmological constant as having a deforming effect on his original equation. In this, he was surely correct. The static solution that he embraced proved to be unstable. When, in the 1920s, Edwin Hubble provided striking evidence that the universe was expanding, Einstein came to regret the inclusion of a cosmological constant. His original equation was compatible with an expanding universe. “If there is no quasi-static world,” he remarked gaily to Hermann Weyl, “then away with the cosmological term.”

The cosmological constant nonetheless has appeared and reappeared in general relativity. Lorentz invariance indicates that an effective cosmological term Λ eff can appear in the field equations. This cosmological constant contributes its mite to the total effective vacuum energy:

.

Cosmological observations indicate that the absolute value of is roughly 10–47GeV4. And therein lies a problem. Among cosmologists, the cosmological constant is usually expressed as the ratio, Ω Λ , between the energy density due to the cosmological constant and the critical density of the universe. Data from the Planck satellite indicate that Ω Λ ≈ 0.6911 ± 0.0062. This is small, but not zero either. “Our knowledge of the present expansion rate of the Universe,” Weinberg has observed, “indicates that the effective value Λ of the cosmological constant is vastly less than what would be produced by quantum fluctuations in any known realistic theory of elementary particles.”

Quantum field theory is obviously no good guide to the value of Λ.

If not quantum theory, then what? “Perhaps Λ must be small enough,” Weinberg added, “to allow the Universe to evolve to its present nearly empty and flat state, because otherwise there would be no scientists to worry about it.” Weinberg formalized his reasoning to give a prediction of the value of Λ, but his account contained a number of modal qualifiers: perhaps, must, allow, otherwise. Just why must Λ be small or large enough to allow anything at all? No fundamental theory explains why Λ should be able to take different values.

Physicists such as Rees and Carroll endorsed Weinberg’s work because it gave content to the idea of an anthropic bound: “[T]he necessary and sufficient anthropic condition on the cosmological constant is that it should not be so large as to prevent the appearance of gravitationally bound states.” This consideration squeezes Λ eff from above and from below. “For a large positive Λ eff ,” Weinberg observed, “the universe very early enters an exponentially expanding de Sitter phase, which then lasts forever.” This cannot be said to be a good thing if one wants life to appear. For a negative value of Λ eff , the universe collapses into a singularity at a time that may be too short for life to evolve.

To determine the upper bound, Weinberg uses the simple spherical infall model of Peebles to follow the nonlinear growth of inhomogeneities in the matter density. Weinberg imagines the early universe evolving according to standard Big Bang cosmology. The cosmological constant is Λ and the background spatial curvature, k, is zero. The evolving universe is subject to homogeneous but nonlinear perturbations. These are needed as a route to gravitational clumping and galaxy formation. Cosmic evolution is governed by the uniform excess density Δρ(t) of the universe, its positive curvature constant Δk > 0, and the scaling factor a(t). The perturbed model evolves according to the Friedmann equation:

,

where by definition .

The perturbation strength, , is by definition

.

Weinberg then considers whether recollapse will occur and lead to condensation. The anthropic upper bound on Λ,

,

emerges on the assumption that the universe gives rise to large-scale structures.

This argument is purely anthropic. It provides a bound on the cosmological constant. It is the introduction of the multiverse that allows Weinberg to specify its expected value in terms of the mean of values suitable for life across the multiverse. The mean and expected values of make sense only if is allowed to vary. Since it cannot very well vary in one universe at any given time t, then it must vary in many of them. To talk of the mean value of is to talk of its mean value across a set or ensemble of universes = {U 1 , U 2 , …, U n , …}.

What else could it mean?

The absolute value of in U k must be less than its mass density over the time required for astronomers to evolve. If positive, only has to be less than the mass density of the universe at the time of galaxy formation. In a paper with Hugo Martel and Paul Shapiro, Weinberg went on to derive a number of “likely values” for the cosmological constant. At the heart of the argument is the assumption that the values of Λ defined over form a probability distribution proportional to the “fraction of matter that is destined to condense out of the background into mass concentrations large enough to form observers.” This fraction is computed by an appeal to standard cosmological theories about density fluctuations at the time of recombination. A comparison of the likely values of with observational bounds on the cosmological constant demonstrates that a small, positive value of is reasonably likely—even if all values of have the same a priori probability.

The conclusion of Weinberg’s argument, it must be stressed, is not that the multiverse exists, for this is among its assumptions. It is, in fact, nothing more than a limited consistency test. Weinberg’s calculation considered variations in the value of Λ and variations in Λ alone. When other constants are varied at the same time, the result is different. Glenn Starkman and Roberto Trotta observed that different ways of assigning probabilities to universes lead to different anthropic predictions:

[A]nthropic reasoning within the framework of probability as frequency is ill-defined and that in the absence of a fundamental motivation for selecting one weighting scheme over another the anthropic principle cannot be used to explain the value of Λ, nor, likely, any other physical parameters.

This recalls Everett’s measure problem. In any case, a more sophisticated analysis of the vacuum energy by William Unruh et al. suggests one can derive the value observed for Λ from quantum field theory without any need for a multiverse.

Bubble Up

How many universes are there in the multiverse? So long as universes are allowed to bubble up without limit, as they are in theories of eternal inflation, the answer must be that eventually they become infinitely numerous. There is a classical distinction in the philosophy of mathematics between potential and actual infinities. The natural numbers 1, 2, 3, …, when defined in terms of the successor function S(n) = n + 1, remain forever potential. They are finite to any given n. The set of all natural numbers is otherwise. It has the cardinality . Following Georg Cantor, set theorists think it entirely real. There is nothing potential about it. Physicists have long been skeptical about actual infinities, and with every good reason. The Hilbert Hotel contains infinitely many rooms, and all of them, let us suppose, are filled. A new room may nonetheless always be made available if the occupant of each of the other rooms is shifted by means of the function f : n → n + 1. This is not a logical paradox because infinity is not a number, but physicists have never found themselves comfortable with the idea that the Hilbert Hotel could be embodied in any kind of physical object.

On almost all inflationary scenarios, universes form without end. If there are infinitely many of them, they are at any moment only potentially infinite. But the multiverse is the set of all of them. If nature is creating them without end, as it creates the natural numbers by succession, then it follows that the set of them altogether has the cardinality . The set of all subsets of the multiverse has the cardinality .

This is the sort of physical object that most physicists do not wish to see.

If the multiverse is scientifically problematic, it is always open to philosophers to rescue the multiverse by expanding the margins of science. A theory, so the argument runs, need not be confirmed by empirical evidence. Richard Dawid has argued as much in a paper entitled “The Significance of Non-Empirical Confirmation in Fundamental Physics.” “In the absence of empirical confirmation,” he writes, “scientists may judge a theory’s chances of being viable based on a wide range of arguments.”

Nothing, Dawid argues, succeeds like success. Theories that satisfy a certain set of conditions have worked well in the past. This increases the probability that a new theory satisfying the same conditions is apt to work well in the future. This argument embodies the triumph of hope over experience. In 1974, Howard Georgi and Sheldon Lee Glashow proposed an exquisite grand unified theory, one that was supposed to unite the strong and electroweak forces. It predicted that, as the result of spontaneous symmetry breaking, protons would decay. Such was the hope. So far as experiments can determine, protons do not decay. Such is the experience.

If meta-induction has proven a stern teacher, there is always, Dawid observes, the argument of unexpected explanatory interconnections. A theory is developed in order to solve a specific problem. Once developed, physicists discover that the theory also explains a range of quite different problems. Dawid takes this as an indication of the theory’s viability. This is not a bad argument if connections are drawn between established physical theories playing over a world of established physical entities. But it does not support theories of physics or cosmology in which those connections remain purely mathematical, or in which some aspects of the theory are tweaked to give the desired additional results. Amazing mathematical relations are not necessarily realized in physical terms.

If all else fails, a theory, Dawid argues, becomes better than nothing if nothing is better than it. It is, of course, very hard to know when it is appropriate, in daily life, or in physics, to conclude that there is no alternative. A lack of imagination may be at work, or too narrow a range of models. The thesis is, in any case, not very plausible. If a theory is true, the fact that it has no alternatives is supererogatory, and if it is false, irrelevant. In the case of the multiverse, there is an alternative: no multiverse exists, and the value of Λ is explained by the mechanism of Unruh et al., that I have already mentioned, or just happened to be set at that value, with gravity being a unimodular theory.

Carlo Rovelli has responded to Dawid:

Scientists have always relied on non-empirical arguments to trust theories. They choose, develop and trust theories before finding empirical evidence. The entire history of science witnesses for this. … Dawid uses a Bayesian paradigm to describe how scientists evaluate theories. Bayesian confirmation theory employs the verb “confirm” in a technical sense which is vastly different from its common usage by lay people and scientists. In Bayesian theory, “confirmation” indicates any evidence in favor of a thesis, however weak. … For lay people and scientists alike, “confirmation” means something else: it means “very strong evidence, sufficient to accept a belief as reliable” … The distinction between reliable theories and speculative theories may not always be perfectly sharp, but is an essential ingredient of science. … The very existence of reliable theories is what makes science valuable to society… Dawid’s merit is to have emphasized and analyzed some of the non-empirical argument that scientists use in the “preliminary appraisal” of theories. His weakness is to have obfuscated the crucial distinction between this and validation: the process where a theory becomes reliable, gets accepted by the entire scientific community, and potentially useful to society. The problem with Dawid is that he fails to say that, for this, only empirical evidence is convincing.

Hear, hear.