“Es ist immer angenehm, über strenge Lösungen einfacher Form zu verfügen.” (It is always pleasant to have exact solutions in simple form at your disposal.) -Karl Schwarzschild

If you knew, from first principles, what the laws of physics were everywhere and at all times in our Universe, that still wouldn’t be enough for you to come up with the prediction that the Universe as we see it ought to exist. Because while the laws of physics set the rules for how a system evolves over time, it still needs a set of initial conditions to get started. This week’s Ask Ethan comes courtesy of a submission from Andreas Lauser, who asks:

While I don’t have many doubts that the theory of the Big Bang (™) is correct (or as you would probably say, a pretty good approximation of what happened), there is a thing I have been wondering when it comes to this part of cosmology for a while: Is there any explanation why the whole Universe did not become a black hole immediately? I suppose that its close to initial density was quite a bit above the Schwarzschild limit.

We’ve taken this topic on before, but you deserve more detail and a better answer than I gave last time. Let’s go back to the birth of our most successful theory of gravity — general relativity — some 100 years ago.

Image credit: Phil Medina / Mr. Sci Guy, via http://www.mrsciguy.com/Physics/Newton.html.

Prior to Einstein, it was Newton’s Law of Universal Gravitation that was the accepted theory of gravity. All of the gravitational phenomena in the Universe, from the acceleration of masses on Earth to the orbits of the moons around the planets to the planets themselves revolving around the Sun, his theory described it all. Objects exerted equal-and-opposite gravitational forces on one another, they accelerated in inverse proportion to their mass, and the force obeyed an inverse square law. By time the 1900s rolled around, it had been incredibly well-tested, and there were no exceptions. Well, with thousands upon thousands of successes to its credit, there were almost none, at any rate.

Image credit: Curt Renshaw, via http://renshaw.teleinc.com/papers/simiee2/simiee2.stm.

But to the astute and those who paid great attention to detail, there were a couple of problems:

At very fast speeds — that is, at speeds approaching the speed of light — Newton’s ideas about absolute space and absolute time didn’t hold anymore. Radioactive particles lived longer, distances contracted, and “mass” didn’t appear to be the fundamental source of gravitation: that honor looked like it went to energy, of which mass is only one form. In the strongest gravitational fields — at least, if that’s why the planet Mercury is believed to be special among our Solar System’s planets in orbit around the Sun — the Newtonian prediction for the gravitational behavior of objects is slightly but noticeably off from what we observe. It’s as though, when you get very close to a very massive source, there’s an extra attractive force that Newtonian gravity doesn’t account for.

In the aftermath of this, there were two developments that paved the way for a new theory to supersede Newton’s brilliant, but centuries-old, conception of how the Universe worked.

The first major development was that space and time, previously treated as a separate three-dimensional space and a linear quantity of time, were united in a mathematical framework that created a four dimensional “spacetime.” This was accomplished in 1907 by Hermann Minkowski:

“The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. […] Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.”

This worked only for flat, Euclidean space, but the idea was incredibly powerful mathematically, as it led to all the laws of special relativity as an inevitable consequence. When this idea of spacetime was applied to the problem of Mercury’s orbit, the Newtonian prediction under this new framework came a little closer to the observed value, but still fell short.

Image credit: Martín Fernández de Córdova, via https://martinfdc.wordpress.com/2012/10/08/grid/.

But the second development came from Einstein himself, and it was the idea that spacetime was not flat at all, but was curved. And the very thing that determined the curvature of spacetime was the presence of energy in all of its forms, including mass. Published in 1915, Einstein’s framework was incredibly difficult to calculate in, but presented scientists everywhere with the tremendous potential to model physical systems to a new level of accuracy and precision.

Minkowski’s spacetime corresponded to an empty Universe, or a Universe with no energy or matter of any type.

Einstein was able to find a solution where you had a Universe with one single, solitary point mass source in it, and with the stipulation that you were outside of that point. This reduced to the Newtonian prediction at great distances, but gave stronger results at closer distances. These results not only agreed with the observations of Mercury’s orbit that Newtonian gravity failed to predict, but it made new predictions about the deflection of starlight that would be visible during a total solar eclipse, predictions that were later confirmed during the solar eclipse of 1919.

Images credit: New York Times, 10 November 1919 (L); Illustrated London News, 22 November 1919 (R).

But there was another solution — a surprising and interesting one — that came out just weeks after Einstein published his general theory of relativity. Karl Schwarzschild had worked out further details of what happens to a configuration with a single, solitary point mass of arbitrary magnitude, and what he found was remarkable:

At large distances, Einstein’s solution held, reducing to Newton’s results in the far-field limit.

But very close to the mass — at a very specific distance (of R = 2M, in natural units) — you reach a point where nothing can escape from it: an event horizon.

Moreover, inside that event horizon, everything that enters inevitably collapses towards a central singularity, which is unavoidable as a consequence of Einstein’s theory.

And finally, any initial configuration of stationary, pressureless dust (i.e., matter that has zero initial velocity and does not interact with itself), regardless of the shape or density distribution, will inevitably collapse down to a stationary black hole.

This solution — the Schwarzschild metric — was the first complete, non-trivial solution to general relativity ever discovered.

Image credit: Dwight Vincent of U. Winnipeg, via http://ion.uwinnipeg.ca/~vincent/4500.6-001/Cosmology/Black_Holes.htm.

So with that background firmly in our minds, let’s come now to the meat of Andreas’ question: what about the hot, dense, early Universe, where all the matter-and-energy presently strewn across some 92 billion light-years worth of space was contained in a volume of space no bigger than our own Solar System?

Image credit: me.

The thing you must wrap your mind around is that, much like Minkowski’s spacetime, Schwarzschild’s solution is a static one, meaning that the metric of space does not evolve as time progresses. But there are plenty of other solutions — de Sitter space, for one, and the Friedmann-Lemaître-Robertson-Walker metric, for another — that describe spacetimes that either expand or contract.

Image credit: Richard Powell, via http://www.atlasoftheuniverse.com/redshift.html.

If we had started off with the matter-and-energy our Universe had in the early stages of the Big Bang, and didn’t have a rapidly expanding Universe, but a static one instead, and one where none of the particles had pressure or a non-zero velocity, all of that energy would have formed a Schwarzschild black hole in extremely short order: practically instantaneously. But general relativity has another important caveat in it: not only does the presence of matter and energy determine the curvature of your spacetime, but the properties and evolution of everything in your space determines the evolution of that spacetime itself!

Image credit: NASA, retrieved from Pearson Education / Addison Wesley.

What’s most remarkable about this is that we know, from the moment of the Big Bang onwards, that our Universe only seems to have three possible options, dependent on the matter-and-energy present within it and the initial expansion rate:

The expansion rate could have been insufficiently large for the amount of matter-and-energy present within it, meaning that the Universe would have expanded for a (likely brief) time, reach a maximum size, and then recollapse. It’s incorrect to say that it would collapse into a black hole (although this is a tempting thought), because space itself would collapse along with all the matter-and-energy, giving rise to a singularity known as the Big Crunch.

along with all the matter-and-energy, giving rise to a singularity known as the Big Crunch. On the other hand, the expansion rate could have been too large for the amount of matter-and-energy present within it. In this case, all the matter and energy would be driven apart at a rate too rapid for gravitation to bring all the components of the Universe back together, and for most models, would cause the Universe to expand too quickly to ever form galaxies, planets, stars, or even atoms or atomic nuclei! A Universe where the expansion rate was too great for the amount of matter-and-energy contained within it would be a desolate, empty place indeed.

Finally, there’s the “Goldilocks” case, or the case where the Universe is right on the bubble between recollapsing (which it would do if it had just one more proton) and expanding into oblivion (which it would do if it had one fewer proton), and instead just asymptotes to a state where the expansion rate drops to zero, but never quite turns around to recollapse.

As it turns out, we live almost in the Goldilocks case, with just a tiny bit of dark energy thrown in the mix, making the expansion rate just slightly larger, and meaning that eventually all the matter that isn’t gravitationally bound together already will be driven apart into the abyss of deep space.

Image credit: Russell Lavery of Imperial College, via http://spaces.imperial.edu/russell.lavery/.

What’s remarkable is that the amount of fine-tuning that needed to occur so that the Universe’s expansion rate and matter-and-energy density matched so well so that we didn’t either recollapse immediately or fail to form even the basic building-blocks of matter is something like one part in 10^24, which is kind of like taking two human beings, counting the number of electrons in them, and finding that they’re identical to within one electron. In fact, if we went back to a time when the Universe was just one nanosecond old (since the Big Bang), we can quantify how finely-tuned the density and the expansion rate needed to be.

Image credit: David P. Bennett of Notre Dame, via http://bustard.phys.nd.edu/.

A pretty unlikely story, if you ask me! (Which you did!)

And yet, that very much describes the Universe we have, which didn’t collapse immediately and which didn’t expand too rapidly to form complex structures, and instead gave rise to all the wondrous diversity of nuclear, atomic, molecular, cellular, geologic, planetary, stellar, galactic and clustering phenomena we have today. We’re lucky enough to be around right now, to have learned all we have about it, and to engage in the enterprise of learning even more: science.

Image credit: NASA; ESA; G. Illingworth, D. Magee, and P. Oesch, University of California, Santa Cruz; R. Bouwens, Leiden University; and the HUDF09 Team.

Thanks for a great question, Andreas, and if you have a question or suggestion you’d like to see featured on Ask Ethan, go ahead and submit it. Who knows? The next column could be yours!