Some of the most perplexing topics in physics revolve around quantum theory. The quandary is seen most famously in the Schrödinger’s cat question and the issue of information loss in black hole evaporation. Richard Feynman said, “I think that I can safely say that nobody understands quantum mechanics.” Most physicists have just gotten used to it. There’s no doubt quantum theory is successful at the practical level. But when considering it as more than a tool for calculating probabilities for possible outcomes of experiments in the laboratory, and taking it as a fundamental description of the “world out there,” it faces serious conceptual problems.
The basic problem is that quantum theory seems to be about what we measure and not about what is out there in the world. One might think this is just fine, as the theory represents just “our information” about the world. But that would make sense only if there were something about the world that we can be informed of, which must be, in general situations, specified by the theory. Understanding how to deal with this conceptual problem requires us to look at the theory in more detail.
According to quantum theory, a generic state of a system (a particle’s position or velocity) does not have well-defined values. That indefiniteness is known as “quantum uncertainty,” and, unfortunately also as “quantum fluctuation.” The quantum theory presented in standard textbooks involves two distinct rules for the evolution of the state of a physical system. One, referred to by Roger Penrose, is the U-process. It’s represented by the Schrödinger equation, allowing the precise determination of the state of the system at any future time (deterministic prediction), or time in the past (complete retrodiction), given the state of the system at present. But this rule only holds as long as the system is not subjected to an “observation.”
The second rule, which comes into play when some attribute of the system is observed or measured, is a stochastic rule, referred to by Penrose as the R-process. According to this rule, as a result of the measurement, the state jumps into one of the states where the attribute in question has a well-defined value. This rule does not allow, in general, a precise prediction of which state that would be, nor the retrodiction of the state previous to the measurement or observation. One can use it to accurately predict probabilities, and predict the average value that would emerge from a large number of repetitions of the experiment, as well as the statistical dispersion of the results, a quantity that coincides, in numerical value, with the level of indefiniteness mentioned above.
The nature of black hole singularity has given rise to wild speculations.
One of the problems is that quantum theory is obscure (to say the least) regarding what it claims about the nature of the world when no one is looking. Is the involvement of a consciousness required for the theory to make sense, and if so, does that include a mouse’s or a fly’s? In particular, the specification of what constitutes a measurement is irreparably vague. Perhaps all that’s needed is a large enough apparatus. But what’s large enough? And what happens at the boundary? These issues are referred to as the measurement problem. Such conceptual difficulties are usually ignored by practicing physicists.
One exception is provided by David Bohm, who rediscovered a proposal (originally considered by Louis de Broglie) giving a different characterization of the theory, with point-like particles that at all times have definite positions and velocities, while the quantum state simply guides them in their time evolution (and a cat is never simultaneously dead and alive). Another notable exception is exemplified by the proponents of modifications of the theory that would unify the U and R processes into a single law, removing the need to introduce the notion of “measurement” at the fundamental level. In that case, Schrödinger’s misfortunate pet would be either dead or alive, even if no one is looking.
This approach has formed the basis of “spontaneous collapse” theories,1 which are characterized by invoking something akin to a collection of miniature versions of the R process occurring spontaneously to all particles throughout space and time; that is, without the need for a measurement to take place. Further out on the frontier is the many-worlds theory (pioneered by Hugh Everett), in which every measurement is tied to a bifurcation (or multifurcation) of reality into something like parallel coexisting worlds.
A careful analysis shows that these are essentially the three logical avenues that might be taken to deal with the issue2: Modify the theory by adding something beyond the quantum state (the hidden variable rout exemplified by the de-Brogile-Bohm approach), modify rules for evolution in the theory by having measurement-like events occur all the time (as in the spontaneous collapse theories), or remove the R process altogether (which takes us down the many-worlds path).
Many quantum physicists are convinced the issue at large, or the approach one might take in its regard, are of no relevance to the challenges in their fields. I, among a small group of colleagues, hold a dramatically different view, and maintain that spontaneous collapse is the most promising route to address some of the most serious difficulties faced by our current understanding of the laws of the universe, and in particular those situations where gravitation and quantum theory must be used together.
A central feature of cosmology, as we commonly understand it, is an epoch known as inflation, thought to have taken place in the first fractions of a second after the Planck epoch, itself a mysterious regime. In the Planck epoch, quantum gravity should rule, and the notions of spacetime itself would probably cease to be relevant or useful. (Quantum gravity refers to a theory that would harmoniously combine the basic principles of general relativity, our best theory of gravitation, and quantum theory.) In this inflationary regime, the usual concepts of spacetime are supposed to be adequate. But in addition, gravitation is thought to be well described by general relativity, and matter explained by the same type of theories we use in ordinary particle physics situations (such as those explored empirically in places like CERN or in the studies of high energy cosmic rays).
The main difference is that the type of matter thought to be dominant in the inflationary epoch is a field known as the inflaton. This is a little bit like the electromagnetic field, but far simpler, because it lacks intrinsic directionality or spin. The main feature of that epoch is that the universe expands, as a result of the gravitational effects of the inflaton field, in an extremely fast and accelerated manner (by a total expansion factor of at least a million trillion trillion times; i.e., a factor of 1030). As a result, the spatial curvature of the universe is driven to zero, and all deviations from perfect homogeneity and isotropy are completely diluted (the remaining deviations of the order of 10-90, so small, that for simplicity, I will take it as zero).
The inflationary epoch ends when the inflaton field decays, filling the universe with all the matter we find in it today: the usual matter from which you, the chair you are sitting on, and the solar system are made of; the slightly more exotic type of matter we are able to produce for fractions of a second with powerful particle accelerators like CERN’s; and even the elusive dark matter that seems to constitute the overwhelming part of galaxies and galactic clusters. In other words, the end of the inflationary epoch is supposed to lead into the regime described by the older, traditional, and empirically successful Big Bang cosmology, describing an expanding universe filled with extremely hot plasma composed of all the variety of particles with their respective abundances basically ruled by thermodynamic considerations. A universe that cooled down as it expanded, leading to the formation of light nuclei (when the temperature dropped to about a billion degrees Kelvin), and much later, to the formation of the first atoms (at about 3,000 degrees Kelvin). This latter stage is the one in which the photons corresponding to cosmic microwave background radiation are emitted.
In the small variations of temperature patterns of the cosmic microwave background radiation, we can see the imprint of the primordial deviations from homogeneity and isotropy that would continue to grow to the present to make up the galaxies, stars, and planets that populate our current universe. The point is that the universe is not, and has not been, homogeneous and isotropic for quite some time. On the other hand, according to inflation, the universe’s violent expansion completely diluted all inhomogeneities (differences in conditions between different places) and anisotropies (differences among different directions). That situation is described in terms of a spacetime and inflaton field in states that are completely homogeneous and isotropic.
Where do the inhomogeneities that led to the formation of all cosmic structure, and whose imprint we see in the cosmic microwave background, come from? According to the current cosmological orthodoxy, they arose out of “quantum fluctuations” of the inflaton and the spacetime metric during the inflationary epoch. In fact, inflation comes together with a recipe for the quantum state of fields in the inflationary epoch, a so-called Bunch-Davies vacuum. That state, just as the vacuum state in flat spacetime, has the property of being 100 percent homogeneous and isotropic, and yet somehow we ought to regard that in its quantum uncertainties lie the seeds of present day cosmic inhomogeneities.
Most cosmologists see no problem here because they readily interchange “quantum indefiniteness” and “statistical dispersions” (a conceptual error often obscured by the fact that the word fluctuation is used in both contexts). But that view is only justified if there were a measurement involved. The point is that a measurement might indeed change the state of the system, according to the R-process, leading to a state that is no longer homogeneous and isotropic as the initial one.
But what could count as a measurement in the early universe, well before galaxies, planets, and conscious beings ever formed? Some cosmologists would answer that, today, with our satellites, we are making the required measurements. A moment’s reflection shows how problematic such posture is: We, with our measuring devices, are responsible for the breakdown of the perfect homogeneity prevailing in the early universe, a change that led to the formation of cosmic structure, including galaxies, stars, and planets, which in turn was necessary for the emergence of the conditions where life and (self-proclaimed “intelligent”) creatures like ourselves would be possible! We would be, in part, the cause of our own existence! I can’t help but to be reminded of the old country song, “I’m my own grandpa.”
After considering existing paths to address the grandpa problem, Alejandro Perez, Hanno Sahlmman, and I proposed3 adding a new ingredient to the mix: the spontaneous collapse of the quantum state of the inflaton field. This is a version of the R-process, taking place constantly, which in general induces small and random changes in the quantum state of the field. The randomness of such a process would be able to account for the breakdown of homogeneity and isotropy in the early universe, without having to invoke any observer or measuring device. Moreover, if spontaneous collapse satisfied some simple requirements, the resulting predictions regarding these inhomogeneities could reproduce the characteristics of the distribution of the temperature variations that are seen in the cosmic microwave background.4
Quantum theory is obscure (to say the least) about the nature of the world.
At first, the new approach did not seem to lead to any important departures from the standard predictions. But there’s at least one aspect of the story in which the predictions differ dramatically. It turns out that, according to the standard treatment, the predictions for the generation of inhomogeneities in the density of matter in the universe come inseparably attached to similar predictions for the generation of the so-called primordial gravity waves. These would be similar to those gravity waves that have been so spectacularly detected arising from the collisions of black holes and/or neutron stars by the detectors LIGO and VIRGO. But unlike those, the primordial ones would be so feeble now, that their presence is only expected to be detectable in a certain type of anisotropies in the polarization of the cosmic microwave background radiation.
The search for those has been intense, as they are taken as the main possible confirmation of the correctness of inflation. The fact that they have not, so far, been detected is considered as a serious problem in inflationary cosmology, with the simplest and most attractive models already ruled out by the failure of their expected detection. When following our approach,5 the predictions regarding the generation of primordial gravity waves are so dramatically reduced that they would be undetectable by current methods and detector sensitivities. The calculations show they might be detectable only with substantially improved sensitivities and with a change of focus from the very small to the very large angular scales in the sky (two unfortunately rather difficult things to do). Thus, quite unexpectedly, and as a result of the type of conceptual considerations we set out to face, a concrete prediction for inflationary cosmology was dramatically changed, with the novel one in better accordance with the existing empirical evidence.
The conceptual difficulties of quantum theory are also tied to the topic of black holes. The theory of general relativity predicts that once black holes are formed, a singularity—a region where geometrical quantities would nominally acquire the value infinity—will develop in their interior, with the curvature diverging as that region is approached. The nature of such singularity has given rise to wild speculations, including the notion that they represent the emergence of even more exotic objects, or even portals to other universes. But what they really indicate is the presence of a regime where the theory of general relativity fails to apply. (Not too exciting, sorry folks!)
That is, if we want to rely on the theory of general relativity, we must do so only up to some boundary that excludes the region where those singularities are supposed to appear.
Physicists are generally convinced that our current theory ought to be superseded by a deeper one, which encapsulates both general relativity and quantum mechanics, joined in a smooth and self-consistent manner: a theory of quantum gravity. Such quantum gravity is expected to “cure” those singularities, and remove the need to include a boundary in the discussions involving black holes. The least speculative notions do not involve anything like portals to other universes or wildly exotic objects appearing in their stead.
One feature about them, first noted by physicist Jacob Beckenstein, that is taken as a fundamental clue, is that their energy exchanges with the exterior are ruled by laws that seem identical to those of the thermodynamics. In particular, and as shown by Stephen Hawking, they lose energy via the emission of thermal radiation, and have an entropy given by the Boltzmann constant (ubiquitous in all thermodynamics) for each “tile” of Planck’s length side, needed to cover the area of the black hole. This has attracted intense interest in the last decades, as physicists started considering a variety of approaches toward the construction of a theory of quantum gravity. Surely, such theory should be able to account for the expression for the black hole entropy. And readily enough, in a relative short time, and within slightly different, but always rather limited contexts, proponents of quantum gravity found accounts that came up with the right answer.
An attractive solution to the measurement problem is provided by spontaneous collapse.
But the fact that this analysis, starting from Hawking’s discovery, involves quantum theory, has raised another question that continues to baffle physicists. It’s the focus of intense debates and disagreements and goes by the name of the Black Hole information “paradox.”
The usual account goes like this: According to quantum theory, the quantum state of an isolated physical system provides a complete description of such a system. That state evolves according to an evolutionary law that allows for the exact prediction of the corresponding state at any other time in the future, or the retrodiction of the state of the system in the past. On the other hand, a black hole of a certain mass and angular momentum could have been formed in a large number of ways. If the black hole evaporates completely, leaving only the thermal radiation, which is fully characterized in very simple ways, there seems to be no way in which it might encode all the information needed to retrodict, with precision, the exact quantum state of the matter that gave rise to the black hole in the first place. Thus, from the details of the final state, it would be impossible to retrodict the detailed state from which the black hole was initially formed, in conflict with what is expected, given the characteristics of the evolution laws of quantum theory. This, to many people, indicates that we face a “paradox.”
A closer look into the problem reveals that things are not so straightforward (and explains why I put the word paradox in quotation marks). The point is that the claim that according to quantum theory we should be able to retrodict the detailed state from which the black hole was initially formed is just false. Such a conclusion would only follow if one just focuses on the U-process and completely ignores the R process. Thus, it is natural to tie the consideration of the issues arising in connection with black hole evaporation and the fate of information with the resolutions of the measurement problem.6
One of the most attractive solutions to the measurement problem is provided by spontaneous collapse. Starting in 2015, my colleagues and I have considered7 and analyzed in detail, with the help of simplified models,8 whether the use of such theories, in the context of the black hole evaporation, could fully address the issue. Our analysis so far indicates that the answer is yes, provided the spontaneous collapse rate increases with the curvature of spacetime. If that is the case, then the small level of information erasure that is normally associated with the spontaneous collapse becomes efficient enough, due to the increasing curvature in the deep interior of the black hole, to account for all the information that seems to be erased when it evaporates completely.
The work must continue to sort out the open issues and details of the exact form of the theory, and to find other situations where these ideas could be put to test. Although things are not yet settled, the possibility exists that a collective resolution of problems as diverse as that of Schrödinger’s cat, the black hole information issue, and the puzzling aspects of inflationary cosmology, might result from the consideration of spontaneous collapse. We have recently found other issues where this approach might be of help, including the possibility to account for the very low entropy of the initial state of the universe,9 and a path to understand the nature and magnitude of the dark energy.10 The use of spontaneous collapse theories in situations involving gravitation seems to be a very promising and exciting research path indeed.
Daniel Sudarsky has held visiting positions at the University of Chicago, Penn State University, the University of Buenos Aires in Argentina, the University of Marseille in France, and NYU. He is currently a member of the Board of Directors of the John Bell Institute for the Foundations of Physics and professor at the Institute for Nuclear Sciences of the National Autonomous University of México.
1. Ghirardi, G.C., Rimini, A., & Weber, T. Unified dynamics for microscopic and macroscopic systems. Physical Review D 34, 470-491 (1986); Pearle, P. Combining stochastic dynamical state-vector reduction with spontaneous localization. Physical Review A 39, 2277-2289 (1989); for a relatively recent review see Bassi, A. & Ghirardi, G. Dynamical reduction models. Physics Reports 379, 257-426 (2003).
2. Maudlin, T. Three measurement problems. Topoi 14, 7-15 (1995).
3. A. Perez, A., Sahlmman, H., & Sudarsky, D. On the quantum mechanical origin of the seeds of cosmic structure. Classical and Quantum Gravity 23, 2317-2354 (2006).
4. Planck Collaboration (Akrami, Y., et al.) Planck 2018 results. X. Constraints on inflation. arXiv:1807.06211 (2019).
5. León, G., Kraiselburd, L., & Landau, S.J. Primordial gravitational waves and the collapse of the wave function. Physical Review D 92, 083516 (2015); Majhi, A. Okón, E., & Sudarsky, D. Reassessing the link between B-modes and inflation. Physical Review D 96, 101301 (2017); León, G., Majhi, A., Okón, E., & Sudarsky, D. Expectation of primordial gravity waves generated during inflation. Physical Review D 98, 023512 (2018).
6. In this, we have been strongly influenced by considerations in this regard made over 3 decades ago by works such as Penrose, R. Time asymmetry and quantum gravity. In Isham, C.J., Penrose, R., & Sciama, D.W. (Eds.) Quantum Gravity II (1981); Wald, R.M. Quantum gravity and time reversibility. Physical Review D 21, 2742 (1980).
7. Okón, E. & Sudarsky, D. “The black hole information paradox and the collapse of the wave function. Foundations of Physics 45, 461-470 (2015); Okón, E. & Sudarsky, D. Losing stuff down a black hole. Foundations of Physics 48, 411 (2018).
8. Modak, S., Ortiz, L., Peña, I. & Sudarsky, D. Non-paradoxical loss of information in of black hole evaporation in collapse theories. Physical Review D 91, 124009 (2015); Bedingham, D., Modak, S.K., & Sudarsky, D. Relativistic collapse dynamics and black hole information loss. Physical Review D 94, 045009 (2016).
9. Okón, E. & Sudarsky, D. A (not so?) novel explanation for the very special initial state of the universe. Classical & Quantum Gravity 33 (2016).
10. Josset, T., Perez, A. & Sudarsky, D. Dark energy as the weight of violating energy conservation. Physical Review Letters 118, 021102 (2017); Perez, A. & Sudarsky, D. Dark energy from quantum gravity discreteness. Physical Review Letters 122, 221302 (2019).
Lead image: local_doctor / Shutterstock