Nautilus Members enjoy an ad-free experience. or Join now .

It is hard to overstate the anticipation that preceded the opening of the Large Hadron Collider (LHC) 10 years ago. Smashing protons together at energies well above those produced at any previous particle accelerator, the LHC seemed capable of vindicating the most fanciful speculations of theoretical physicists, from curled-up extra dimensions to microscopic black holes to a hidden realm of new particles mirroring the particles that we know.

A decade on, particle physicists find themselves in what some at the time called the “nightmare scenario”: discovery of the Higgs boson and nothing else. The triumphant discovery of the Higgs in 2012 confirmed theoretical notions about the generation of particle masses introduced in the 1960s with the Standard Model of particle physics, which describes three of the four fundamental forces of nature (gravity being the exception). The absence of new physics at the LHC so far comes as a snub to many of the speculative ideas for physics beyond the Standard Model that have been advanced since the 1960s and ’70s. This development (which still could be overturned by future analyses at the LHC) has invigorated discussion about the status of a central idea in modern elementary particle physics called the naturalness principle, which served as the basis for the prediction that “new physics”—experimental hints of more fundamental patterns beyond the Standard Model—would be found at the LHC.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

As with most mathematical theories in science, the predictions of the Standard Model depend on the values of certain fixed quantities known as the theory’s parameters. If we change the parameter values, we typically change the predictions of the theory. In elementary particle physics, naturalness is most commonly understood as a prohibition against “fine tuning,” the contrived, unexplained tweaking of a theory’s parameters to accommodate unexpected observations. Naturalness restricts the permissible values of the parameters of the Standard Model by restricting the amount of fine tuning that they can exhibit. If a theory requires a lot of fine tuning to agree with observation, it is considered unnatural.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Debates over naturalness have a long history in quantum field theory (QFT), the mathematical and conceptual framework within which the Standard Model is formulated. In QFT, the value of a pervasive background field such as the electromagnetic field at each point in space has no definite value but exists only in a superposition that reflects the probabilities for different values of the field. While the energy of vibrations in a classical field (like a sound wave) can be as large or as small as we like, the energy of vibrations in a quantum field has a minimum value. These minimal disturbances in the quantum field are what we know as elementary particles.

Unfortunately, when we naively use QFT to calculate the probability of some process—for example, the production of a Higgs boson—it generates nonsensical, infinite predictions. To address this problem, the inventors of QFT devised a clever but mathematically dubious trick for extracting finite predictions from the theory, known as renormalization. The trick is to recognize that in QFT we have a choice between finite parameters that generate infinite predictions, as in naive applications of QFT, and infinite parameters that generate finite predictions. By choosing to make these parameters, which were rechristened as “bare” parameters, infinite, we can extract finite predictions from QFT that agree remarkably well with experiment. However, renormalization was widely regarded as a temporary hack lacking sound mathematical basis, and many of the inventors of QFT continued to regard it with suspicion. Because they were infinite, bare parameters were originally seen as a mere mathematical device that did not describe anything in nature.

It’s as if we hapharzardly tossed a handful of sand onto a dark surface and found the grains ordered into the famous image of Albert Einstein sticking his tongue out.

In the 1970s, Kenneth Wilson formulated a new approach to renormalization that removed the infinities from QFT. His approach was inspired by condensed matter physics, which concerns complex many-particle systems like crystals and semiconductors. Condensed matter systems can often be described in a way that closely resembles the QFTs of particle physics, since the vibrations of the atoms in a solid can be collectively described as a field. However, there is an upper limit to the energy of vibrations that can propagate in these materials, since the distance between neighboring atoms sets a minimum wavelength. Wilson suggested that the QFT models of elementary particle physics likewise be defined to include only vibration energies up to some high-energy “cutoff.” Since the infinities of QFT come from attempts to describe vibrations of arbitrarily high energy, introducing a cutoff allows for finite values of both the theory’s predictions and its bare parameters.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

One interpretation of Wilson’s work understands the analogy between particle physics and condensed matter theory in a relatively literal sense. According to this interpretation, there is a single, true set of values for the bare parameters of a QFT model, known as the QFT’s “fundamental parameters,” which provide the correct “microscopic” description of the quantum fields in much the same sense that a detailed description of the inter-atomic interactions in a piece of silicon provide the true microscopic description of that system. By contrast, quantities measured in accelerators, such as the Higgs boson mass or the probability of producing a Higgs boson, belong to a coarser, “macroscopic” level of description, in some general respects the way quantities like temperature and density provide a coarser macroscopic description of a solid.

But this particular way of understanding Wilson’s approach to QFT posed a problem with the 125 GeV value of the Higgs boson mass that was later measured at the LHC. The problem was originally articulated by Leonard Susskind in 1979, well before the value of the Higgs mass was experimentally confirmed in 2012. In the Standard Model, the predicted experimentally measured value of the Higgs mass is calculated as the sum of the bare Higgs mass parameter—one of the “fundamental parameters” of the Standard Model—and so-called quantum corrections that capture the effect of the Higgs’ interactions with itself and other particles. Because these corrections grow much more quickly with increasing values of the cutoff than do the quantum corrections to other Standard Model parameters, recovering the measured value of the Higgs mass requires an unusually delicate cancellation between the bare Higgs mass and its quantum corrections. If we imagine selecting a set of values for the fundamental bare parameters randomly from the set of all possible values for these parameters, it is highly unlikely that the chosen parameters will exhibit the required cancellations; only an improbable, fine-tuned set of parameter values will work. The larger the cutoff, the more fine tuning is needed.

DJANDYW.COM / Sylverarts Vectors

It’s as if we hapharzardly tossed a handful of sand onto a dark surface and found the grains ordered into the famous image of Albert Einstein sticking his tongue out. Since the probability of such a configuration occurring by chance is astronomically small, we expect a deeper underlying explanation—for example, perhaps the grains contain iron and there is a magnet below the surface. The more grains there are in the handful, the more unlikely this is to happen as a matter of random chance, since the arrangements resembling the image of Einstein are a much smaller fraction of the total number of possible arrangements of all of the grains. Thus, the appearance of Einstein’s image in the sand grains calls out more urgently for explanation the more grains there are. By analogy, a measured value for the Higgs mass requiring delicate cancellations between the bare Higgs mass and quantum corrections calls out more urgently for explanation, the larger the cutoff for the Standard Model is.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

In order to avoid the need for fine tuning, physicists had predicted that this cutoff should lie somewhere in the relatively low range of energies probed by the LHC. Ten years later, the continuing absence of any such new physics places the Standard Model cutoff somewhere above 1000 GeV, which already requires more fine tuning than many physicists find acceptable. The need for fine tuning in the Standard Model is now thought by many to imply that QFT is unnatural in the sense that it relies on a contrived or fine tuned choice of values for the theory’s fundamental parameters.

One view of the present situation is that results from the LHC require us to accept the need for fine tuning in the Standard Model, which in turn demands explanation by a more fundamental, as yet unknown theory. This is the position of Gian Giudice, head of the CERN theory group, and many other members of the particle physics community, for whom fine tuning remains a real problem in need of resolution by theories beyond the Standard Model.

The validity of naturalness-based arguments that invoke the notion of fundamental parameters rests on questions about what is physically real in quantum field theory.

Another possibility is that the naturalness principle, understood as a prohibition against fine tuning of bare parameters, should be abandoned, and that attempts to avoid or explain fine tuning of the bare Higgs mass should be much less of a focal point in the exploration of physics beyond the Standard Model. This view has been defended by a small minority of prominent voices, including the physicists Christof Wetterich, Eugenio Bianchi, Carlo Rovelli, and Sabine Hossenfelder, and the mathematician Peter Woit. This more controversial suggestion implies that much work in theoretical particle physics over the last four decades has been premised on shaky metaphysical speculations, which might have held less sway were the particle physics community less beholden to the pronouncements of a few leading figures. Notwithstanding the radical implications of this view, the absence of new physics at the LHC leaves it open as a strong possibility.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

If naturalness is misguided, where does it go astray? One potential point of weakness, recently underscored by Hossenfelder, is the assumption that some Standard Model parameter values are more “likely” than others. She argues that since the parameters of the Standard Model are fixed and given to us only once, we cannot estimate this probability in the way that we estimate the probability, say, of a coin landing heads—by flipping it lots of times and seeing how many times it turns up heads. Since there is nothing to tell us how the probability distribution over the fundamental bare parameter space is determined, the worry that the parameters of the Standard Model are somehow unlikely or fine tuned can’t be formulated. Thus, Hossenfelder suggests that we shouldn’t worry about fine tuning of the Higgs mass, or pursue naturalness in physical theories.

Another concern about naturalness was articulated in the 1980s by Wetterich, a pioneer of renormalization theory. In Wetterich’s view, the delicate cancellation between the Higgs bare mass and quantum corrections is an artificial by-product of how we choose to calculate the Higgs boson mass, rather than a reflection of any mysterious coincidence or underlying conspiracy. Wetterich argues that other conventions could be chosen in which the cancellation is absent, and that we therefore shouldn’t worry about it. More recently, Bianchi and Rovelli have advocated a similar attitude with regard to the famous cosmological constant problem, which is closely related to the Higgs naturalness problem.

Since 2016, my collaborator Robert Harlander and I have been studying the debate around naturalness and fine tuning. What we’ve found is that Wetterich’s argument seems implicitly to dispense with the assumption made in many interpretations of Susskind that there are real physical values of the bare parameters. Instead, Wetterich’s argument assumes there are many distinct, physically equivalent choices of values for the bare parameters that generate exactly the same predictions, and there is no matter of fact about what the “true” or “fundamental” bare parameters of the theory are.

From this perspective, the delicate cancellation between bare Higgs mass and quantum corrections appears more closely akin to the cancellation that occurs, say, when we calculate the distance from the Museum of Natural History to the Metropolitan Museum of Art in New York by taking the difference between their distances to the “Hollywood” sign in Los Angeles. Here, as in the case of the Higgs, the difference between two large numbers yields a much smaller number. However, there is no coincidence or fine tuning. The delicate cancellation is merely an artificial by-product of an inconvenient reference point, and can easily be removed by an alternative choice, such as the middle of Central Park. In the case of the Higgs mass, the delicate cancellations between the bare Higgs mass and quantum corrections can similarly be removed by choosing a different reference point, associated with the value of a certain arbitrarily chosen, unphysical scale parameter.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

The validity of naturalness-based arguments that invoke the notion of fundamental parameters rests on questions about what is physically real in QFT—about which parts of QFT math “latch on” to genuine features of the physical world, and which do not. The main argument for abandoning the notion that a single set of values for the Standard Model’s bare parameters provides the “true” underlying description of the system is that it is unnecessary to generating the successful predictions of the Standard Model and to defining the theory in a mathematically sound way.

In one sense, the choice to abandon the notion of fundamental parameters parallels the choice in Einstein’s special theory of relativity to abandon the assumption that there is an absolute matter of fact about which objects are at rest and which are in motion. In both cases, we shed the idea that there is a physical matter of fact about which reference point or coordinate system is physically correct or real, in favor of the notion that there is a plurality of equally valid descriptions associated with different arbitrarily chosen reference points. What counts as real are quantities that don’t depend on the arbitrary reference point.

Supersymmetry, long a leading candidate among theories describing physics beyond the Standard Model, has frequently been defended on the basis of its ability to cure the problem of delicate cancellations in the calculation of the Higgs mass. The extent to which this counts as a virtue of supersymmetry depends on the extent to which these delicate cancellations were ever in especially urgent need of resolution to begin with. Shedding the notion of fundamental parameters raises the possibility that they were not.

What’s at stake in debates about the validity of the naturalness principle is not the notion that physicists should continue the search for deeper, more universal theories—this is widely accepted—but rather the notion that the delicate cancellations associated with the Higgs mass constitute a mysterious coincidence, and that explaining these cancellations should be a main focus in the search for physics beyond the Standard Model. To the extent that the particle physics community is restless to extricate itself from the familiar confines of the Standard Model, the absence of new physics from LHC data, as predicted by the naturalness principle, is disappointing and anti-climactic. However, given the many difficulties that continue to afflict the mathematical and conceptual foundations of quantum field theory, it may be that the road to progress will lie partly in the quest for deeper understanding of theories that we already know.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Joshua Rosaler is a research fellow at the Institute for Theoretical Particle Physics and Cosmology at RWTH Aachen University and a member of the “Epistemology of the LHC” collaboration. He holds a doctorate from the University of Oxford, where he was a Clarendon Scholar. 

The author wishes to acknowledge the Epistemology of the LHC Research Unit, of which he is a member.

Lead photocollage: general-fmv / iLab / Shutterstock.

Nautilus Members enjoy an ad-free experience. Log in or Join now .
close-icon Enjoy unlimited Nautilus articles, ad-free, for as little as $4.92/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

Access unlimited ad-free articles, including this one, by becoming a Nautilus member. Enjoy bonus content, exclusive products and events, and more — all while supporting independent journalism.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member. $9.99/month. Cancel anytime.