Chicago-bound motorists passing mile marker 121 on Interstate-88 through Aurora, Illinois, on Sept. 30, 2011, at 3:00 p.m. likely noticed nothing that seemed particularly remarkable. To their right was a scene of humdrum office parks, and to the left was a low-slung sprawl of buildings, fences, and trees fringed with the first yellow edges of fall color. In the distance, the skyscrapers of the Loop would soon materialize from the afternoon haze.
Appearances, however, were deceptive. Unbeknownst to that heavy stream of Friday traffic, the drivers were threading, both physically and metaphorically, though a moment of profound and potentially far-reaching transition.
Just north of I-88, at the Fermi National Laboratory, the Tevatron Collider had, at 3:00 p.m., completed the last minute of its final day of operation, a victim of budget cuts. For a final few moments, in an awe-inspiring 1.24-mile wide ring of superconducting magnets whose footprint is visible from space, protons and antiprotons had raced in opposite directions at full speed, executing nearly 50 laps per millisecond, colliding head-on in exquisitely staged, almost vanishingly fleeting microscopic disasters where temperatures had exceeded a quadrillion degrees, revisiting, in a quiet corner of suburbia, the extreme state of nature that held sway during the first second of the universe’s existence.
That afternoon’s shuttering of the Tevatron closed the books on decades of United States leadership in particle physics. The momentum from the success of the Manhattan Project that had spurred the construction of ever larger, ever more powerful, and ever more expensive machines had effectively just come to an end.
We humans are producing a computational load that is beginning to faintly register in comparison to the staggering sum total of the Earth’s biological computation.
Just south of the Tevatron, across the six lanes of I-88, and a few hundred yards from the frontage road, stood a bland white building that, at casual glance, resembles any of the countless distribution warehouses in the Chicago exurbs. This building, however, had no signs or markings, and a careful observer might well have wondered at the profusion of security cameras, the sturdy iron fence, the manned guardhouse, and the handful of carefully tended pockets of wetlands-inspired landscaping. All of these touches, to the few in the know, provided subtle hints of what was going on inside. The building, recently completed, was a state-of-the-art data center built and owned by the Chicago Mercantile Exchange. For 23 hours per day, five days a week, in a rack of industrial-grade computer servers, the direction of the entire U.S. financial market was being thrashed out in a frenzied electronic blur that makes the action in the trading pits and exchange floors of old seem like the slow advances and retreats of glaciers.
In racks of co-located equipment, rented by mysterious firms at exorbitant expense, field programmable gate arrays were executing hundreds of billions of floating point operations per second, all with the goal of divining where the next fluctuations in price would land. In what can be described as a stylized, spontaneously emergent choreography, (or, to employ the usual cliché, as a high-stakes game of chicken) the machines flashed tens of millions of dollars of bids and offers, challenging one of their number to “cross the spread.”
The coincidental intersection in space and time on that fateful Friday in Chicago was a microcosm of a larger transition. Here was the dawn of the information age inscribed onto the interstices of suburban America, without pomp or circumstance. Humanity is progressing from the manipulation of matter to the manipulation of bits, and we are starting to take our world with us. For the first time in the history of our planet, we humans are producing a computational load that is beginning to faintly register in comparison to the staggering sum total of the Earth’s biological computation—and critically, our rate of computation is climbing. As it continues to do so, we are not only changing the nature of our daily lives, but the physical appearance of our planet itself. If current trends continue, our planet, as seen from outer space, will begin to take on the characteristics of an enormous computing machine. And if it’s happening to us, can it have happened to another civilization among the stars?
The information age merges human-directed activity into the planet’s 4 billion-year-old information economy. Earth’s biosphere, when considered as a whole, constitutes a global, self-contained infrastructure for copying the digital information encoded in strands of DNA. Every time a cell divides, roughly a billion base pairs are copied, with each molecular transcription entailing the equivalent of about 10 bit operations. Using the rule of thumb that the mass of a cell is a nanogram, and the estimate that the Earth’s yearly wet biomass production is 1018 grams, we arrive at a biological computation of 3×1029 bit operations per second. This is close to the amount of artificial computation we carry out within the span of a year. While the production of biomass has stayed roughly constant for hundreds of millions of years, artificial computation is increasing exponentially.
From the shuttering of Fermilab’s Tevatron to the rise of electronic markets, so begins the transition from the four-letter biological alphabet of base pairs to a basis of zeros and ones. The trend will accelerate as computation migrates to locations where electricity is cheap, and where cooling is effectively free. Google has sited a major data center along the Columbia River in Oregon, and large-scale bitcoin mining has migrated to Iceland, where the hydroelectric power is inexpensive, data privacy regulations are favorable, and arctic air howls in abundance. Currently, global energy expenditure on computation is somewhat less than 1 percent of global energy use, with the data centers themselves consuming about 2 percent of the world’s electricity production. As computational devices—cell phones, sensors, tablets—proliferate, these numbers will grow.
It might appear to another species peering at us through a telescope as if our planet were entirely sheathed not by forests and grassland, but by black photovoltaic panels.
A little-discussed consequence of our exponentially growing computational appetite is how we search for extraterrestrial life. The 1997 movie Contact tells the story of radio telescopes picking up a signal from a distant, alien civilization. That civilization had detected the first television transmissions from Earth, and rebroadcast them back to us, along with instructions to build a machine. But the idea that we might be discovered by (or discover) an alien species through stray electromagnetic emissions is being made increasingly unlikely by the information revolution.
Distant civilizations that happen to glance in our direction will likely not sense the faint traces of our radio emissions or our television shows. From a purely energetic perspective, Earth’s signal leakage into space is declining as transmissions become ever-higher bandwidth, and less isotropic, and as broadcasts migrate from the air to the growing worldwide mesh of fiber optic cables. While the total waste heat from the world’s current data centers could, in theory, be detected against a black background at distances well beyond the outer reaches of the solar system, we could not (as of now) detect any persistent artificial signal emanating from Earth if we were located at the distance of Alpha Centauri, the nearest stellar system. As we become more expert at communicating amongst ourselves, our waste signature will continue to decline.
Eventually, however, if we continue on our present exponential increase in bit operations, a second possibility will present itself: Our planet’s physical surface will entirely rework itself in the service of computation, and the consequences, from waste heat to other possible signatures such as atmospheric transformation, will generate a footprint that increasingly becomes astronomically observable.
To see why, consider how we compute. Our computations occur in a thin film on the Earth’s surface, whereas the bulk of the Earth’s mass is buried, mute and inert, in the deep interior. The data centers of the world are carrying out 3×1029 bit operations per year, and are consuming the equivalent of all of the sunlight absorbed by a patch of ground roughly half the size of Rhode Island—a millionth of the Earth’s surface area. If we continue to expand our computational resources at the present rate, and if the energy efficiency of our computation does not increase, we will be computationally competitive with the biomass in slightly more than 20 years. To effect this transformation, the planet would likely have to entirely change the appearance of its surface, and such a change could potentially be visible from vast distances. Crudely put, it might appear to another species peering at us through a telescope as if our planet were entirely sheathed not by forests and grassland, but by black photovoltaic panels. Rather than searching for extremely faint radio emissions, might distant civilizations be on the lookout for shrill, astronomically observable consequences befalling worlds that transition from one form of computation to another?
Of course, the assumption that the energy efficiency of our computation will stay constant is wrong. In fact, energy per bit operation has been falling by a factor of two every five years. But even at this rapid rate of improvement, we will increase our energy consumption by a factor of 1 million in less than 50 years. In addition, while there is, in principle, no limit to our computational desires, there is a limit to the maximum efficiency with which computations can be performed. Landauer’s Principle posits that the minimum possible energy required to carry out a single bit operation is E = kT ln(2), where k is Boltzmann’s constant and T is the temperature.
The “Fermi Paradox” refers to the puzzling disconnect between the apparent ease with which life and intelligence developed on Earth, and the complete lack of evidence that similar chains of events happened elsewhere. We haven’t detected any alien civilizations, and we don’t seem to have any alien visitors. There is no shortage of proposed solutions to the paradox: Inhabited planets might be incredibly rare. Civilizations might destroy themselves as soon as they become technically capable. We might have been placed under galactic quarantine. The list of possibilities, most of them interesting, is long.
But as we look for civilizations among the stars, it might be useful to keep our own ongoing computational transformation in mind. We are reaching the threshold where the quantity of artificial computation impinges on its biological analog. If an alien species were another few decades or centuries along this trajectory, it would leave behind a distinct heat signature from computation.
Its data show a galactic disk teeming with potential Dyson spheres.
Indeed, it might even expand its computational load to the point where it uses the entire energy output of its parent star to generate and process information. The inevitable result, a star’s worth of waste-heat glow, already has a name: a Dyson Sphere. Such a system, operating at the room temperature Landauer limit, and consuming the entire output of a sun-like star, would be capable of a staggering 1025 times more computation than we are currently doing here on Earth. Nevertheless, if Moore’s Law (or one of its generalizations) continues to hold, we will arrive at this state of affairs in a mere 125 years or so—barely longer than a human life span.
Dyson sphere computers, shining in infrared light, would be visible from across the galaxy, thereby presenting a much more realistic signpost of life than stray communications emissions. If such things exist, they have already been spotted by NASA’s recently completed WISE Mission, which performed an infrared survey of the whole sky in four different infrared bands. In fact, its data show a galactic disk teeming with potential Dyson spheres. The problem, however, lies with making an unambiguous identification. Extraordinary results require extraordinary evidence, and there are many mundane objects, such as soot-obscured red giants and dust-enshrouded protostars, that outwardly resemble what one would expect a Dyson Sphere to look like.
We might draw a clue from the kind of computation that we are engaging in. The transition that occurred on either side of Interstate-88 was part of a general arms race in the service of gaining financial advantage. Could it be that this also happened to other civilizations? Could there be a galactic exchange? If so, what does one trade, and how does one connect?
Gregory Laughlin is a professor of astronomy and astrophysics at the University of California, Santa Cruz. He is co-author of The Five Ages of the Universe—Inside the Physics of Eternity, and he blogs at oklo.org.