ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump

Waste looks like the pile of plastic, Styrofoam, and cardboard that your tiny new earphones came in. It looks like plastic bottles overflowing in the waste bin beside a water fountain. It looks like a Hummer occupied by a single person. But few people think about waste when they post kitty photos on Facebook. The clusters of computers at data centers that power websites live far away from the people who use them. As a result, the energy they consume remains out of sight and out of mind.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Until recently, the executives of most companies served by data centers did not put much thought into energy either. High electricity bills were taken as a sign of thriving business. And by that measure, business is great. Facebook users upload 350 million photos, send more than 10 billion messages, and click nearly 5 billion likes every day. All that activity requires enough electricity to power a city of 155,000 people. Google could power a metropolis of 750,000, larger than Boston, Seattle, or Denver. Globally, power consumption at data centers rose by 63 percent last year, to 38 gigawatt-hours—an amount equivalent to the annual electricity usage of New Zealand or Colombia.1

But the costs of processing data have now become so high that an efficiency edge can turn into a significant competitive edge, leading many high-tech firms to start relatively low-tech development programs focused on one thing: leaner data centers.

The goal is not an easy one. A New York Times investigation found that only 6 to 12 percent of the electricity used to power servers went to computations. Much of the rest went to keep servers idling in case a rush of activity would arrive, and to back up systems and air conditioning. Even momentary service interruptions can be expensive. For example, Sears stated that it lost $1.6 million after electrical problems shut its data center for five hours during the post-holiday rush on Jan. 3, 2012. A mere five-minute outage at Google in 2013 cost the company $545,000. And an Amazon outage in the same year, $1,100 in sales per second, according to Buzzfeed

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

With so much to lose, the price of excess energy seems worth it. Operators add redundant fleets of servers to their data centers, which chew up power but do nothing, waiting to take over in case a server fails. This is about as efficient as running the air conditioner full blast while on vacation. But in the data center world, duplication is not waste, it is reliability. “The risk of something going down is not worth the savings,” says Dennis Symanski, a senior project manager at the power utilities think tank, Electric Power Research Institute (EPRI). “They want their facilities running 99.995 percent of the time.”

There is waste, too, in cooling and maintenance services. Thousands of computers running in a single room cause temperatures to rise and computers to fail. To prevent this, giant air conditioners push cold air through floor vents and directly through the servers themselves, where it is heated and passed out as hot air.

All of this means only a fraction of energy goes to the computing services which power the websites people actually use. A commonly used measure of this fraction is the power utilization efficiency, or PUE. A PUE of 1.0 means that all of a data center’s energy is going to in-use computing equipment; a 2.0 would mean that only half of its energy is. When the Uptime Institute, a New York-based data center industry group, first polled members in 2007, it found the average PUE was 2.5. In other words, members burned 1.5 kilowatts-hours of electricity for every 1 kilowatt-hour they used for computing.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

With so much to lose, the price of excess energy seems worth its cost if it ensures that a blackout won’t occur.

Energy expenditures finally got under Walt Otis’ skin two years ago. Otis manages the data center of a professional services firm with offices around the world. Business was up, and Otis wanted to expand, but his 1,500 servers had pushed his cooling system to the max. Adding more would have overloaded it entirely.

This led Otis to look at his data center in a different way. For example, he noticed that the air conditioning vents were positioned just above the areas where computers vented out hot air, heating the air-conditioned air before it could cool the servers. The problem was simple enough to fix. He moved the air conditioning vents away from the computer exhaust, and hung plastic drapes over the servers, and at the ends of aisles, to keep chilled air from mixing with warm air.

With this simple change, Otis was able to dial up his thermostats by 7 degrees Fahrenheit. He did it one evening when no one was around. “The techs would have freaked out if I even talked about it because they were afraid the servers would heat up and fail,” says Otis, “but nobody even noticed because the temperature was still cooler than the heated air that was reaching the servers before.” The move cut Otis’ electric bill by nearly half, and he expanded his data center.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

At Facebook, technicians also isolated server exhaust from cooler air. They also made a more radical change: They threw away their air conditioners. The lore among technicians was that servers must operate between 64 to 70 degrees Fahrenheit to prevent overheating. It took years to realize that this was a holdover from earlier times. Newer servers can run as hot as 95 degrees Fahrenheit, says Jay Kyathsandra, Intel’s manager of data center efficiency solutions. That means that blowing in ambient outdoor air can replace air conditioning.

Lawrence Berkeley National Lab’s supercomputer center in California has made this change, but relies on water instead of outside air. It starts by letting warm water cascade through large towers, where some of the water evaporates and lowers the temperature of the remaining water. The water is then used to cool the hot air in the equipment room.

The idea can also be run in reverse: At the National Renewable Energy Laboratory’s data center in Boulder, Colo., warm air wafting off the servers is funneled into adjacent buildings to heat rooms, and also onto nearby sidewalks and parking lots to melt snow in the winter. 

Modern data centers are also reducing server redundancy. Rather than have each server handle a single operation, like streaming video or email, today’s servers can act as multiple independent servers each handling a different operation. “We can use the same piece of hardware to run 5, 10, or 15 applications instead of one [application],” says John Pflueger, Dell’s representative to Green Grid, an organization founded to improve data center efficiency.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

These measures are having an effect. In 2012, the Uptime Institute found that average PUEs had fallen significantly from 2.5 to below 1.9. But the largest data centers have shown some of the most dramatic improvements. Facebook, for example, had an average PUE of 1.09 in 2012. Google’s most efficient server farm checked in at 1.06. In the data center world, energy savings have become as visible as a garbage can brimming with trash, and companies are cleaning up their act.

Alan Brown is a freelance writer focusing on the intersection of science and engineering.

close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Subscribe to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.