Facebook's state-of-the-art data center houses awesome amounts of computing power, but the biggest technical challenge has been the air handlers.
The company said today that its Prineville, Ore., data center received LEED gold certification from the U.S. Green Building Council. The power usage effectiveness (PUE) rating varies between 1.06 and 1.1, making it a data center that consumes about half what a building simply built to code would use.
The significance of the Prineville center could go beyond lower energy bills and lower emissions for its owner, though. Facebook started the Open Compute consortium to share--and improve upon--the designs of the data center in an open-source-like effort to advance efficient data centers techniques in the industry.
To use power more efficiently, the building has a power supply that sends direct current to servers, which saves energy by cutting down on the conversions between alternating current from the grid and the direct current the hardware uses. The servers themselves are custom made, able to work at higher temperatures, which lessens the air cooling needs.
But one of the biggest breaks with conventional design is its air flow system that's built around evaporative cooling of outdoor air. About half of data centers' energy usage normally goes to cooling.
Rather than a raised floor pumping air conditioned air toward servers, Facebook built an elaborate air flow system that takes advantage of the relatively cool air in Oregon. That outdoor air, with the help of sprayed water, does all the cooling.
A penthouse above the server racks takes in outdoor air and, after being filtered, is cooled by a bank of water misters. That cooled, humidified air is then fed onto the concrete floor of the data centers to actively cool the racks.
Hot air is collected and exhausted during warm days. But in the winter, the heat from the data center gear is mixed with incoming outdoor air to heat the room.
One of the challenges, which caused a shutdown of servers from an overheated power supply in June, was a build up of condensation, according to a blog post today. The root of the problem was the control system for adjusting the air handlers to adjust for rapid swings in temperatures, which Facebook tweaked to address the problem.
The building itself uses several other efficiency tricks, including a networked LED lighting system with motion sensors, and a 110-kilowatt solar array that provides power to the offices.
Facebook has touted its super-efficient data center in response to complaints from advocacy group Greenpeace, which launched a campaign to pressure Facebook to lower its reliance on coal power.