Facebook Data Center

Prineville, Oregon  Region ENR California

Project Team

Owner Facebook, Palo Alto, Calif.

Architect Sheehan Partners, Chicago

General Contractor DPR Construction, Redwood City, Calif.; Fortis Construction, Portland, Ore.

Engineers Alfa Tech, San Jose, Calif.; WHPacific, Bend, Ore.; Peoples Associates Structural Engineers, Milpitas, Calif.

Consultant Brightworks Sustainability Advisors, Portland, Ore.

Subcontractor: Rosendin Electric, San Jose, Calif.

The Facebook Data Center is a project of many firsts. It was Facebook's first real estate purchase. It was the first data center to be designed, owned and operated by the social networking software giant. It is Facebook's first LEED-Gold certified building and one of the most energy-efficient data centers ever built.

In another first, Facebook is giving away the project's design secrets. In a nod to its hacker roots and open-source software, Facebook last year launched Opencompute.org, a repository for technical specifications and CAD drawings for the sustainable data center's mechanical and electrical systems, battery cabinets, rack systems and servers.

To make the 320,000-sq-ft project more sustainable than Facebook's leased facilities, a key decision was to integrate a new server design with the center's overall design, says Tom Furlong, director of site operations. To reduce power consumption, servers were custom built to eliminate unneeded components usually found in off-the-shelf equipment, such as video cards and multiple network interfaces.

In another deviation from the norm, designers cut out a stage of power transformers and used a higher voltage in the facility. "If you want to focus on efficiency in the electrical system, you want to limit those power conversions or make them with the most efficient components you can," Furlong says. In typical data centers, up to 25% of power is lost in these conversions; the rate was reduced to just 7% at Prineville.

Much of the project's energy savings result from an innovative yet simple mechanical system that relies on evaporative cooling. Most data centers are noticeably chilly, some as cool as 55º F. Located in a desert in central Oregon, Facebook's cold aisles are designed to operate at much higher temperatures. "In 2009, when we were doing this data center's design, the ASHRAE standard was changed to 80.6º F inlet temperature, but no one wanted to step up and be the first to adopt that new code," says Jay Park, the project's design engineer. "We took a big step and got there, proving that the [new] standard works—and we went even further." Prineville's second phase, along with Facebook's recently completed North Carolina data center, will operate using 85º F inlet air.

To cool the air, the mechanical system simply injects water mist as needed after the outside air is drawn through a massive bank of dust filters. "Prineville's high-desert location means they are able to use outside air routinely most times of the year, with evaporative cooling used during the summer months," says Eric Lamb, executive vice president at Redwood City, Calif.-based DPR Construction, which built the project in a joint venture with Portland, Ore.-based Fortis Construction. Variable-frequency-drive-equipped fans work in concert with onboard server fans to draw air over the server components at the ideal rate.