Thursday, August 4, 2011

Social Desert: Entrepreneurial Platform Youths can Grasp

Why did Facebook build its first wholly owned server-operations facility in what may quite reasonably be called the middle of nowhere? In the first of a series of posts from the social network's desert redoubt, Babbage will try to find an answer.

The centre opened in April 2011, and is the model for one being built in North Carolina and scheduled to open in 2012. The Prineville operation is nearing completion of a second phase, doubling its server capacity by filling the second half of its first building. During Babbage's visit, the company told the city it would shortly begin work on a second building.

Facebook did not pick the spot for the view, though the landscape doubtless holds a desolate allure. Nor was the choice about inexpensive power. As the crow flies, Prineville, Oregon, is a full 150km (93 miles) south of the Columbia River, where Google opened a data centre a few years ago. Google was an early settler along the Columbia, drawn in part by cheap (and appealingly green) hydroelectric power, though availability of tax incentives, land and labour also played a part. Other companies, including Dell, Intuit, Microsoft and Yahoo!, set up shop in Quincy, upriver in Washington state, for similar reasons.

Meanwhile, Prineville's electrical utility, Pacific Power, derives most of its electricity from coal—a whopping 63% across its multi-state system (although it plans to halve that in the next decade, and has invested heavily in wind power). The coal component is not unusual. America as a whole generated 45% of its power from coal in 2009 according to the US Energy Information Administration. Firms like Google may trumpet centres placed along a scenic river—the information superwaterway—but most server farms burn a good amount of coal in their power mix. Facebook could have jumped on the hydro bandwagon, paid less for power, and burnished its green credentials to boot. Why didn't it?

Labour was certainly not the deciding factor. Prineville, a town of 10,000 or so, has an unemployment rate of 16%, well above the national average of around 9%. But Facebook had to bring in its own specialists, and only needed to hire around 45 local workers, only some of whom required extensive training. The firm could have done this anywhere. The same is true of the daily average of 250 construction workers (or 1,400 in total so far) the firm has employed during construction phases; idle builders are not exactly in short supply in America at the moment. Tax incentives certainly played a part—the state will forgo $2m a year in taxes for 15 years to lure the firm—but similar perks are on offer elsewhere, and Facebook will still get a hefty local tax bill.

The reason, the centre's boss Ken Patchett explains, is the weather. At first blush, this seems odd. Temperatures in Prineville routinely drop to -5°C (22°F) in the winter and climb to 32°C (90°F) in the summer. And the desert clime means that drops of 28°C (50°F) between day and night are not unheard of around this time of year. That ought to make keeping the servers at a steady 20–25°C (68–77°F) and 40–55% relative humidity an arduous task.

In fact, the Prineville plant is a leading exponent of a new style of data-centre management. It does away with expensive air-conditioning "chillers". Instead, air is brought in from outside. For this approach to work, however, the desert is key. For much of the year outside air is actually cool enough to keep the servers from overheating. At the lowest temperatures, just the gentlest of breezes needs to be brought inside at all. And, this being the desert, nights are chilly irrespective of the season, so even in the summer additional cooling is only needed during the hottest times of day.

This is provided through a "swamp cooler". A fine mist of water, much like artificial fog, is sprayed in the direction of the air flow. Heat is leached from the air by the water as it evaporates, and the cooled air passes through the servers. Precipitating water vapour is captured and reused. Babbage admired the ventilation system—reminiscent in scale and appearance to one Mr Wonka's fizzy-lifting drink chambers—on a day in which outside temperatures hovered around 26°C. Only two of the seven misting systems were fired up, yet the inside of the ventilation area and ground floor data centre were both quite comfortable.

The servers themselves are unusual, too, and require less coddling. They are designed and tested to work in a greater ambient temperature and humidity range than their forebears: -5°C to 45°C (23°F to 113°F) and anywhere between 10% and 90% relative humidity. Unlike most mass-produced rack-mounted computers meant for corporate server rooms and larger data centres, Facebook's kit is naked at the front and top, and designed with bigger fans and a smarter motherboard layout than most. All connections, including power, are in the front, to devote the entire rear to outflow. Everything is done to ensure the best airflow across circuits, drives and power supplies. To reduce the amount of power lost as heat, servers are run at a higher voltage than is typical (277 volts instead of 110 or 220), requiring less power-conversion gear. Elements of these designs are becoming commonplace in large-scale server facilities, but Facebook appears to be using all of them at once—and crowing about it.

As in all modern data centres, racks of gear are organised into cold and hot rows, in which the fronts of servers face each other across aisles through which technicians stroll. Their nether regions vent air across narrower back-to-back gaps. Open ceilings rise above the hot rows and into the ventilation system, allowing for some passive convection. The server racks are nearly silent, and their internal fans whirr almost imperceptibly. The only exceptions are network switches which, Facebook staff notes, are perversely designed by even the biggest firms to vent air out of their sides. As a result, they run loud and hot—and are openly sworn at.

This approach leads to dramatically more efficient use of the power that enters the facility. Mr Patchett says—and industry sources agree—that many data centres have a relatively high ratio of Power Usage Effectiveness (PUE), a score that divides all the power brought into a building by the amount required by the servers and related equipment themselves. Typical ratios may be as high as two, meaning that for every watt used to run the servers an extra watt is employed to maintain the right running conditions. Internet firms have been pushing that number down through better hardware and building design, but most have seen only small gains.

Mr Patchett, who is passionate about airflow, boasts that his centre operates at a PUE of about 1.07, although Facebook has come up with the figure itself—and flaunts it in the lobby as part of a constantly updated set of charts and graphs rotating through a large high-definition display. Google says its dozens of data centres achieved a weighted average PUE of 1.16 over the 12 months ending in March, which means that some of its centres are substantially more efficient than others. Cutting the ratio by .01 might mean savings of hundreds of thousands of dollars in annual costs. As a consequence, Facebook reckons that finding the ideal climate will, in the long run, prove more economical than locating close to cheap power.

There are risks to locating in ultima thule. Facebook even documented some of them in a frank discussion of its mechanical systems as part of the Open Compute project it announced at the centre's formal opening. Dust storms, grass fires or a volcanic eruption might, for instance, require shuttering the air intakes to protect the servers. With no cooling or airflow, the centre would be forced to go offline until the threat passed. So, just in case, data are redundantly stored across leased centres for now, and across leased and wholly owned facilities in the future.http://www.economist.com/blogs/babbage/2011/08/data-centres?fsrc=scn/fb/wl/bl/socialdesert

No comments:

Post a Comment