The Green Data Center
Greening the data center-- and cashing in on the resulting energy savings-- is an achievable goal. Here, two different approaches.
- By Jennifer Grayson
- 06/01/09
EVERY TIME Utah State University remodels a building, the IT planners there ask themselves, "How can we make it greener?" says David Tidwell, team coordinator for IT physical infrastructure at the university. The case was no different when the team took a good hard look at its aging data center, which was in desperate need of an overhaul.
THE SUN MODULAR data center essentially is "a box that you hook up water, electricity, and a network connection to; everything else is inside," says Sun Senior Director of Engineering Jud Cooley.
According to Eric Hawley, associate VP for IT, the UPS infrastructure that was in place to provide backup power for the data center was very old, very unstable, and needed to be replaced. The cooling system-- 20-year-old Liebert units that ran on glycol and used an inefficient under-floor air conditioning method-- needed to go as well. "When you have to look at replacing the entire electrical and air conditioning infrastructures, you essentially have an opportunity to rebuild the whole data center," he offers, adding, "Recognizing that we had that opportunity, we set a goal to make it as green as possible."
Hawley, who was the visionary for the project, explains that in addition to its reliability issues, the data center also had reached power consumption/ cooling capability capacity; what's more, every inch of the 2,500-square-foot center was being used. "We wanted to increase capacity, but not suck more power off the grid," he explains. Over the years, administrators had become aware of the instabilities in the old data center, especially when the network would fail or the power would drop. Needless to say, everyone at the university was "very, very interested" in understanding what it would take to get improved uptime and greater availability in the data center, and throwing green into the mix was well received by the administration. Fortunately, the renovation of the data center was a project that had been planned before the economic downturn, so it was not necessary to increase tuition to fund the project, says Hawley.
Utah State: The Green Overhaul
While in some cases, building a data center from the ground up can take two to three years, Utah's green remodel took a mere six months to implement (Tidwell estimates that the planning stage took eight months), during which time the data center stayed live, serving the university's enterprise system.
One lucky break for the initiative was the existence of the university's chilled water plant, which was used for the data center's new APC in-row cooling solution. The data center has greatly benefited from the existing chilled water plant; its renovation included the elimination of all CFC refrigerants, and a large portion of the center's overall green costs actually are attributable to the installation of the chilled water plant four years previous. In addition, the chilled water facility initiative included the installation of a natural gas-fired co-generation facility that allows Utah State to create some of its own electricity, and the university also has its own hydroelectric plant.
The plant itself is highly efficient, with economizers that use outside air to chill existing water when the weather drops below 50 or 60 degrees. The APC system, too, maximizes efficiency by designating hot and cold aisles in the data center. "In the old system," says Tidwell, "we essentially were trying to cool the entire room, kind of like a meat locker. But with this new system, we're only cooling where we need it." Further enhancing the energy savings is an APC switched rack power distribution unit, which allows IT to remotely monitor, power down, and hard boot the equipment. "You can log in from anywhere in the world and turn off one outlet," boasts Tidwell. The 40 percent decrease in power consumption that resulted from all of these measures allowed the university to move more services into the data center, including a large-scale enterprise-wide e-mail system for faculty and staff (student e-mail is outsourced to Google). Hawley explains that the 40 percent reduction was achieved even while increasing server processing power by 20 percent.
Although it was not included in the overall budget, server virtualization (which was achieved on a piece-by-piece basis and which allows multiple virtual machines to run synchronously on the same physical server) allowed the university to lower energy costs further. The team chose the VMware ESX server virtualization platform, as well as a Dell 10th-generation blade server. Through Dell, the university also procured a large storage area network from EMC.
The reduced number of servers mean less cooling, and virtualization allows more room to grow. "We would have had to put up a new building with a new data center, to be able to add more servers," says Hawley. He estimates the cost of a new data center building at $1,000 to $1,200 per square foot-- significantly more costly than the $1.8 million allocated for Utah's remodel. The size of the data center now: under 1,000 square feet. Hawley estimates that without accounting for high-performance computing, which will probably be managed in collaboration with other universities across the state, Utah State will be set for another 10 years of data center growth. And with more physical space, IT has been able to encourage university groups (any academic, research, administrative, or operational areas or units) to discard their energy gobbling "renegade" server/window air conditioner combos scattered across the campus, and move their server needs to the center. There are 150 buildings on the Utah State campus, and all of them have fiber feeding back to the data center. Today, the more compact data center translates into a direct management savings: Hawley notes that the IT department is able to manage all of the virtual server infrastructure-- over 230 virtual servers at last count-- with only two full-time employees.
At the UCSD School of Medicine, three departments have deployed a modular data center, which allows them to share their data on energy efficiency and thus get the most out of every computing watt spent.
UCSD and Stanford: Modular Green
Not every school has the funding or time to take on such a large-scale green project. "There are a lot of business models where people want smaller data centers located close to clusters of users," says Jud Cooley, senior director of engineering for the Sun Modular Datacenter (Sun MD). Touted as a fully functional data center housed in a 20-foot standard shipping container, the Sun MD takes up only 160 square feet, and comes with eight standard 19- inch racks into which any kind of server equipment can be placed. (Similar products are offered by Rackable Systems and IBM.)
Essentially, says Cooley, "The Sun MD is a box that you hook up water, electricity, and your network connection to; everything else is inside." That includes lighting; cooling necessary for the servers (although cold water must be provided from an external source such as a chilled water facility); a redundant set of electrical wiring to provide power from multiple sources (also allowing for a backup energy source in case the utility goes down); and fire suppression systems to protect people and computer assets. Instrumentation (sensors that monitor the internal environment and fire alarm system) and an industrial control system with software that processes that sensor data and controls the cooling system, are also built in. There are, for instance, 40 different points inside the data center where temperature is measured to determine the heat profile of the equipment; every server and rack can report its energy usage.
It's this energy measurement capability that the University of California, San Diego, arguably one of the greenest universities in the nation, is using to its fullest advantage. With a robust alternative energy program on campus, UCSD is a member of The Green Grid, a worldwide consortium of companies, organizations, government agencies, and academic institutions dedicated to advancing energy efficiency in data centers. UCSD has selected the Sun MD as a research tool to examine the most efficient methods of using alternative energy for computing. Three departments in the UCSD School of Medicine took part in the first deployment of the Sun MD on campus, which served as a consolidation point for the researchers, allowing them to share their data on energy efficiency, and creating an environment where the researchers could examine how to get the most bang for the buck out of every watt they spent in computing. For years, the emphasis was on reliability in data centers; now, data center managers are focused on the cost of energy, as evidenced by the August 2007 EPA Energy Star Program paper, Report to Congress on Server and Data Center Efficiency, Public Law 109-431, which revealed that the assets in a data center are only used about 5 to 15 percent of the time. The rest of the time, those servers are just sitting there, eating up expensive power. But energy usage monitoring is not the only route to cost savings: Sun's recent lab studies with the US Department of Energy's (DoE) Lawrence Berkeley National Laboratory at UC-Berkeley revealed that the MD eliminated about 40 percent of the cooling costs associated with a data center. And the modularity inherent in the product is said to offer a "pay as you grow" solution for facilities planners with tight budgets.
Yet, what if your institution has the dollars to build a green data center, but it just doesn't have the time? Such was the case with another DoE National Laboratory, the SLAC (Stanford Linear Accelerator) National Accelerator Laboratory, operated by Stanford University (CA). The laboratory had the grants and programs in place to purchase servers, but not the time to construct the building. So Stanford brought in the modular Sun product as a turnkey solution, with all the servers and software already loaded and ready to go.
In-House Overhaul, or Market Solution?
In the end, whether you opt for a data center retrofit such as Utah State's, or one of the market's up-and-coming modular data center solutions like those at UCSD or Stanford, the end goal is savings-- both for the environment and the bottom line. What's more, higher ed institutions need to abandon the fear that green is costly to implement. Not so, if planned intelligently, say the pundits. Utah State's Hawley puts it plainly enough for any data center manager to accept: "Green did not cost us more."