Served Up Virtually

A Catholic liberal arts college in Vermont is significantly reducing the number of its physical servers by going virtual. The effort is not only paying off in energy efficiency and cost savings, but it's also allowed the college to establish a second data center dedicated to disaster recovery.

Three years ago Saint Michael's College in Burlington, VT had an IT infrastructure common in institutions of higher education today. Comprising 64 physical servers, the setup had outlived its useful life for the school, which was ready to switch over to a more efficient and effective strategy.

"Our data center was a complete mess at the time," recalled Bill Anderson, CIO for the 2,000-student Catholic liberal arts college. "We were out of space, low on cooling capacity, and running out of power. We also had a long list of applications to support, so we knew that we had to do something about it."

Anderson and his team started looking around for options and pinpointed server virtualization as one of the school's best choices. The process involves a software application that "divides" a single, physical server into multiple environments, each of which is known as a virtual private server or "guest." Using this strategy, universities can stretch their server resources without investing in additional equipment.

Whittling Down the Hardware
Sold on the idea, Saint Michael's started virtualizing its servers in 2007 using software from VMware. "Microsoft had a [virtualization] product at the time that was not really comparable," said Anderson of the school's vendor choice. He said the school started with the low-hanging fruit first--those areas that would be easiest to virtualize--and over the last few years has reduced its number of servers by 62 percent.

The college also built a second data center dedicated to disaster recovery--something it previously lacked. "Our disaster recovery process basically involved opening up a notebook and making phone calls," said Anderson. "There was no way we could have duplicated our original 64 servers in a second location, but now that we've whittled that number down to four, we've been able to replicate the servers in a secure location."

During the switch to virtualized servers, Anderson and his team used a slow, calculated process of moving applications over to the new setup and eradicating the physical equipment. "As we moved applications off of the old servers, most of which were past their useful lives anyway, we put them on a pile and waited for the recycler to come and get them," recalled Anderson. The pace at which the servers were removed varied depending on "how busy the IT department was at the time," he added. "If we were occupied with other projects, it slowed things down a bit."

Getting the actual applications moved over to the virtualized servers also slowed the process down, particularly when vendors claimed upfront that their programs would be compatible. In some cases, those claims were either completely or partially incorrect.

"We had one vendor that was sure its application would run on our virtualized environment," said Anderson, "but it turns out that due to copyright protections, the application still had to run on a dedicated box (thus reducing the value of the virtualization itself)." The problem has since been fixed, according to Anderson, who told the vendor, "We don't want to use your application if it doesn't run on our virtualized servers."

Virtualization Challenges: Complexity and 'Mysterious' Failures
The fact that virtualized server environments are more complex and networked has also put challenges in front of Anderson's IT department, which has had to work through several "mysterious hardware failures," over the last few years. With various layers of hardware, software and networking connections to sift through, troubleshooting such issues takes up time and resources.

"We've had a few recurring problems with network interface cards that required a lot of our resources--both our own, and those of our vendors--to figure out," said Anderson.

The Payoff: Energy Efficiency, Ease of Setup
The payoff has been significant for Saint Michael's, whose server consolidation efforts garnered the attention of Efficiency Vermont, an organization that provides technical advice, financial assistance and design guidance to help make homes, farms and businesses in the state more energy efficient.

Using the Efficiency Vermont Web site, Anderson tracked the school's power characteristics, comparing the old server setup to the new, virtualized environment. The college received a one-time, $1,500 rebate for its first 12 months as a virtualized institution. During that period, a total of 27 old servers were discarded in favor of the new setup.

The school has reaped a few other benefits from its virtualization strategy. Where a solid business continuity plan was previously unachievable, Anderson can now rest easier, knowing that in case of disaster, Saint Michael's data is safe and retrievable. "In the past, we would have been down for weeks," said Anderson. "Now we have a recovery point objective every 15 minutes, and a time objective of four hours for all but one critical system. We're pretty comfortable with that."

Configuring new servers is also easier than setting up new, physical machines, said Anderson, with the former taking just a few minutes, rather than hours or even days. "We just click, 'build me a new one' and the virtualization software creates a server for us," said Anderson. "That's a huge improvement over how we handled it in the past."

Anderson, who said application virtualization could be in Saint Michael's' future, said colleges looking to emulate his institution's success should start by taking inventory of their current applications and identifying those that can (and cannot) operate in the new environment. Next, create a plan that incorporates important points like compression ratios (the number of physical servers you have now versus the number of virtualized servers that will exist when the project is completed).

The next step is to get your backend storage squared away, said Anderson, since all of the applications have to be able to access their relative databases. Conduct a pilot program, and use it to test the connections between servers, software and storage. "Once the pilot is successful," said Anderson, "just lay out a plan, commit to the change, and start decommissioning the old servers."

Featured

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs AI Content Safeguards into Law

    California Governor Gavin Newsom has officially signed off on a series of landmark artificial intelligence bills, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • glowing AI brain composed of geometric lines and nodes, encased within a protective shield of circuit patterns

    NIST's U.S. AI Safety Institute Announces Research Collaboration with Anthropic and OpenAI

    The U.S. AI Safety Institute, part of the National Institute of Standards and Technology (NIST), has formalized agreements with AI companies Anthropic and OpenAI to collaborate on AI safety research, testing, and evaluation.

  • a glowing gaming controller, a digital tree structure, and an open book

    Report: Use of Game Engines Expands Beyond Gaming

    Game development technology is increasingly being utilized beyond its traditional gaming roots, according to the recently released annual "State of Game Development" report from development and DevOps solutions provider Perforce Software.

  • translucent lock composed of interconnected nodes and circuits at the center

    Cloud Security Alliance: Best Practices for Securing AI Systems

    The Cloud Security Alliance (CSA), a not-for-profit organization whose mission statement is defining and raising awareness of best practices to help ensure a secure cloud computing environment, has released a new report offering guidance on securing systems that leverage large language models (LLMs) to address business challenges.