How to Get From 97 Data Centers Down to 8

The University of Wisconsin-Madison is expecting major savings and improved performance from a massive effort to aggregate its 97 data center locations into as few as eight. Will users who are accustomed to their own local services be willing to buy in?

How to Get From 97 Data Centers Down to 8
Photo: Shutterstock.com

Steve Krogull is on a mission. The director of system engineering and operations in the Division of Information Technology for the University of Wisconsin-Madison and his team are embarking on outreach to 97 different data centers on campus to persuade the people behind them to consider a new way of managing their data.

The effort is the next chapter of a saga that has required nearly two years of extensive background work intended to identify administrative processes that could be revamped for overall efficiency. In this "Administrative Excellence" initiative, an assessment by the Huron Consulting Group found that "lots of different groups on campus had responsibility for lots of different data domains," said Krogull. The result is a server/data center model on campus described in the data center aggregation business case as "inefficient, resulting in duplication and overspending in areas including: hardware purchases, utility costs for power and cooling, labor and facilities." To put it mildly, a lot of different groups were doing the same stuff using separate gear and software, some better than others.

Bruce Maas, vice provost for IT and CIO, anticipates replacing what in some cases are not much more than servers in closets that lack sufficient power or cooling with "about eight facilities on the campus that we know are engineered properly." Those "preferred" data centers will have virtualized servers and storage. In some cases, existing data centers will be designated as aggregation points; in other cases dedicated set-ups won't be touched — they'll remain as stand-alone operations. Beyond that, public cloud services will also be available on the menu.

When the smoke has cleared, it's possible that the overall project could save a very large amount of money — although the financial and efficiencies tallies are still to be determined. The question is just how Krogull, manager of the new service, and his newly renamed Data Center Aggregation (DCA) team will approach the work of convincing users on campus that there are compelling reasons to use a central service instead of the "local" ones they're accustomed to.

Here are six guiding principles driving their actions.

1) Become the One-Stop Shop
Krogull described the new approach as a "portfolio" or "hybrid" model that follows two mantras. The first is, "Allow people to move their data and their applications, not their stuff. If you plug it in here, plug it in there — you're still plugging it in." In other words, users can manage their own data regardless of where the servers reside. The centralized data centers aren't being designed simply to take possession of existing servers — they're meant to fundamentally transform how data is managed.

The second mantra is, "Let the nature of the data drive where things get housed." That includes using external resources when needed. UW-Madison is participating in Internet2's NET+ initiative, a cross-institution effort focused on developing a portfolio of cloud services, which will allow the university to extend its centralized data services beyond the campus-based resources.

The virtualized environments maintained on campus will provide a mix of facilities including high-availability, high-reliability options for data that requires certain kinds of security restrictions, access controls, encrypted drives, audit trails and the like. But when the need arises to "burst out to the cloud," noted Krogull, those services will be in place too, using infrastructure from Amazon, Microsoft or other providers.

"What that really does for us is say, OK, we have some data here that is very low risk, very low use, maybe archival. Then we have this other data set that's restricted and has a lot of intellectual property issues around it," explained Krogull. "The nature of the data will help guide us in locating these things to aggregate them into the right kind of facility."

That's attractive to users for multiple reasons. For instance, the local component of the hybrid model offers the possibility of a "co-op" approach that could appeal to some researchers. By aggregating the price tag across a wide pool of users, said Krogull, the university can "manage the cost and help faculty who may have lost a grant or had a change in funding weather those transitions." If those researchers were to go straight to the cloud, on the other hand, "the minute you put it there, the bill starts mounting up," he pointed out. And the meter keeps running. Keeping that data on campus "lets us help buffer some of the ebbs and flows of day-to-day work in our environment."

2) Make the Decision a Financial No-Brainer
Maas has set a specific goal for pricing the centralized data services to be offered: "The price point is going to be close to what Best Buy charges, even though it's not the same caliber of equipment." After all, he said, "That is what resourceful people do — they hop around online and look at what they can get storage for."

While he may be halfway joking here, the point is a good one: Campus users with choices need a compelling reason to make the "right" decision, especially when it could cost them more.

It's expected that university customer pricing will be based on consumption only, not the cost of maintaining the infrastructure. Under the new philosophy, data management falls into the realm of being a "common good service" for the entire campus.

"What we have right now," Maas said, "is a chargeback at full cost. What we're going to end up with will actually be a subsidized chargeback."

3) Address Privacy Concerns Head-On
Data privacy is a big concern for people on campus, who wonder whether allowing their data to be maintained in a public or even private cloud service will somehow make it more vulnerable to breaches.

"The way I look at it is, there's nothing that automatically says that having our data on campus is better than having it in the cloud. It's just the way we've always done it," said Maas. He acknowledged that privacy concerns are legitimate and in no way "helped" by the recent disclosures of the National Security Agency "and what they have been doing with regard to working with the private sector in accessing data."

But from his vantage point, contractual protections are "as good as having that level of protection in your own data center." That includes dealing with issues such as an exit strategy to cope with a cloud provider that gets acquired or goes out of business.

It's a moving target, he said. "Cloud is becoming gradually acceptable as a place where you can protect privacy, but you still have to have the right protections contractually to do that."

And then there are times, he added, "especially with regards to HIPAA data, where you just need to take a little more conservative approach until people are ready."

4) Get Your Timing and Processes Right
Typically, advised Maas, the best period for persuading a school, college or department to adopt a new data service is when its equipment or software is up for renewal and taking the centralized route will avoid a very large capital expenditure.

But the intent is not to come in and do a large takeover. The way the DCA team is approaching the work is first to send people into a department to do an assessment. In one engagement begun a couple of months ago, for example, a system administrator and a business process analyst performed a walk-through with a department's local IT staff to find out how many servers they had; how those were configured; what operating systems were being run; what the hardware and software product lifecycles were; what the nature of the data was; whether it had particular audit requirements, special sensitivity or other characteristics; and what the business and "data management stewardship" drivers were.

After the team has a fundamental understanding about what the situation is, they can "put a matrix around that" to figure out the opportunities, explained Krogull.

In that situation, the department was willing to let the DCA take over a few systems "that were less risky to try out the relationship," he said. "Can we be responsive? Can we offer the performance and accountability metrics that are needed? Can we engage effectively with their team? Can we provide 24/7 response in emergencies? They started off with a few systems to test the waters."

5) Add Value
While Krogull's organization has reorganized itself to accommodate the new focus on centralized data management, the university's schools, colleges and departments have also made some interesting changes on the IT front as they start to use the new centralized data resources.

"They're leveraging our sys admins and DBAs, which is freeing up their staff time to add value back into the department," Krogull said. The departments are retooling some staff to work more closely with users — the faculty, staff and students. "What I'm hearing," he added, "is that they're not really saving any money, but what they are doing is redirecting staff toward pent-up demand for other services. It's not changed their personnel or IT costs as much as it's allowed them to realize a lot of programmatic value."

The low-hanging fruit right now is shared storage. "Everybody needs shared storage," proclaimed Krogull. However, the details of how shared storage should work are up for debate. "Is it high-performance? Is it archival? Rather than having us go out and just buy a bunch of storage and then sell it as a commodity service, we're actually trying to figure out where we can add value as a central resource and then where there are unique opportunities," he said.

The bigger issue, he added, is that "no one service can be all things to everybody. Let's define the exceptions as a business need rather than a just-because-we-can need. That's a very different kind of conversation at a campus level than we usually have."

6) Develop Trust and a Shared Vision
With 43,000 students, nearly 22,000 faculty and staff and 13 schools and colleges, UW-Madison is a large, distributed campus. Finding a way to have people come together around a shared vision is expected to be a major challenge. Figuring out the common threads, said Krogull, becomes "a very complicated conversation just because there are so many players."

Yet to Krogull, those conversations may just be the most interesting part of the entire project. "Everybody that fired up a 'room' did so for a reason." His team's job is to find out what that reason is and whether the decentralized server is truly unique enough to stay in place and — if not — to discuss "how to move forward in a more collaborative way."

The transition is going to take time, Krogull predicted. "What we're hearing from other schools is that's a five-, six-, seven-, eight-year endeavor by the time you really understand what people are doing and you implement some things that are not disruptive."

Along the way, the new data organization must earn bankable trust among potential users. Maas' philosophy is to not move any faster than what the "comfort level" can handle.

Krogull seconds that. "There are some things we're finding that may not move for quite a while, because there's so much local value or so much history around how a particular IT service is running inside the department. That's OK. We're actually doing it very, very gradually and doing it in a way that makes sense."

He recognizes that not all 97 data center locations are truly ripe for aggregation — but enough are. "The tail end of that — from spot 50 on out — are very, very tiny operations. There are a lot of efficiencies there that I think we can do fairly rapidly — if we get past the trust issue."

Featured

  • abstract illustration of a biometric face scan, featuring a minimalistic wireframe outline of a human face

    Microsoft Releases Face Check Identity Verification for Enterprise Use

    Face Check with Microsoft Entra Verified ID, a consent-based method used to confirm a person's identity, is now available in general release.

  • Digital Education Council survey data

    Survey: 86% of Students Already Use AI in Their Studies

    In a recent survey from the Digital Education Council, a global alliance of universities and industry representatives focused on education innovation, the majority of students (86%) said they use artificial intelligence in their studies.

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs AI Content Safeguards into Law

    California Governor Gavin Newsom has officially signed off on a series of landmark artificial intelligence bills, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • abstract image representing AI tools for reading and writing

    McGraw Hill Introduces 2 Gen AI Learning Tools

    Global education company McGraw Hill has added two new generative AI tools to help personalize learning experiences for both K–12 and higher ed students, according to a news release.