Centralizing Software at Duke

Centralizing software on a large campus is never simple -- especially when the institution is a Research I university and the software under consideration involves a key program for the research process itself. Yet that's exactly what Duke University undertook in 2010 and 2011. Just two years later, migration to the new system -- used for research surveys -- is almost complete and the Durham, NC-based university is looking at a summer decommissioning of the legacy program.

According to Evan Levine, assistant director of academic services within Duke's Office of Information Technology (OIT), in an institution with nearly 50,000 students, staff, and faculty members (including the medical center), "When we started this evaluation, someone somewhere was using just about every survey tool you could think of."

The primary one was ViewsFlash by Cogix, a web application that lets users create questionnaires for surveys, assessments, data collection, quizzes, tests, and polls. The university had been using ViewsFlash since 2006 and doing self-support. The service desk provided basic help -- how to create a survey, how to log in -- but OIT set up an online mailing list for a support community to handle more complex questions. "People were amazingly willing to help each other," Levine notes. But it was time, he added, to do a reassessment of the central survey offering.

Why to Centralize Software

Taking steps to move users onto a single enterprise research application is typically driven by a number of goals, and security is a major concern. "People are collecting all sorts of information and storing it in these survey tools," he explains. "So whether it's something we host here centrally or is cloud-hosted, centralizing around a platform lets us have some insight into and control over how secure that tool is and how that data is stored."

Also, having a single application would "provide the opportunity for people to share knowledge," he points out -- a far more efficient approach than putting them into the position of learning a bunch of different tools depending on where at Duke they worked.

Likewise, Levine, says, centralizing would allow OIT to come up with Duke-branded templates for its surveys.

And, of course, there was the expectation of cost reduction -- "being able to pick a tool, work with a vendor on pricing, and have all these departments and schools work together to pool their resources allowed us to get a better deal on a single product."

The Assessment Process

The assessment process for any enterprise software at Duke starts by compiling the requirements. Overall, says Levine, in the case of survey software, "We needed something that would be secure, have good ease of use, and allow for custom branding."

More specifically, on the security front the program would need to provide Shibboleth authentication. Shibboleth is an open source project that provides single-sign-on capabilities and allows organizations to set up authorization of user access in multiple ways. Duke wanted it for users setting up surveys and as an option for those taking them. This requirement specified that a user could pull his or her affiliation from Duke's identity management system and place it into a specific type of user account automatically based on whether the person was a student, staff, or faculty.

The survey application also needed an export capability since not all users do their analysis within the program; they'd need a quick way to get the data moved into other tools, such as an XML conversion function.

To help develop a list of criteria, Levine's Software Licensing group involved another on-campus organization, the Duke Initiative on Survey Methodology, which is part of the Social Science Research Institute. This initiative aims to enhance research and teaching in survey research methods at the university.

"When we do an assessment for any new software, we're lucky to be able to involve groups like that in the assessment," notes Levine.

The overall evaluation process also involved the campus' office of Institutional Research, assessment staff within some of the larger schools, the Center for Instructional Technology, faculty, and the software licensing committee, which has representation from each school.

Levine's team gathered feedback from everybody and identified the vendors considered the most likely candidates. In total, hundreds of people gave some sort of input to one or another of the representatives from those groups; direct reporting came from 20 to 30 people.

Once the finalists were identified, the Software Licensing team talked to the individual vendors about licensing and pricing and other contractual matters. The latter included the need for a "business associate agreement" that would allow Duke's medical center to use the software with protected health information, a feature the legacy survey software couldn’t provide.

The "conversation" part of the evaluation process ran for six to nine months; the formal assessment process ran for another three months.

The winner: The Qualtrics Research Suite from Qualtrics. Says Levine, the software met each of the requirements; but another aspect that played into the final decision was the company's willingness to work with Duke in further developing its product in ways that would also probably make it attractive to other higher education prospects.

"We had a couple of requirements that no vendor was necessarily going to have to begin with," Levine explains. That included the automated Shibboleth integration for account creation, optional authentication for survey respondents, and small tweaks to the tools internal reporting functionality. Beyond that, Qualtrics would also allow OIT to give each school a custom skin, an interface that would appear on surveys put out by the members of its individual community, and to let each school assign a survey administrator if desired.

Qualtrics wasn't totally new to Duke. It was already being used by Fuqua School of Business, which has one of the highest rates of usage of survey software on campus.

Persuading People to Migrate

In 2011 the new survey application was made available campuswide with the goal of allowing people "to migrate at their own convenience."

The IT organization teamed up with the Duke Initiative on Survey Methodology to run hour-long training sessions in a computer lab, some of which were actually wait-listed until OIT could schedule additional classes. Levine estimates that over 200 people went through those workshops. And the Initiative offered itself to any department that wanted on-demand training not just on the software itself, but also on how to build better surveys.

Most people moved to Qualtrics "pretty quickly," Levine says. But there were small holdouts here and there, users holding out for some new function to be added or waiting to finish a long-term survey before making the move.

The migration process was fairly basic: Users were told to export their data via XML if they wanted to keep it, a procedure most people knew already because they were using applications such as SAS or SPSS for their survey analysis. If they needed help, Academic Services created instructions on export and prepared its service desk people in case users had questions. "But ultimately, a lot of power users were already doing that," Levine notes.

For the most part there was no way to move over survey content itself; certain types of questions could be copied and pasted from one survey tool to the other; but anything more complex than a simple multiple choice question had to be recreated. Academic Services pitched that as a benefit of migrating: "This is a great opportunity for you to recreate your survey. Go back and look: Are these questions still current? Are there things you'd like to add or change? Are there new types of functionality that are now available in survey software that you could be utilizing?" As Levine points out, "A lot of people were open to the idea that this was a great time to renovate their survey."

Overall, he says, "There was no real incentive [to migrate], other than moving to a more modern and secure platform."

And it worked. Currently, 1700 students, 2300 staff, and 360 faculty have accounts in the Qualtrics system. "It's a large group," Levine reports. "It's roughly 10 percent of each population, which I would consider a pretty high adoption rate for a specialized service like this. We consider it a highly successful centralization."

Among the heaviest users: on the academic side there's the School of Nursing, the School of Business, the Department of Psychology & Neuroscience, and the Duke Clinical Research Institute. On the institutional research side, Duke’s Office of Institutional Research, and the Trinity College's Office of Assessment are major users, as is Residence Life, which does a lot of surveying and polling among students.

Also, Qualtrics is becoming useful for Duke's MOOC work. The university has already run 11 courses in Coursera’s massive open and online course platform and plans to offer more. The Center for Instructional Technology has been working with faculty to use Qualtrics to poll their MOOC students. One of those polls, says Levine, had over 50,000 responses.

Then there are the other oddball applications for survey software that users have come up with. "It gets used for registration forms, for events, for polling people on anything and everything," explains Levine. In other words, it's becoming the campus go-to application for putting together quick polls and simple web forms.

The Secret of Enterprise Software Adoption at Duke

The support community created for the old software is still going strong, but now it's becoming even more effective. With almost everybody using the same tool, Levine, says, at least one person among the 5,000 users will be able to answer just about any question. "There's almost always another user who can help and can make a recommendation on best practices or share what they've done, not only regarding functional use of the tool, which Qualtrics already supports, but regarding use specific to Duke or related to quality survey methodologies. That just saves so much administrative time on everybody's part. You're not duplicating effort here and there. You're sharing your resources internally and learning from each other."

At this point, the holdouts are few. The last time Levine checked a report showing logins to the legacy survey system, about 20 users showed up on the list. "I think this has gone well," he notes. "We've done a good job centralizing. We've met those goals." But he adds, those 20 users -- and others who still might choose to use alternative software -- still count, "It'd be foolish for me to say that any one tool meets everyone's unique needs, especially that many people in this large of an environment. We've accomplished those goals through centralization, but we also don't disallow people to go out and have smaller licenses of other products if needed."

The secret to success: engagement with all of the campus experts: "It's a real shame to make a decision strictly from a technology standpoint. If you look at the groups we involved in an assessment like this, you've got actual experts in using these types of platforms, who know what they need. Involve your students, involve faculty, involve staff. Really use an assessment process that leverages everything that we have at an educational institution."

Featured