Deployments | Feature

Preparing Pilots for Takeoff

When schools and vendors work together, a pilot project can be a win-win proposition. Here are 6 tips on how to get pilots flying.

The term "test pilot" conjures up an image of Chuck Yeager strapped in a supersonic fighter, hair on fire, hurtling over the dry sands of the Mojave Desert. In educational technology, the term is a tad more prosaic, but there are similarities: In both cases, the payoff for success can be huge--a cutting-edge product that performs as it was envisaged--but it's also easy to spin out of control, crash, and burn.

So why would schools consider partnering with a vendor to operate a pilot? Why not just wait until the final product is released? For starters, pilots provide schools with a golden opportunity to get an early look at the software, take it for a test flight, and ask for changes tailored to their operating environment and business needs. In some cases, too, there is a financial benefit, including free or discounted software, assistance in installation, or credit applied to an annual contract.

This story appeared in the September 2012 digital edition of Campus Technology.

Tim Flood, a technology consultant for higher ed institutions including Stanford University and the University of California, San Francisco, believes more schools should work with vendors in the development stage. He sees a primary role for campus IT departments in "facilitating the uptake of technology into the institutional bloodstream. It should be much more in the consciousness of institutions to work effectively with vendors." The challenge is to make the partnership work for all parties. To ensure a smooth flight, CT offers these six tips for schools embarking on pilot projects.

1) Work With People You Know
The best insurance for pilot success is to team up with a vendor or developer with which you already have a good relationship. "It's really important to have the right partners," says Patricia Summers, VP of marketing at CollegeNET, a software-as-a-service provider. "We usually go to our existing customer base to find people we know will give us the feedback we need."

Indeed, it's difficult to overstate the importance of mutual trust. "I'd be more skeptical with a company that just approached us out of the blue, because there would be a lot more concern about the integrity of our data," explains Justin Schlenker, director of admissions operations at Tiffin University (OH).

But choosing a known partner can be far more than a defensive measure. Not only does it reduce the level of risk, but it provides better opportunities to share insights and work together to improve the product. A case in point is Transylvania University (KY), which was invited by Datatel (now known as Ellucian after merging with Sungard HE) to be a beta tester for its Recruiter 2.5 product based on the school's history with previous versions. "We were already a Recruiter customer and had made several enhancement requests that we thought would make it a better product," notes Jason Whitaker, the university's VP for information technology.

2) Choose Your Pilot Team With Care
Nothing can derail a pilot project faster than a bad team. Before you select its members, consider who the key stakeholders and end users are, who has the drive and attitude to see the project through, and who has the cred to eventually champion the product on campus.

But whatever you do, don't allow too many people into the tent. In Flood's view, a pilot team should be no larger than three to seven people. It's also not uncommon for vendors to want to enlist only end users for pilots. Just say no.

"IT folks bring to the table the whole idea of integration, making sure the product we come up with works with the systems on campus," says Summers, who is an advocate of a mix of IT professionals and end users.

Transylvania's Whitaker wholeheartedly agrees. Even if the product will be hosted offsite or in the cloud, he considers IT involvement to be absolutely critical. "We help users understand the business processes involved," he explains. "We can help them evaluate technologies and better integrate them into their processes, as well as make improvements that streamline operations and result in savings of time and money."

For the Recruiter 2.5 pilot, for example, Whitaker put together a group of five that included two IT staffers who supported the existing version of Recruiter, two admissions staffers who were "power users" of the product, and himself.

Flood emphasizes one other key attribute of team members--flexibility. "Oftentimes when we're piloting, we're trying to do some level of innovation," he says. "You want people who aren't attached to the old way of doing things and aren't going to stifle every creative urge a developer has."

3) Test Products That Your School Really Needs
Working on a pilot entails extra time and effort on the part of already busy people. And if the pilot is a success, adoption will require even more people to learn the new product or process. As a result, it's vital to show that the final product will save your institution time, money, and effort.

"Make sure the pilot makes sense for you," encourages Whitaker. "Does it have features that will benefit your users? Most likely, the draw to participate will be new features and functions that are important to your institution. Once you are live, make sure you change your processes to take advantage of the new software or features."

Laying the groundwork before the pilot starts is just as important. End users--the people who stand to benefit most from the product--should understand what problems it's intended to fix and how. For Amy Wood, Tiffin's director of degree completion, admissions, and student services, this was easy when it came to piloting Hobsons' Virtual Orientation. "This is something my staff has wanted to implement for about two years," she notes. "It was going to be more time-efficient for them, so I already had buy-in from my staff."

She urges pilot sponsors to educate as many people in the institution as possible about the product's possibilities, "so they can be part of the champion team." But there's another, more practical reason to do outreach: to avoid unforeseen glitches.

"Make sure people on campus know that you're rolling out this product," advises Wood, "so you find out beforehand if there's anything that may impact what you're trying to do." For example, changes to registration procedures could require a revision of orientation content, which could in turn delay or derail the pilot.

It's an approach that CollegeNET has baked into its pilot projects. "We go to the campus after certain milestones and show everybody who would be a user or an IT person what we have so far and get feedback from them," notes Summers. "Everybody has a chance to see it, make comments, give suggestions. We want to make sure they know they have a voice, and that we are trying to make something easier and more efficient for them."

4) Build Feedback Mechanisms and Realistic Timelines
Whether you use a web-based application such as WebEx or SharePoint, e-mail, or periodic phone conferences, communication between pilot and developer teams should be easy and efficient.

Flood is an advocate of the Usain Bolt school of pilot projects: a short, very fast sprint to the line, with a relatively small pilot team giving feedback as features are added or tweaked. "In general, pilots should be short--I'd say one- to three-month efforts," he says. If the product is very complex, the pilot project should be broken up into multiple, short segments. "That allows campus staff to see and react very quickly. It's good for the vendor and the campus, because it enhances the dialogue and establishes buy-in."

CollegeNET typically begins with a customer committee comprising several schools that works with the development team to come up with product specs in a series of highly collaborative, in-person sessions. "As we build, we show it to our committee for feedback," explains Summers. "We release version 1 as a pilot, and then the schools continue to give us feedback on what we should be tweaking or changing."

Regardless of how long the pilot runs, the timing has to mesh with the school calendar and other campus cycles. At Tiffin, Schlenker says, "We had a defined deadline for the start of our January classes, so we kind of backfilled from there to make the timeline work."

The Virtual Orientation pilot had three separate trials, which allowed testers to learn the program, revise and refine the way they used it, and provide feedback to Hobsons. The first rollout was in January 2012, with 52 online students. This was followed by trials with larger groups on campus for the spring and summer semesters.

The longer the pilot runs, the more potential for calendar conflicts and churn within the team. "Don't sign up for the pilot unless you've got the people lined up and they have the time to test and give the feedback required," advises Whitaker.

The Recruiter pilot, for instance, had to take into account the workload of both IT and admissions team members. The late spring timing worked well, because it's a quiet time in the admissions office. "If we had got into late June or early July with this beta still dragging on, that would have been an issue, because the admissions folks might not have had the time," notes Whitaker.

In Summers' experience, delays tend to arise when team members drop out because the pilot extends into a new budget cycle--with attendant staff cuts or reassignments--or if the IT component is an outsourced service whose contract is not renewed. 

5) Get It in Writing
"The alignment of purpose, goals, and the expected outcomes among the business partners is paramount," says Flood. The best way to ensure this alignment is a written agreement that explicitly states the commitments and responsibilities of both parties. Every contract should have a statement of work, including deliverables, milestones, and a timeline. Other provisions to consider include nondisclosure agreements, compliance with the school's security and confidentiality policies, financial or other inducements for participation, and an exit clause if specific provisions are not met. "Fuzzy ground rules and unclear goals undermine agreements that everybody originally thought were going to be clear," says Flood.

6) Keep Expectations Realistic
It's essential to understand that perfection is not the goal of a pilot. There will be glitches. "Not only are you testing the software, but you are testing the installation instructions and documentation," notes Whitaker. "Anyone going into a pilot should understand that, and be prepared for some problems. Part of the beta is to try to force those out and get them fixed."

As important as it is to have defined goals, it's equally important to be flexible about how those goals are reached. "You need to be ready to change directions several times along the way," cautions Schlenker. "You don't want to have too many preconceived notions about how things should work, because you might find that the way it does work is actually better than what you were expecting."

And just because your school is involved in the pilot, remember that the vendor's goals are not to create a product customized to your specific needs alone. "As a vendor, probably one of the more difficult parts is making sure that the product is generalized enough that it can be used by many customers," says Summers.

Flood asserts that it's also in the interest of pilot schools that vendors don't customize the product just for them. "We don't want one-offs--we want to use the functionality that is part of the base product," he explains. "If a campus gets saddled with a one-off, they don't have what everybody else gets, so they've really dug themselves into a hole: If they want to change anything, they've always got to pay for it."

comments powered by Disqus