Course Evaluation | Feature
Many Happy Returns
Moving course evaluations online saves schools money, time--even trees. But it doesn't mean more students will complete the surveys. Here's how to bump up those response rates.
- By Jennifer Grayson
When was the last time you wrote something by hand? Perhaps it was a scribbled grocery list, or last winter's holiday cards. Chances are it wasn't anything in-depth. Anyone used to typing on a laptop or hammering out e-mails on a BlackBerry knows how foreign a pen can feel--after two minutes your fingers start to cramp. So why are some schools still asking their students to complete course evaluations on paper?
For as long as anyone can remember, students at the University of Oklahoma were given in-class surveys for as many as five classes a semester. Not surprisingly, many loathed the pencil-and-paper process. "The students would write incomplete sentences, even phrases, and just dash off," recalls Paul Bell, dean of the College of Arts and Sciences. "Some wouldn't even write anything."
There was also the problem of profanity: "This class sucks" was an all-too-common response in the comments section of surveys. In one memorable instance, in response to a question about how an instructor could improve the quality of the class, a student wrote, "Die!"
So when the university decided to move its course evaluations completely online in 2009, Bell and the rest of the faculty and administration were pleased to see the quality of the answers improve.
Bell is the first to admit, though, that the move online was not prompted by a desire to tap into the more thoughtful recesses of student brains. It was done to save money--$100,000 a year in Scantron forms alone.
But you take your breaks where you can get them. The new system saves students from having to fill out the evaluations at the end of class, which is probably the worst time to capture students' attention. (Forcing them to write something at that point is cruel and unusual.) The school's web-based eValuate system, which was developed in-house, lets students log in with their student ID and password to answer questions on their own time.
"When they're sitting at home evaluating the class online, they're much more likely to write something that actually says something," explains Bell. "They're more comfortable writing online; they're used to writing online. You get much, much better feedback."
There is one problem: While the information is qualitatively better, it is not quantitatively so. Response rates have dipped, albeit slightly. Whereas 60 percent of the students would fill out the paper evaluations, the percentage of students responding online is now averaging in the high 50s.
Bribery Works Well
This downward shift was even more dramatic at the University of Miami (FL). In 2006, the school switched to ConnectEdu, a hosted online system for course evaluations. The aim was to eliminate the task of scanning 100,000-plus faculty surveys every year--not to mention lessen the school's impact on the environment. While the school achieved these goals, the rate at which students evaluated courses slipped from 75 percent for the old paper-based evaluations to 50 percent to 60 percent for the online forms.
As at OU, the responses were definitely more informative, says David Wiles, executive director of testing and evaluation services, but the school nevertheless wanted to see a response rate comparable to that of the old system.
So a course-evaluations committee was formed, with faculty representatives from each of the schools, along with members of the faculty senate and student government. The committee decided to try a time-tested method: bribery. Students who completed an online course evaluation would be allowed to view their final grade for that class early.
Since course evaluations close only two days before all grades are posted, participating students don't get that much of a jump on everyone else. But the method has proven a powerful motivator nonetheless.
In just the first semester after adopting the incentive, schools saw student participation climb to 74 percent. Now that figure is hovering around 80 percent--higher than the rate from the old paper-based days.
The Role of Faculty
OU also offers an incentive--every student who completes the forms is entered into a drawing to win an iPad or the like--but Bell doesn't put much stock in that approach. In his view, there's a better, often-overlooked way to boost participation.
After Bell noticed that online return rates varied significantly from one faculty member to another, he decided to do some sleuthing. It turns out that professors with high student participation do something surprisingly low-tech: They ask their students to complete the surveys, which are sent out via e-mail.
"The single influencing factor is the faculty member," says Bell. "If a faculty member tells the students it's important--that he actually pays attention to what they say--he gets a much higher response rate than faculty who say nothing."
However, some faculty members may be reluctant to point students toward an online survey if they're not on board with the system themselves. One concern is security, says Bell: "Faculty members are always worried about who's got access to confidential personnel information."
In the past, paper forms were kept on file with each department or given to the faculty member to file. With the online system, data are now stored in a central database. Not everyone has access to all the information, of course (a department chair, for instance, can see only the evaluations of his own faculty), but some faculty still have concerns. As a result, some faculty members choose not to let their evaluation results be viewed--and very likely don't put much effort into trying to get students to participate.
Concerns about privacy go beyond faculty, however. Students, too, are sometimes hesitant to use the system. "We get students each semester who write to us, asking, 'Is this really going to be anonymous?'" says Wiles, who reassures them that names and e-mail addresses are not attached to any survey data.
But hesitant faculty and students are a minority, and should not be considered the major impediment to participation. Convenience, more than anything, probably lies at the heart of student participation rates. With that in mind, OU's Bell is exploring other ways to bump up response rates, including a smartphone version of the survey, developed by Aaron Biggs, computer network manager for Arts and Sciences and author of eValuate.
The new version will be available for the iPhone and iPad. Bell believes that the go-anywhere capabilities of mobile devices provide the best of both worlds: Students in class can be encouraged to complete the survey right then and there, while others might fill it in while they're on the bus, or even in the waiting room at the doctor's office.
"It's part of an ongoing effort," says Bell. "We want feedback from students, we want to communicate with students. Students, by and large, don't want to be communicated with, yet they have a vested interest in helping make things better. So it's a constant dialogue to see what it is that they will actually respond to."