Popular Feedback Devices Involve Students in Learning

At Ohio’s University of Akron, a pilot program introduced last year is successfully using wireless feedback devices to increase student involvement in the learning process. The relatively simple technology uses portable infrared receivers connected to faculty laptop computers, and a small infrared “clicker” device for each student in class.

According to Dr. David A. McConnell, a professor of geology at the university who spearheaded the program while he was interim co-director of the Institute for Teaching and Learning, the success of the program last year has encouraged the university to make it available campus-wide this fall.

About 15 percent of the university’s 23,000-plus students used the clickers in the pilot, along with more than 50 instructors teaching courses across seven colleges. "We believe that UA has some of the widest and most extensive experience with this technology," McConnell says.

The system, called Classroom Performance System, or CPS, comes from eInstruction Corp., a Denton, Texas-based company. A number of companies offer similar technologies; they’re generally known as classroom or wireless response systems.

The devices can be used during class as often as an instructor chooses; McConnell said the general recommendation is to gather feedback several times during the teaching session, perhaps after a distinct concept has been covered. eInstructions suggests on its Web site “as a general guideline, that 8-12 appropriate objective questions be integrated into every one hour learning activity.”

McConnell’s research suggests that the devices are most effective when used two to four times per class, he says. “Some use them more, some less... We recommend [a question] every 10-15 minutes. Cover a small section, ask a question, move on.” Questions should be answered correctly by one third to two-third of the class, he adds, with the core idea being to “get a conversation going with the students themselves; to use peer instruction.”

The software works by displaying pre-programmed questions to the class; students respond by pressing a button on individual devices. For example, instructors can display a multiple-choice question to check whether students understand the material just covered. Responses are tallied and displayed instantly by the system; the instructor can then tailor a discussion based on the response. Results can be saved for benchmarking and exported to programs like Microsoft Word or Excel, or to a PDF format.

McConnell stresses that the devices aren’t about technology; they’re about helping instructors find better ways to teach, and finding new ways to engage students and encourage them to participate in the learning process. ““We came at it from a pedagogical perspective. How can we help our students learn? We wanted them to get to the point of understanding why an answer is wrong.”

He didn’t want to introduce a complex program that would require extensive training and organization, McConnell says. “We wanted to do something that was a small step, that didn’t ask too much.”

Instructors were given a three-hour workshop in using the devices, including how to formulate questions to generate discussion. After last year’s pilot, McConnell recalls, several faculty said the instant feedback they got from students made them better at writing questions. “Sometimes, students just don’t understand the question… It’s not just a matter of finding out what the students know—you can also find out what they think. They can offer opinions anonymously.”

McConnell suggests that questions work best when based around something conceptual rather than factual. “Base questions around a key question you’re trying to teach.” Answers should offer popular student misconceptions, he suggests, so that a discussion can ensue about why students picked a certain answer.

A survey of 1,600 students after last year’s pilot program asked whether and how this type of teaching helped with various aspects of learning. The survey reactions, McConnell says, were overwhelmingly positive. The vast majority of students said the system increased their level of understanding of a subject. "[Students] told us it helped them to know their level of understanding, and they overwhelmingly recommended that we use it in other classes at UA.”

Perhaps the best part about the program, McConnell concludes, is that it’s been completely faculty-driven. “Students and faculty reported back that this was effective,” he says. “Faculty are stepping up and doing this voluntarily.”

Featured