Education Research | News
Change the Homework, Improve Student Achievement
A new study from Rice University and Duke University researchers identified a relatively non-invasive approach to improving student achievement — one that doesn't involve gutting the curriculum or reinventing pedagogy. The researchers found that implementing subtle, technology-based changes to homework resulted in improvements in student performance on tests.
The study, "Integrating Cognitive Science and Technology Improves Learning in a STEM Classroom," published this week in Educational Psychology Review, focused on a single upper-level engineering course (40 students) using methods that the researchers said "could easily be applied across disciplines and grade levels with minimal cost and disruption."
The changes included the adoption of a software tool developed at Rice called OpenStax Tutor. According to the researchers, the software is similar to other tools on the market that fall into the broad category of cognitive science-based digital tutors, tools that are designed to differentiate instruction based on the needs of individual students.
They broke students into two groups and assigned them two different types of homework on alternating weeks — the cognitive science-based interventions one week, traditional homework the next. In any given week, half the class would receive homework that used the cognitive science approach, the other half traditional homework.
The cognitive science intervention included three key practices:
- Providing immediate feedback on homework, with students required to read the feedback in order to receive credit;
- Spacing problems over three weeks rather than pushing them all in a single week; and
- Follow-up problems assigned in subsequent homework ("repeated retrieval practice").
"The results exceeded everyone's expectations," said report co-author Richard Baraniuk, the instructor of the Rice U course used in the experiment, in a statement released to coincide with the report's publication. "These simple changes produced a larger effect than the average improvement for classroom interventions that require a complete overhaul of curricula and/or teaching methods."
According to the researchers, the intervention approach resulted in a 7 percent improvement on short answer questions on the final exam and 5 percent on multiple choice questions.
"The results of the experiment were clear; the combination of small, but important changes to a small part of standard practice boosted student learning and retention in the course," according to the report. "Students performed better on the exam problems about material learned via the intervention than they did on problems about material learned through standard practice."
"Giving students multiple opportunities to practice retrieving and applying their knowledge on new problems is a very powerful way to promote learning, especially when this practice is spaced out over time," said study co-author Elizabeth Marsh, associate professor of psychology and neuroscience at Duke, also in a statement released to coincide with the report's publication. "Feedback also is critical to learning, and previous studies have shown that students will often skip looking at feedback."
According to lead author Andrew Butler, a postdoctoral researcher at Duke University, the technology was not fundamental to the experiment, but it made the implementation simpler.
"We could have implemented these same principles in the classroom without technology, but the digital tutor made it much easier," Butler said. "Moreover, technology has the potential to implement these principles in a more powerful way by providing personalized instruction to each student."
The report's authors will discuss their findings at the Personalized Learning Workshop, which will be held April 2 in Rice University's Duncan Hall, located in Houston. The research was produced out of Rice U's Center for Digital Learning and Scholarship and Duke's Department of Psychology and Neuroscience.