Change the Homework, Improve Student Achievement

A new study from Rice University and Duke University researchers identified a relatively non-invasive approach to improving student achievement — one that doesn't involve gutting the curriculum or reinventing pedagogy. The researchers found that implementing subtle, technology-based changes to homework resulted in improvements in student performance on tests.

The study, "Integrating Cognitive Science and Technology Improves Learning in a STEM Classroom," published this week in Educational Psychology Review, focused on a single upper-level engineering course (40 students) using methods that the researchers said "could easily be applied across disciplines and grade levels with minimal cost and disruption."

The changes included the adoption of a software tool developed at Rice called OpenStax Tutor. According to the researchers, the software is similar to other tools on the market that fall into the broad category of cognitive science-based digital tutors, tools that are designed to differentiate instruction based on the needs of individual students.

They broke students into two groups and assigned them two different types of homework on alternating weeks — the cognitive science-based interventions one week, traditional homework the next. In any given week, half the class would receive homework that used the cognitive science approach, the other half traditional homework.

The cognitive science intervention included three key practices:

  1. Providing immediate feedback on homework, with students required to read the feedback in order to receive credit;
  2. Spacing problems over three weeks rather than pushing them all in a single week; and
  3. Follow-up problems assigned in subsequent homework ("repeated retrieval practice").

"The results exceeded everyone's expectations," said report co-author Richard Baraniuk, the instructor of the Rice U course used in the experiment, in a statement released to coincide with the report's publication. "These simple changes produced a larger effect than the average improvement for classroom interventions that require a complete overhaul of curricula and/or teaching methods."

According to the researchers, the intervention approach resulted in a 7 percent improvement on short answer questions on the final exam and 5 percent on multiple choice questions.

"The results of the experiment were clear; the combination of small, but important changes to a small part of standard practice boosted student learning and retention in the course," according to the report. "Students performed better on the exam problems about material learned via the intervention than they did on problems about material learned through standard practice."

"Giving students multiple opportunities to practice retrieving and applying their knowledge on new problems is a very powerful way to promote learning, especially when this practice is spaced out over time," said study co-author Elizabeth Marsh, associate professor of psychology and neuroscience at Duke, also in a statement released to coincide with the report's publication. "Feedback also is critical to learning, and previous studies have shown that students will often skip looking at feedback."

According to lead author Andrew Butler, a postdoctoral researcher at Duke University, the technology was not fundamental to the experiment, but it made the implementation simpler.

"We could have implemented these same principles in the classroom without technology, but the digital tutor made it much easier," Butler said. "Moreover, technology has the potential to implement these principles in a more powerful way by providing personalized instruction to each student."

The report's authors will discuss their findings at the Personalized Learning Workshop, which will be held April 2 in Rice University's Duncan Hall, located in Houston. The research was produced out of Rice U's Center for Digital Learning and Scholarship and Duke's Department of Psychology and Neuroscience.

About the Author

David Nagel is the former editorial director of 1105 Media's Education Group and editor-in-chief of THE Journal, STEAM Universe, and Spaces4Learning. A 30-year publishing veteran, Nagel has led or contributed to dozens of technology, art, marketing, media, and business publications.

He can be reached at [email protected]. You can also connect with him on LinkedIn at https://www.linkedin.com/in/davidrnagel/ .


Featured

  • lock with a glowing keyhole integrated with a transparent, layered server stack against a dark background with a subtle grid pattern

    Cohesity Integration Adds Protection for Red Hat OpenShift Virtualization Workloads

    AI-powered data security company Cohesity has expanded its collaboration with Red Hat to enhance data protection and cyber resilience for Red Hat OpenShift Virtualization workloads.

  •  black graduation cap with a glowing blue AI brain circuit symbol on top

    Report: AI Is a Must for Modern Learners

    A new report from VitalSource identifies a growing demand among learners for AI tools, declaring that "AI isn't just a nice-to-have; it's a must."

  • Abstract AI circuit board pattern

    New Nonprofit to Work Toward Safer, Truthful AI

    Turing Award-winning AI researcher Yoshua Bengio has launched LawZero, a new nonprofit aimed at developing AI systems that prioritize safety and truthfulness over autonomy.

  • modern college building with circuit and brain motifs

    Anthropic Launches Claude for Education

    Anthropic has announced a version of its Claude AI assistant tailored for higher education institutions. Claude for Education "gives academic institutions secure, reliable AI access for their entire community," the company said, to enable colleges and universities to develop and implement AI-enabled approaches across teaching, learning, and administration.