Cornell Researchers Use AI to Understand Students' Math Struggles

Researchers at Cornell University are working on software that will help math teachers understand what their students were thinking that led them to finding incorrect answers.

Cornell University researchers are working on software that will help math teachers understand what their students were thinking that led them to finding incorrect answers.

Erik Andersen, assistant professor of computer science at Cornell, said that teachers spend a lot of time grading math homework because grading is more complicated than just marking an answer as right or wrong.

"What the teachers are spending a lot of time doing is assigning partial credit and working individually to figure out what students are doing wrong," Andersen said in a prepared statement. "We envision a future in which educators spend less time trying to reconstruct what their students are thinking and more time working directly with their students."

To help teachers get through their grading and understand where students need more help, Andersen and his team have been building an algorithm that reverse engineers the way students arrived at their answers.

They began with a dataset of addition and subtraction problems solved — or not — by about 300 students and tried to infer what the students had done right or wrong.

"This was technically challenging, and the solution interesting," said Andersen in a news release. "We worked to come up with an efficient data structure and algorithm that would help the system sort through an enormous space of possible things students could be thinking. We found that 13 percent of these students made clear systematic procedural mistakes, and the researchers' algorithm learned to replicate 53 percent of these mistakes in a way that seemed accurate. The key is that we are not giving the right answer to the computer — we are asking the computer to infer what the student might be doing wrong. This tool can actually show a teacher what the student is misunderstanding, and it can demonstrate procedural misconceptions to an educator as successfully as a human expert."

Eventually the researchers hope to develop a program that will be able to offer teachers reports on learning outcomes to improve instruction and differentiation. For now, the tool only works with addition and subtraction problems, but the team plans to expand to algebra and more complicated equations eventually.

For more information, go to cs.cornell.edu.

About the Author

Joshua Bolkan is contributing editor for Campus Technology, THE Journal and STEAM Universe. He can be reached at [email protected].

Featured