Cornell Researchers Use AI to Understand Students' Math Struggles

Researchers at Cornell University are working on software that will help math teachers understand what their students were thinking that led them to finding incorrect answers.

Cornell University researchers are working on software that will help math teachers understand what their students were thinking that led them to finding incorrect answers.

Erik Andersen, assistant professor of computer science at Cornell, said that teachers spend a lot of time grading math homework because grading is more complicated than just marking an answer as right or wrong.

"What the teachers are spending a lot of time doing is assigning partial credit and working individually to figure out what students are doing wrong," Andersen said in a prepared statement. "We envision a future in which educators spend less time trying to reconstruct what their students are thinking and more time working directly with their students."

To help teachers get through their grading and understand where students need more help, Andersen and his team have been building an algorithm that reverse engineers the way students arrived at their answers.

They began with a dataset of addition and subtraction problems solved — or not — by about 300 students and tried to infer what the students had done right or wrong.

"This was technically challenging, and the solution interesting," said Andersen in a news release. "We worked to come up with an efficient data structure and algorithm that would help the system sort through an enormous space of possible things students could be thinking. We found that 13 percent of these students made clear systematic procedural mistakes, and the researchers' algorithm learned to replicate 53 percent of these mistakes in a way that seemed accurate. The key is that we are not giving the right answer to the computer — we are asking the computer to infer what the student might be doing wrong. This tool can actually show a teacher what the student is misunderstanding, and it can demonstrate procedural misconceptions to an educator as successfully as a human expert."

Eventually the researchers hope to develop a program that will be able to offer teachers reports on learning outcomes to improve instruction and differentiation. For now, the tool only works with addition and subtraction problems, but the team plans to expand to algebra and more complicated equations eventually.

For more information, go to cs.cornell.edu.

About the Author

Joshua Bolkan is contributing editor for Campus Technology, THE Journal and STEAM Universe. He can be reached at [email protected].

Featured

  • interconnected cloud icons with glowing lines on a gradient blue backdrop

    Report: Cloud Certifications Bring Biggest Salary Payoff

    It pays to be conversant in cloud, according to a new study from Skillsoft The company's annual IT skills and salary survey report found that the top three certifications resulting in the highest payoffs salarywise are for skills in the cloud, specifically related to Amazon Web Services (AWS), Google Cloud, and Nutanix.

  • a hobbyist in casual clothes holds a hammer and a toolbox, building a DIY structure that symbolizes an AI model

    Ditch the DIY Approach to AI on Campus

    Institutions that do not adopt AI will quickly fall behind. The question is, how can colleges and universities do this systematically, securely, cost-effectively, and efficiently?

  • minimalist geometric grid pattern of blue, gray, and white squares and rectangles

    Windows Server 2025 Release Offers Cloud, Security, and AI Capabilities

    Microsoft has announced the general availability of Windows Server 2025. The release will enable organizations to deploy applications on-premises, in hybrid setups, or fully in the cloud, the company said.

  • digital brain made of blue circuitry on the left and a shield with a glowing lock on the right, set against a dark background with fading binary code

    AI Dominates Key Technologies and Practices in Cybersecurity and Privacy

    AI governance, AI-enabled workforce expansion, and AI-supported cybersecurity training are three of the six key technologies and practices anticipated to have a significant impact on the future of cybersecurity and privacy in higher education, according to the latest Cybersecurity and Privacy edition of the Educause Horizon Report.