Open Menu Close Menu

Teaching and Learning

U Michigan Large-Course Writing Toolkit to Add Automated Text Analysis

A team of educators at the University of Michigan is getting closer to releasing a new version of a writing program intended to implement writing-to-learn pedagogies in large enrollment courses. The enhanced edition of M-Write, expected in fall 2017, includes the use of automated text analysis to identify the strengths and weaknesses of a student's writing submission. By 2021, program leaders hope to reach more than 10,000 students with the digital toolkit under development.

M-Write is the brainchild of two faculty members, Anne Gere, a professor of education and English language and literature and the director of the campus's Sweetland Center for Writing, and Ginger Shultz, an assistant professor of chemistry. Their original goal was to help instructors teaching those large classes to embed the use of writing in the courses as part of the assessment process. The barrier traditionally has been how to get the essays or papers graded. Methods developed by Gere and Schultz integrated several approaches — asking students to explain what they know, having them interact with each other through peer review and helping them learn better writing skills through a revision process. The heart of M-Write is an automated peer review platform for creating, assigning, distributing and tracking writing prompts in those large courses.

Eventually, the project was subsumed into the Digital Innovation Greenhouse (DIG), a division of U Michigan's Office of Digital Education & Innovation that works with the campus community to help grow and scale their education innovations. In 2015, M-Write was awarded a $1.9 million, five-year grant to help introduce the methods into more large, introductory courses and to embed technology into the process. One aspect of that automation was to build in the use of ECoach, another DIG program initially introduced for physics courses, that delivers personalized messages to large-class learners.

This fall the automated text analysis will be tested in one course — Statistics 250 — to predict the overall score from a student's piece of writing. This same course has been using earlier iterations of M-Write. When a student submits his or her essay, the automated text analytics will assess it, looking for those qualities of good writing that have been designed into the algorithm. The technology will generate a numeric score for each component of a rubric, all of which will contribute to a summary or "predicted score."

The details of that score will be shuttled to ECoach and then verified by a human — specifically a "writing fellow." This is a student who has previously excelled in the class and who will help the writers develop initial drafts, make revisions and guide them in giving feedback to others and using the feedback they receive.

"The writing fellow will serve as a checkpoint between [the text analytics] and the students," said Dave Harlan, principal developer of M-Write at DIG, in a campus article about the project.

For its part, ECoach will be used to send messages to students about what makes for a good peer review and what makes a good revision of an essay, and by adding next steps to a to-do list built into the coaching program.

Initially, the text analytics will be used to identify students who need help as early as possible. But the DIG development team is already looking beyond that goal. For example, the use of the human and automated assessment process is "very interesting" to them since the combination will have two outcomes: providing "direct feedback" to the algorithm development, allowing them "to create a better one," said Chris Teplovs, lead developer at DIG, and giving "human graders a moment to pause and reconsider their assessments." The result could be better teaching, he noted.

In one test of the automated text analysis technology, the doctoral students analyzing a set of essays noticed a jump in essay quality between semesters for one of the course prompts. Based on a comparison of both sets of essays, the professor for that course changed his teaching approach for certain topics.

"Overall our goal is to improve student learning and a corollary of that is improving teaching," Teplovs said.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

comments powered by Disqus