Open Menu Close Menu

Login | Viewpoint

Analytics at Scale

MOOCs should be the Holy Grail of student data, but they aren't there yet.

One of the great promises of massive open online courses, besides making education more accessible for more students, is the treasure trove of student data collected on a grand scale.

Large amounts of data are exactly what higher education needs to stay relevant in this era of disruptive change, as Arizona State University's Adrian Sannier pointed out in his keynote at last year's Campus Technology annual conference. The only way to make sure colleges and universities are continually boosting student success, he said, is evidence-based pedagogy. And that requires scale: "You can't take evidence one class at a time, one person at a time — it takes too long, you don't get a broad enough sample…. I'm not sure you can do it at a university, at a single institution. You may not have enough scale, you may not have enough size."

Yet scale can be both a blessing and a curse, as evidenced by the preliminary data analysis (released in late January) from MIT and Harvard's first year of edX MOOCs. The data sets are massive: 841,687 course registrations from 597,692 unique users, generating about 340 GB of total data. But the conclusions made by the researchers are not nearly as interesting as the conclusions they couldn't make.

One might easily assume that a large set of MOOC data would reveal patterns in student behavior, and suggest some key indicators to predict student success. Turns out the MOOC universe is not so easy to parse, perhaps due to the large percentage of students who treat MOOCs as Web content to surf rather than courses to complete, pointed out Andrew Ho, associate professor at the Harvard Graduate School of Education and a lead researcher for the MITx/HarvardX study. As a result, he said, "Everything predicts MOOC performance, because doing anything in this space separates you from the thousands of people who are doing relatively little — thus doing anything predicts doing anything else." (Read our interview with Ho, "Inside the First-Year Data From MITx and HarvardX," in our March issue.)

In fact, Ho said, the study took care to call the MOOC participants "registrants" rather than students, emphasizing that they could not be viewed through a conventional analytic lens from higher education.

That's not to say that predictive behavior patterns don't exist for MOOC students — it's just that, ironically, we don't have enough data to separate out what's meaningful at MOOC scale.

More research is needed, and it's up to higher ed institutions to work together to achieve the scale needed to generate actionable data. As University of Michigan CIO Laura Patterson put it in this month's cover story, "We have to be more open to collaboration across the industry as we look at the changes driven by research and teaching and learning at scale — learning analytics particularly…. We are in an era in which no single university is going to meet the challenges alone. Universities working together to meet demands of the future will be a critical part of our success."

This story appears in the March 2014 digital edition of Campus Technology. Click here for a free subscription to the magazine.

About the Author

Rhea Kelly is editor in chief for Campus Technology, THE Journal, and Spaces4Learning. She can be reached at [email protected].

comments powered by Disqus