Open Menu Close Menu

Login | Viewpoint

Down the Analytics Rabbit Hole

The search for data that impacts student success may lead to more questions than answers.

Down the Analytics Rabbit Hole
Shutterstock.com

When we first sketched out a topic for this month's feature on the quest for data that impacts student success, we had a slightly different story in mind. We wanted to come up with a handful of data types that any higher ed institution pursuing learning analytics should have on its list — say, five key data points that really impact student outcomes.

Of course, boiling analytics down to five types of data would be an oversimplification, but we hoped it could generate some interesting discussion. And with projects like the PAR Framework and Open Academic Analytics Initiative making great inroads in the field — generating predictive models with a significant degree of accuracy across various types of institutions — our goal seemed attainable. Surely the experts would agree that certain basic data — lecture capture usage, LMS activity, etc. — universally contributed to academic performance.

In reality, what we encountered was something of an analytics rabbit hole. Take lecture capture usage, for example: The University of Saskatchewan found that students in second-year STEM classes who accessed lecture capture systems weekly earned higher grades than students who did not regularly integrate lecture capture into their study habits. But when the model was applied to a first-year social sciences class, it "didn't hold up very well at all," as researcher Christopher Brooks told us. "There are so many different variables," he said — differences in the course, the students, the domain or the instructor could all skew the results.

Similarly, researchers at Brigham Young University (UT) determined that basic LMS data — students' time spent within the system and the number of page views they generate — is not enough to indicate student performance. By taking a more granular approach, categorizing specific types of page visits, they were able to uncover some insights into student learning, but those findings still require further exploration.

It's a realization that I imagine many institutions pursuing learning analytics must have upon delving into the data: The closer you look, the more questions you discover.

That doesn't mean higher ed should stop searching for answers in big data; rather, I think it highlights the need for continued research, and more important, the need to share findings and collaborate across institutions. And here's one place to start: At our Campus Technology Fall Forum this Nov. 4–5 in Chicago, University of Michigan professor Perry Samson will speak on "Predicting Student Success in Hybrid Courses Based on Student Participation," offering his lessons learned from linking student participation data with student outcomes in a large lecture class at U-M. Don't miss it!

About the Author

Rhea Kelly is editor in chief for Campus Technology, THE Journal, and Spaces4Learning. She can be reached at [email protected].

comments powered by Disqus