Data Analytics | Feature
Big Data Helps Underperforming Students Succeed
- By Bridget McCrea
In their book, Big Data: A Revolution That Will Transform How We Live, Work, and Think, authors Viktor Mayer-Schonberger and Kenneth Cukier offer opinions on society's ability to harness information in new ways to produce useful insights or goods and services of significant value.
A professor of Internet governance and regulation at the Oxford Internet Institute at Oxford University, and data editor for The Economist, respectively, these authors define "big data" as something that one can do at a large scale that cannot be done at a smaller one. It's about extracting new insights or creating new forms of value, they write, in ways that change markets, organizations, the relationship between citizens and governments, and more.
Big data -- and the analytical systems used to dissect it -- are also changing college campuses, where underperforming students who once went semesters or even years unnoticed are now finding themselves subject to quicker identification and intervention. Generated by the myriad information systems used on campus, big data can be collected, analyzed, and compared to data across the entire student population. Administrators and professors can detect important trends and make quick decisions around specific students. Here's how two colleges are tapping into their own big data to help improve student success.
The Big Picture
Four years ago the leadership team at Rio Salado College in Tempe, AZ, started discussing new ways to use the data being generated by the school's proprietary learning management system (LMS), known as "RioLearn." Out of those discussions Rio Progress and Course Engagement (RioPACE) was born. In use since April 2010, the proprietary system pulls information from RioLearn and allows the college to quickly ascertain students' progress using a color-coded rating system in specific courses.
"It was only available for instructors but then in 2011 we opened the system up for students to see as well," explained Jennifer Freed, dean of instruction and academic affairs. The system's code has been tweaked several times over the last few years, namely by "institutional researchers who wanted to make the analytics more accurate," Freed said. When the latest version of RioPACE is rolled out in the fall of 2013, both academic advisors and the instructional help desk staff will also gain access to the data.
Freed said opening the analytics system up to a wider audience will make the solution even more useful when it comes to intervening with underperforming students. "It's great for an instructor to see that Susan is at risk in her chemistry class, but how is Susan doing in English?" said Freed. "Our new setup will provide a fuller, more complete picture of the individual student across departments – rather than just on a course-by-course basis."
According to Freed, RioPACE focuses only on variables that were the result of student behavior and that could be affected by instructor or support staff interventions. The three key data points include students' RioLearn login frequency, their site engagement (how much they are engaging and what they are doing there), and their pace in a specific course (measured by points earned versus total possible points). Current performance is then compared to historical data, Freed explained, and is given a green (good), yellow (possible trouble ahead), or red (warning!) rating.
Being awarded a color-based rating doesn't always go over well with students, particularly when a "yellow" pops up on a course where they're making acceptable progress. "A student can have a B+ in a course and still be in the yellow zone," said Freed, whose team has spent much time educating both students and instructors on the three zones and exactly what they mean. "Yellow doesn't mean a student is failing, but it is a sign that there are things he or she can be doing to increase the odds of doing well."
Freed said Rio Salado College students have benefitted from their school's data-based early intervention program, which boasts an 80-90 percent accuracy rate and has led to a 40 percent decline in drop rates when combined with instructor e-mails and phone calls. Additionally, the school has learned that general education students who log into RioLearn on their first day of school succeeded in class 21 percent more often than those who didn't. "It's about getting them in early," said Freed, "and then carefully tracking their success from day one right through to graduation."
Bridging the Gap
K. Sata Sathasivan, senior lecturer for the School of Biological Sciences at the University of Texas at Austin, knows that university professors aren't always thinking in terms of the "big picture" when it comes to student success. In many cases, according to Sathasivan, instructors are focused on teaching a single course and ensuring that as many students as possible pass the associated final exam. This type of tunnel vision has existed in the university setting for decades, he adds, but is currently changing with the help of big data and analytics.
"As instructors, each of us is thinking in terms of pieces of a whole puzzle and not about the puzzle itself," said Sathasivan, "and that can take the focus off student performance." To help bridge that gap, the University of Texas has been beta-testing a system that it's calling "virtual tracking of real-time assessment" or vTRAC. A classroom response system, vTRAC lets students use their own devices (laptops, tablets, or smartphones) to enter their answers. The system tracks where students are sitting within the class and how they answered a given question. This allows Sathasivan and his teacher assistants to immediately identify students who are struggling and provide assistance on the spot.
Sathasivan said vTRAC picks up where traditional classroom response systems or "clickers" leave off by digging deeper into the information and interconnecting it with historical classroom data. "We not only want to know how many students answered A, B, or C," said Sathasivan, "but we also want to be able to track student performance in real-time, and whether students misunderstood the information or if they just hit the wrong response accidentally."
According to Sathasivan, the cumulative information generated by vTRAC helps to identify students who are at risk of dropping out of classes and alerts instructors – based on the student's classroom location, which is inputted into the system as they enter the classroom – to the need for early intervention. "We don't give them the answers, per se, but we do help the individual students better understand the material," said Sathasivan, who is already seeing positive results after two semesters of beta-testing vTRAC.
"We've been seeing increases in student engagement, participation, and engagement since we started using the system," said Sathasivan, who estimates that about 95 percent of students now make it through to the final exam stage. "I can now count on one hand the number of students who get Ds or Fs in my classes. That's pretty significant in a challenging subject area like science."
Sathasivan said he's also been able to raise the bar when it comes to developing exam questions and course expectations and create a classroom environment were students do more than just memorize biology notes. "Rather than relying on PowerPoint presentations," he said, "I can actually train classrooms full of 120 students and then watch as their performance and knowledge is enhanced."