Open Menu Close Menu

Student Success

Developing Better Interventions for At-Risk Students

A million-dollar grant is helping John Carroll University fine-tune a targeted intervention and early alert system that helps boost student learning and retention.

college students studying

In 2015, the federal government awarded $60 million to 18 colleges and universities, chosen from a pool of hundreds, to develop innovative approaches for supporting at-risk students. John Carroll University, a four-year liberal arts institution in Ohio with around 3,000 students, was among those chosen, receiving a $1.3 million "First in the World" grant from the U.S. Department of Education. The question at the heart of our project was simple: What processes and procedures could we put in place to improve the first-year experience at John Carroll — an experience inherently linked to broader goals like student learning and retention?

Coming down the homestretch of the four-year grant, we've learned a lot about developing and testing a new first-year experience model. In particular, we've discovered a great deal about the value of multidimensional data visualizations for better predicting which students may be the most at risk.

Linked Learning for At-Risk Students

At the time we were developing the First in the World project, John Carroll was already implementing a new integrative core curriculum for its students, including linked courses, also known as student co-enrollment. But students couldn't co-enroll until sophomore year.

We decided to build on this approach for incoming students. Instead of putting higher-risk first-year students into developmental or remedial courses, we put them directly into integrated learning communities, a concept that had shown promise at the secondary school level. Linked courses were one aspect of this: One course was typically content-based, like science or math, while the other was an application course, like writing or speech. We also hosted faculty development workshops, held discussions from guest speakers and experts, and offered service learning and advanced student advising. Overall, we hoped the communities would help students both stay in school and achieve improved academic outcomes.

One way to test this approach would have been to conduct a randomized control trial, but the trouble with that methodology is that many students who needed help wouldn't get it. We didn't want to select a random group of students; we wanted to proactively identify students who were at-risk.

Regression Discontinuity

We opted for an approach called regression discontinuity. We relied on the College Student Inventory survey, which uses non-cognitive indicators like academic motivation and receptivity to counseling to predict which students may encounter the most academic hurdles. Students took the survey upon accepting their offer to attend John Carroll and again at mid-term of second semester. We used a stanine scale, which runs from 1 to 9; students who scored 5 or higher on predicted academic difficulty were eligible for the integrated learning community outlined above. This was the "gold group." For other students, it was business as usual — they comprised the "blue group."

We chose a relatively low cut-off for the intervention because we wanted to ensure we were supporting students that fall in the "murky middle," as the academic literature calls it. Students who fall right in the middle — hovering between a 2.0 and 3.0 GPA, for instance — aren't usually offered support teams or an emergency response. Those approaches are saved for the extremes. These mid-level students still have a higher chance of withdrawing from school. Their GPAs might be okay, but something else could be causing challenges. We wanted to proactively identify these students and those challenges.

At this point, it's important to note that the goal of our project wasn't to prove integrated learning communities represent the best first-year experience for every institution. Instead, we sought to demonstrate a useful methodology — a way schools that wanted to implement a program aligned with their own goals could identify which students to enroll in the program, and then track those students' progress.

Developing Predictive Analytics

The learning intervention itself is only one component of our grant. We are also working to better identify factors predictive of student success and, in turn, to develop predictive analytics capabilities to proactively identify especially at-risk students.

To do so, we must be able to see and understand a significant amount of data. We've now had three cohorts of first-year, first-time students, with more than 600 variables in each data set. While the College Student Inventory survey was used to determine which students got the intervention, we also used two other surveys throughout the course of our research: EQ-i, an emotional intelligence questionnaire, and Thriving Quotient, a holistic student survey developed at Azusa Pacific University in California. Of course, we also have data from standard outcomes like credit accumulation and GPA and standard demographic information.

During the second half of our grant, we started working with GlyphEd, which makes a unique tool that brings all of this data together and displays it in a way that's far more intuitive than SPSS printouts. GlyphEd uses "glyphs" — three-dimensional collections of geometric objects and data points (academic, extracurricular and personal) representing each student. Through the glyphs' different colors, shapes, sizes and positions, we can compare students, spot anomalies and identify patterns that are crucial to student success and retention.

For instance, with our first cohort of students, we noticed a surprising trend: Students' perceptions of financial aid appeared to be a significant predictor of withdrawal. Those who ending up leaving the university scored very low on this measure. Such insights can drive targeted implementation of additional support services.

This use case is extremely replicable for other universities. Most have similar survey and demographics data on hand. The challenge is reconciling and understanding it — and multidimensional data visualization can help.

Lessons from First in the World

While we're in the fourth and final year of the First in the World grant, in many ways this is just the beginning. We are still working to clean and recode all of the data we have, and are still working alongside GlyphEd to develop a uniquely simple tool that can bring predictive analytics to more universities. We're also finding new uses for glyphs that go well beyond uncovering new indicators for at-risk students; they can also be used to manage and eliminate bias during the enrollment process, for instance.

But, in my opinion, the key learning point from our First in the World research is its very premise: We wanted to deliver a learning intervention — one aligned with John Carroll's broader institutional goals — to those students who really needed it. Predictive analytics may be our finish line, but regression discontinuity was the starting line — and it's a methodology that any university can apply when designing first-year experiences and targeted interventions.

comments powered by Disqus