Open Menu Close Menu

Student Retention | Feature

Shining a Light on Retention

Three institutions created their own customized programs to keep at-risk freshmen in school, with positive results.


American higher education is suffering from a dropout pandemic. About 30 percent of freshmen at four-year colleges don't return for their sophomore year, according to a 2010 report by the American Institutes for Research. Such a high failure rate threatens to make a mockery of President Obama's goal for the US to have the highest proportion of college graduates in the world by 2020. It also hits employers and taxpayers hard. The report, Finishing the First Lap: The Cost of First-Year Student Attrition in America's Four-Year Colleges and Universities, estimates that, between 2003 and 2008, states and the federal government spent $9.1 billion in appropriations and grants on students who dropped out after freshman year.

With stakes this high, colleges and universities are pushing to understand the dropout issue and find solutions to keep students in school. And just as there are myriad reasons why students drop out, institutions are discovering different ways to address the problem. Here, CT takes a look at the efforts of three universities that are making headway.

Carroll University (WI)
Five years ago, Douglas Hastad, the incoming president of Carroll University, made it clear that, in his view, the most important indicators of the university's success were the graduation and retention rates. Sure enough, soon after the new president's arrival, Jim Wiseman, VP of enrollment, was tasked to find a way to improve the student retention rate. Wiseman, who had great success using predictive modeling in the school's enrollment process, decided to see if the same approach could work with retention.

Wiseman immediately focused on the transition from the freshman to sophomore years, when the most students drop out. His first challenge was determining which data would best identify those freshman students who were at risk.

"A lot of people associate 'at-risk' with academics," notes Wiseman, "but that's just one factor in why a student may be driven to leave school." In addition to academic data, Wiseman wanted to utilize data from the admissions office, the financial aid office, the registrar, the athletics department, student life -- basically any office that compiles student data. He then planned to weight each type of data according to its historical impact on student retention. There was one big problem, though: Each department had its own data management systems, and housed its data in individual "silos" across campus. Wiseman had no way to access all of the data from a central location.

Wiseman knew that he needed to bring in a technology powerhouse to break down these silos and securely access and analyze their data. In 2008, Carroll University partnered with Jenzabar to develop Jenzabar's Retention Management Solution (RMS). The university already had a strong relationship with Jenzabar: The company was Carroll's ERP provider and had recently become its portal provider, too.

"Our relationship with Jenzabar has been wonderfully symbiotic," says Wiseman. "They launch the product; we test it on campus; we give feedback; and then they make changes. We're on our third version of the RMS, and the features just keep improving."

Data on freshman students feeds directly from the university's Jenzabar CMS into the RMS. Using a mathematical model derived from the school's historical retention data, the RMS analyzes the data nightly to predict how likely each student is to drop out.

"We began with almost 200 data points," explains Wiseman. "As we started running them through the model, we threw out those that were revealed to be redundant, unreliable, or irrelevant." Among the data analyzed are high school transcripts, historical retention rates sorted by major, out-of-pocket tuition payments, grades, the assessment of late fees, campus employment earnings, open holds, student alert forms, involvement records, and parent- and student-survey information.

"These are the data points that worked for our university," notes Wiseman. "I've helped other schools with their retention systems, and every school tends to be a little different. Some data points overlap, but every school seems to have its own unique factors that affect student retention."

Reports on at-risk students are generated by the system's dashboard, which uses a customized formula to categorize students according to the probability of their leaving. This allows the university to proactively intervene in a way that's appropriate for each student's situation. "The dashboard functions almost like a stock market ticker," explains Wiseman. "It actually shows arrows going up or down for each individual student." Students are classified as safe, at-risk, or critical, depending on the trajectory of their data. When a student is identified as being in trouble, his profile is used to identify an adviser, coach -- any faculty member with whom he has established trust -- who can contact the student in an effort to resolve any issues.

With 725 freshmen to monitor, Carroll University also decided to create a new department, the Office of Student Success, whose sole function is to increase student retention rates. "It's a marriage between the high-tech approach of using predictive modeling to identify students that need attention and the hands-on approach of having supportive staff who can intervene quickly to work with those students," says Wiseman. "This two-pronged approach is what makes our system work."

In the year after Carroll instituted its system, the freshman-to-sophomore retention rate jumped 2 percent, and the university has maintained that higher retention rate in the years since. "In this down economy, we're noticing more students with financial issues," says Wiseman, "so the fact that we've maintained that initial 2 percent increase is a strong indicator of our system's success."

Illinois Institute of Technology
Rather than partnering with an outside vendor as Carroll University did, IIT decided to build its own student retention system. It didn't hurt that project leader Matthew Bauer, the director of undergraduate advising since 2007, had taught computer science at the school for 15 years. Bauer used his expertise to develop the IIT Early Warning System in PHP and MySQL.

"Developing it in-house gave me more freedom," explains Bauer, "because the system was within our portal and thus not limited by our data center's development restrictions. And because I was one of them, the faculty gave me a lot of leeway. They understood that it wasn't going to be a clean, fully developed system on day one. They also knew how easily I could update and improve the system, and I encouraged them to suggest improvements."

By creating the system in-house, Bauer was also able to develop a predictive model that would work within IIT's unique academic environment. It's not uncommon for a student to change his major during his first year of college, but IIT's focus on engineering, science, and architecture means that a student who suddenly realizes he'd rather study social science is likely to leave at the end of the semester.

"The driving goal of the system was to get information about any issues with the students from the teachers as early as possible -- as early as the first or second week of the semester," says Bauer. "That way, you can have an adviser talk to the student as early as the second or third week of the semester to find out what the issue is. Is it a standard maturity issue? Is it time management? Is it study skills? Does he want to change his major from computer science to architecture? Or, does he want more of a liberal arts education?"

One of Bauer's biggest challenges was to persuade teachers to submit student attendance and performance data on a regular basis. The first iterations involved paper attendance sheets: "The amount of data entry that needed to be done and the amount of paper moving around were unbelievable," laughs Bauer. And then there was the online system that required teachers to log on and enter each student's academic and attendance information, which "was way too much extra work," he says.

Bauer then configured the Early Warning System to automatically generate a customized e-mail, containing course-specific lists of students (with both their names and student ID numbers), that is sent weekly to every teacher with freshman and sophomore courses. In their replies, the teachers put notes next to the names of any students with attendance or academic problems: "missed one class," "poor exam," or "not doing homework," for example. Each reply is sent to a dummy mailbox that parses the e-mail and loads the information into the system's database. "The teachers only have to type in information for the students who they feel may need help," says Bauer. "It's a system that probably wouldn't scale up to meet the needs of a big university, but for a small engineering school like ours, it works." More than 50 percent of the IIT faculty now submit student data on a weekly basis.

On Monday mornings, the Early Warning System sends customized e-mail reports to each of the school's advisers, listing any of their students who had more than one absence or issue reported by their professors during the previous week.  Reports are also sent to the school's athletic coaches, ROTC unit leaders, and the school's disability office. "Basically, anyone who has a just cause for interest in the academics of a student will be sent a report every Monday that lists any students with whom they should follow up," explains Bauer.

Factoring in these advisers, coaches, and other mentors, Bauer estimates that between 70 and 80 percent of IIT's faculty and staff are involved in the Early Warning System. And that involvement is paying off. Before the Early Warning System, IIT's freshman-to-sophomore retention rate was around 85 percent; it's shot up to 93 percent in the three years since the system was implemented. In a school with a freshman class of 400 to 500, that means IIT is retaining 30 to 40 more students each year than before.

"You know those cases where we realize at the end of the semester that a student was never attending class?" says Bauer. "That just doesn't happen anymore. We find those problems earlier. And because we find them earlier, we can do things to help the student. We can get them out of these situations before it becomes a financial headache for them. The students, in general, are happier about their experience at IIT, and even if they end up leaving, they're not leaving mad."
           
Purdue University (IN)
Both Carroll University and IIT developed systems that alert university officials when to intervene with an at-risk student. While Purdue University's retention system also notifies advisers, it puts primary responsibility for a student's success squarely on the shoulders of the one person who has the most control over that success -- the student himself.

In 2004, when Purdue began developing Signals, as its student retention system is known, it made one other decision that was outside the norm: The system would be designed around a course-specific algorithm rather than following an across-the-board institutional model. As a result, a student's at-risk status would be based on statistics tied to each course rather than aggregating the data into a single big picture. 

Signals, which currently focuses on freshman-level courses, mines data from the university's Blackboard Vista CMS and SunGard Higher Education Banner. "We look at three ports of data," explains John Campbell, associate VP for information technology at Purdue. "First, we look at the academic preparation of the student, such as standardized test scores, which are pretty influential historically in the first year of college. Then we look at the effort the student's putting forth in class: How often is he interacting with that course's page within the CMS? And then we look at his performance in terms of grades."

After these three points of data are run through the course-specific algorithm, the student's status is displayed in the form of a traffic signal whenever he logs into a course's home page on the CMS. A green light indicates the student is doing well; a yellow light indicates that he is potentially at risk; and a red light indicates that he should get help immediately. "We want to give the data right to the students," says Campbell. "We want our students to become self-guided learners, to become aware of where they are in their learning."

Because freshman-level courses tend to have multiple sections throughout the week, and the algorithm compares the student's performance with that of students in each section of a course, the Signals system is updated only once a week. This way, the fact that a student with a Tuesday section contributed heavily to the course's Blackboard discussions on Sunday and Monday nights won't negatively affect the status of a student whose section falls on a Thursday.

When students click on the traffic signal, they're given specific guidance from the course instructors, such as reminders of when office hours are scheduled, or information on the math help desk. As the semester goes on, these guidance messages increase in intensity; faculty are encouraged to be blunt and direct. "We're finding that the tone of the message -- increasing that intensity -- is an important factor in how you reach out to the student," says Campbell. "We're trying to produce what we call 'actionable intelligence.' It's not enough to just identify the students at risk. We want the students to take action."

Since Purdue began piloting its retention system in 2007, more than 11,000 students have experienced Signals. For each course that participated in the pilot, Purdue divided the course sections into groups: an experimental group, in which the students were given the Signals system, and a control group. Among those students using Signals, Purdue has consistently seen 10 percent increases in the number of A or B grades, and 30 percent fewer failing grades. There is also a 65 percent increase in the number of students who seek help when encouraged to do so by the Signals system. Furthermore, the data from the pilots show that the Signals students seek help earlier in the semester.

"Throughout the process, we've done student focus groups," says Campbell. "When we ask students what kind of grade they think they'll get in a class, they give a wishy-washy reply. However, when I ask them what color their signal is, they can tell me immediately, without hesitation. Students are paying attention."

Purdue recently partnered with SunGard Higher Education to develop Signals for cross-platform use with a variety of course management and administrative systems, and the underlying algorithm is being adapted for use by other schools. Purdue implemented its new and improved version on campus in January. Updates include increased automation, which allows the system to process data on an increased number of students, and a snapshot view that allows students and their advisers to see the signals for each of the students' courses on a single page in Blackboard Vista.

comments powered by Disqus