Community Colleges | Feature

An 'Ensemble Model' for Student Analytics: Driving Academic Success in the Community College

South Orange County Community College District Vice Chancellor Bob Bramucci explains how SOCCCD is expanding its toolchest for student success, why that's more crucial now than ever, and what the institution is doing to help other colleges do the same.

South Orange County Community College District is expanding the scope and reach of its software toolbox for student success. While known most widely for Sherpa, the recommendation engine, SOCCCD has an entire software suite of tools to assist students, including MAP (the My Academic Plan software), which has been integrated with Sherpa. More recent efforts have included work on the addition of an analytics module, and funding has been secured for development of a student success dashboard. CT asked Vice Chancellor of Technology and Learning Services Bob Bramucci for an update as the district constructs its plans to offer the software to other institutions as a cloud-based service.

Mary Grush: You've stated that SOCCCD would like to make its tools for student success available to other institutions. What steps have you taken toward this?

Bramucci: First, we have been meeting with several consultancies that are our development partners or potential partners, including Neudesic (Irvine, CA) and Microsoft, to investigate the question: How do we take all these things that we have created, such as MAP (the My Academic Plan software) and Sherpa (the recommendation engine), and provide them for other institutions to use as a cloud-based service? This effort is ongoing, but we are now beginning to find out just what would have to be done to stand our software up once at a statewide or national level rather than having individual colleges run separate instances in their own server rooms.

A second significant event is that the California Community College Chancellor's Office has released three relevant RFAs: one for a statewide assessment database and a system of common testing; another for online programs such as the California Virtual Campus; and a third for academic planning.

The RFAs provide a perfect vehicle to make the State Chancellor's Office aware that we would like to offer our suite of student success software royalty-free for California community colleges to use system-wide. (That doesn't mean "cost-free," because there would still be integration, maintenance, and the cloud hosting expenses, but California’s community colleges would not have to pay software license fees as they would from a commercial vendor.) So, we will respond to the RFAs, probably not to become the fiscal agent to run all those programs, but rather to indicate our willingness to work with the institution that ultimately would.

Grush: The predictive analytics project has been your most recent development. Could you describe where you are with that, as well as what progress you have made towards creating a student dashboard?

Bramucci: We are just finishing up the research phase and entering the product phase of our predictive analytics project; and we have secured the funding ($0.5 million to get started) for our student success dashboard project — work will begin on that this fall.

Grush: Could you describe your approach to the analytics project?

Bramucci: We worked with three mathematicians. We wanted them, at the very beginning stages, to be kept apart, so they wouldn't influence or limit each other's creativity. By doing that, we did benefit from their different strategies in specific ways. For example, Sandeep Jayaprakash, working with Josh Baron at Marist College took an approach of determining whether a student would be successful in any or all of their classes. That, as a more global measure, I think will be very useful for our dashboard.

Dr. Padhraic Smyth at the Center for Machine Learning and Intelligent Systems at the University of California, Irvine, instead predicts actual grades for each course that a student might take. He also evaluated several machine learning techniques that we might turn to. From these we adopted an ensemble model, which combines several techniques, with the idea that several together work better than any single technique alone.

Ted Younglove of Cerritos College in California concentrated on a linear regression math model.

We are now at the end of our research. We have mathematical models that are predictive. The task now is to focus on how to get that information out to the student. We think important opportunities exist in the academic planning process, and in the registration process. These are two prime areas to get these academic recommendations out in front of students.

The overarching goal is that we want students to be making informed decisions whenever they are choosing classes. Of course, that doesn't mean recommending the "sure thing" or "easy classes" to increase the chance of higher scores! That is not our intention at all. What we want is to identify the classes that are most important for the student's program and for the student to be successful.

You know, there's been a sea change in community colleges, from a sole emphasis on providing an access to education for all, to placing a much greater, additional emphasis on the students you have already admitted. We're really moving from looking at input, to looking at output. So with our student success software we are trying to provide good advice to students, while still preserving that traditional mandate for access. The systems we are building — the predictive analytics, the dashboard, and all the tools for student success — are created with all that in mind.

About the Author

Mary Grush is Editor and Conference Program Director, Campus Technology.

comments powered by Disqus