Open Menu Close Menu

Predictive Analytics | Feature

Page 2 of 3

Powering Student Success with Systemwide Data

The University System of Georgia (USG) has 31 public institutions, including powerhouses Georgia Tech, the University of Georgia, Georgia Regents and Georgia State, with individual budgets running up to a billion dollars. Many of the schools making up the extensive university system consume major IT systems -- registration, student information, virtual libraries, applications supporting the physical libraries, financials, student advising software and plenty of others -- that are run by Information Technology Services and centrally provisioned from USG's private clouds hosted in data centers around the state.

Now many of those same institutions are working with USG to introduce predictive analytics into their operations. The initiative is turning out to be one of the largest experiments in higher ed to apply predictive analytics to the ongoing challenge of retaining students and helping them to achieve their education goals. At the heart of this endeavor is USG's learning management system.

Investing in the LMS

In 2011, following an extensive evaluation process, a USG task force recommended Desire2Learn as its next learning management system. That recommendation was accepted by the state's Board of Regents, and eventually 30 of 31 institutions within the system moved over.

According to Vice Chancellor and CIO Curt Carver, two schools, which were moving faster than the rest of the system, requested to have the vendor host their implementations. However, 28 others adopted the version hosted on the private cloud run by the system IT organization. That represents 260,000 students generating about 50 million hits a day, he reported, for all types of courses -- online, traditional, flipped and MOOC. They consume about 42 terabytes of storage just for that one enterprise system. Students and faculty access that content through a variety of means, including mobile devices.

Carver noted that the amazing buy-in among institutions for the centrally hosted version of Desire2Learn resulted from the combination of a "good funding model" and "flawless" deployment. "The first year of system deployment, the availability was 100 percent," he said. "We never went down. Since then we've run at 99.99 percent system availability. It very seldom goes down. Because it always works, the faculty and the staff count on it. Then they can innovate. Once you've got that running flawlessly, people will start to invest in it and take advantage of it."

From "Body Count" to Intervention

Now, a new use is being written for the information maintained within that LMS: powering predictive analytics, the use of data to monitor and predict behaviors and recommend actions to achieve preferred outcomes. For example, analytics may indicate that a student who fails to show up two times during the first two weeks of class has a far greater chance of withdrawing or failing altogether. Knowing that, a school can put strategies in place to change the potential outcome.

That's a far cry from the way schools are used running their operations. "I'm an old military officer," Carver likes to say. "The older systems we had were great 'body count' systems. They could tell us how many students were dead at the end of the semester. You'd go run something in the [student information system] and figure out what happened in retrospect. What we'd rather do is figure out as it's going on and then be able to intervene so that students can successfully complete the course."

Carver likened this latest objective to a Gartner chart: "The dots are your students, and the dots move during the semester. The idea is to get all the kids up into the 'Magic Quadrant' and let them complete the course successfully."

Predictive analytics, he explained, "allows the instructors and support staff to intervene and do the advising, communications and support necessary to figure out what's going on with that individual student, so we can enable student success."

Success Models

The shift from managing student data and courses to predicting outcomes has been a few years in the making in Georgia. The members of the system come together on a regular basis to share what's working and what's not. One recent workshop offered a tech showcase, where three campuses from both within and outside the Georgia system talked about what they were doing to intervene to enable student success, recalled Carver.

First, Georgia State talked about its work with Education Advisory Board (EAB), a research and consulting firm that works with higher ed executives to address major performance challenges. According to a document about the project, in 2012 the institution went live with a Web-based "Graduation and Progression Success" (GPS) advising system that tracks 30,000 students nightly on more than 700 markers that identify academic risk factors. Some of those markers apply to all students; some to specific majors, each alerting advisers when a student has taken some action that puts him or her at risk. Since then the school has begun adding a parallel group of markers based on financial risk factors. The impact overall has been dramatic. In 2013, as a result of the GPS advising system and a number of other initiatives, the institutional graduation rate climbed 5.1 points over the previous two years, even as the student population grew in size, diversity and economic disadvantage.

Second, Valdosta State shared details of its partnerships with Oracle in building a "more traditional" business intelligence approach that culled data from numerous systems feeding a data warehouse to generate automatic alerts, triggers and events. For example, if a student is absent or withdraws from a course, the adviser and the academic success center will be notified to do follow-up. More recent efforts have begun doing information discovery by pulling data from multiple non-traditional sources -- including social media posts and blogs, verbatim text taken out of surveys and evaluations, engagement with campus Web sites and data from government agencies -- to create a complete picture of the student.

Third, Daytona State College detailed its work with Desire2Learn in the deployment of its analytics product, Desire2Learn Insights. The Florida school has been using the company's LMS since 2003 and launched a framework for measuring and reporting on outcomes in 2011.

Several Approaches for Doing Analytics

The motivation behind the discussions was to get possible buy-in from institutions to forge bigger pilots and test out the efficacy of each approach. The Oracle approach was the toughest sell, despite the fact that the technology was already in use within the university system. "We already own all the licensing," noted Carver. "It's not an issue of the licensing or even the hardware." The biggest challenge there: "We'd have to hire some people that know the Oracle business intelligence solution very well. The problem is those folks are just very expensive. We have not had additional institutions move in that direction yet."

However, there was ready acceptance for the other approaches. In fact, schools saw EAB and Desire2Learn as complementary. "EAB works by pulling information out of the SIS, and you get a certain level of granularity with that data. There's other data that's just going to be in the LMS," Carver pointed out. "If you're going in and you're trying to look at how much time they're spending accessing the content or interacting with other students via the discussion boards and with the instructor, you're only going to get that out of the LMS."

What has been "interesting" to watch is the dynamic of how institutions are making the decision regarding adoption of the EAB and Desire2Learn combination, he said. "What they've done is calculate the total cost of deploying the system, and gone back and said, 'How many students do we have to save to recoup that cost?' It's actually a pretty small number. It's been a fairly compelling case. And that's really the focus of the conversation among provosts: 'How many students do we have to save to break even?' When we do that calculation for their institution, they come back with, 'Well, how can we afford not to do this? It seems like a fairly low risk approach.'"

The EAB experience at Georgia State already has "hard results," said Carver, but the Desire2Learn analytics piece is still gelling. "We're at the point where we've seen it work at Daytona State. We've talked with other customers. We've stood up the hardware," he noted. In March, seven institutions were expected to begin participating in a pilot to test the analytics module. By July, the system will bring in additional institutions that are interested in using that model or the combination.

Typically, in these kinds of engagements, Carver's organization negotiates pricing with the vendor with multiple "tiers" of discounts based on how many colleges and universities participate. As the number of participants goes up, the cost goes down for everybody.

The integration of multiple analytics programs makes sense to Desire2Learn CEO John Baker: "Probably the most confusing piece in the marketplace today is the whole concept of people thinking there's one analytical package that's going to answer all questions. We used to think that way too 10 years ago. That's just not the case."

Now it's becoming more common for institutions to combine multiple analytics solutions across campus: some that support learning, some that support the "administrative side of the house" and a dashboard for senior administration. With that combination, he said, "I think we're going to see analytics really start to take off."

Changing the Formula

The initiative to improve student achievement didn’t come out of left field. The institutions are responding to a state initiative called "Complete College Georgia," put forth by Governor Nathan Deal in August 2011 that aims to produce 250,000 college graduates by 2020 to meet workforce needs. The effort lays out a series of steps for increasing access, retention and completion of degree programs in both the university system and the technical college system.

A concise report issued in December 2012 by a governor-appointed state commission, which included a number of institutional executives as well as members of the state legislatures, recommended adoption of a new funding formula to replace its existing one. In the previous formula, in place for 30-plus years, the systems received funding when students enrolled, whether or not they were successful afterwards. Under the new structure, schools are funded based on student achievement. Performance funding, as it's called, is afoot in numerous states in response to pressure, in part, to prove that public dollars are being well spent.

This performance-based formula is being gradually implemented starting in the fiscal year that begins July 1, 2014. The plan takes effect in the following year.

Making performance funding work requires access to timely and reliable data and the technology and business process improvements needed to improve student outcomes continually. This requires Carver and his IT organization to work as part of a much larger team, including other state agencies, institutional representatives and system office staff. By summertime they expect to understand the "institutional reaction" to that model. Working as a team, adjustments will be made so that Governor Deal's vision can be realized.

"Change management is one of the hardest parts of this," said Carver. "The key to that is not some new technology, but instead constant communications and building relationships and building trust around a common goal." From there, he declared, it's full steam ahead to "see what we can do to set up early success."

comments powered by Disqus