The Intersection of Learning Analytics and Openness

A Q&A with Josh Baron on the Open Learning Analytics summit

Learning analytics is at the cusp of becoming a mainstream technology at many of our institutions. With the thought in mind that it may become commonplace on campuses within the next 3-5 years, a group of campus leaders in learning analytics recently organized an Open Learning Analytics summit — a new networking vehicle to explore the future of learning analytics, particularly in relation to open technologies and strategies.

The first summit meeting was held this spring in Indianapolis, following LAK 2014. One of the facilitators of the summit, Marist College Senior Academic Technology Officer Josh Baron, shared his biggest concern as institutions reach for effective learning analytics: "If we remain on the path we're on right now, where openness isn't really a top priority within the learning analytics field, I do wonder if we will indeed realize that goal." How close to ubiquitous learning analytics becomes on our campuses — and the impact it will have on learning — Baron maintains, depends on openness. Without it, he says, we may see some pockets of interesting things going on, but with a culture of openness we can realize the full potential of widespread learning analytics.

Mary Grush: How did the Open Learning Analytics summit come about? What was the thinking behind organizing a summit that would consider openness in the emerging field of learning analytics?

Josh Baron: Groups like Apereo and SoLAR — and many others — have been looking at the importance of openness within learning analytics. The summit was a result of a number of people, within several of these groups, realizing that we've all been having separate conversations and we really need to come together to connect with each other.

Grush: What were the key priorities for the summit?

Baron: First, to define and better understand what we mean by openness in learning analytics, and then to begin to collaborate across communities in areas where we might see mutual benefit.

Grush: Who stepped up to spearhead the first summit meeting?

Baron: SoLAR, the University of Wisconsin, and Marist College helped fund this first, 2-day summit, with George Siemens (University of Texas-Arlington), Kim Arnold (University of Wisconsin-Madison), and myself doing much of the initial organization and facilitation. But the three of us didn't "run" the summit — it was truly a collaboration across different communities.

Grush: Do we really need another group to form in the learning analytics space? What sets this one apart?

Baron: One of the big motivators for creating this summit was the realization that we already have so many groups, and there are indeed many siloed activities going on. We certainly don't want to create another silo. So, rather than forming another organization, our goal is to create a network between existing organizations, trying to connect these siloed efforts, pockets of innovation, and other research work — so that we are all better aware of and can leverage each other's work. And we'd like to put together more of a unified front around the importance of openness within learning analytics.

Grush: What were some of the discussion points that came out of this first summit meeting?

Baron: We focused on refining our thinking about what we really mean by open learning analytics, and we tried to identify the domains that fall under that umbrella. In many of our minds, open learning analytics represents the intersection between learning analytics and open learning, open technologies, and open research. So it's the intersection between learning analytics and several other fields that have existed for upwards of a decade within higher education.

Open technologies — specifically things like open standards and APIs, and open source software — would be a good example of an area people are seeing some importance around. If we ultimately want to improve teaching and learning by leveraging big data, then we need big data. So we need ways of standardizing processes to securely capture that data from a whole range of systems and be able to bring them together in a single repository. We need this to be able to tap into the full potential that learning analytics represents. On the other hand, if all the data in higher education ends up remaining in silos — in the learning management system, in the student information system, in the library system — that will vastly complicate the work of bringing that data into a single repository. But if we can agree upon some common standards — which are already emerging, like the IMS Caliper project or ADL's Experience API and Learning Record Store (LRS) — then we can bring data into a single repository, and that would be really powerful.

Grush: When you mention a "single repository," do you mean some grand repository of everything?

Baron: No. I think the idea that floated around a few years ago, of having an international repository that would have data from all institutions, is now not considered very practical — most people have moved away from that specific goal. I think what the standards that are emerging right now are poised to support is the idea of institutional repositories — with formal and informal data about student learning experiences. The other part of what emerging standards aim to facilitate is that these institutional repositories could share data between them. For example, we have had conversations in the learning analytics community about having a research cloud data repository, to share data in a secure, anonymized way, to support learning research. The repository could include data from a wide range of institutions, offering researchers a vastly larger data set [than what they would have at their own institution].

Grush: I think you were about to mention another example of the summit discussions…

Baron: Another good example is the whole area around open data sets and open data models. This is an area where there hasn't been as much work, but there are examples. You may be familiar with the work John Stamper and others have done at the Pittsburgh Science of Learning Center's DataShop Project — where he has made anonymized data sets available for researchers so they may further their research in the learning analytics field. And the Open Academic Analytics Initiative (OAAI), led here at Marist, was one of the first to release a predictive model for our early alert system under an open license. Everyone can view how that model works, enhance the model, and contribute those enhancements back.

So, while there hasn't been as much work yet [in open data sets and open data models] as in, say, open education software, open data sets and models can be considered equally important in terms of openness in learning analytics. If our data models remain proprietary to vendors, and we can't peek into them, that creates a barrier to doing research to further the field, and it even creates practical problems for the institution in terms of understanding how systems work — imagine trying to answer a student's question about why they received an alert if the institution doesn't have full visibility into to the data model! A lack of openness is just not going to cut it for institutions in the future: Being able to understand how these models work will become very important to institutions that will be making important decisions based on how they work.

Grush: Is vendor software going to create a problem in itself?

Baron: That's a good point to bring up. Nobody involved in this work is anti-commercial. In fact, one of the statements coming out of the summit was that everyone saw great value in having a commercial ecosystem grow up around providing learning analytics. The concern is around any potential lack of openness, for reasons I've mentioned.

Grush: Broadly, how can the work of the Open Learning Analytics summit benefit the student learning experience in the future?

Baron: When you look across the learning analytics landscape today, you'll see analytical tools being built into different systems. Your learning management system might have some analytics built into it, your student information system might use some analytics, your assessment platform may use analytics. But the data may be siloed within those different systems and pretty much locked up for good if there is no openness.

In the future, institutional repositories built on open standards will eventually facilitate a lot of learning analytics that, for now, simply aren't taking place at all. If the OLA "movement" is a success we will unlock the full potential learning analytics holds — not just to improve learning outcomes and reduce costs, but to truly transform education in ways that empower learners to innovate their own learning experiences.



Featured