Open Menu Close Menu

Artificial Intelligence

The State of AI in Higher Education

Both industry and higher ed experts see opportunities and risk, hype and reality with AI for teaching and learning.

artificial intelligence

Matthew Rascoff, associate vice provost for Digital Education and Innovation at Duke University, views the state of artificial intelligence in education as a proxy for the "promise and perils of ed tech writ large." As he noted in a recent panel discussion during the 2020 ASU+GSV conference, "On the one hand, you see edX getting more engagement using machine learning-driven nudges in courses, which is pretty amazing. But on the other hand, we have all these concerns about surveillance, bias and privacy when it comes to AI-driven proctoring."

Rascoff identified "something of a conflict between the way this stuff is built and the way it's implemented." In his role at Duke, he noted, "It's really hard to distinguish [in AI] what's real and what's not."

Rascoff joined other panelists in one of two sessions held back-to-back, examining the place of AI in learning: "AI Transforming Higher Learning" and "AI in Education, Hype vs. Reality."

"Robots Are Going to Rule the World"

Fears abound, noted several of the speakers. "People think that AI agents are coming for teachers' jobs, that robots are going to rule the world, and they're going to teach our kids, that the kids will love the robots more than humans. There's a lot of sci fi out there," summed up Bethanie Maples, product manager for Google AI. "In reality, digital agents are a really interesting way to supplement learning. We need more tutors. And if there aren't human tutors, there's a place for machine tutors to help progress individual learners, There [are] really cool things we can do with personalization using machine learning and especially with adaptive assessment."

Job fear is a biggie, acknowledged Stephanie Butler, product manager at ed tech company Turnitin. "We've moved from this world where technology takes basic rote work and automates it to a world in which technology is starting to encroach on intellectual work, on intellectual life and creative work. And you see that in many different places, not just education."

Likewise, on the teacher side, Butler added, especially in the Western tradition of higher education, there's a fear that AI could encroach on academic freedom, how educators "spend your time researching, what you publish, how you publish it, who you work with, and, of course, your content, the topics you decide are relevant for your students, how you craft the learning experience for them."

On the learner side, there are fears that AI systems could "track a learner from a very young age and have this personalized lifelong learning experience for them," suggested Maples. "The risk there is that they get put into a track that they can't escape. That's something that's very near and dear to all of us, especially to the American spirit, the concept of 'rebirth.' We believe that you can start over and that you can fail and then wake up the next day and shrug it off and maybe go found another company or start a new job." The problem arises, she explained, when learners can't delete their data "and say, 'I'm a new person today.'" For that reason, "it's extremely important that we allow for user data control, because we all change, and we have to build the allowance for change into this assessment."

Bias Fears

Fears about bias also can't be minimized, emphasized Sherry Coutu, founder of Workfinder, a UK-based company that matches young people with work projects. Anybody involved in an AI project needs to understand what assumptions are being made by the model in use. "There have been examples where you trained your algorithms on the wrong thing, [introducing] unintended bias so that if you were from a lower socioeconomic background, you would score lower."

Countering unintended bias requires bringing "disparate skills around the table, looking at it from a legal perspective, looking at it from a parent's perspective, looking at it from gender or other perspectives," Coutu said. "If I'm an investor, I'd want to know that they signed up to one of the standards there are for ethics, and that they're at least thinking about it. And you can tell in two seconds if anybody's thought about it or whether or not they're just sort of BSing. If we're going to use [AI], we have a moral obligation to make sure that we use it responsibly."

Sometimes AI Is the Answer — Except When It Isn't

Kurt VanLehn, the chair for effective education in STEM in the School of Computing, Informatics and Decision Systems Engineering at Arizona State University, knows how challenging it can be people to come up with examples of effective AI in education. Why? "Because learning is complicated. It occurs in a context," he said. "Oftentimes, when people import a new technology to that context where it's been successful, [it may not be] successful in a new one because there are a lot of things that have to come along with it."

VanLehn pointed to adaptive learning as an example of AI in action, which plenty of institutions, including Arizona State, have been experimenting with for some time. "Sometimes it works and sometimes it doesn't," he said. Even though his university has tried to make all of the freshman year courses adaptive, the problem is that while slow learners may "seem to prosper in an adaptive learning setting," that only happens "under certain conditions," which aren't necessarily "part of the advertising package that comes along with the software."

As VanLehn explained, while adaptive classes allow people to work on their studies at their own pace, for slow learners, the pace may be too slow to finish the class in the semester. "It creates a problem, which is that by the end of the semester, they've only gotten through two-thirds of the material," he said. To counteract that, the university introduced the concept of the "stretch class," allowing students to enroll in the same class again the next semester for free and finish it at their own pace, which is typically about two weeks. When the student is done, it's recorded in the transcript as finishing the previous course. "It all sounds good," VanLehn noted, until you find out that those same students can't enroll in the second course in the sequence before they've finished their stretch course. That immediately puts them behind. "The only way this is gonna work is we make the whole bloody university adaptive. Then it would work."

AI as Curator

Nuno Fernandes, president and CEO of Ilumno, an ed tech company in Latin America, isn't ready to count adaptive learning out yet, if only because adaptivity has worked in other industries, such as social platforms like Netflix and Amazon, to identify what could work best for the user, based on previous activities and preferred formats of curriculum.

That course content curation example would appeal to Anthony Salcito, vice president of education at Microsoft. "When you drive a car and AI is helping you navigate, avoid collision, and get you to the nearest gas station as you run out of gas," that isn't so different, he suggested, from helping educators to "check charts, graphs and make some assumptions on how to pull and curate content. AI should power a learner in a dynamic way that's not connected to a user intervention necessarily. That's not where we are currently. That's where we need to get to."

Current Uses for AI in Learning

Getting teachers to adopt AI rather than fear it, suggested Duke's Rascoli, is a matter of making sure they understand that the technology is there to augment their work, not replace them.

Rascoli offered three examples that have found adoption in higher ed.

One is Elsa Speak, a language accent improvement app built by immigrants who found it hard to lose their accents. The program listens to the speaker's accent and gives improvement advice and suggestions. Company co-founder Vu Van was a Vietnamese student with a strong accent who knew perfect English but had trouble pronouncing the English language as spoken in the United States in business school. Van built the app around her own challenge.

"That's the opposite of Alexa and Siri, which are trying to normalize everybody's accent and hear just one canonical speech. This is AI that's actually differentiating everybody based on their accents and trying to look for the differences and the variances between what you're saying and standard colloquial English," said Rascoli. "Elsa is a really powerful tool for augmenting the instructor. Improving accents is not something you typically get in a language class at the university. That's something a tutor might be able to give you at diverse points. But in a large class, you're not going to have that chance for practicing your accent and improving over and over again"

Another, Protopia, is a tool in use at Duke that's being piloted with its alumni. The software uses machine learning to read the profiles of students and alums. "Students can ask a Blue Devil alum for advice on any topic or professional interest, extracurricular or something they want to pursue," said Rascoli. "We mine the corpus of everything we know about our alumni, to find them the perfect connection to make a mentor match." It all runs through e-mail, so the AI behind the scenes is invisible. Currently, he said, "it's doing automated e-mail introductions for [180,000] alumni a month who are spread around the world [and] who have interests and expertise that cut across all pursuits." The use of AI enables those kinds of connections to be "democratized at scale."

Then there are the mundane uses for AI, such as handling the workflow of grading. Rather than having teaching assistants struggle to grade the whole exam for every student in a large-enrollment class or forcing faculty to only include Scantron-type multiple-choice questions, Rascoli observed, AI can tackle the automated grading, enabling the TAs to focus, as an example, on the short-form answers.

"That vision is clear to many of us from the field," he said. "The questions are, how do you get there, how do you take the steps and do it in a way that's consistent with our values, consistent with the things that we want to hold on to in higher education?"

Transformation — but Not in Our Lifetimes

As Ilumno's Fernandes asserted, AI won't "substitute for faculty in any of our lifetimes. What it will do is give us tools to work better and to complement what is being done by humans."

Of course, a challenge with adoption of AI in higher education is that the segment moves slowly. "It's a highly regulated environment, and that's fine. It's just what it is. And because it's so regulated, it's not that easy to transform," Fernandes said. "But I do believe that AI will eventually transform the learning experience, the teaching experience and, eventually, the economics of higher education." His timeframe for that to happen? "Maybe decades."

comments powered by Disqus