Open Menu Close Menu

Transcript

Campus Technology Insider Podcast July 2024

Listen: New ED Guidelines for Designing Trustworthy AI Tools in Education

Rhea Kelly  00:08
Hello and welcome to the Campus Technology Insider podcast. I'm Rhea Kelly, editor in chief of Campus Technology, and your host.

The United States Department of Education recently released a new report called "Designing for Education with Artificial Intelligence: An Essential Guide for Developers." The guide seeks to inform ed tech developers as they create AI products and services for use in education — and help them work toward AI safety, security, and trust. For this episode of the podcast, I spoke with Kevin Johnstun, education program specialist in ED's Office of Educational Technology, about the ins and outs of the report and what it means for education institutions. Here's our chat.

Hi Kevin, welcome to the podcast.

Kevin Johnstun  01:00
Thanks for having me.

Rhea Kelly  01:02
So to start off, I thought the best thing to do would be to have you introduce yourself and your role at the Department of Education.

Kevin Johnstun  01:08
Yeah, happy to. So I'm Kevin Johnston. I have been with the department for about five years, and I work in our Office of Educational Technology, which is kind of a cool shop in the department because we get to write reports about the future of teaching and learning. And so right now, I am now co-leading our AI team, which means I get to spend all day, every day reading and writing about AI, which is a lot of fun.

Rhea Kelly  01:39
Yeah, I actually feel like I also spend all day reading and writing about AI. So the ED recently came out with a report called Designing for Education with Artificial Intelligence. Could you provide kind of a top-level overview of the guide, kind of what it's all about, what it's for, anything like that?

Kevin Johnstun  01:58
Yeah, very happy to. So we're very excited about this report. We think it really speaks to a core constituency and a core issue right now, which is, obviously a lot of people are really excited about AI. And they're trying to figure out how they can use it to improve their products, they're trying to figure out how they can use it to enhance education opportunity, right? And so we're obviously picking up on that, and we wanted to write a report that would help guide developers in: What does it mean to channel that excitement into responsible innovation, into edtech products that are, you know, embody the values that we have in an education system, and still use this cutting-edge technology? When we envisioned our kind of core audience here, we were really thinking about people who were managing product development teams. So that could be the CEO of a small business who has like one product that they're trying to bring to market, or it could be a product manager in a much larger firm who's working with a development team. We really wanted to write in a way that they could, like, see the recommendations and see them as kind of the big rocks that they needed to build into their design and development process and help their team kind of conceptualize around that.

Rhea Kelly  03:17
I know that a few years ago, the Department of Education had a first report that was sort of an overview of all things AI in education. So how does this build on that guide?

Kevin Johnstun  03:28
Yeah, thanks for asking. I mean, so that report, it was so funny. It came out May of 2023, and people were like, "Whoa, this is so timely. How did the department come up with this so quickly?" And the answer is, we started about four years earlier. And so the department had been tracking AI in education, you know, for a long time, and we had been, you know, thinking about it and writing about it. And what the, that report was, was really to think about all the, going back, you know, all the way back to, like, the 1960s, 1980s, of like seeing AI in terms of automated decision-making make its way into the classroom, and then tracing that all the way up to the present moment. Had several kind of core recommendations. And so what this is, is kind of saying, like, hey, we had that, and it's like, traces all this history and helps people really situate themselves. Now let's center ourselves in this moment and say, how do we move forward in a responsible way? So that AI report had a kind of central metaphor that was like, AI should be an electric bicycle and not a robot vacuum. Which is, this should help you do what you want to do faster and better, and, you know, more efficiently, but with you completely in the driver's seat all the way. And so with the tech developers guide, we're saying, not only should it be an electric bicycle, it should be a safe one, and one that people have insurance, assurances about the quality, and that, you know, applies for local road laws, and things like that. And so we're really trying to help push that metaphor a little further for people and say, how do we do this the right way?

Rhea Kelly  05:07
It strikes me that that first report, I mean, that you started it before the big explosion in generative AI. And what was it like for that to throw a wrench into everything you were thinking and writing about?

Kevin Johnstun  05:19
Well, I mean, if you understand, I mean, I think it helped us to situate the explosion in a trajectory, right? So it was like, we had been tracking and seeing the ways in which, you know, branching had given way to kind of deep learning. And we were like, okay, so we were tracking that kind of thing, and then, you know, LLMs are in a similar kind of family, and we were like, could see the ways in which one thing had led to another. But it also, I think, helped us to help the community think outside of just LLMs, because they were not the only technology that has seen tremendous progress in the last several years, and we knew that because we had been tracking the family of technologies that exist under AI.

Rhea Kelly  06:09
That's super interesting. So the new report, one of the things that comes through really strongly is an emphasis on a shared responsibility for building trust in AI tools. So can you talk more about the key issues there?

Kevin Johnstun  06:24
Yeah, so I think it's really important to think about an ed tech tool as entering an ecosystem. That is, it's not just about kind of dropping it into a classroom, but those classrooms are supported by a whole bunch of figures, including educators, but also district administrators and state folks and even federal folks. And so what we wanted to say was, ultimately, there's a shared responsibility amongst all of those members in the ecosystem to make sure that this, at the end of the day, delivers a great service to kids, or to learners in the higher ed, higher ed setting. And so we really wanted to push developers' thinking on like, what are my responsibilities and what are the responsibilities that I share? And in cases where I share those responsibilities, how am I supporting people in executing on those responsibilities? So one of the things is like, are you building AI literacy among the people who you're working with? Because this might be, even though it's not new to you if you're an experienced AI or ML developer, it's going to be new to them. So how are you helping them understand, how are you helping them make informed decisions in the process of approaching the implementation of AI technology?

Rhea Kelly  07:44
Could you maybe walk through the main recommendations for developers, sort of at a high level?

Kevin Johnstun  07:51
Yeah, happy to. So in the report, we have this kind of like circles thing. It's gonna be hard to describe on a podcast, but this is a good reason for why you should go look at the report, so you can see the circles. But the, has, you know, a series of overlapping circles, and on the first one, it says, like, ultimately, these have to be designed for education. As we know, AI is a broad-use technology. In fact, it may be one of the most broad, like broad-use technologies. And so what we want to think about is, how are you really making sure that whatever AI system you're using is purpose built for an education context? That means a lot of things, but one of the things I would particularly draw attention to is it means working with educators in the design and development of education materials. So these things that, like, the platforms have to be able to integrate into classrooms. Educators are still going to, educators are still going to ride their e-bike, right? They're still going to be in charge. So we've got to work with them to make sure that it can work and that it's built for the right kind of classroom environment. And then also, we really wanted to make sure that people knew and were grounded in the literature around modern learning principles. This is not something that we're just expecting AI systems to drop knowledge into learners' heads, but that learning is socially situated, that it's context driven, you know, that it's best when it's, when it's authentic, when you have opportunities for low-stakes formative assessment, all that kind of stuff. And so we really wanted to make sure that we were saying, hey, if you're a new entrant especially, take some time familiarize yourself with the education literature so that you can build that into your tool, rather than just doing what's perhaps the most convenient thing with the set of technologies. Then in the middle, there's kind of these three overlapping pieces that we think are kind of the core of how you actually get to earning trust. And that's providing evidence, so that's both the evidence of why you think this is going to work, what, you know, pilot studies or other studies are you drawing on to say this is the right way to go, but then also that you have a plan for how you're going to build evidence as you move along, and how are you going to help people know that this product is, in fact, improving learning experiences, and for whom and under what conditions is it doing that. And then there's the safety and security piece, which I think is just absolutely crucial for developers to understand. And this is evolving. We understand that it's evolving. As new tools emerge, they have different vulnerabilities, and so it's really important that developers, you know, obviously stay on top of those and that they're building in products that we know are going, and they have assurances for folks that they know that these products are going to be safe and reliable, and they're not going to have, you know, vulnerabilities, both for the larger infrastructure, but also kind of in terms of, like toxic outputs from things like, and things like that. And then there's the third one, which is advancing and protecting civil rights. So anyone who's followed the history of AI, you know, knows that absolutely they can be programmed with algorithmic bias in them, and we have to make sure that we're clear that civil rights laws absolutely apply to education settings and that people are thinking about that front and center. I think it's important here to offer that we're not offering any new rules here. This is simply kind of helping people think about some of the things that already exist, and helping them kind of start that process of, what does this mean? And, but absolutely, it's crucial that we protect civil rights and that we advance equity with these tools. And then at the end of this process, once you've designed for education, you do the middle three things, we think you have a chance to really be transparent about what you've done and to ultimately earn trust in doing that. And so working closely with your stakeholders so that they can know what it is that's going to happen, and they can know why they're, you know, how their interests were secured. And then ultimately, you can, you can have a kind of trust throughout the ecosystem.

Rhea Kelly  12:17
What do you think the state of the industry is in terms of meeting all those recommendations? Are companies making progress toward AI trust, or do you feel like they have a long way to go?

Kevin Johnstun  12:30
So I think the industry is changing. So it's hard to answer this question in terms of like, what is the state of the industry, because we're seeing a lot of new entrants into the ed tech space with kind of this generative AI moment. And so it's hard to know for sure exactly what it is, but we do, we are seeing examples of folks who are, you know, putting out things that are showing the ways in which they're trying to be transparent about what they're doing and how they're building in safety, and, and, you know, other kinds of considerations, such as evidence. But yeah, I think right now, it's, it's really big in the K-12 space — we have outside data that we didn't produce that indicates that, you know, the average district is using over a thousand different ed tech tools. And so it's a lot of different actors, and it's hard to kind of figure out exactly where they're all, they're all at.

Rhea Kelly  13:28
Yeah, I have heard from districts where, you know, they're struggling to standardize, or just at least cull down the number of, you know, tools for the same purpose that are in use. Because it's hard for IT to support so many different things at once.

Kevin Johnstun  13:43
Yeah, absolutely. And it's hard to get kind of quality assurances on each one of those different things.

Rhea Kelly  13:49
Yeah, yeah, and cybersecurity issues as well. So although this guide is geared toward the ed tech developers, are there things that educational institutions can learn from the recommendations? Because, like to me, it seems like it could translate well into, let's say, questions that, you know, any university should ask of their ed tech providers, or, or even just a set of expectations to guide their technology decision-making.

Kevin Johnstun  14:18
Yeah, for sure. So you can think about this and what comes next in the, kind of, from the Office of Ed Tech, it's kind of a classic kind of supply and demand piece here. So we've talked developers, about what they need to do, but of course, there's a, there's a demand side of that, that also is kind of mirrored in what we're saying to developers. And we're actually in the process of building an educator toolkit and setting up some guidance for higher education institutions as well through a higher ed brief. And so those will kind of clarify specifically what is the, in the purview of education institutions or districts or schools. But it's not hard to see some of these, these parallels. So if we're asking developers to make sure they have a plan for how they're going to build evidence, then schools are going to need a plan for how they can help developers get the data they need to build evidence. And then likewise, if it's safety and security that we're thinking about, schools need to be prepared to ask the right questions of developers so that they can get the assurances they need in terms of safety and security. And then on civil rights, very much so we, you know, we're saying developers, you need to be aware of civil rights responsibilities. But in many cases, those civil rights responsibilities fall on the schools to, ultimately, you know, protect the civil rights of their students. So you can see all of the kind of mirrors that are are happening here. And that's why one of the shared responsibilities that I mentioned is the chance for ed tech developers to build capacity, help work with schools to build capacity so that they are short, you know, helping to bring them along, so that both of them can ultimately work together better down the line.

Rhea Kelly  16:21
What are some next steps for your department? I mean, I'm kind of wondering what your approach is in terms of updating these reports. Is there a timeline? Because the technology changes, it seems like by the second, so what, you know, what are you guys looking at doing to keep up?

Kevin Johnstun  16:38
Yeah, so I think we have a couple of things. So one, in this particular report, we've tried to write at just a high enough level that we're not getting undercut by the next, you know, model release. And so, you know, the things like safety and evidence, they apply no matter what kind of model you're using. Some of the stuff we have tried to speak to what we're seeing in terms of trends in, in the ecosystem. And so that's like a place where we would, you know, we might need to be thinking about how to update on those trends. And then I think one of the biggest things is we want to take this show on the road. Like we want to get to places where we can talk with developers and use the framework that we have in place to hear from them, to decide what it is that they, you know, need, and then we can continue to release, you know, either addendums or, you know, companion resources, or things like that. But it's really important for us, and even more important is that we get in spaces where we have both developers and institution-side folks together, and then we can hear from both of them together. And that really is a powerful mix for us in terms of being able to provide technical assistance and resources that meet the needs of the field. So if you are a developer or institution folk, and you want to hear from us or want to work with us, you know, like we're all for it. That's one of our main responsibilities in this office, is to do technical assistance and to, to reach out and help convene folks and bring them together. And so over the next several months that's going to be, you know, a big part of the rollout of this guide is getting in spaces where we can do that.

Rhea Kelly  18:27
Yeah. I would love to see some events, you know, out there at different universities, to have both sides come together and talk about these important issues.

Kevin Johnstun  18:36
Yeah, for sure.

Rhea Kelly  18:38
So you mentioned a couple of new resources that are coming down, you know, on the horizon, including a guide for higher ed. Can you talk, give me any more of a sneak preview on those, or when, when, you know, what the timeline might be for those?

Kevin Johnstun  18:52
Yeah. I mean, I think one of the things that's really exciting about these resources, and this is something that OET does with many of its resources, in fact, we did it with the developer's guide too, is we gathered extensive input from the community in order to inform our recommendations. So with the developer's guide, we held a public listening session with over 200 developers on the line. We met with small groups of developers of all kinds of sizes, from small businesses to people who are working in research or institution-like kind of higher ed settings, and really tried to get them engaged in helping us understand where they were headed and what they were doing. And what's exciting about the guide that's coming is we've done the same kind of thing for higher ed–specific. So we had roundtables with about 40 different institutions, leaders from centers of teaching and learning, from faculty representatives, from union leaders, to come together and inform that. And then we had a very interesting event where we actually were convened at the White House with about 10 university presidents and representatives from another 30 institutions, as well as, you know, major nonprofits in the higher ed sector. And so all of that input is being taken and put into that brief, as well as a kind of literature scan, so looking at what does the academic literature say about what is most effective, what are some real opportunities. And so obviously we're still in the process of work, working through it and clearing it, but I'm very excited for the input that went into it, and I'm hoping that it will reflect that back to the field and move the conversation forward.

Rhea Kelly  20:37
That's that's exciting. I can't wait to see that, the fruits of all of that labor.

Kevin Johnstun  20:44
Yeah, it'll be great.

Rhea Kelly  20:48
Thank you for joining us. I'm Rhea Kelly, and this was the Campus Technology Insider podcast. You can find us on the major podcast platforms or visit us online at campustechnology.com/podcast. Let us know what you think of this episode and what you'd like to hear in the future. Until next time.

comments powered by Disqus