Campus Technology Insider Podcast July 2025

Listen: What Students Really Think About AI: Insights from WGU Labs

Rhea Kelly  00:00
Hello and welcome to the Campus Technology Insider podcast. I'm Rhea Kelly, editor in chief of Campus Technology, and your host. And I'm here with Stephanie Reeves, Senior Research Scientist at WGU Labs, to talk about a new survey revealing how Western Governors University students really feel about AI. Stephanie, welcome to the podcast!

Stephanie Reeves  00:29
Thank you so much. I'm excited to be here.

Rhea Kelly  00:33
So I guess let's start off, could you just give a brief overview of this survey, kind of what your goals were and your methodology?

Stephanie Reeves  00:40
Yeah, definitely. So our main goal of this survey was to understand how students are engaging with AI in their learning experience. So we wanted to know, you know, if they're using AI tools, how they feel about them, what they trust AI to do, or what they're skeptical of, and also whether there were kind of any equity gaps in how students are using and experiencing the tools. So we surveyed over 4,500 students in January and February of 2025. All of our students were from Western Governors University, and they were part of our Student Insights Council, which is a standing research panel, kind of designed to reflect the diversity of our student body and give us kind of access to research participants and kind of insights into the student experience. So students at WGU are, you know, often working adults, juggling jobs, family responsibilities. So I think they, you know, bring somewhat of a different perspective than some of the surveys that we see at kind of the traditional brick-and-mortar institutions. So just to say that, to provide some context for the survey results. So then we analyzed the survey overall, and then we also looked at some important demographic differences, like gender, race, ethnicity, what program of study students were enrolled in, and also whether students were the first in their families to attend college. So overall, the goal is to kind of give us a clearer picture of how students are experiencing and using AI, and to use those insights to kind of inform how we design AI-powered learning experiences for students.

Rhea Kelly  02:26
Did you find that students were pretty eager to share their thoughts about AI in general?

Stephanie Reeves  02:30
Yeah. So we got really good engagement on the survey, and we found that students were kind of really excited to kind of answer these questions and engage with these topics. So I think it's something that students really want to share their perspectives on. So that was that was great to see.

Rhea Kelly  02:48
So one of the first key takeaways in the survey report identifies a gender gap in students' confidence level using AI tools. Can you talk a bit more about that? Sounds very interesting.

Stephanie Reeves  03:00
Yeah, definitely. And I think this was one of the most striking findings for us, so it's kind of consistent with, you know, other findings that we've seen in other research, but it was still, you know, concerning to us, because we know that AI skills are becoming increasingly important in the workplace, so this kind of equity gap was, was really striking to us. So we found that men were about 12 percentage points more likely to than women to say they felt confident using AI, and they were also more likely to say that they were using AI in their studies. So we kind of looked into why this might be, and we wondered if it could just be due to the program of study that they're, they're in. So at WGU, our kind of STEM and IT fields tend to be, you know, more men are enrolled in those, whereas, you know, our healthcare education programs tend to have more women. But we actually controlled for kind of the program of study, and we, we still found evidence of this gap. But then we also saw some other patterns in our data. So we found that women were a little bit more skeptical of AI's benefits. They were more likely to express concerns about kind of the ethics. And that aligns with some other research that has shown that, you know, women are, by and large, a little bit less trusting of AI and more concerned about things like data security, other concerns, compared to men. So I think it just speaks to a need to kind of build programs to increase in AI literacy for all students, especially women, and also kind of focus on building kind of trust in AI through things like transparency, you know, AI ethics, and things like that.

Rhea Kelly  04:46
That's really interesting. I actually had wondered if it was reflecting sort of a STEM, like gender gap, but you're saying that really no, it's something else.

Stephanie Reeves  04:56
So I think that, that it's part of, that's part of it. So we controlled for their program of study, but, and we saw that, that that was part of it, but it didn't explain kind of the full picture. So I think, yeah, it just speaks to a need to kind of really dig into that and think about ways we can address that gap, especially as AI, you know, we know AI skills are becoming more important in the workplace.

Rhea Kelly  05:19
So what did you learn about students comfort level with specific uses of AI?

Stephanie Reeves  05:26
So we saw that students were broadly pretty positive about AI in general, but their comfort levels with AI varied on how it was used. What students seemed the most comfortable with was the ways that AI might be used to kind of provide a more personalized learning experience, often kind of using students' academic data to provide that. So kind of, you know, analyzing their academic data and suggesting learning support or resources to kind of help them improve, or kind of looking at their career interests, interests, and kind of offering guidance based on, on their career data. But we thought that they were less comfortable with uses of AI that were around things like providing social-emotional support, so things like counseling or mentoring, and also less comfortable with AI taking on more like evaluative roles like grading. So for example, 32% of students in our survey said that they believed AI would be beneficial for emotional support or mental health guidance, and then with that evaluation piece, just 35% said that they would trust AI to accurately evaluate their work. So overall, I think it shows that students are really enthusiastic about applications of AI that kind of provide a more personalized, effective learning experience. So things that tailor the experience to their unique needs, by recommending resources, tailoring instruction, providing guidance. But you know, they're, they're a little bit more hesitant towards some of the other use cases that I mentioned.

Rhea Kelly  07:16
So you could say they want, like, help me but don't judge me?

Stephanie Reeves  07:20
Yeah. I think that that's a good way to sum it up, like they want, they want the human oversight and some of the kind of more high-stakes decisions that happen in the learning experience.

Rhea Kelly  07:32
Yeah. So I think that this, the report, talks about, there was a difference in their perception of just AI-generated feedback versus AI grading. Can you talk about that difference?

Stephanie Reeves  07:44
Yeah, so that was a really clear distinction in the data. So we saw, for example, 58% were comfortable with getting feedback from AI tools, and 66 were comfortable with receiving real-time feedback on their assignments. And I think, you know, part of this might be due to the fact that this is really consistent with many of the ways that our students are already using AI. So they use tools like Grammarly to get feedback on their writing, you know, they might put early drafts of their work into ChatGPT to get feedback. So it's probably pretty consistent with, with those uses that that students are already kind of taking advantage of. But when we asked about some of the higher stakes decisions that I kind of referenced a little bit earlier, their trust in those, in AI for those uses dropped quite a bit. So like I said, you know, just 35% of students trusted AI to grade their work. We also saw that 36% believe that AI-generated assessments would evaluate their skills fairly. And so, you know, to me, I think this says that students are open to AI kind of augmenting their learning and helping them improve. They really want the human oversight in the kind of higher stakes evaluative decisions like grading. So I think that was a really interesting distinction that we saw.

Rhea Kelly  09:14
I know it wasn't part of the survey, but it would be interesting to find out how students' perceptions of AI grading compares to instructors perceptions of AI grading.

Stephanie Reeves  09:23
Yeah, and that is something that, yeah, we have not sought out our instructors' perspectives on these things, but I think that that would be a really interesting comparison

Rhea Kelly  09:36
Put it on the to-do list.

Stephanie Reeves  09:37
Yeah, definitely.

Rhea Kelly  09:40
So I wanted to ask about AI chat bots, because it's something I just, I feel like I hear about it, you know, everywhere. And kind of anecdotally, there seems like there's general agreement that students prefer interacting with a chat bot to get questions answered, as opposed to, like calling a university office. Is that something that maybe was borne out in the survey?

Stephanie Reeves  10:03
Well, we, so we did see that students do seem really comfortable with AI chat bots to kind of provide support, especially when it comes to things like, so 65% were comfortable with using chat bots for tutoring and learning support; 59% for career guidance. But their comfort seems to depend somewhat on how these tools are framed. So when we asked about kind of like an, if they'd be comfortable with an AI tutor, they were, they're kind of less comfortable with that. Only about a third said that they would be open to that. So there does seem to be kind of a framing difference. So what we kind of interpreted this as is, you know, when AI is presented to them as kind of a tool or service, like a chat bot, they're really open to that. But when it's framed as more like a replacement for a human support, like an AI tutor or AI mentor, they do seem less comfortable. So, you know, I think when it when it comes to like, if they prefer the chat bot versus like calling the university office, I think for those like, quick questions or on-demand support, like, they're really open to that, but they might be less comfortable when it comes to AI taking on the more like traditionally human roles. But, you know, I think it kind of highlights the need for institutions to be kind of thoughtful in how they frame and, and kind of roll out AI in, in support. So they, they want those human relationships, for kind of mentorship or more complex support, but for, for kind of ways to augment that support, or to answer the more transactional questions, it seems like they're really excited about the chat bot. And I think, like another thing that we saw is, like, students really like the idea of, kind of like the on-demand support that, you know, if you're just relying on humans, it's harder to kind of get like, it's harder to get kind of like 24/7 support, and I think that's where the kind of AI chat bots can really be beneficial. But I will say that we thought that the kind of findings around student support and chat bots and things like mentorship and guidance were really interesting. So it is something we want to kind of dig more into in a second survey that'll be coming out in about a month, where we kind of get more into those nuances of how students want AI to show up in their support experiences.

Rhea Kelly  12:32
Look forward to that. So I was struck by the importance of transparency for students, that like the vast majority, I think it was in the high 90% if I remember correctly, they want to know when they are interacting with AI. Could you talk more about that?

Stephanie Reeves  12:50
Yeah, so that was, I think, one of the biggest consensus points in our survey. So 92% of students who responded said that it's important to know when they are interacting with an AI. You know, the majority also said they want the ability to kind of opt out of AI-supported experiences. You know, they want to know how to access human support when they need it. And you know, they, they also kind of want AI-generated content to be to be clearly labeled. And so I think this also really important to note, because when we asked about, asked students whether they were, they knew when AI was being used in their learning experiences, we saw that almost half of our student sample said that they, they weren't sure. So I think this kind of, to me, spoke to a little bit of a disconnect. So on one hand, students seem to be telling us that transparency is really important to them, but, but right now, they might not be getting the level of transparency that they want. So I think, I think transparency is going to be really important for building trust in AI-powered learning experiences, which, you know, I think will be really essential for kind of successful AI integration and adoption.

Rhea Kelly  14:11
So were there any findings in the survey that surprised you?

Stephanie Reeves  14:15
Yeah, so, you know, honestly, I thought that students would be a little bit more kind of skeptical or cautious around some of the applications of AI that kind of use their personal data, but we saw that they were actually pretty comfortable with those use cases. So that would be the kind of applications of kind of using their academic data to provide learning materials or personal support, but our students were, by and large, pretty comfortable with those applications. And I think this could just reflect just how comfortable people in general are becoming with kind of technology that uses our data. So we know, you know, things like social media platforms use our data to show targeted ads, you know, other technology that kind of use our data to provide personalized recommendations on…. So I think it could just be, you know, kind of the, the level of comfort we have kind of established with our, the uses of our data in general. And I think this, this kind of level of comfort was definitely reflected in our results. I think the other finding that surprised me is just kind of a pace of adoption that we've seen in our AI surveys. So this one was our most recent survey, but we've been tracking student adoption of AI since the spring of 2023. And so in our first survey, we actually saw that less than half of our student sample had even heard of ChatGPT, and less than 10% said that they've been using it in their learning. And then this survey, you know, nearly all of our students have now heard of ChatGPT and other kind of AI tools, and then over 75% had used it in our, in their studies. So I think, you know, that's, that's a pretty rapid pace of kind of adoption, and I think it just speaks to how fast things are changing with AI as it's kind of getting integrated into education. And, yeah, I think it also speaks to kind of a need to continuously kind of track student experiences with AI, to kind of keep a pulse on how things are evolving over time.

Rhea Kelly  16:31
Yeah. I mean, it seems like it's likely to change very quickly. So what is your schedule on the surveys? Like, how often do you do it?

Stephanie Reeves  16:39
So we've been trying to do about kind of four surveys on different aspects of the learning experience and how AI shows up in the learning experience per year. We, you know, we, we just kind of established our Student Insights Council this past December. So this was kind of our first survey that we conducted with that Student Insights Council. So we hope to kind of track this on a really regular basis and kind of continue to keep a pulse on how things are changing. Because I think it's really critical that we listen to our students and, and what they're saying about how they want AI to show up and how they're using it, and also identify potential kind of equity issues that are emerging.

Rhea Kelly  17:23
Has the survey led to any sort of internal recommendations at WGU, like, are you all taking actions based on the results?

Stephanie Reeves  17:32
So we put out some recommendations just kind of for higher education leaders and institutions more generally in our report. And you know, our leaders are kind of aware of these findings. You know, I know at WGU Labs, we're using all the insights from these surveys to kind of think about how we design student-centered AI learning experiences, and also kind of where we want to conduct more research in the future.

Rhea Kelly  18:01
What do you think should be on basically every institution's to-do list around AI?

Stephanie Reeves  18:08
Yeah. So I think the big one for me is to focus, or a big one for me is focus on closing that gender gap in confidence and also usage of AI. So I think, you know, programs should kind of include courses and workshops to build AI literacy. And it should just, that should be across all programs, not just kind of the STEM or the tech fields. And so I think, you know, having more opportunities to kind of engage with the tools will help build the kind of confidence, and then I think also focusing on that trust piece will also, I think, help close some of those gaps. So we did see, you know, that women were a bit more skeptical of the benefits of AI. So I think focusing on that transparency and kind of AI ethics will be, will be really helpful for building that trust. I think we also saw that students were asking for uses of AI that provide the personalization in their learning experience. So I think that's, that should be a really big focus area for institutions. So, you know, I think prioritizing applications like tailored resource suggestions or kind of proactive support and offering personalized learning materials will be really helpful. I think it's also really critical for institutions to use AI to kind of expand but not replace, you know, traditional student support services. So I think that's an area where we can really think about how we can use AI to kind of free up some time for humans and staff members in those student support roles to focus on the kind of higher impact support and kind of relational support that they get from interacting with students versus, you know, spending a lot of time, you know, answering kind of more transactional questions and things like that. And then the last one, I think, is just getting back to that transparency piece. So students really want transparency in how AI is used in the learning experience. So I think it's important for institutions to kind of be clear and upfront about when AI is used, and, you know, make sure that students are aware of how they can get human help when they need it. And I think that transparency piece, as I said, is going to be really helpful for building trust and making sure that students are comfortable, kind of using the AI tools that are available for them.

Rhea Kelly  20:48
Thank you for joining us. I'm Rhea Kelly, and this was the Campus Technology Insider podcast. You can find us on the major podcast platforms or visit us online at campustechnology.com/podcast. Let us know what you think of this episode and what you'd like to hear in the future. Until next time.

Featured