AI: Familiar Territory or Alien World?
A Q&A with Mark Frydenberg
Sometimes we feel right at home with AI; sometimes it seems like an alien world. (Image generated with AI; ChatGPT 4o.)
We've been here before. Each of us with a higher education career has lived through at least one, and possibly several periods when educators had to sort out how to move forward during the onset of disruptive technology: Google, social media in its many forms, Wikipedia, and others — even the hand-held calculator!
But now we have AI. Generative AI is different, if in no other way, by the sheer speed with which it has crept into almost every crevice in the ways we learn, work, and live. How can higher education keep up with, or preferably, stay ahead of the significant changes that come with AI? Can we still stay on familiar ground while AI makes some of us feel like we're in an alien world?
Mark Frydenberg is a distinguished lecturer of Computer Information Systems at Bentley University and founding director of Bentley's CIS Learning and Technology Sandbox. As a frequently featured speaker and widely published author he's explored virtually every aspect of new and emerging technologies for teaching and learning. Here, CT talks with Frydenberg to hear his latest thinking and practical advice on teaching with AI and surviving our latest and perhaps greatest disruption.
Mary Grush: Is there any one historical moment you've seen with technology change that is closest to the scale and nature of disruption we're seeing now with AI?
Mark Frydenberg: I remember teaching an introductory information technology course at Bentley where students used pocket PCs to access their e-mail, keep their calendars, and surf the web. That was back in 2004, and there were no iPhones. Facebook and other social media were in their infancy. Search engines were around and popular, albeit much more limited in functionality than what we have today.
Grush: What was the biggest disruption during that time?
Frydenberg: Well, prior to that time you always had to go to your computer to access e-mail, calendars, and the Internet. But finally in 2004 my students could use those resources anywhere that they could find wireless Internet access. I remember thinking that this was a game changer for education, as my students could access the world's information from a device they kept in their pockets. And of course, a few years later Apple introduced the iPhone and pretty much everyone was doing it.
Grush: So, is that scenario relatable to what we have now with AI?
Frydenberg: What we're seeing with generative AI is different. It's one thing to be able to look up information online; it's another to have AI generate knowledge by interpreting that information for you.
What we're seeing with generative AI is different. It's one thing to be able to look up information online; it's another to have AI generate knowledge by interpreting that information for you.
Grush: It sounds like now, with AI, we're talking about more than widespread access: At the same time that we have mobile access, the services that our students access are possibly several orders of magnitude more advanced than previous technologies.
Frydenberg: You could say that.
Grush: Still, are the issues surrounding AI today similar to your historical example?
Frydenberg: Many of the issues are the same — just the technologies are different.
Grush: So, continuing with your example from the past, what was the response to those issues?
Frydenberg: As the Internet became much more mainstream, educators focused on teaching digital literacy skills — how to use online resources effectively and responsibly, how to determine the reliability of information found online, as well as how the information we provide will be used or kept private.
Grush: Can we extend that model into the present, with AI issues?
Frydenberg: We can and we do. Now we teach students to develop these same literacy skills when using (or in the context of) AI: including a bit of how AI works, how to create effective prompts, what to use it for and when it's appropriate to do so — and more importantly, how to recognize when its results are biased or inaccurate. AI literacy is yet another skill to develop in the path that began with computer literacy in the 1950s. Later, we expanded the skill set to include information literacy and digital literacy, beginning with the widespread use of the World Wide Web; and technology literacy as our devices connected us to the Internet and to each other.
Grush: So, can learning from our experiences really help us out, or might we be truly stranded in an alien world?
Frydenberg: I don't think we're stranded in an alien world. We just need to learn how to navigate it better. And learning from our prior experiences with disruptive technologies will help.
I don't think we're stranded in an alien world. We just need to learn how to navigate it better. And learning from our prior experiences with disruptive technologies will help.
Grush: How would you characterize the rate of student adoption of AI tools?
Frydenberg: ChatGPT is a good example to give you. It was introduced in November 2022, and the following January, at the start of the spring semester, we asked first year students at our university how they use it. Nearly half of the students hadn't heard of it in January 2023, but by September of that year, only 7 percent hadn't heard of it, and only 1.5 percent had not heard of it as of January 2024. Also, the number of students using it for homework assignments has increased from 3 percent in January 2023 to just under 40 percent in January 2024. All this highlights how rapidly college students have adopted AI technologies.
Grush: How can you tell whether students are learning with the use of AI tools? How do they benefit?
Frydenberg: One of the best ways to tell if students are learning in the ChatGPT era is to change the way we assess their knowledge. Reliance on multiple choice quizzes to evaluate learning must be put behind us now that the Internet and AI are at our fingertips. Educators today need to embrace project-based learning and develop assignments that ChatGPT can't — yet — solve fully or easily. Educators may also talk more one-on-one with students to get a sense of what they are learning. In an introductory Python class I'm teaching in the fall, I'll replace two mid-semester quizzes with 3 or 4 short "check-in interviews" where I ask the students about code they wrote, or a project they completed. If they can't explain their work, they haven't fully grasped the concepts needed to complete it. By being able to explain concepts, students are able to tackle more challenging problems.
Grush: If an important goal is to use AI to help students learn more, we're focusing on serving students. Serving our students well means knowing them well. How would you characterize students in relation to technology?
Frydenberg: An article I read last year by Antonios Karampelas on Medium.com classified learners as analog, digital, and now, AI natives. Many of us who went to college before the era of PCs and cell phones purchased and carried printed textbooks around! Our teachers primarily lectured, and all our work was done by hand, on typewriters, and later, on computer terminals. But we were analog natives, with limited exposure to technology.
Today's students are different. Mark Prensky famously calls learners who grew up with electronic devices and the Internet "digital natives." They'll go to Google to do research and use Excel to analyze data. They learn through watching videos and most of their resources are digital.
With generative AI, we're seeing a new generation of students — AI natives, who are relying on AI tools not only to find information, but to help analyze it. They go to ChatGPT rather than Google to get information. The skill set is much different — these students need to know the right questions to ask, as well as how to analyze whether the results that AI generates are accurate.
With generative AI, we're seeing a new generation of students — AI natives, who are relying on AI tools not only to find information, but to help analyze it. They go to ChatGPT rather than Google to get information.
I can imagine a time soon when AI native students will be so comfortable with the technology, that it becomes their go-to tool of choice, similar to the way today's digital natives immediately turn to search engines.
Three generations of students are solving the same problems using different tools. First, there are those who work all by hand; then those who use technology to create their own solutions; and now, those who use technology to evaluate solutions created for them. In each case students are learning, but the skills they are developing are different.
Grush: What do we need to do or to manage, in order to make digital transformation at scale work, and survive the AI disruption? Who can help?
Frydenberg: Practically everyone involved in higher education is caught up in evaluating the new frontier enabled by the sudden and immediate rise of generative AI. Colleges and universities need to provide ongoing workshops and training so faculty can learn the best ways to integrate AI tools into their classrooms. At the same time, institutions and educators need to be transparent on how they use AI and how students can use it in their assignments (though policies and guidelines will likely differ across courses). By mixing opportunities to use AI with traditional teaching methods and leveraging AI's abilities to provide personalized learning for students, higher education can keep up with the rapidly changing developments as the uses and capabilities of AI continue to evolve.
By mixing opportunities to use AI with traditional teaching methods and leveraging AI's abilities to provide personalized learning for students, higher education can keep up with the rapidly changing developments as the uses and capabilities of AI continue to evolve.
Grush: Are there some good "go-to" resources or leadership programs appearing that could help institutions come on board with AI?
Frydenberg: The generative AI tools we commonly use have been generally available for only about 18 months now, and practically every institution is grappling with how to incorporate their use for teaching and learning. I'll suggest a few resources that can help get you started.
To begin, creating clear policies on syllabi and assignments will help set appropriate expectations. Several examples are available online. In particular, these examples may help you understand issues related to data privacy so you can raise awareness about how these tools use information supplied to them.
Several universities and companies have made online courses about different aspects of AI available at no cost. Check out the MIT Open Courseware offering on AI; Microsoft's AI education link and their AI for Beginners course; and Google's AI link on Coursera. These are just a few examples of what's out there. Search online and you'll find more, including specialized courses on AI in Healthcare and other disciplines.
Several universities and companies have made online courses about different aspects of AI available at no cost.
Two colleagues at Bentley University host an AI in Academia Podcast where they invite faculty and administrators to discuss topics that shape teaching, learning, and research related to artificial intelligence.
Grush: Those are good resources for educators to check out while they ponder their own AI journeys. Meanwhile, AI is finding its way into higher education…
Frydenberg: Yes, AI is finding its way into higher education teaching and learning! Here are just a few highlights:
Learners can converse with AI tools as if they were chatting with a tutor or their instructor about discipline-specific knowledge, and the output that students receive is often well-structured and specific. The downside is that AI doesn't always know what the student doesn't know, and may suggest solutions using techniques that students haven't learned yet in class, so be careful!
AI is good at creating outlines and summarizing information. Recently I was recording a short video for an online class I'm teaching. After I completed the recording, I pasted the transcript into ChatGPT. I asked ChatGPT to generate an outline with at most five topics and four items under each, that I could use as the basis for a PowerPoint presentation to accompany my video. This gave me something to start with as I edited the slides and added images and diagrams.
AI is also good at creating lesson plans once you identify the topic, time allowed, types of activities to include, and so forth. It's also pretty good at creating discussion questions and prompts, and examples that can be used for teaching. Best results occur when prompts are specific: Tell ChatGPT who you are, who your students are, what the course is, concepts they might know already, and what your objectives are for the lesson.
I've read reports and spoken to some educators who have used AI tools to grade student homework. I'm not quite ready to do that, although I have tried creating a rubric and asking AI tools to evaluate term papers based on it after I graded them first. For the handful that I chose at random, I had AI grade each paper three times. I took the average, and in most cases, the results were within a half-grade level of mine.
Grush: Those are pretty inspiring highlights. But there is so much in those scenarios that you do as a teacher, that I'm thinking probably only you can do. Or, could you be replaced by AI? Assisted, maybe, but certainly not replaced…
Frydenberg: AI tools won't replace teachers, but they will require teachers to think creatively about how they teach in partnership with AI. AI tools can save teachers time in managing and creating course materials and provide students with more individualized learning experiences.
AI tools won't replace teachers, but they will require teachers to think creatively about how they teach in partnership with AI.
Given all that, educators must have the skills to use AI tools effectively and recognize the capabilities that AI provides, as they create new learning experiences for students.