AI Tools in Education: Doing Less While Learning More

A Q&A with Mark Frydenberg

What do students need to know, to use AI tools effectively? How does AI affect assessment? What new skills will our graduates carry with them to their future employment? Here, Mark Frydenberg, a Distinguished Lecturer of Computer Information Systems and director of the CIS Sandbox at Bentley University, shares his experiences using AI in education.

AI generated image of students studying

Image generated by Dall-E

"AI tools can do things for our students, but, importantly, these tools can at the same time help students learn more." —Mark Frydenberg

Mary Grush: There are so many AI tools in use now… Are educators stepping up to try them in the classroom?

Mark Frydenberg: By now most of us have tried at least a few AI tools — such as ChatGPT, Dall-E, Codex, Microsoft's Copilot, and more — and we've been amazed with the results. These tools can write anything, from Python to poetry. They design artwork and write letters and reports in a fraction of the time that it would take us to do it ourselves. So yes, we are stepping up to try AI in the classroom.

Grush: As with many popular new technologies, students often "get there first." How are educators meeting the challenge of working with students who may be employing AI tools already?

Frydenberg: Many students have quickly figured out how AI tools can help them, and they have come to rely on these tools when working on resumes, cover letters, and assignments, often as a quick way to get their homework done.

As a result, we as educators find ourselves needing to create assignments in which students can use ChatGPT and other AI tools as a means — not an end — to doing their assigned work.

AI tools can do things for our students, but, importantly, these tools can at the same time help students learn more. I try to craft assignments for my students that will help them learn more and build upon topics they may already know something about. From there, we soon see that even the simplest of tasks require developing digital literacy and the critical thinking skills necessary to evaluate the results that AI tools provide.

Even the simplest of tasks require developing digital literacy and the critical thinking skills necessary to evaluate the results that AI tools provide.

Grush: So students begin to see that using AI tools in completing their assignments is more than just a "quick fix"?

Frydenberg: Yes.

Grush: Could you give me some real-life examples of assignments in what we might call the "do it for us" and the "learn more" categories?

Frydenberg: Sure. As an example of a "do things for us" assignment, I asked students to use ChatGPT to find three articles for a research paper on a given topic and prepare a short summary of each. The results may look promising and authentic, but do the journal articles actually exist? Did ChatGPT make them up? Students need the information literacy skills necessary to use library resources to validate their results.

In a Python coding course, for a "help us learn more" assignment, I asked students to use ChatGPT to generate a solution to a programming problem they had completed the previous week (on their own). They then had to write a report comparing the two different solutions. Which solution takes fewer lines of code? What are the test cases? Can you run the code to verify independently that the results are correct? What features of the programming language can you learn from the AI-generated code?

When used correctly, AI tools can improve critical thinking skills by taking the "grunt work" out of some tasks and allowing students to do more original work on their own.

Grush: As an educator, what was one of the projects using AI at Bentley that you found most interesting?

Frydenberg: This past spring we held a ChatGPT hackathon at Bentley. [www.bentley.edu/news/cashing-chatgpt] Students used the AI application to create code that generated currency trading strategies and then used TradingView, an application that would run their AI-generated code to provide visualizations of the models they described. They had to figure out how well their models performed, and then tweak their prompts (or the code) to improve upon the results. This activity showed how to use ChatGPT as a tool to accomplish a complex task. Students didn't need to know how to write the code themselves; they just needed to describe the models they wanted to evaluate in detailed enough language to generate code they could test.

Creating these kinds of opportunities that use AI to augment the learning process can allow students to accomplish, and learn from, tasks that previously were not readily possible. And this can have a long-lasting impact on the use of AI in higher education.

Grush: What about assessment, given an instructional environment where AI is present? How has AI affected assessment strategies that have been in place for years? Do we need to control when and where students are allowed to use AI tools?

Frydenberg: Colleges and universities have different policies around the use of ChatGPT and other AI tools, which generally range from "they are forbidden, don 't use them ever" to "you can use them as long as you document their use." The reality is that AI is here to stay, and students need to have the skills to use it to succeed in the workforce. And so we choose to embrace it, with reasonable restrictions, such as when taking exams or completing other graded assignments.

The reality is that AI is here to stay, and students need to have the skills to use it to succeed in the workforce.

We know that students are using AI tools. A recent study of 2,000 college students and 2,000 instructors published by Campus Technology [More than Half of Students Will Use AI Writing Tools Even if Prohibited by Their Institution -- Campus Technology ] showed that more than half of students will use AI writing tools even if not allowed by their institutions, while 71 percent of instructors hadn't even tried them yet. The challenge is to find ways that students can still use these tools legitimately for learning without the risk of committing academic integrity violations.

AI tools have required educators to rethink how to assess students to make sure they are really learning, and not just artificially intelligent. Students can become so reliant on ChatGPT that they will not develop the core competencies they need to succeed in college or demonstrate mastery of course concepts. Online exams, multiple choice questions, and take-home assignments or quizzes don't help, especially if students are prone to use AI tools to complete them.

Successful assessment in a world driven by AI will see a return to project-based learning, where AI tools can be contributors, but the student will still need to synthesize information gained from these tools to determine what is relevant and accurate, and incorporate that knowledge into their final learning products.

Successful assessment in a world driven by AI will see a return to project-based learning.

All this may require educators to shift from giving online tests or take-home exams (where ChatGPT is readily available), to shorter in-person quizzes, one-on-one conversations about students' work, or asking students to create short videos in which they describe their approach to solving a problem or the lessons they learned. Instead of grading the homework, grade a short activity in class that is based on it, to see what students have learned.

And beyond ChatGPT, AI is at the heart of adaptive environments that can provide personalized content based on individualized needs and skills. Students can ask questions and get answers in real time by interacting with chatbots that serve as personalized tutors. While this can't take the place of the one-on-one teacher-student or peer tutor-student relationship, it can provide additional support for learners needing individualized attention. One way to assess student learning is not by the number of correct answers, but by the time, effort, and quality of their conversations or prompts.

AI is at the heart of adaptive environments that can provide personalized content based on individualized needs and skills.

Grush: Is the use of AI tools in the classroom a factor in preparing your students for the workforce?

Frydenberg: Having ChatGPT as a skill on the resume can only set today's students apart from others. Developing their skills in prompt engineering, chatbot design, and generative AI techniques is now necessary to prepare the next generation of information workers for success in the workplace. 

Success will depend on having both technical skills and the human qualities that distinguish humans from machines. Students who can read, write, communicate, think creatively, adapt, evaluate, and synthesize information will be leaders in the new AI-driven economy.

[Editor's note: Image generated by Dall-E from the prompt "students learning with artificial intelligence tools — show nodes and connecting lines, colorful, watercolor style, futuristic, and smart." Courtesy Mark Frydenberg and Bentley University.]

Featured

  • pattern featuring interconnected lines, nodes, lock icons, and cogwheels

    Red Hat Enterprise Linux 9.5 Expands Automation, Security

    Open source solution provider Red Hat has introduced Red Hat Enterprise Linux (RHEL) 9.5, the latest version of its flagship Linux platform.

  • glowing lines connecting colorful nodes on a deep blue and black gradient background

    Juniper Launches AI-Native Networking and Security Management Platform

    Juniper Networks has introduced a new solution that integrates security and networking management under a unified cloud and artificial intelligence engine.

  • a digital lock symbol is cracked and breaking apart into dollar signs

    Ransomware Costs Schools Nearly $550,000 per Day of Downtime

    New data from cybersecurity research firm Comparitech quantifies the damage caused by ransomware attacks on educational institutions.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.