Campus Technology Insider Podcast May 2025

Listen: Leadership in the Age of AI: Tech Tactics in Education

Rhea Kelly  00:00
Hello and welcome to the Campus Technology Insider podcast. I'm Rhea Kelly, editor in chief of Campus Technology, and your host.

As AI reshapes the world at an unprecedented pace, what does that mean for technology leadership in education? Does AI redefine what it takes to be a great leader, or does it amplify timeless leadership principles? David Weil, Vice President and CIO for Information Technology & Analytics at Ithaca College, recently explored these questions and more in the opening keynote for our May 2025 Tech Tactics in Education conference. For this episode of the podcast, we're presenting the recording of his session. I'll let Dave take it away from here.

David Weil  00:56
So, want to talk about leadership, and leadership in particular in the age of AI. So it's sort of bringing together two areas that I'm really excited about. And so let's talk about this. You know, a lot about leadership is about moments, moments in which we have an opportunity to help guide conversations, to set direction, to really connect with one another. And so if we think about our journey as a series of moments, some of the moments are small. You know, we have, we're having a moment right now. Thank you for joining me for this moment. But you have moments that are good, some moments that may not be as good, and you know, our, just our time is filled with these moments. And as we think about the moments, some are really significant and personal, like when I got married to my wife, had our boys, just hanging out on a lake and having some fun when my younger son graduated college, or spending time with a pet, or even going to the dentist. These are all moments that we have, and they're opportunities for leadership, especially if you think about the moments that occur in our work environment. And so, you know, there could be one-on-one meetings, we could have a promotion, or we have a pandemic, and all of these impact us and those that we work with. Well, I want to focus on this particular moment, which was fairly significant and impacted a lot of people. So this moment was November, late November 2022, and you may remember it. It was the start of, well, not really the start, but it was when ChatGPT 3.5 came on the scene and sort of changed the way we think about AI, and in many ways, ushered in this new age of artificial intelligence. And if we think about artificial intelligence, and you know, this period in which we are living, it's really a period of rapid changes, lots of opportunities and risks. There's a good degree of uncertainty. You know, that fear of the unknown of what's happening, and separating fact from fiction, as we'll talk about in a few minutes. You know, especially back in November 2022 there was a lot of, you know, fiction or fact, and you know, things were happening. Also, the hype cycle. You know, a lot was said, AI is changing the world or will impact us this way, that way, and a wild west of options. So leadership in this age is really about, how do we guide people, and how do we navigate this and help our institutions? And that's what I want to explore for the next 30 or 40 minutes with you all. So I've, I'll, gonna organize a lot of these thoughts as six lessons, so to speak. And for these lessons, I will talk about some of the AI journey here at Ithaca College as a framework so that we can talk about this. So that's what we're going to do. I hope that's something that you're interested in, because we're going to spend a little bit of time together on this. So with that in mind, let's dive right into what I'm calling the first lesson in about leadership in the age of AI. So if we think back to November 2022, and maybe your, you know, first exposure to ChatGPT 3.5 in this new era that we were entering into, you know, your reaction might have been, hmm, this sounds interesting. I first learned about it through this New York Times article that came out December 5th, called "The Brilliance and the Weirdness of ChatGPT." And that was really the first time that I started learning about it and experimenting with it. And this was being shared by my leadership team. They were passing the article around and they were saying, hey, you need to try this out and see what's there. So we use Teams, but whatever instrument you use, you know, you may have had this happen where you work, where people are sort of sharing articles, and hey, I was experimenting with it on this computer, and I was able to do this type of thing, or, I wonder what it means for our students in computer science, this could do the homework for them. Someone, you know, just asked it, well, what are the ethical considerations and stuff? So there's a lot of chatter going on about ChatGPT and its implications. So within our division, like many divisions, you know, we have regular meetings. And so 15 days later, about 20 days later, we were having our regular December meeting, and that's where we really started talking about ChatGPT, and what did this mean. And we showed examples of how we had been experimenting with it to sort of get people to start thinking about it. And this is a division of about 65 people. And again, you may have been doing this on your campus. So what I did after these conversations is I charged my team. I said, we need to better understand this new tool, and I wanted everyone to spend time playing and experimenting with it, and then, and also encouraging their colleagues at the institution to do the same thing. Then I wanted to get people together to compare notes and see what we learned and discovered. Well, what I was doing really is what I'm calling this first leadership lesson, which is really exploring the power of play and exploration. And I think it's particularly important as we are in this era of AI. So what does this look like? So I was setting the expectation that everyone in the division should spend time playing and experimenting with ChatGPT. Some people don't like my use of the word playing because it sounds frivolous, but it's not, and I intentionally use that word because I think it's important for us to get joy and explore this in a positive way. So let's play with it. And we would learn from it. We created regular meetings, or we called them AI gatherings, to share what people were doing, what they learned, and updates about what we were hearing. And remember, there's not a lot of direction to this. It's just experiment with it and see what it can do. We also participated in national conversations about AI and higher ed. And because these activities were happening across the institution, I wanted a way to coordinate that, so I was able to use some salary savings to create an 18-month AI coordinator position to centralize our efforts and sort of understand how people were experimenting with it. And I established a fund to buy licenses for experimentation, again, encouraging people to play, explore, experiment. And we even set up a space for this. So we created an AI exploration lab. We staffed it with some students who, who became really knowledgeable in AI, and it was a place that faculty, staff, and students could go to explore and learn about this tool. So as we think about leadership, I really think the power of play and exploration is so important, especially in the age of AI. So my question to you is, how are you harnessing the power of play in your leadership?

David Weil  08:25
So as you think about that, I'm going to segue now to the second lesson here regarding leadership in the age of AI, and it's a little different. It builds upon the play. And so this is about, you know, as we think about the future of higher education, and, you know, there's a lot being written about AI and the impact on your campus. Some will say, well, ChatGPT is going to be the plague upon education. Or, you know, how is it changing education and what we're doing. So as we're thinking about this, people would start saying things like, we should not be using AI, or we should be using AI. And there were others that were saying, well, our staff doesn't know anything about this. People might say, I do not want my students going anywhere near AI. It's going to be really bad for their education. While others said, I do want my students to learn about AI. They need to know this. Concerns about the data. And you know, some people are saying, well, AI will have a huge impact, and others not. So you're getting all these different opinions and thoughts about what's happening and the impact of AI. But how do you pull it together, especially as a leader, and how do you think about this? And so what really evolves from this is, what are the questions that we should be thinking about as we think about AI. And so as we gave that some thought, we developed these seven questions that ended up really being a core for how we approached our, our, our work to understand AI. So I'll share with you those questions. The first one was, how do we provide the skills and build the critical thinking competencies that students need to be prepared to enter a workplace that will use AI in key processes and tools? So this is really about the education that we provide our students, and how are we preparing them for the workplace that they're going to be entering into. It's a really important question for higher education. And then, thinking sort of about the students' experience on our campuses, how can AI enhance our students experience?

David Weil  10:43
The third framing question that we came up with was, how can AI improve the effectiveness and efficiencies of our processes and support our strategic success? So how does it make us better and make us be able to provide services more effectively and do our work more effectively. And speaking of doing our work, the fourth question was, what is the impact of AI on the skills, roles, and organizational structures of our workforce? So do we have to have different jobs, changing job descriptions? Are there new jobs that we need? I created an AI coordinator position, for example. Or are there jobs that are no longer needed? And how do we retrain and educate our workforce? The fifth question is all about our policies and our processes at the institution. So like any institution, we have dozens and dozens of policies. Well, some of those policies need to be reviewed and rethought, because AI can have an impact on that. Just as an example, we have a harassment policy, an anti-harassment policy — well now people can use AI do deepfakes and other things that may have to be, you know, we may want to review that policy to see how AI might impact new and create new ways for harassment to occur. Just as an example, it impacts all areas of contract review and everything there. So you know, what's the AI's impact on that? The next, the next question is probably one of my favorite questions, because it really gets to the heart of our institution. And so how does AI and its use align with our institution's philosophy, approach, and core values? At Ithaca, we are, you know, a smallish institution. We're about 4500 or 5000 students. We really put a lot of emphasis on the human-to-human connection. So how do we use AI in a way that enhances that, as opposed to using AI to replace human-to-human? Another institution might be looking at AI in a very different way. So it has to align with your values. And finally, is the last question is really about sort of doing a SWOT analysis. What are the opportunities, the risks, the threats that come from AI for our institution and how we do our work, and even for higher ed? So we took these seven questions, and I actually turned them into an article that got published in Inside Higher Ed, to use this to frame our work here at Ithaca College. And I know other institutions are doing the same. And if we look now, taking these questions, you can see they map to specific activities that then were taking place. So the first question about, How do we provide the skills and build the critical thinking competencies for our students. Well, we have faculty working groups that are looking into that. We created AI mini grants, I think I talk about that a little later, for faculty to explore using AI in the classroom. There's a really phenomenal collaboration occurring between some of our IT staff and our Academic Affairs Center for Faculty Excellence team, looking at ways in which we can enhance teaching and learning. Another question like, how can AI enhance student experience, and how can we be more effective and efficient with our processes, those spun up pilot projects, which I'll talk about a little later. Impact on the workforce: We've had a lot of conversations with HR and doing some really interesting things of using AI to help with performance reviews, for example, or reviewing college policies. So question five led to the review of 160 policies and the impact of AI. And then align with our values: We actually developed some guiding principles for use of AI at Ithaca. But a lot of these really were wrapped together and were part of a charge to a presidential working group on AI to really look at these in a systematic way and to come up with the responses. The presidential work group explicitly did not focus on the curriculum and the use of AI in the classroom, because that is the prerogative of the faculty. But there were other groups that looked at that. So this is the way that we use questions to drive clarity and to help people understand sort of where we want to go.

David Weil  15:22
So this is the second lesson of leadership in the age of AI. And so questions really give our exploration conversation focus. They provide a tangible way to rally people and involve them in processes, and also have that sense of shared ownership and more inclusive decision-making. Questions signal what matters most to you and to the organization, and they inspire curiosity, not just compliance. So as we were thinking about the questions, we wanted to use them to create a space for reflection and to invite the diverse perspectives, to build trust, and to provide the non-threatening way of focusing people on issues that matter or should be considered. So leadership lesson number two, questions. Questions are a powerful way to bring clarity and focus. So when you have a moment, you know, reflect on how you use questions in your leadership. Okay, we talked about questions. We talked about the power of play and exploration. But even with those activities happening and creating like an AI exploration lab and stuff like that across campus, you know, a lot of people were starting to explore this and started to have these questions. So like Student Affairs would say, Well, what does this mean for us? Or how can we use it in the library? Or how can I tell if my students are using ChatGPT? Or, you know, our friends in Legal Affairs want to use it with contract reviews. Or as a student, what should I know about it? Or how will you protect my data? So everyone across campus was starting to really explore this and come up with these questions. And, you know, thinking through this, and it was creating, you know, a lot of good thinking, and a lot of, you know, questions were coming up, and some concerns as well. And so as this was starting to unfold, my leadership team and I, we wanted to connect with people, and so we held a series of meetings with division leaders across the institution. We talked with the leaders of Student Affairs or the leaders of Legal Affairs or Academic Affairs, and we tailored each conversation with examples of how AI might impact that area or can be used as a tool to help them be more successful. We also held conversations with our student government council, and we also had open conversations about ChatGPT for the institution, and those were done in partnership with a provost or Center for Faculty Excellence and Computer Science Department and others. We met with different groups. This is from a "Let's Talk Tech" presentation to administrative assistants and about how AI can work with that. So these are all examples of building personal connections, and that's powerful because it allows you to have conversations with people and to allay fears and hear about their concerns. So we had discussions with faculty, staff and student. We had meetings with president's cabinet, Board of Trustees, open school-wide meetings, department meetings, and, and, you know, we engaged with the student newspaper. The discussions helped shape our approach. Sustainability came up as an issue in multiple places. So that helped us think about, how can we use AI in a sustainable way? It helped us understand concerns and provided us with a chance to create additional opportunities based upon feedback and concerns that we were hearing. At the end of the day, these discussions and these conversations with people built trust and created partnerships, and that, I think, is a powerful lesson. And the third lesson in leading in the age of AI is creating opportunities to connect, and when you do so, you provide avenues to address issues and demystify. So as you think about your leadership, you know, think about how you create opportunities for connecting and helping people understand.

David Weil  19:42
So now we're going to talk again from a leader's perspective of setting direction. So as a leader, you know, we are often faced with a choice. Do we head this way? Like maybe, do we buy a solution? Or do we head over here and we build our own solution? And these, you know, these, some of these decisions that we have to make are huge, will set the direction of the institution for the next 10 years, and some are very small. But as you think about this and setting direction, it's as important to say where you're not going as it is to say where you are going. So when we set direction, saying we're going east, not west, tells people we're not going to head to California, we're going to the East Coast. And that provides some clarity. So taking this metaphor a step further. So if we're going to go east, that establishes the general direction. So saying that, for example, we're going to buy a solution, we're not going to build our own. But where on the East Coast are we heading? Are we going to Boston, New York City, Washington, DC? And so, in, from my perspective, the, you know, from a leadership perspective, you set the general direction, but then you use the team to help set the actual destination. You know, which product are you going to buy. And so that they have involvement, and, you know, feel a piece of it. And then how are we going to get there? So let's say we decide we're going to Washington, DC. You have the team help develop and set that process. Are we going to fly? Are we going to drive? Are we going to take a train? Don't know. So this setting direction, I think, is a very powerful leadership tool, and really important. And let me show you how we did this at Ithaca College. So the president came out with a statement: We need to understand AI, its impact on our students, and how it can help our institution. So she set the general direction. She could have said, we don't want to do any, we don't want to have anything to do with AI, or AI is just going to be the realm of our math computer science department or so, or just the IT. But no, she set the general direction that we need to understand it. And so then, so then, how are we going to or, I'm sorry, where are we going to go from there? And so to identify the specific destinations, New York, Boston, Washington, she created the presidential working group with the six questions that she asked the group to, to consider. And the answers to these questions really help us identify where we want to go. So for like, question four is to identify five to 10 strategic AI initiatives, and that gives us guidance as to where we should be heading, and then how are we going to get there? So, you know, this example of creating these AI mini grants, allow faculty to experiment and to try to see, you know, what's the best use of AI, and how do you want to approach it? So it wasn't prescribed from above, but allows people to explore and find what works best for them and help to be part of the conversation. So I'd say that's leadership lesson four. By setting a general direction, you establish the parameters for the team to explore and then develop the details. And I encourage you to think about how you set direction for your teams and allow them room for buy-in and refinement.

David Weil  23:34
So now I want to talk about a word. There's a not-so-secret word that I feel works wonders, and it really comes into play as we are thinking about leadership in the age of AI. And it's a word that starts with the letter P. And so the word is pilot. May not have been the word you were thinking of, but pilot is an incredibly powerful word, and it allows for early successes and failures. And as we think about AI, this is really important, so let me explain. So pilots allow you to experiment. They give permission for something to not be perfect. It helps people become comfortable with ideas and concepts, and pilots provide opportunities to learn. So let me show you how we use pilots here at Ithaca College. So this sort of maps out some of the AI development that we've been doing here. So we've been experimenting with AI. We use Copilot. You know, people are using ChatGPT. It's used a lot of different ways. This is really looking at how we are going to either build or buy AI applications to help enhance the student experience. And so, as you can see from this, we have two things that were only piloted, never made it to production. But then we had a very successful product that we call Nebula, which I'll talk about, that is actually in production. And now we are looking at pilots and prototypes of things for where we want to go from here, but it all comes back to the pilot. So we started early on with something that we called Ithaca Insights, which now a lot of institutions have, and there are a lot of products out there, but at the time, this was something that wasn't that prevalent, and we wanted to experiment. So we developed our own Ithaca Insights tool, which is a tool that prospective students, or at least in thinking, that prospective students could use to ask questions about Ithaca College and the Ithaca, New York area. And we trained it on publicly available data. So we trained it on the Ithaca College website, we trainined it on the course catalog, we trained it on Visit Ithaca New York website, and things like that. And we also trained it on how to respond to questions. And it worked. And so these are some of the types of questions that a prospective student could ask. For example, they could say, I'm nervous and, that I won't be successful. How will Ithaca College help me? And it comes up with this response, which gives Ithaca College specific responses to the questions there in a nice, friendly way. But we never went live with this. It was just a pilot, but we learned a lot. We learned how to tailor AI responses to have the tone that we want to always follow up with a question at the end, to reference our language for things. We don't call Career Services Career Services. We call it Center for Career Exploration. And other things like that. We also learned how to train AI on our data and our information, very powerful. Another lesson was the importance of getting stakeholder buy-in. So we set out and we did this pilot, we're showing people on, and some offices were like, Oh, I'm not sure I'm ready to do that, or we're not sure this, this fits the culture of the institution. And so in this particular case, we took a step back, and we're like, okay, we're not going to push this forward right now, we're going to learn from it. But the lesson was that if we're really going to try to move something to production, we want to make sure we get the stakeholder buy-in early. And you know, we learned the power of these tools and how they can provide responses beyond what we expected. So there were times that some of the responses from these tools actually even surprised us. Not in a bad way, it was just like, wow. So we learned a lot from this pilot. So that then led to our second pilot, where I said, Okay, you know, we ran into a problem with this in terms of going live, because a lot of offices needed to get involved in that and part of the decision process. Well, hey, I'm the vice president for IT. I'm responsible for IT service desk, so I don't have to get the buy-in from others. Let's move forward with creating a service desk tool. And I wanted to take it to the next level. I didn't want the service desk tool to simply provide answers. I wanted it to be able to take an action if it couldn't solve the problem for the for the individual. And this is again, a few years ago, before the development of agentic AI, where that type of thing is becoming more commonplace. So we created this IT service desk agent and, and it works. A student or someone could come up and say, I'm having trouble connecting to WiFi on campus. They would, they would come to this through an authenticated source, so they would come to it through a portal or something, so we would know who the student is, and so we would know things about them. And the AI agent would provide them with guidance and stuff, and then if it couldn't solve the problem, it would offer to enter a service request for them on their behalf. And again, we learned a lot from this. This is not in production now, and I'll tell you why in a second, but it is something that we do hope to put into production this fall. So what did we learn from this pilot?

David Weil  29:27
We learned that having clean data is critical. And sort of we knew this, and we thought the data that it was using was clean, but it wasn't AI clean. So what do I mean by that? Well, we pointed this tool to our knowledge base. We use a service request system with a knowledge base there, and we'd been using that knowledge base to train our service desk students and other staff to provide answers. So we thought the information there was good. The problem is, it turns out that the humans were using the knowledge base, but they were still adding a lot of value from themselves. And so they would add a context, or they would add a nuance that wasn't fully captured in the knowledge base, and that was tripping up the AI tool. And so it really was a learning experience for us about, you know, clean data and what you need to have to get the level that's needed for this type of tool. It also really showed the importance of thorough testing. So, you know, on the surface, we're like, oh, this is great, but when we really dug in and have all these different test cases, it really, you know, it showed us that. But it did, on a positive front, it showed us the ability to have the agents take an action and connect more deeply, and have them connected more deeply to our systems. So again, the power of a pilot is really there where we can learn from it. And so these were not wasted efforts at all, and they actually led to a very successful service that is in production, and it's something that we call Nebula. So we have a group on campus. They're called ICare, and they work with students that are having some distress. So it's not mental health counseling, per se, and it's not academic advising. It's sort of in the middle. It's like social work. And so if anyone on campus observes a student who may be struggling a little bit, we can make a referral to the ICare team. The ICare team would triage those referrals, and if it reached a certain level of concern, they would set up a time to meet with the student and have a nice conversation and see how we could help. Well before they met with the student, they would need to do research, and that research would take 30 to 60 minutes for each student. So the ICare team came to us and said, Hey, we don't have enough staff to meet with every student that really needs we need to meet with. Can you help us be more efficient so we can meet with more students? So again, I talked about values earlier. It was really important that we didn't create a tool that would replace the human-to-human. But in fact, we created a tool that took care of the mundane to create more time for human-to-human interactions. And this tool is very successful. So it took us about 80 hours to design and develop, we are using APIs, we deal with privacy, and it really costs us less than $25 a month to run this tool. But the return on investment is significant because we save the counselors 45 to 60 minutes a day, which allows them to see up to 150 more students in an academic year. So it really created much richer and many more opportunities for that human-to-human interaction than was possible before this, and we were successful with this based upon the pilots that we developed. So that's where we are today, and now we're piloting and thinking about how we use agentic AI to really take things to the next level in terms of taking other actions and things like that. And so we're working on pilots with that. So that's our journey here, and sort of the power of pilots to allow people to learn and become comfortable with technology and experiment, while giving permission for things not to be entirely perfect. So I encourage you to make use of this concept of pilots in your leadership and your development efforts.

David Weil  33:55
So this is taking us now to the last lesson that I want to share with you, which, lesson six, and it's a lot less technical, and it really is focusing on our why. And I referred back to values. I referred back to, you know, that connection, and it's just so important. It's important with leadership in general. It's important with leadership in the age of AI, is to focus on our why. And I always try to create opportunities to build those connections. So these are pictures of students interacting with IT staff at a retreat, and just try to remind people of why we are here and why we are making the decisions that we are. It's to serve our students. We have, in our offices we have posters of students that work for us over the years and what they're doing now and how they've moved on. It's a reminder of why we do what we do, and how we touch lives in that. And I think that's so important at any time, but it's especially important in the age of AI, where AI is viewed at times as this dehumanizing, depersonalized thing. But I think we can use it in ways that actually enhance that. So connecting back to our why can be done in many different ways. I showed you some through pictures and examples there. We also have developed guiding principles for the use of AI at Ithaca College, and it really reflects who we are. Ithaca College is an institution dedicated to empowering people through theory, practice, and performance. That is core to our mission. And if you look at the guiding principles here, the very first one says we center on people, not technology, and then we seek to promote digital inclusion and commit to lifelong learning. So these principles really provide a bedrock for us to base our decisions on and our approach to leveraging AI. So the last lesson is, you know, keeping our why and values at the forefront is a very powerful motivator and guide, and just be thinking about how you incorporate that into your leadership. So I spent, you know, some time talking about six lessons of leadership in the age of AI, the importance of play and exploration, the value of asking questions, leaning in, making those personal connections to learn and to help allay concerns, the power of setting a direction, pilots, early successes and failures, and learning from them, and then connecting with our why and our values. So these lessons, I think, impact all of those moments that we have as we're making decisions or influencing or connecting with people. But at the end of the day, leadership is about people. We can talk about technology. We don't lead technology, we don't lead projects, we don't lead strategies. We lead people. And so we often think we lead these things, but what we actually lead are the people who design the technology, the people who execute the projects, the people who create change, and the people who bring organizations to life. So leadership in the age of AI is still all about leading people, even though we have rapid changes and lots of opportunities and risks and uncertainty and, and we're riding this hype cycle, you know, at the end of the day, leadership is about people.

Rhea Kelly  37:49
Thank you for joining us. I'm Rhea Kelly, and this was the Campus Technology Insider podcast. You can find us on the major podcast platforms or visit us online at campustechnology.com/podcast. Let us know what you think of this episode and what you'd like to hear in the future. Until next time.

Featured

  • college student working on a laptop, surrounded by icons representing campus support services

    National U Launches Student Support Hub for Non-Traditional Learners

    National University has launched a new student support hub designed to help online and working learners balance career, education, and family responsibilities as they pursue their education. Called "The Nest," the facility is positioned as a "co-learning" center that provides wraparound support services, work and study space, and access to child care.

  • abstract AI pattern

    Meta Forms 'Superintelligence Group' to Pursue Artificial General Intelligence

    Meta CEO Mark Zuckerberg is assembling a new team focused on achieving artificial general intelligence (AGI), amid internal dissatisfaction with the performance of its current AI offerings. The team, known internally as the superintelligence group, is part of a broader effort to enhance Meta’s AI capabilities.

  • stacks of glowing digital documents with circuit patterns and data streams

    Mistral AI Introduces AI-Powered OCR

    French AI startup Mistral AI has launched Mistral OCR, an advanced optical character recognition (OCR) API designed to convert printed and scanned documents into digital files with "unprecedented accuracy."

  • handshake where one hand is human and the other is composed of glowing circuits

    Western Governors University Joins Open edX as a Mission-Aligned Organization

    Western Governors University is the first organization to join the Open edX project as a "mission-aligned organization" (MAO), a new category of institution-level partnership supporting development of the Open edX open source online learning platform.