Campus Technology

Campus Technology Insider Podcast September 2025

Listen: Human Connection in the Age of AI

Rhea Kelly  00:00
Hello and welcome to the Campus Technology Insider podcast. I'm Rhea Kelly, editor in chief of Campus Technology, and your host.

How did you get where you are, and who helped you get there? At our September 2025 Tech Tactics in Education conference, keynote speaker Julia Freeland Fisher began her presentation with that question as a way of emphasizing the importance of personal connection in all of our lives. The director of education research at the Clayton Christensen Institute, Julia discussed the ways that AI companions and productivity tools are impacting social ties in her talk titled, "Human Connection in the Age of AI: Disruption, Dystopia, or Discovery?" For this episode of the podcast, we're presenting the recording of her keynote. I'll let Julia take it away from here.

Julia Freeland Fisher  01:02
Thank you so much, and thank you all for joining, early morning if you're on the West Coast and reaching midday if you're like me on the East Coast. I'm Julia Freeland Fisher, and I'm here to share a little bit of my research. And I want to say, you know, the title of this can sound a little doom and gloom, but what I urge you to consider over the next 45 minutes is not necessarily all the ways that AI could displace student-teacher relationships, advisor-student relationships, student-to-student relationships, but the ways that, as you adopt this new technology that's really becoming the operating system of our economy and increasingly, our schools, we can center human connection as one of the things we want to grow and nurture, using the tech, not replace. So today, what I want to cover, I'm going to define social capital, which is a concept of human connection that I think is really important for all of you thinking about students' long-term trajectories to opportunity. It's why relationships matter, not just for healthy development right now, but for long-term access to opportunity and economic mobility. I then want to talk about the trends that we're seeing in the space around how AI is impacting our social connectedness and social capital, and provide a little bit of a case study of research we conducted last year on the rise of chatbots in college and career guidance. Lastly, though, I want to give you some consideration, some guardrails, and some opportunities to think big about how to scale human connection as this technology becomes more and more commonplace.

I start every talk with this question: how did you get where you are today and who helped you? Some of you may be thinking about deep mentoring relationships, family members, people who were always in your corner. Others of you might be thinking about chance encounters, someone you just happened to meet on a train, in a program at a school, who helped direct you on the path you're now on. The reason why I ask this question is that opportunity, particularly in America's labor market, is a social phenomenon. We spend a lot of time in education, rightfully so, thinking about what students do and don't know and measuring that. But what we have to start paying more attention to is the right side of this equation, whom our students know, who they have exposure to, who's willing to invest in them, take a bet on them, and open doors. And that's because opportunity sits at the cross section of these two.

An estimated half of jobs and college internships come through personal connections. So we can talk about skills, credentials, and knowledge all day long, but if we don't attend to the social side of opportunity, we're not setting our students up to live choice-filled lives. I study this through the lens of a concept called social capital. If there's any sociologists on the call, I'm sure you could school me on this. I am not a sociologist. I play one on TV, though, sometimes. But this concept of social capital, it's pretty simple. It's the idea that our networks contain value, and we in our work, define it as young people's both access to relationships and their ability to mobilize those relationships to support their goals. Now, although it's simple, social capital, I think can be a little bit elusive, as compared to the other forms of capital that we talk about a lot, financial capital, the dollars and cents in your pocket. We all know those numbers. We can look at a bank statement and know how much money we have. Even human capital, what you know and the labor market will pay you to know and do, is easier to wrap our heads around. The reason why social capital can feel a little bit squishy is that it takes many, many forms. It can take the form of perhaps that person you thought of who'd helped you get where you are, giving you emotional support. Maybe it was someone who gave you rides to your first job. Maybe it was someone who gave you information, and maybe it was someone who was even a reference, referral, or the person who hired you in your first job. That's all social capital in action, and it can therefore be so broad that it can be hard to measure.

That being said, it's something that we can't afford to ignore right now. We know that the state of young people's social capital is not as strong as it needs to be to ensure that all young people are thriving. This is data just out this week from Hopelab, which studies mental health and youth well-being. And what you're seeing here is that among young people who report fair or poor mental health — which is not all young people, I want to clarify, the report sort of dispels the myth that every young person today is in a mental health crisis. But what this shows us is that for those students who are struggling, loneliness, disconnection is a leading factor. Right? Our young people are not connecting at the rates that they need to be to be well and to be thriving. We also know from research at the Search Institute — which I urge you to check out their developmental relationships framework, if you don't know it — that we don't always, as adults, have a perfect grasp on the quality of relationships with the young people in our lives. This is showing along a variety of dimensions, like expressing care, providing support, expanding possibilities, that adults tend to overestimate the rates at which they're doing that with the young people in their lives, either their children or young people at schools where they work or youth-serving programs. And on the other hand, students report, yes, accessing these things, but at lower rates. So we know there's a disconnect in terms of our sort of lens that we bring to the relational lives of students.

What I also often hear in my work, and I know we have a mix of K-12 and higher ed folks on this call, but I often hear that college is where you build a network. The reality, though, based on Strata's research and other, is that this idea of networking skills and opportunities is not necessarily as widespread as we might hope on college campuses. In fact, among leading sort of career skills that are measured on the NSSE survey, networking with alumni career professionals to make connections is where students feel least confident. We also see this bear out in some of the mobilization data. I urge you to think about what percent of college graduates do you think found their alumni network helpful. If you think about the glossy pamphlets with which we sell college as a chance to build a network, the numbers don't line up to that sort of representation. Only 9% of college graduates, in general, report that their alumni network was helpful, and 18% of alumni of more selective schools. In other words, this social side of opportunity is often left to chance or going ignored in the structures inside of our education system.

Now this, in general, is something that we should pay attention to, I'd argue, if we care about mental health, well-being, academic success, and career success. But this takes on a new urgency with the rise of AI. Because with the rise of this new technology that I know all of you are exploring, experimenting with, and probably bringing to your classrooms in really inspiring ways, there's a hidden risk, which is that experience and connections and social skill are all going to command a growing premium, even though we may not be paying attention to those social sides of opportunity in how we're guiding students. What does this look like? Well, we know that technical skills are something that AI is increasingly adept at, and therefore what people are increasingly calling, quote unquote, "human skills" are becoming more important, more in demand in the labor market. We also know that entry-level work is starting to get disrupted and displaced. Right? When we first saw ChatGPT come out a couple years ago, it was sort of framed as like an intern in your pocket. It can do intern-level work. Fast forward a couple years, when we have tools like Deep Research, it can now do much closer to PhD-level work, at least on some tasks. And as a result, we see employers backing away from hiring entry-level talent that has no work experience, that has no connections to doing real work in the context of real professional settings. And lastly, there's a real irony to the efficiency that AI appears to lend to the job search process. We see young people increasingly using AI to create résumés and to do their applications to entry-level jobs, while at the same time, employers are using screening systems that are screening out a lot of that early talent and are overwhelmed by the sheer number of applications they're getting. Which shows the labor market reverting back to a "who you know"-based hiring process. And so technology, yes, is lending productivity, but when we look at the sort of social layer of what's happening between the interface of education and work, we see human connection rising in importance.

At the same time as it's growing more valuable, I argue it's also becoming much more vulnerable with the rise of what are called AI companions. And I want you to think of AI companions in terms of their function, not just their form. By that I mean, we know that there's a bunch of, I would argue, somewhat creepy applications emerging in the consumer market that are selling AI boyfriends and girlfriends that people are falling in love with their ChatGPTs. And these headlines can sound really extreme. We also know that half of young people are reporting experimenting with AI companion tools in the consumer market on a regular basis, meaning at least a couple times a month.

Julia Freeland Fisher  10:08
However, this is also a form that's starting to emerge in education, not for intimate partnership or even friendship, but as a really a companion that can help students make progress, either in their college journey, their career journeys, and understanding sort of the pathway that they're on. And the fact that these tools take this companion-like form, that they are anthropomorphic in nature, that they are sort of ever there, 24/7, poses real questions about how young people are going to relate to AI in the coming years, and how that's going to impact how they relate to one another and the adults in their lives. What I think is really important to understand about this companionship category, or function in the market, is that it's sticky. This is research from Andreessen Horowitz, which is betting really big on this companion market. It's a VC firm putting millions and millions of dollars into startups in the space. And what they've seen is that when people, not specifically young people, but users in general, are using these companion apps, they are spending significantly more time on these tools than some of the other types of tools. You'll see ed tech number two on here, far lagging these engagement rates. And so we have to pay attention to this trend, not just because it has sort of the potential to shape young people's social lives, but because it's actually way out ahead in terms of stickiness, I would argue, potential addiction, but also just sheer usage rates among, among the user base, including young users. We also know that users of these apps lean younger. So Character.AI, which is one of the fastest growing ones of these sort of strictly companion apps selling kind of friendship, entertainment, and I would argue, intimate and sexual chatbots, based on some of the research, their user base is over half people under 24.

So we have to pay attention to this, even though it's not strictly ed tech, it's going to have an outsized impact on young people right now. And the question I've been asking in my research, I study disruptive innovation across education, across the education sector, is, how is AI poised to potentially disrupt human connection? And I want to nuance that term disrupt. It's a very specific term of ours. If you meet any entrepreneur these days, they'll tell you they're disrupting whatever industry they work in. But disruptive innovations follow a very specific trajectory. This is both inside and far beyond education. They start off not as good as existing offerings, and they tend to sell to what we call non-consumers, people who are otherwise shut out of the market, that can't afford or access a given product or good. Over time, they tend to get better and satisfy more and more demanding customers, and eventually they get good enough to displace mainstream offerings. So a concrete example of this outside of education is what Sony did to disrupt RCA. This is back in the day when Sony started building transistor radios. They were pretty crummy in terms of sound quality, but they were far cheaper than what RCA was selling in terms of really state-of-the-art tabletop radios that you picture in the sort of 1950s living room. But, right, with this affordable, slightly crappy technology, Sony could sell to teenagers who were delighted to spend a little bit of pocket change on something that allowed them to take a radio, go outside their house, and listen to rock and roll. Over time, Sony's technology got better and better, eventually displacing RCA, and that's the process of disruptive innovation. Again, it starts off not looking particularly impressive, but it's that very attribute that gives it over time a competitive advantage to displace industry incumbents.

So what are we seeing right now in this market of AI companionship? It's that bots are better than loneliness, right? If your alternative is nothing at all, a bot does not seem like a crummy alternative to human connection. And the trajectory we see these taking is that initially getting a foothold among users that are seeking out connection and not finding it in the real world, these technologies are getting better and better at responding, at appearing empathetic, at connecting with users, starting to displace the outer rings of our networks and then eventually coming towards our more intimate ties. That's the trajectory that we see in the consumer market. I want to emphasize, and I mentioned this already, folks like Andreessen Horowitz, there is a ton of money going into this market. So despite the fact that those of you listening right now might think an AI boyfriend or girlfriend that's ridiculous, or might play with this technology and think it's not particularly compelling, that's, that reaction is rational, but it doesn't line up to where investors see the immense potential in this market. There's a lot of money being poured into it. And this is from an Andreessen Horowitz memo showing, sort of, this was the introductory quote that they used in a memo called "It's Not a Computer, It's a Companion," right? And this is a Caryn AI user, another one of these sort of intimate app, consumer apps, saying, "The great thing about AI is that it's constantly evolving," right? We know this. We're seeing this in the productivity world, where we get better and better versions of ChatGPT and Claude on a frequent basis. But the way that users experience this socially and emotionally is that one day it will be better than the real girlfriend. One day the real one will be the inferior choice. That is textbook disruptive innovation. And again, I want you to like, maybe edit out the world girlfriend here and think about friend, companion, mentor, coach, advisor, all of the ways that relationships show up in our lives, and the ways in which this technology is not just a breakthrough in productivity, but is starting to be a breakthrough in lending human support.

Now that's a consumer market trend. I think we all have to be wide awake to it in education, because just like social media has shaped the worlds that our young people bring into the classroom today, this new technology, I think, is going to do that on steroids. But there's a different question we have to start to ask inside of ed tech, inside of the decisions you're making every day when you integrate AI into your workflow, into your classrooms and into your organizations. And that's that AI does have the potential to disrupt what we think of as the social capital advantage by lending students personalized support. It shouldn't matter whether you know someone who went to college to have you successfully navigate the college application process. It shouldn't matter the zip code where you grow up, whether you have access to high quality instruction and instructional materials. AI could be a game changer in all of those ways. What we have to hold in tension or in balance with that, however, is the fact that as we use AI to supplement or replace those things that historically have traveled through human connections, networks, coaching, and instruction, there's a real potential that we're disrupting human-to-human connection with simulated companions, with anthropomorphic AI. And so the question we have to ask today inside of education is not "Should we not use AI?" Right? I certainly think that I want to keep my own kids away from some of these consumer apps, but I want to see their classrooms using this in all sorts of smart ways. The question we have to ask is, can we build it to foster rather than replace human connection, and can we build learning models that, given all of these dynamics in the labor market, understand and integrate the social side of opportunity into how students are going through their learning journey and ideally brokering the connections and networks they need to get the jobs they want. So we asked this question last year in a study that we conducted in the college and career navigation market. I want to acknowledge that I suspect many of you are coming at this work from a classroom standpoint, how can we integrate AI into teaching and learning. And while that's not what we studied, I actually think our findings are really applicable to the teacher-student-AI sort of triangle that I think we're all going to be navigating in the coming years, because it is a fundamental question. As we looked at this market, we were asking, how are students going to be accessing support and information, content and skills, right, in the age of AI, and how will that impact human connection? And I want to share a little bit about what we found, which is, I think, a story of both risk and hope.

Julia Freeland Fisher  18:12
The reason we set out to study this market is that, as I looked at the rise of AI companions and this idea of sort of human-like coaches and mentors that GPTs and other tools can provide, I thought that we were going to see the fastest rise of these where our ratios were the most broken. And I don't have to tell you guys this, but counseling and student support is one of those places right? Nationally, we have an average of 385 students to one high school counselor. It's really hard to get into deep relational work when that's the ratio that you're operating within. We also know that there's a lot of non-consumption or lack of access when it comes specifically to career guidance, right? Our counselors are spending a lot of time on a whole slew of things, ranging from scheduling and academic advising to college access. And the idea of helping students explore careers is just hard to fit in, right? So a limited amount of time is spent on that. And then we look to the higher ed space, right? Where ostensibly, many students are going to college in hopes of getting a job, in hopes of achieving the American dream. But our ratios are even more broken when it comes to access to career support. All of this creates a ripe space where AI could be a game changer, and we're seeing a whole bunch of companies modernize in this space, start to integrate chatbots and generative AI tools to be there for students 24/7 to lend support. And we're also seeing a ton of startup activity in this space. So we wanted to understand what's actually happening when students are engaging with these tools. We interviewed 30 providers on the supply side of the market. So this was not student-facing research. This was trying to understand how are these tools being built, and what is the sort of division of labor between humans and bots that are coded into these tools. And one of the first things we asked about was, what are the types of support that students are getting, either from humans or bots? And we adapted what's called, I don't want to get into too wonky of the theory here, but we adapted something called social support theory, which is the idea that our relationships provide myriad forms of support. So students need esteem support, they need someone who's in their corner, cheerleading them, saying that they believe in them, but they also need things like tangible support, right? They need cash and rides to school. I won't belabor each of these categories, but you can, you can imagine, in your own life, and ideally, probably in your own work, how these different forms of support show up in how you interact with students. And this wasn't just an esoteric exercise in saying, what are the different types of support? We wanted to ask about the human-bot division of labor. So based on how these tools are being designed, what forms of support are humans versus chatbots providing? I know this is a little bit of a complex diagram, but I'll just say that what we heard at first blush was exactly what I would expect, and is exactly what everyone is putting in slide decks and strategy decks and sort of outlining right now in this market, which is the idea that if bots, if this AI technology can start to do a lot of the administrative and informational and even skills-based work with students in a highly personalized way, suddenly could we free humans up to do the quote-unquote "human stuff": emotional support, esteem support, again, expressing belief in students. Motivational support, helping students imagine future possible selves through conversations, through exposure, through broken opportunities. And so the breakdown of what we heard is exactly what I would expect to hear.

But what I want to underscore for you is that this idea that we have an education of the quote-unquote "human stuff" is getting increasingly fragile as the technology gets more sophisticated. So leaders of these companies and tools definitely asserted humans are still preferred for things like esteem and emotional support. But what we heard is that bots are starting to catch up, that bots are actually really good at motivational interviewing, that students sometimes feel much more seen by a bot than the advisor or educator that they're supposed to turn to, that this emotional support on a 24/7 basis is meeting students' needs in ways that our existing emotional support structures are not. Here's some quotes just to give you a sense, right, of how this is actually being programmed in on the supply side of the market to reach students, right? This is Christine from Handshake, Coco actually is no longer a tool that they're using, but this is just an example of how Handshake, which is a leading provider of sort of job access in the higher ed space, was starting to train a chatbot to sound like a quote-unquote "cool older cousin." Bottom Line is a college access and success organization working nationally that had built Blu, this bot that uses Blu, I should say, a bot that was sort of trained to be a cheerleader. And we even heard from some providers, and this was the exception, not the rule, but that there were sort of allusions to the idea that students were starting to form kind of social emotional bonds with these bots. That they were becoming very connected with them.

And so the question we have to ask is, given that AI is moving in this social, emotionally adept direction, and given that tools are engineered to drive towards engagement, which often means showing these companion-like behaviors, how much social-emotional support should bots be providing to young people? That is like a key question that I don't hear enough school districts and colleges asking right now, but it's like right under the surface of where a lot of these tools are headed. And to answer that question, we needed to understand from providers, well, what's driving students to these bots in the first place? And what we heard is this interesting mix of logistics and psychology. So on the one hand, logistical factors are very real. Students may not have access to an advisor, period. Right? If you're going to, if you're a commuter student at a community college and you don't actually know your advisor, you don't have someone to turn to, that's an example, right? If you're a student growing up in a rural district where you're actually sharing an advisor across multiple high schools and across a whole region, access is a real challenge. We also know that students are not always filling out the FAFSA at 2pm at their allotted time to meet with their advisor, right? That in fact, at 2am they might be trying to do certain tasks that we're hoping they're doing on their journey to post-secondary planning. And so lack of on-demand support, right, in the hours that students actually need the support, rather than when that support is available during the school day, is another big factor. But we also heard about these psychological factors that are real, that students had questions or concerns or anxieties, that they felt shame turning to their advisors about, and that they may have just wanted to avoid human connection altogether, right? In the digital age, we know that 40% of Gen Z actually say they prefer digital to in-person interaction, and so there's a cultural factor here of sort of the preferred modality for seeking and receiving support. This is a quote from an advisor at College Advising Corps that was likewise sort of deploying a chatbot and, and this idea that there's questions, concerns, anxieties that students will bring to these sort of safer online spaces than what they'll bring to a chatbot. And what I want to point out is not that that is bad. It is totally rational behavior, and it's a way to start to reach students that may have actually historically been underserved by the system and hard to reach by the current system. But the question we have to constantly ask is whether that's a feature or a bug, right? That these behaviors, if taken too far, lean in the direction of anti-social and isolating behaviors. And if we go back to that statistic, that an estimated half of jobs and internships come through personal connections, we have to pay attention to this, not just as an indicator of near-term isolation, but a cost to accessing long-term opportunity. So that begs this much bigger question, and this is the tension, again, that I would urge you to wrestle with in your classrooms, in your advising and student support systems across your institutions, which is, when are chatbots and AI tools in general expanding access, and when are they making isolation more convenient and comfortable for our students? When are they actually taking sort of the worst of what I would argue two decades of addictive tech in the consumer market have kind of engineered into our hearts and minds, and making that actually worse?

The good news, though, is that what we heard time and again is that there is a real possibility that using these tools could unlock a whole new layer of human support and access to opportunity in our education systems. One example of this comes from Georgia State's student success work, where they took resources that were being saved and accrued by using AI and poured it back into hiring more student advisors, which meant they saw a huge increase in students going to their advisors as a result of nudging using chatbots. The other thing that we heard a lot about is that if advisors, if counselors, if educators are not in the business of sort of answering FAQs, that they could suddenly start to be in the business of being opportunity brokers. And we heard a lot about this concept of warm introductions, which is something that I would argue, although some tech builders in Silicon Valley would disagree with me, it's something that really remains a uniquely human skill. The idea that technology can always provide you with, here's a list of people you should talk to, or should reach out to, or go build a LinkedIn profile. But AI can't do that warm introduction of saying, I know that you would benefit from connecting with my student, and here's all the reasons you should talk with them. So that idea that we free up time and we move it into deepening and diversifying students' networks came up time and again in our interviews, and I think it's where there's a lot of potential to shift our focus away from AI simply in service of increasing learning and persistence, to increasing learning persistence and access to human connection.

Julia Freeland Fisher  28:09
So I want to take a final few minutes here to consider some of the safeguards and sort of opportunities around what human connection can look like in the age of AI. And again, I'm using this "in the age of AI" term a little bit loosely here, because I want to honor that I think these considerations, while we came up with them in the context of advising and college and career sort of conversations that students are having with chatbots and with humans, I think this can apply also to the classroom. So the first is we have to start, if we're going to take seriously both the threats in the consumer market and the opportunities in the education market, we have to start measuring human connection with more rigor and regularity. We have to pay attention to people indicators, not just progress indicators. We have a whole report on our website whoyouknow.org, called "The Missing Metrics," that provides example survey items and other ways to capture data on how well-connected students feel. That organization, the Search Institute, that I men, that I mentioned earlier, has a wonderful survey around students' access to developmental relationships, and also a great tool called their Relationship Check tool, where adults can reflect on the degree to which they're engaging in developmental relationships with their students and advisees. So this piece is critical, because otherwise we're sort of flying blind, and our traditional measures are not going to tell us the degree to which students are either building their muscle to be in relationship, or whether that muscle is atrophying as these self-help bots become more commonplace. The next thing is that we need to prompt AI to prompt relationship-building. So those of you who are using Playlab or other sort of proprietary AI tools in your systems, think about ways that those tools can have this human relationship theme threaded throughout. That when a student goes to them for academic or college and career help, that it's not just giving them answers, but it's encouraging them to reach out to the people around them, and it's building their confidence to do so.

That relates to the next recommendation, which is, there are tools starting to emerge, one called Goldi by an organization called Climb Together, that are really saying, let's take this anthropomorphic AI and not just give students better instruction, but give them a chance to practice telling their story to potential employers, to potential admissions officers, and how to actually get rid of what I would argue is a real crisis in help-seeking aversion. So we can build AI to do that. It's not the default setting of ChatGPT. ChatGPT wants to be your self-help coach for, in perpetuity, right? But we can build tools that actually move us in this human-centered direction. I already mentioned this idea that if we're using AI as an efficiency play, it becomes even more critical to reinvest those saved resources back into staff capacity. Now I want to acknowledge it's 2025, the idea that there are abundant resources lying around is a joke. So I want to acknowledge that this is directionally where I think we need to be sort of setting strategy over the next five years, thinking about when we're saving time, when we're saving resources, rather than that netting us out to have just like more resources to pour into more technology, those resources should actually be directed towards human connections.

And then lastly, I think we have to start paying real attention, you know, not just to escalating cases where students maybe express intent to harm themselves or others on these tools, but escalating when there's signals that students are isolating, that they're not meeting with their advisors or their educators, that they're spending too much time on these tools in a way that's taking away from their real-life interactions. And these are all things that we've actually seen start to crop up among the technology tools that we studied. So I want to give just a glimpse of what this can look like for those of you who are either building or procuring technology tools. So you can start to get specific about okay, what are the social features and functionalities that we could elevate alongside the sort of student-facing efficiencies and personalization that AI affords. So one example that I absolutely love comes from Uprooted Academy, which is a nonprofit college access platform that's doing a great job of integrating not just better college guidance and advice for students first in their family to go to college, but a whole bunch of mental health supports and sort of psychological supports for students going through that process. And one of the things that they recognized early on was, yes, we can give students better information and a better experience, but we can also tap into the existing assets in their lives that traditional college guidance doesn't always do. And again, I would echo that this is true in academics as well. If you're thinking about, how do we move a student through a curriculum? How do we ensure that they're on grade level or advancing at the rate we want? We have to start enlisting families and communities in that process if we want it to be sustainable and effective. So what Uprooted Academy does is it has students identify who are you, they call it your tribe of five. Who are the five people in your life, could be peers, near peers, family members, other members of your community, that you think you can turn to on this college journey. It collects their contact information, and every few weeks, it automatically texts those individuals with an update on the student's progress and tips on how to help them. Suddenly, using this technology, not in just a student-facing capacity or an educator-facing capacity, but a community-facing capacity, is unlocking what we think of as sort of latent social capital in young people's lives. And again, I think this is happening in ad hoc ways in our school systems, but the idea that we could do it efficiently for each and every student is where the real unlock comes in. Another example of this is using AI to sort of catalyze conversations that otherwise might not happen. So, so, you know, I started this research years ago. I've been writing about relationships and networks for almost a decade. And about a year ago, I realized I wish I had started my research writing about conversations rather than relationships, because conversations are the building block of relationships. And right now, with conversational generative AI, that's what we see starting to sort of potentially disrupt human connection as we know it. And so a tool that does a really great job of this. And I'm not selling you on these tools necessarily. You should check all of them out. I'm trying to give you a sense of the technological designs that undergird them. But Protopia is a tool that sells into Alumni Engagement offices in higher ed with basic, a very simple but powerful premise that it can cull through alumni directories and find the best positioned alumni to answer students' career questions. It's not a separate app. It's basically white labeled where students can go on their career services website, ask a question, it'll call through that alumni directory, send an e-mail directly to the alum that's the best fit. Ask, are you willing to answer the student's question? If they are not willing or they don't respond, it goes to the next best alumni. So you have two dynamics at play there. First of all, every student question gets answered by a human, right, and right now, like whether you like it or not, students are turning to ChatGPT and these other tools to get their questions answered. So we're starting to live in a world where questions are getting answered at a much higher rate than connections are getting formed. Protopia flips that script and actually makes this an opportunity to plant a seed of connection. It also, though, I think, does what sometimes we as schools struggle to do, which is it takes a latent network, in this case alumni, but think of that latent network as also your employer partners, your community organizations, your mentoring partners, your families, right? It takes all this latent social capital and it starts to mobilize it using AI as the infrastructure. So very different use case than what I think we're typically hearing about with AI, but actually could be a game changer in addressing social capital gaps in access. Couple other tools before we close out here. You know, there's a tool called Let's Get Ready. This is another college access, it's a program, but it's sort of a virtual college advising program, and they've decided to not use AI in a student-facing capacity at all. They call it brain on, mouth off. So AI is listening and observing interactions between students and advisors, and then supporting advisors to have even more highly effective evidence-based conversations with advisees. So that's, again, another example of sort of an infrastructure layer that's really there to strengthen a relationship, not replace it. I already mentioned Climb Together. This is the tool that is sort of helping students start to tell their story, to answer what Nitzan Pelman, the founder there, would call the sort of impossible to answer question at the beginning of any job interview or coffee chat or informational interview, which is, tell me about yourself, right?

Julia Freeland Fisher  36:46
Which is a very well-hidden test of like, prove that you are impressive, right? And it's, it's code that sometimes young people don't have the chance to decode and practice. And so they've built a bot called Goldi to help students tell their story and start to make an ask in terms of gaining access to job opportunities and internships. And then lastly, I already shared what Georgia State University has managed to do in retaining students at a higher rate, using nudges to push them to advisors, and that that has led to a reinvestment in more advisors. Trickier to pull off that in the K-12 space, where retaining students does not get you more dollars. But I think again, a good example of AI in service of human-to-human relationships and connections. What I want to end with here is like I show you these examples, not to, again, sell you on particular tools, but to, to warn that what I see happening as I look at trends across the ed tech market is that there's a lot of rhetoric about using technology to free up time to build human connection, but there's not a lot of technology being built that fits that purpose. And we on the, in the education market, if you're working at a school or institution, we have to shape demand for that. Time and again, I'm talking to tech providers who are not hearing from the market that, that this idea of human connection is actually what we're demanding. And as a result, we're getting tools that are better at productivity, better at content delivery, better at assessment, but not actually designed for human connection. If you're listening to this, and you're on the provider side, and you're actually building tools, these are examples of not totally rethinking your tool, but actually just adding a pro social layer to the tech stack. So that we're living in that balance between personalizing content and support for students, without doing away with their access to human connection, and without sort of letting their muscle and confidence to seek out help in the real world atrophy. So I'd urge you to take this content and go back to your teams, go back to your schools and have conversations about this. Have conversations about where you see the most promise in sort of AI's potential to grow your students' social capital, to make sure they have the networks they need to get the jobs they want down the line, and to also be honest about the risks that you're actually grappling with as you integrate AI into your classrooms and into your sort of administrative workflow. I'll end there with just a reminder that really, you know, I think that we can forget the social side of opportunity, and we can really focus, in this particular moment, on all the ways that AI could supercharge and scale learning, and that is so critical. And if we don't hold alongside that the importance of scaling connection, we are not setting our students up for living choice-filled lives and accessing opportunity in the future. So thank you so much for coming. I'm so excited that you guys have the rest of the conference and all of the amazing sessions, and I'll hand it back to our host. Would love to get in touch over LinkedIn, if anyone wants to go deeper. Thank you so much.

Rhea Kelly  40:00
Thank you for joining us. I'm Rhea Kelly, and this was the Campus Technology Insider podcast. You can find us on the major podcast platforms or visit us online at campustechnology.com/podcast. Let us know what you think of this episode and what you'd like to hear in the future. Until next time.