Campus Technology Insider Podcast November 2022

Listen: Balancing Data Insight with Data Governance, Privacy, and Transparency

00:08
Rhea Kelly: Hello, and welcome to the Campus Technology Insider podcast. I'm Rhea Kelly, editor in chief of Campus Technology, and your host.

Ravi Pendse is passionate about data privacy. As vice president for information technology and chief information officer at the University of Michigan, he has worked to ensure that privacy is a part of every technology decision on campus. At the same time, he is committed to fostering a robust data culture that democratizes the use of data to inform decision-making. At the center of that culture is transparency: making sure students, faculty, and staff know exactly what types of data are collected, and how that data is stored, accessed, managed, and shared. In this episode of the podcast, we talk about creating a data-aware, privacy-aware ecosystem, data governance challenges, making data visible to students, and more. Here's our chat.

Hi Ravi, welcome to the podcast.

01:11
Ravi Pendse: Thank you so much. It's a privilege and honor to be with you.

01:16
Kelly: So at the University of Michigan, you've created a guide to personal data for students called ViziBlue. I'm wondering how that came about.

01:27
Pendse: Um, first of all, very appreciative of your interest in the subject of privacy and specifically ViziBlue. But I, what I wanted to do is just take a step back and share with you that I joined University of Michigan in 2018. So I'm in my fifth year now. And when we joined, working with all the stakeholders across campus, our students, faculty, staff, we set out five high-level goals or priorities that we wanted to focus on. And they're all tied to this. That's why I'm sharing this. One of them was really enabling data-informed decision-making across campus. Our goal was to democratize the use of data. And anytime you're talking about data, you have to think about security and privacy. The other goal was elevating the customer experience. And we wanted to be very best for our customers; we wanted to meet where they were. And when you talk about elevating the customer experience, often it's informed by data, because customers tell us if we're doing well or not, and that data becomes important. Again, then privacy of that data becomes critical. We were also focused on, University of Michigan is one of the top research institutions in the world. We do about $1.6 billion of research every year. It's a number one public research university. And one of the key factors there is delivering intuitive research computing solution. And when our researchers work on major projects, including vaccine developments, sending mission to Mars, and all of these interesting things, guess what drives that: tremendous amount of data, and tremendous amount of data movement. So you're talking about data at rest, data moving, that's again tied to privacy and security of the data. We are also focused on building a transformational network, because we wanted to have high-speed networks so our researchers could move their data from point A to point B or share that data with researchers across the world. And last but not least, tying all this together is how do we secure our open society. And when we talk about security, there we are tying against security and privacy. So as we thought about all of these five top goals, and important thread throughout is the privacy, respect for privacy, privacy for data, transparency for data, governance around that data, how that data is stored, accessed, managed, and shared. And the concept behind ViziBlue actually started as an ITS internship project driven by students. Our students were interested in trying to design a site that would provide them feedback on what data we collect on students, we meaning the University of Michigan. And so we created ViziBlue, but the initiative came from students. So ViziBlue, frankly, helps students understand how the data is collected. You know, for example, when they apply for admission and financial aid, or when they interact with teaching and learning tools such as Canvas, or when they use their, what it's called is M card, that's the ID card they swipe to enter a building or library, and many more. And how is this data collected? How is this stored? How does this data support university operations and life on campus? How do we use this data to enhance the life of a student, whether it's in classroom or outside of classroom? So under ViziBlue we also in addition to all of this information, we also provide links and resources where students can actually see their data and update their data. You know, maybe when they apply, oftentimes, that data may be their home address. But when they move here, the address may be different. So once they see that mail may be going to their home address and they want to change it, they can update that data, or maybe their banking information had changed now, they can update that data, and on and on. So the goal is there to also educate them, provide them the information, and allow them to interact with that data so that they could change it if needed. And the work on ViziBlue actually began at the start of the COVID-19 pandemic. And the first iteration of the tool actually was released in August of 2020 to all our students across campus.

05:35
Kelly: I was wondering about the timing. Did COVID and the need to track COVID-related data, did that have anything to do with, you know, the, the launch of the project, in terms of, like, I think that brings the concept of data, data privacy maybe closer to home for students, because it's a, it's a real personal, like health data issue.

06:00
Pendse: So so while COVID did not drive — the tool was already in development, the actual development of the tool started, tool started back in 2018 as a student internship project — the COVID, frankly, brought the privacy of data further to limelight as well. Like if we find collecting your health and wellness data, who is this data being shared with? And where is the data stored? How long are you planning to keep this data? You know, all of these are very relevant questions. And University of Michigan frankly encourages these questions and discussion, because we want people to ask these questions. So it's very helpful when we have a vibrant debate on this topic. Because I'm a privacy hawk — I actually believe that a right to privacy is, should be as fundamental as right to vote. This is just my personal opinion. And so I was delighted when students, faculty, staff, were asking these questions. And really, the goal of ViziBlue was to show in a very clear way what personal data U-M collects, and how it is used, like I shared with you before.

07:06
Kelly: It's so interesting that that the idea came, you know, from, originally from students, because I'm always wondering how students feel about the use of their data. You know, are they, do they readily opt in to having their data collected? And do they care about privacy concerns, or they just want to be aware of it? What's that level of awareness among students, do you think?

07:32
Pendse: I think depending on who you speak to across different institutions on this specific topic, the opinions may vary. But in my case, and based on my experience with the University of Michigan students, I found our students to be extremely savvy and very privacy conscious and aware when it comes to use of their data. They certainly trust an academic institution much more readily than, say, a Facebook of the world, in terms of data trust, and so on. But they do care about how their data is used and who has access to it. I know there was an Educause survey done in 2020, it was a report called Student Technology Report, that did show similarly that students generally place trust in academic institutions far more readily than, say, some of the private organizations. But you know, frankly, that puts responsibility on us to ensure that we respect that, their trust, and provide the kind of transparency they absolutely deserve. And frankly, a tool like ViziBlue also helps raise that awareness about privacy, and sets an example for what they can look for when they leave our campus and enter workforce or become policy advocates in Washington, DC, or beyond. Because we want our lawmakers and others to really understand our ecosystem of data, the dependencies that datas have, the consequences that connecting bunch of datasets can result in, and be really aware of it, and, and we feel that the ViziBlue is a tool that does that. I often have students, you know, question us in terms of, you know, who has access to certain data? Why is that data collected in the first place? And we welcome all of those questions. For example, I'll give you one situation where, early in my tenure, some students wanted to know, when they would send a copy to print to a network printer, or say in another location, remote printing, they wanted to know after they pick up their printout how long the data stays in the printer. It's a question that I had not even thought about to be candid with you. So that question came from students, to give an example of how students are interested in privacy aspects and how savvy they are. And our goal when we did ViziBlue was to really help us, you know, proactively address these questions and to serve as a go-to resource for these and other routine queries, and use this is a dialog to teach them, learn from them, and make us a much more data-aware, privacy-aware ecosystem at the University of Michigan.

10:11
Kelly: I also wonder, you know, if students inherently trust the university to take good care of their data, are they aware of the technologies and like education technology, I mean, it's kind of like the institution is intertwined with third parties that will also be able to access data in some ways. Is that an area that students are savvy about?

10:36
Pendse: So I think there, there are nuanced things there, right? So for example, we use Canvas as our learning management system, right? So inherently, all of the learning management data is in Canvas. Canvas is a third-party provider, right? And so the question is, how can we ensure that Canvas is following all the rules that they're supposed to? That's where we typically have agreements with all these third party providers to ensure that all state of Michigan laws and all our requirements are followed by third parties. So all third parties are vetted very, very clearly and carefully. And we share all of that information with our campus community through our safe computing website, because these questions come up all the time, like, can Google read my email, for example. Michigan uses Gmail. And so can Google read my email. And that's where, you know, we have deployed tools, like one of the tools we have deployed is a company called Virtru. It's spelled V, I, R, T, R, U. And Virtru allows you to encrypt your Gmail messages, for example. And we wanted that tool to be available to our faculty, staff, and students so they can get even further sense of privacy. And the tool allows you to do many more other things. And, and so we deployed that to make that available to students, so, and faculty and staff, so, you know, because we need to, it's sort of going back to that it's our responsibility to be make them, provide them the right tools, make them aware of what's out there. So that's why we made a tool like Virtru available to our campus community.

12:09
Kelly: Can you give an overview of all the information presented to students in ViziBlue? Because I think it's really interesting to see, to understand how thorough it is, across all parts of the institution.

12:21
Pendse: No, I think, and one of the things we also did with ViziBlue when it was rolled out was, it was not just that, what information we provide, but also that it's a living and, you know, developing information site so that it continues to grow. So as I shared with you, we did that to make sure firstly, information is presented in a very clear way. So we take input from students saying, is this clear? Are there areas that you would like to change, you know, quick focus group work that we do with students. And then, you know, for example, when they apply, there is admission data, that information is available to them, or there is financial aid data that's available, when they swipe into buildings, that information is available, when, for example, they connect to our WiFi, and let's say they're connected to WiFi in building A, and then later they connect to Wi Fi in building B. So we know that they moved from building A to building B, not that we are tracking them, but that information is available. So all of the WiFi data throughout the day as they move around and connect to WiFi, that is also available to them. And these are just some examples. So really, the data encompasses admissions, academics, financial aid, finances, WiFi location, videoconferencing. How many Zoom video calls did you join today? Was it for a class? So you know, all of that information is also available. And what we try to do is, each time, our goal is to incrementally add new information, and then share that, those results back with students. And we continue to do small focus groups. During COVID years, doing focus groups was harder. But now that we are back in person since last year, we will continue to enhance that. So these are just some of the examples. Along the way it also has places for them to update their data, if they feel something's wrong, or something's not right. There are places and links available for more education in terms of data literacy. So those links are also available. And then as we get feedback, we continue to enhance this type of tool because one thing important to know is it's not about you roll out the tool and celebrate and stop. It's a growing, developing, evolving tool and we want it to be as useful as possible for our student community.

14:39
Kelly: One of the things I was impressed with is how much detail that ViziBlue goes into about what types of data are collected, but then also exactly how that data is used. And it just makes it look like there's a ton of data-driven decision-making going on across, you know, U-M. So I'm wondering what it took on the back end in terms of things like data governance and infrastructure to really make the most of all that data that you're collecting.

15:10
Pendse: Now I think, so first of all, at Michigan, one of the phrases I coined was data-informed decision-making. And I encourage people to stay away from the word, words data-driven, because I, the engineer in me, when you talk about data-driven, I can often, if it's data-driven, I can eliminate the human being in the loop and I can automate everything. And our goal is not to do that here. The goal is to keep the human beings in the loop and human beings are, and basically enhance and inform our colleagues to make the decisions based on data and the information that data is providing. So we talk in terms of data-informed decision-making. And you're absolutely right. And in fact, I have shared with the campus community, our goal is a data democratization, we want to make sure that we are empowering our users to be able to use the data that's available to them, and just-in-time format. The other thing that we try to do is as we think about data, I feel that for any institution, to have a successful data ecosystem, I think of it as a three-legged stool. The first leg is, of course, the wisdom and the information that you derive from data for data-informed decision-making. The second leg of the tool that's very, very important, to your point, is the data governance. We established a robust data governance several years ago, and there's actually a data governance advisory committee that weighs in on all data-related decisions, that's advisory to myself and our provost. And so we have a terrific data governance in place now. And that work still continues. So data governance is the other important leg, a leg of the stool. And the third leg, obviously, is the privacy of the information and transparency around what information is being collected. And I believe that these three legs have to be in balance for your data culture and your data ecosystem to be successful. Because for example, if you have very bureaucratic, extreme data governance, that nobody can do anything, people are gonna ignore the whole thing and kind of do their own thing, If you don't have proper data governance, and if you're just collecting data and sharing it with no rules and regulations, that can be, turn into a very, very bad situation. If you are not careful about privacy and transparency, you may end up losing trust that people place in you. And that's a responsibility, and we'd never ever want to compromise our, you know, importance of the privacy that we feel and how we value the trust that the campus community places in us. And that's why these three legs, the data governance, the use of the data, and of course, the privacy and transparency of the data, these three legs have been balanced for to have a successful data culture and a data ecosystem.

17:52
Kelly: That seems like a fine line between too much data governance and too little.

17:58
Pendse: Absolutely. And that's what, that's where they have to be in balance, right? So that's why you have to have those in balance. Because if you try to govern too much data, then what happens is in a campus where, you know, there are incredibly brilliant, brilliant innovators across, they're going to find a way around it. And so that's where it's a, it's a balancing act, right? And that's why we have that responsibility. And I'm proud to say that I have an incredible team of data governance experts, and the committee comprises of many, many faculty members and others who do take their jobs very seriously. And we respect when they opine on things.

18:32
Kelly: Can you share any challenges you may have worked through, or maybe are even still facing with data governance?

18:39
Pendse: Well there are always challenges, right? Because, like one of the challenges we had early on was, for example, how do you define data stewards or data owners, right? So for example, if you are a registrar of the university, you should be typically responsible for student data, right? And if it is HR, human resources data, the VP of HR should be responsible for that. And so we had to work initially to educate everybody, and get people to understand the different roles and so on. And that was challenging. The other part was University of Michigan, like most institutions often are decentralized organizations. And so we had to remind people and units and colleges that it's not your data or my data, it's our data. So just because you're a college or a school within a university, it's still University of Michigan's data. And it's still our data, and we become stronger and we can do more things with our data when we share our data, not when we hold our data. And so that was an important lesson for all of us, but we all came together. I often like to say that when you have, like when you have like five fingers of your hand, and if you have five fingers with your hand and, and there are different sizes, different shapes, but If you want to get things done, they have to come together to get things done. And when I talk about different shapes and different sizes, I'm talking about the diversity of the individuals involved in these decisions-making, diversity of the data that we collect all across. Yet all of that data and those individuals have to come together to get things done on a wonderful campus like ours. And that's where that trust is critical. And so that's how we went about, we were very methodical about it, we were very thorough about it, we were very patient, because you can't rush. You can't just say to people, "Oh just trust me, and life will be good." You have to earn that trust. You earn that trust by showing the results of the data that you have, and what that data is doing. So it was a, not a 100 meters dash, it was a marathon and it's still going on, where we continue to earn the trust, we continue to partner across campus, convincing people that why sharing data is what gives us our power as the institution, and it's working really, really well. People are coming together. But it wasn't easy at all. And I don't think it's ever easy anywhere, because these conversations are hard. Because you know, a researcher may say, "Oh, what do you mean, you can't open the port? Just give me the data, it's important to my research." Well, if it's patient-related data, there are rules we have to follow before I can let you access any patient-related data, right? Somebody could say, "What do you mean, you can't share the vaccine data with me? It's important for my research." Well, there are rules around vaccine data that, so we have to work with people to educate them. And once you do, people understand. And if you show them the way, people will come around and become actually your partners. And we've had great success there. That's because of the, first of all the culture of cooperation that exists across University of Michigan. And we also have a culture of very vibrant debate. So we debate these things. And that's good. We question each other, which is great. That's what a higher education institution is about, right? We question each other, we learn. I sometimes jokingly like to call it constructive irreverence, which is great.

Kelly: I like that.

Pendse: And, and so you know, it's all good. And we're working through that. And we believe that we are in a great situation, in our case, and our plans are actually this coming year to release a ViziBlue equivalent for our faculty and staff. So we're working on that actually now. So we would like to have a similar presence for our faculty and staff. Again, it's a different type of data, right? But I remember I was in a business school, college meeting where one of the departments wanted to know, why do we even bother to collect any of their WiFi data? Which is a fair question to ask, right? And so we had discussion and debate about that. But we want our faculty to be aware of what data, if any, that's there, what's collected, how long it's kept. I'll give you an example. Initially, when COVID started and we were collecting COVID symptomatic data, where people had to report in symptoms, if any, we had, at the get go, told campus, based on advice from our medical experts, that that data was going to be kept indefinitely. And the reason for that was our, our medical professionals were convinced, and they were right, that COVID was not a one-year thing. They felt it's going to be a multi-year thing. And having data from two years ago in fall, they'll inform us of how we're doing this fall, comparing that data, right? If there was a cluster happening in place A, was there a cluster in place A two years ago? Having that comparative data was going to be very, very useful. And so really a credit, foresight of our physicians and doctors where they said, "No, no, no, we want you to keep this data." So we were very explicit to the campus, letting them know that anonymized, not named, but anonymized data is going to be kept indefinitely. And we will keep the campus informed. So that's called managing expectations, being very clear and transparent about it.

23:44
Kelly: When you find people need to be educated about some kind of use of data, is that something that you do sort of just at the moment, or do you have any sort of formal communications or trainings going on?

23:57
Pendse: So I mean, you know, obviously, we don't explicitly have training, say, in data privacy. But overall, we are working hard to create a culture of data and data awareness across campus, right? And tied to that is, of course, the privacy and security of the data. So for example, if you're somebody who is responsible for handling HIPAA data, then you are required to go through a certain amount of training that you do. You know, there are others who may be dealing with sensitive student data, or what's called as FERPA data, and everyone who deals with FERPA data has to go through certain training that's required. So there are all these things, but in addition, then, periodically, we will have interesting speakers who come to our campus and engage with us and talk to us about it. For example, in, on September 28 of this year, in few weeks, we have Hebdrik, Hebdrik Toomas Ilves, he is a former president of Estonia. He's coming virtually to talk about overall digital transformation, and the privacy of the data. If you recall, Estonia is a country that was attacked via cyber multiple times by, allegedly by folks from Russia. And it's one of the most advanced, digitally advanced country. So we're going to have conversation. Now, obviously, that will be participating, that would be many, many colleagues from across the campus, faculty, staff, and students. And we'll all learn together to all be richer. We also take advantage of cyber, cybersecurity month in October and data privacy month, day and month in January, to have multiple events and speakers across campus where we are focused on data privacy, or focused on data security, and having these conversations and dialogue. And that enriches everyone.

25:43
Kelly: So to change gears slightly, I'm also interested, you know, how does U-M's commitment to data privacy and transparency impact the technology decisions that you're making, let's say when you're procuring or deploying new technologies?

25:58
Pendse: So, you know, I think, great question there. You know, both data privacy and transparency is a critical factor in how we make our decisions about the services we provide. So when we do procure new technologies, as I shared with you, we do very, very thorough vendor assessments that help us ensure that service providers that we are doing business, business with have the capabilities and resources and commitment to, frankly, keep U-M data secure and respect the privacy of our constituents. That's why I gave the example of Gmail before. And then, you know, if a potential vendor, well just say does not meet our security or data protection standards, we actually don't do business with them. We are really explicit about that. Tied to that, I might also mention, while we're doing this kind of evaluation, we also do evaluation of accessibility of the technology tools. And if we feel that if a technology tool is not accessible, or if accessibility is not on the roadmap for that vendor, we will not do business with them either. So we're very thorough about these things, because we take privacy of the information very, very, you know, we're just very committed to it. And we also seek out tools, as I shared with you before, to help our faculty, staff and students be, frankly, more secure and privacy aware in their everyday activities. I think I shared with you before, one example was the tool Virtru, which we made available to the entire community back in fall of 2020. And that came in very handy, frankly, during COVID times, you know, people were concerned, people were working from homes. So having an encryption tool for those who wanted to use it was terrific. We also added another tool from a company called CrowdStrike. That, frankly, enhance privacy and added more enhanced security, I should say, and added additional tools that our campus community could use to keep themselves secure when we were working remotely. So things like that, we're always open to looking at tools, and making those available to our campus community that enhances security, that provides the right kind of privacy, is, frankly, user friendly is another thing we look at, and on and on.

28:14
Kelly: Yeah, if something's not user friendly, then people just aren't going to use it, right?

28:18
Pendse: I know. Because you know, iPhone changed our perspective completely, right? I mean, now we expect information our fingertips, we expect you push a button and things happen. And frankly, it has consumerized … I've jokingly always said that ever since iPhone came, every human being became an IT expert. And they exactly know what needs to happen. And so we as IT colleagues, we have to step up and deliver. And that's the expectation. So we have to work very hard, and really meet our users where they are. Back in the old days, you could say, oh, hey, I'm the IT shop here, come over. No. Now you have to be very user. And we take that, and that's why if you notice one of our high-level goals was service excellence in terms of providing the right kind of services to the user in the most user centric way. So I'm talking to researcher then it needs to be researcher centric. If I'm talking about a student, it needs to be student centric. We need to meet where students are. And so that's where we have to do that, there's no, and that's the right thing to do, frankly.

29:17
Kelly: Do you ever run into times when faculty or students want to use a particular tool for learning, or like collaboration, or etc, but it doesn't live up to U-M's privacy policies?

29:31
Pendse: Literally every other day. Because you know, when you have 19 schools and colleges, 46,000 students, 9,000-plus faculty members, budget of $11 billion and counting, you know, people are innovative. People are always, you know, pushing the envelope of the next great thing that they want to develop. Many, many technologies that are internet users today were developed at University of Michigan. And so we absolutely don't want to stifle that innovation, but we also want to keep us secure. And that's why our security paradigm is about securing our open society. So how do we continue to provide the openness while securing our society? That's what we talk about. And there, it's always a challenge. And so we work with faculty members. What we, what we have done is, we never say, in our vocabulary, no, you can do that. We always say, well, let's work together to figure out a way that you can do this securely. Because we do not want to stifle innovation. Because if we try to do that, people will simply bypass you, they'll ignore you. And then it puts the entire university at risk. So our goal is to work with you, if you're the researcher, to work with you to say, how can we work with you to ensure you get to where you need to go, at the same time, we want to keep our campus secure. So that does require us to be constantly engaged with the campus community, work with them, provide them the tools, you know, provide them an environment where they can try their experiment without exposing the entire university. So we routinely work with our faculty colleagues, graduate students, undergraduate students, to create environments. And that's work in progress. Because you know, innovation has no boundaries, it occurs everywhere. And we want to be there to support it.

31:11
Kelly: Do you have any final advice for institutions on basically how to balance, you know, all of these things, data governance, privacy, transparency, and also being able to use data to inform decision-making?

31:23
Pendse: I think, I think the example I shared about the three-legged stool comes in very, very handy. It's about that balance. It's about creating the data culture, it's about building trust with the community. And if one can do that, then the overall aspects of privacy and data and data use and data, you know, informed decision-making all can come together really, really well for the benefit of the campus. And one great thing about University of Michigan is we are very happy to share how we have done these things. So if anybody wants additional information or wants help of our staff, colleagues, we are very open to supporting any and every educational institution that's out there. We're happy to help.

32:04
Kelly: That's great. Well, thank you so much for coming on.

32:07
Pendse: I appreciate the time and it's an honor to present this wonderful institution. And really, the credit for all of this goes to my incredible staff colleagues, our really really brilliant students who inspire us every day, and our faculty members who are just out of this world. Amazing colleagues.

32:27
Kelly: Thank you for joining us. I'm Rhea Kelly, and this was the Campus Technology Insider podcast. You can find us on the major podcast platforms or visit us online at campustechnology.com/podcast. Let us know what you think of this episode and what you'd like to hear in the future. Until next time.

Featured

  • glowing blue nodes connected by thin lines in an abstract network on a dark gray to black gradient background

    Report: Generative AI Taking Over SD-WAN Management

    In a few years, nearly three quarters of network operators will use generative AI for SD-WAN management, according to a new report from research firm Gartner.

  • abstract pattern with interconnected blue nodes and lines forming neural network shapes, overlaid with semi-transparent bars and circular data points

    Data, AI Lead Educause Top 10 List for 2025

    Educause recently released its annual Top 10 list of the most important technology issues facing colleges and universities in the coming year, with a familiar trio leading the bunch: data, analytics, and AI. But the report presents these critical technologies through a new lens: restoring trust in higher education.

  • abstract image representing AI tools for reading and writing

    McGraw Hill Introduces 2 Gen AI Learning Tools

    Global education company McGraw Hill has added two new generative AI tools to help personalize learning experiences for both K–12 and higher ed students, according to a news release.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.