Protecting Privacy while Leveraging Opportunity

A Q&A with Kent Wada

"Even bigger data can mean bigger opportunities but may also present bigger risks. We are constantly balancing these factors." — Kent Wada

The concepts of privacy and information security often become intertwined when people mention one or the other in casual conversation. Many people don't make, or even seem to be aware of a distinction between the two. Yet, they are different, and while you might not find universally established ways of treating each in higher education, you'll see that most institutions do differentiate between them.

Years ago the University of California constructed a useful model for thinking about privacy and information security: The technology assets of the university that we want to protect, such as devices, computers, networks, hardware, servers, e-mail systems, and the data associated with these assets all fall under information security. A subset of the data that pertains to people falls under privacy.

Today, IT leaders and privacy officers are faced with increasing privacy and information security issues along with a growing list of compliance requirements. Many institutions, like UCLA, have established a 'privacy officer' role, separate from but complementary to information security functions. At the same time, advancing technology is presenting higher education institutions with compelling new opportunities that must be weighed against the assessment of risk.

Here, UCLA Chief Privacy Officer Kent Wada talks about the differences between privacy and information security in higher education along with the need to consider opportunities as well as risks.

Mary Grush: How is privacy different from information security, in the context of higher education institutions?

Kent Wada: Let's start with the assumption that privacy is about protecting people. Without wanting to put words into the mouths of my information security colleagues, one could characterize information security as ensuring the confidentiality, integrity, and availability (CIA) of data, and by extension, the systems, infrastructure, and services necessary to do so.

In this framework, information security is concerned with safeguarding confidential or sensitive information, whether or not it's about people — for example, a list of campus labs housing toxic chemicals or information about computer security configurations.

Where information about people is involved, privacy and information security work hand-in-hand to help institutions be good stewards of information entrusted to us by our extended communities of students, faculty, staff, patients, volunteers, donors, and others.

Roughly speaking, one might think of information security as responsible for administrative and technical controls to safeguard data; whereas privacy is looking to ensure compliance with laws and other institutional obligations and has responsibility for communicating why data about people is being sought, how it will be used, obtaining consent, and the like. 

Grush: It sounds like those who work for the privacy of information really do complement the IT staff who work for the security of information.

Wada: Certainly. On the privacy side, protecting information held about people against unauthorized disclosure and use can be captured by the term "information privacy", reinforcing the natural alliance with information security. Both these functions become ever more critical: Who wants a breach?

Grush: Does privacy also cover concerns over surveillance?

Wada: Yes. We also use the word privacy to mean safeguards against the monitoring of behavior, or surveillance — or to use a sound bite, 'Big Brother'. Social media companies are seemingly always in the news for violations, whether perceived or actual, of this kind of privacy. 

Grush: Do you have a separate term for privacy issues addressed in light of the potential for surveillance?

Wada: A report by a University of California task force examining the appropriate balance between privacy and information security used "autonomy privacy". While information privacy is about the protection of personal data, autonomy privacy addresses the potential for surveillance and the monitoring of behavior.

And while information privacy is an institutional function that protects all data, autonomy privacy is most relevant to academic freedom, scholarship, and research.

Grush: For individuals, is real privacy "over"? Or, can we still expect to enjoy the ability to keep ourselves from unwanted exposure or surveillance? Do we each need to work towards some type of balance?

Wada: I often hear "I don't care… I have nothing to hide". The implication is, I suppose, that if you want some personal privacy, you must be doing something illicit, immoral, or shameful. But have you ever wanted to grow tall shrubbery because of a nosey neighbor? Or to be let alone in a moment of personal grief? This kind of privacy is not meant to be absolute, but exists in balance with a constellation of values and obligations of our institutions, and of society.

From the perspective of higher education, this form of privacy is related to the First Amendment, academic freedom, and intellectual freedom. An open letter written by the University of Wisconsin-Madison's previous chancellor about the need to protect communications between scholars from premature disclosure provides an elegant argument for academic freedom and implicates privacy as an underpinning of academic freedom. Whether or not you agree with the argument, and whether we might disagree about what expectations for personal privacy should be set for our society, I hope we can agree that no privacy at all will not serve us well.

And this is where privacy can be in conflict with information security: Security often depends on identifying every access, seeing everything, logging everything, forever; whereas autonomy privacy resists such total, indelible knowledge.

In a different realm, would we be better served if we abolished the use of cash in favor of reliably traceable credit card transactions? There are good arguments on both sides, but I also know: I don't care, I have nothing to hide.

For higher education, we need to find a way to balance these two imperatives, one which protects the mission from the skyrocketing consequences of failure — fines, damage of reputation, litigation, not to mention being mired in countless hours internally — and the other which helps enable our ability to accomplish our mission of teaching, research, and for public institutions, service.

Grush: Do you have any wisdom or insights into finding a balance?

Wada: I need to partly reframe what we're balancing. Privacy and information security, yes. But in some sense, both push to lock down access to data, the lifeblood of our institutions, in order to prevent harm to individuals and to the institution… at a time when virtually every opportunity in teaching, research, and in the very success of our institutions, has some aspect dependent on data and our ability to harness insight from it: to create, consume, collect, steward, and transact with even larger troves of data than we already do. Still, even bigger data can mean bigger opportunities but may also present bigger risks. We are constantly balancing these factors.

We've long recognized that certain types of data are valuable assets, whether intellectual property created by faculty or patient data used in precision medicine. We're now recognizing the value of other data assets.

So the other critical balance is between the two institutional imperatives of preventing problems and facilitating mission, or perhaps innovation. Privacy and security, and probably legal, audit, compliance, risk management, and others all tend toward the former, which can translate into limiting access to data, exerting greater control over our data assets. And the divergence from those who need data to soar is widening rapidly. These challenges are only sharpened by a data border at the edges of our campuses that has all but vanished with the global reach of our institutions, our faculty, and our third-party service providers.

Grush: What are your thoughts on how to sort out risks from benefits?

Wada: I can make a personal observation: It's common to make risk-based analyses these days. I would say we also need to consider explicitly: What's the opportunity analysis? Put crudely, one asks "Can we afford the possible consequences?" whereas the other asks "How badly do we want to do this?" They may sound like two sides of the same coin, but I think it's important that each side put forth its best arguments, and that they be looked at side-by-side to make a decision — in some sense akin to the importance of having a minority report, or dissenting opinion, to understand the full range of arguments.

It's also important to keep in mind that trying to address every problem before taking a first step is no better than rushing headlong without any consideration of possible consequences.

Grush: Are you watching, or have special concerns about student data — perhaps data used for academic alerts or advising?

Wada: There are issues of privacy we are watching related to learning analytics. Who owns the metadata about student behavior being gathered by third-party service providers and who gets to use it and for what purpose? What if the metadata is used to create a new service by the third party that creates a new revenue stream without agreement by the institution or the students whose behavior was being monitored? For what purposes is predictive analytics by an institution appropriate when it comes to directing a student's future? Should the student have a stake in whatever value is created by the use of this data? Is there a model in which we aren't all trying to assert sole ownership of the value of a data asset, and instead find a way to share equitably in the value? I suspect we're going to have to resolve this sooner rather than later.

Some of this can be addressed contractually with third parties, but the more important issues will require institutional attention to complex questions around use of data.

Grush: Do you define some privacy issues in terms of requirements that come to the university from the outside?

Wada: Higher education is already one of the most highly regulated sectors, and new compliance obligations, including privacy compliance, are only multiplying. Higher education and the private sector alike continue to work on implementing compliance with the European Union's General Data Protection Regulation (GDPR). The Federal Student Aid Office of the Education Department is renewing and evolving its requirements around breaches of financial aid data. California just passed (on June 28) the California Consumer Privacy Act of 2018, some of which has even broader reach than the GDPR (though what the implications are for higher education institutions and academic medical centers still needs analysis). Even as we make sure we comply with this and every distinct set of requirements in every area of concern, I think it is important that the privacy officer's responsibilities are viewed holistically, from a university-wide perspective. Soon it may be impossible to do otherwise, given these expanding obligations.

Grush: If a campus recognizes the distinction between information security and privacy, what does that mean for the organization? I'm sure there are several different approaches…

Wada: I expect we'd find every conceivable model somewhere. Chief information security officers often report to the CIO, though sometimes they are peers. Privacy often lives within the compliance office of an institution (especially for academic medical centers), or with legal. Having privacy within IT is generally disfavored within privacy circles. And that probably makes sense, given that the privacy function goes well beyond IT, even while privacy issues are often tied to technology and what it can do.

On the other hand, institutions that are just starting a privacy function often hand the task to the information security officer, making them the institution's de facto first privacy officer. It's terrific when an institution recognizes the privacy need. The person wearing both hats has a tricky path, however: Knowing that there are circumstances in which privacy and information security can be in conflict, it's important to consider each issue from both viewpoints explicitly — even if it's resolved internally as a mental exercise, so that the more familiar path doesn't reign by default (that's the need for an explicit "minority report" again).

Grush: As a chief privacy officer, what's one of your biggest concerns if there's a breach?

Wada: If we were to suffer a data breach, of course we would comply with applicable laws that prescribe what we must do. But we can find ourselves in situations where we are afforded a certain degree of discretion, say because circumstances are such that we can't know for certain what actually did or didn't occur. Part of my thinking is to ask what I'd want the institution to do if it were my own data involved… It may only be an informal opinion, but I feel it an important aspect of my role to give a voice to the people who won't be at the table for our deliberations.

Grush: Are there any areas where the technology is so new or advanced that it's difficult or even impossible to plan safeguards against privacy violations?

Wada: Yes, though part of the problem is even being aware of new technologies (I think it helps if you have kids!). Some are in early stages where applicability is exploding, such as blockchain or facial recognition. Wearable tech has all sorts of implications for our student athletes, incredible opportunities, and possible consequences as well. As serious as the chief privacy officer role is, these new areas always keep the job interesting.


Featured

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs AI Content Safeguards into Law

    California Governor Gavin Newsom has officially signed off on a series of landmark artificial intelligence bills, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.

  • An abstract depiction of a virtual reality science class featuring two silhouetted figures wearing VR headsets

    University of Nevada Las Vegas to Build VR Learning Hub for STEM Courses

    A new immersive learning center at the University of Nevada, Las Vegas is tapping into the power of virtual reality to support STEM engagement and student success. The institution has partnered with Dreamscape Learn on the initiative, which will incorporate the company's interactive VR platform into introductory STEM courses.

  • Campus Technology Product Award

    Call for Entries: 2024 Campus Technology Product Awards

    The entry period for the 2024 Campus Technology Product Awards is now open.