Open Menu Close Menu

Privacy & Compliance >> Better Safe Than Sorry

Here’s what’s happening with privacy and compliance legislation—and why it’s in your institution’s best interest to keep up.

Two thousand. 145,000. 380,000. 600,000. What do these numbers have in common? Each number relates to a security breach within the past year in which a computer holding sensitive personal information in California was compromised. The numbers represent the number of people whose personal data had been potentially compromised in each incident. Though these are only a sampling of such incidents, together they put more than a million people at a higher risk of identity theft, at least potentially. How could this be allowed to happen? Why aren’t companies keeping data secure? [Photo] Sun Chairman Scott McNealy delivers his famous 'Get over it' speech in '99,and the battle for sticter privacy legislation is on.

These are the questions that typically come to mind when a letter arrives notifying a computer user that his data may have been compromised due to a computer security breach. No doubt, the recipient of that notice also experiences the fear that an unscrupulous person may now be stealing his identity [see “The Power of Who,” January issue,], not to mention the accompanying anger with those responsible for allowing the privacy debacle to happen in the first place. It’s a natural reaction: Identity theft is now the fastest-growing crime in the nation, and the damages to credit and reputation can take months or years to clean up. So why is more care not being taken to protect privacy?

Motivating disclosure. The truth is, a lot of care is being taken to protect personal data, by organizations that collect it for one reason or another. Colleges and universities are no exception: We do take great care to protect personal data we are responsible for, whether it is social security numbers during registration for classes, or credit card data for purchases at the bookstore. Nevertheless, most of the incidents referred to previously were computer security breaches that occurred anyway, at institutions of higher education in California. We know about these breaches because they eventually made it to the media, and importantly, that’s because a new law in this state requires disclosure of security breaches of computers containing personal information of California residents. Notifying people whose personal information may have been compromised helps to alert them to the possibility of identity theft.

Sen. Feinstein (D-CA) pushes for federalizationi of privacy breach notification.

But organizations responsible for disclosing breaches of personal information in California now have new reasons to do this well: If they do, they may avoid remediation costs and negative media attention. And in the future, these same incentives for action may spread nationwide, as the principles of this legislation form the basis of a federal counterpart (S. 1350) being proposed by Senator Diane Feinstein (D-CA). Yet, this is only one of the new reasons colleges and universities have to concern themselves with protecting privacy.

“You have no privacy. Get over it.”

Sun Microsystems Chairman and CEO Scott McNealy uttered these (in?)famous words in 1999. Was he prescient or merely cynical? And should we just “get over it”? After all, VISA already knows what you’ve bought; marketers track your surfing across popular Web sites; and your cell phone company will soon know exactly where you are at any given time.

Yet, with incidents of identity theft skyrocketing, concerns about homeland security, and increasing appreciation for how vulnerable our widespread data is, we see these worries reflected by legislative activity in the area of privacy.

Legislation soup. Beyond individual state legislation, there is an alphabet soup of federal legislation that colleges and universities must comply with. HIPAA, the Health Insurance Portability and Accountability Act (, defines privacy and security standards for the protection of personally identifiable medical information, among other things. GLBA, the Gramm-Leach-Bliley Act (, creates obligations to protect customer financial information. Then there is FERPA, the Family Education Rights and Privacy Act ( protecting student information, already well-known to higher education.

New California law helps prevent identify theft

A new law in California (effective July 1, 2003) basically requires that individuals be notified if their personal data, kept on a computer, is somehow compromised (e.g., someone steals a laptop containing personal data or breaks into a computer via the Internet and grabs the data). According to this new law, the organization responsible for the data and computer is also responsible for the notification. The idea is that if personal data is somehow stolen, the individual should be alerted so that she can take proactive steps to protect herself against identity theft. (The law d'esn’t specify what it means by “personal” data.)

With this new legislation in place, there are now significant incentives for any organization keeping personal data, to protect it well. Consider the cost of notifying 100,000 people: 100,000 envelopes, sheets of pre-printed paper, and stamps; probably a hotline with trained operators; media preparation for a large incident; and staff time.

Things aren’t always straightforward, either.

A computer with personal data could be infected by a computer worm that allows anyone to enter the computer over the Internet via a “back door.” The computer user would know that the computer was compromised, but wouldn’t know if anyone actually entered the back door and touched the data. Whether the organization notifies or not (a decision with large consequences) often comes down to an educated guess.

Some individuals in California have already received two—or even three—notices indicating their data may have been compromised. Will this result in a loud cry for sustained change in protecting personal information? Or will people begin ignoring these notices if they see them too frequently? At this point, the answer to these questions is anyone’s guess. may have changed everything

In October 2004, a particularly worrisome situation was observed and discussed by IT security folks in colleges and universities across the nation. Spyware put out by was found to be redirecting individuals’ Web traffic—including secured (SSL) connections for secure transactions, say to a bank or credit card company—to a server registered by Marketscore. At that point, Marketscore had the ability to monitor everything those people did via the Web, although precisely what the company was up to still isn’t known. But as a Web user, consider the implications of working with confidential data over what you thought was a secure connection!

As it happened, one version of the spyware was bundled into the download for iMesh, a popular Internet file-sharing program. Another offered raffles and prizes in return for registration. Apparently, everything the software did or could do was in the fine print to which the user agreed by clicking “I Agree”—including the right to download any software onto his computer, without notice.

Defenses varied across institutions. Many blocked access from their campus to the Marketscore servers to the extent possible, and/or redirected such access to a local page giving information about this software. Others notified their constituents directly. The bigger problem is: When redirected secured Web traffic to its own servers, yet another new challenge to privacy was born. We only need to wait and see what comes next. More information can be found at

FERPA, however, is also one of more than a dozen statutes amended by the USA PATRIOT Act of 2001 (Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism), to give law enforcement greater access to confidential information. The PATRIOT Act also creates the mandate for SEVIS, the Student and Exchange Visitor Information System used by the INS to track foreign students in US schools. In these cases, colleges and universities must of course comply with the law, but must also consider how to protect the privacy of our students, faculty, and staff to the greatest extent possible within the framework of the law. This may entail something as simple as ensuring data is kept only as long as it’s needed, and no longer. Whether it’s realistic or not, there is certainly an expectation that privacy will be protected.

Campus IT security measures. Of course, through their IT security programs, higher education is already doing much to protect systems. For example, antivirus and patch management are two basics of IT security that keep at bay the proliferation of worms and viruses that seek to compromise our systems (and information). And good security awareness is another basic that addresses the more insidious trend in social engineering attacks such as “phishing,” in which the nuisance spammers (who send out thousands or millions of e-mails) join hands with the serious scammers, who use tricks to gain monetary advantage. Ditto for spyware and adware, which come bundled unannounced with other software that users download, monitoring where they surf on the Web, installing software on their systems without permission, and reporting information about those systems back to the author. Bank account “warning” e-mails with company logos (which many of us have received in recent months) are a prime example of the chilling pseudo-authenticity of such messages. But perhaps the greatest challenge we face is simply the ubiquity of data.

Access to God

In June 2003, Alan Cohen, a VP of the Wi-Fi (wireless) provider Airespace (, was quoted in a New York Times op-ed piece saying, “God is wireless, God is everywhere and God sees and knows everything. Throughout history, people connected to God without wires. Now, for many questions in the world, you ask Google, and increasingly, you can do it without wires, too.”

Privacy Perspective

What exactly is privacy? Turns out, that’s actually not an easy question to answer.

US Supreme Court Justice Louis Brandeis wrote in 1928 about “the right to be left alone.” Yet our notions of privacy—the right to be “left alone”—have changed remarkably over time: Consider the short space of years that has elapsed between telephone calls made in the privacy of enclosed booths, and calls we now make via cell phones on buses, happily blabbing our most intimate details. Technology (the Internet in particular) has often been behind these evolving social expectations.

In fact, different cultures approach privacy differently. In the US, the federal government has created a patchwork quilt of privacy legislation; laws that protect personal health records, financial information, student data, and so forth. There are also laws like the Electronic Communications Privacy Act, which protect a type of communication rather than a type of information. But other forms of personal information are not covered and thus are “vulnerable,” though many states have defined their own specific privacy legislation.

The European Union has taken a different, overarching approach in its Privacy Directive, requiring members to protect the “fundamental rights and freedoms of natural persons, and in particular their right to privacy with respect to the processing of personal data.” Some of its provisions include requirements to specify up front what personal data is to be collected and why; that such data must be kept accurate and up-to-date; and that individuals have a right to know about data collected about them, and a right to correct any inaccuracies.

We expect information at our fingertips, and often it is—including our personal information. It’s collected by many companies, government offices, and other organizations such as colleges and universities; it may be outsourced to companies in other countries, and may be kept on many different servers and computers in each of these places. Services like Google blindly index anything they come across, whether intended for public consumption or not (consider how many budget spreadsheets can be found by searching for budget.xls). Individually, we keep our banking and other financial transactions on our laptops. Data is everywhere, and it’s hard to protect something so vast and diffuse.

Do organizations need privacy, too?

Did you know that Dan Brown’s The Da Vinci Code is one of the most-often purchased books by folks at UCLA? Go to’s “Purchase Circles” page (, and you can find out who’s buying what at your favorite company, university, or city. While Amazon only compiles aggregate statistics that do not reveal the activities of individuals, Purchase Circles can reveal the activities of an organization. So, here’s the question: Is the privacy of an organization—that is, institutional privacy, which
in some cases may translate to institutional reputation—a matter for concern?

Well, maybe. What if instead of The Da Vinci Code, Purchase Circles revealed that a book about evading taxes (or pick your own favorite shady topic) suddenly became the number-one seller at UCLA? Who might take an interest? What if it turned out to be for instructional purposes? Why is this information anyone’s business? Finally, should we care at all?

These questions have been raised at UCLA because of the possibility of research projects requiring the capture of traffic content that is flowing over UCLA’s networks (for example, a project wants to look at which Web sites are most visited from inside UCLA). There are many safeguards to protect individual privacy in research, particularly in research involving human subjects which must satisfy federal privacy requirements. Generally, any data captured would have to be “anonymized” to ensure it could not be used to track the behavior of any individual. But even if data can’t be traced back to an individual, the aggregate data can still reveal behavior patterns of the UCLA community as a whole, whether meaningful or not. In other words, if we have the data, we can be compelled to disclose it. Isn’t this a matter for concern? While we’re pondering this question, maybe we also need to consider the cultural expectations we have about surveillance: Could a shift in privacy policy have a chilling effect on inquiry? Although these concerns are well-founded, it is also true that there are plenty of legitimate research projects that would benefit from this type of data collection.

Not surprisingly, UCLA is forming a Privacy Board to deal with precisely these types of new privacy challenges; challenges that intertwine legal mandates, cultural expectations about surveillance, and the values of academic freedom. It’s likely to be a busy board.

Putting cart behind horse. Technical security measures we have in place are only part of the solution, particularly in a higher education context where the very security measures that protect privacy can also intrude upon it and can be seen to infringe upon the open exchange of ideas and information that underlies the academy. In fact, technology solutions are really putting the cart before the horse.

A Common Sense, 4-Step Approach

With so much need for privacy (whether it’s for regulatory compliance or for other reasons) and so many different ways privacy can be compromised, what are we in higher education to do? One straightforward way to look at the challenge is to consider the following four-step process, building on what privacy and security infrastructure we already have in place:

  1. Inventory. Know where all of the sensitive information is in your institution, and how it’s used.
  2. Minimize. Ensure that confidential information is kept and used only where necessary, and question whether all of such data is actually needed (for example, using only the last four digits of a social security number, instead of the entire number). Take into special consideration whether confidential data should be permitted at all on portable devices—laptops, PDAs, USB flash drives—which are so easily lost or stolen.
  3. Protect. Implement logical, technical, and physical security controls to safeguard those systems that still contain confidential information—and the people who use those systems. For example, from a policy standpoint, consider setting minimum standards for devices that connect to your network.
  4. Educate. Create awareness that protecting sensitive data is everyone’s responsibility: Not all such data may reside in well-protected central databases. For example, it has long been the practice of faculty who write letters of recommendation for their students, to include the student’s social security number in order to avoid confusion with other students with the same name.

Policies and plans. None of these steps is necessarily easy. Even cataloging sensitive information in a decentralized environment can be daunting. But you’re building on what security and privacy infrastructure you already have, not starting from scratch. In particular, some of the most important elements of your infrastructure to consider include your policies on security and records retention, and your incident response plan.

Privacy & Compliance resources you should know

Educause Security Resources (

“Scale the Solution to the Problem,” Cedric Bennett, Educause Quarterly, November 2004, (

“IT Security for Higher Education: A Legal Perspective,” Kenneth D. Salomon, Peter C. Cassat, Briana E. Thibeau, Educause/Internet2 Computer and Network Security Task Force, March 20, 2003, (

“Principles to Guide Efforts to Improve Computer and Network Security for Higher
Educause/Internet2 Computer and Network Security Task Force, August 2002, (

And then there are some principles to consider… The following principles may be helpful in any of the above undertakings, whether or not you have an IT security officer role at your institution.

First, the principle of not reinventing the wheel applies not only to building on what you already have, but also to what others have already developed. Take advantage of the many resources available through professional organizations and the Internet.

Second, to manage privacy, cement partnerships with other key organizations and individuals across the institution. Those partnerships could be forged with your controller, registrar, legal counsel, or police department. After all, privacy and compliance are certainly not “just” IT problems!
Finally, start an institutional dialog about what role privacy values play at your college or university, if there isn’t one in existence already. As the institutional privacy example in the box at left (“Do organizations need privacy, too?”) shows, you will need understanding and buy-in from all parts of the campus.

In the end, it’s all about keeping up with the challenge in order to achieve the best balance for your institution. Yes; technologists, higher education, and others have spent decades making online access to information ubiquitous. The grand challenge now lies in managing the access to meet societal expectations not just about access, but about privacy as well.

Kent Wada is Director, Information Technology Policy, for the University of California-Los Angeles. He also serves as UCLA’s designated agent for the Digital Millennium Copyright Act and is AIS manager of Planning.

comments powered by Disqus