Who's Watching Me?

Bob Blakley’s reflections from the Digital ID World conference

A principal analyst for identity and privacy strategies at the Burton Group, Bob Blakley gave a talk this month at Digital ID World in Santa Clara, CA, titled, “What is Privacy, Really?” In a separate interview, Campus Technology logged Blakley’s comments relative to privacy and technology in higher education:

Bob Blakley

Bob Blakley ponders privacy and identity

Conventionally, people think of privacy in terms of secrecy. They think that privacy means the obligation to protect information that we have observed, maybe in the course of our job, about other people. The point that I made [in my talk] today was that there’s another part of privacy that we don’t speak of so often, which is equally important, or perhaps more important – and that is our obligation not to pry into other peoples’ affairs; to avert our eyes or close our ears if we come across something that is obviously private. And [I spoke about] our obligation as a society to censure people who don’t fulfill that obligation, because that kind of behavior – voyeurism and gossip – is destructive to civil society. I made an argument that there may be a way to construct a right to privacy that d'esn’t currently exist in our constitution on these grounds, if it continues to be the case that lots of private information is exposed.

The academic community has, in a certain sense, been a leader in the development of technology [related to this]. When I was the first editor of the OASIS SAML specification, we had a lot of discussions about the relationship between privacy and anonymity and the ability to identify attributes of people without identifying the people themselves. The higher education community needed to be able to comply with the FERPA regulation, and needed to be able to, for example, enroll a student from one campus at another campus without revealing that student’s identity to the second campus. And what was developed on top of SAML was the protocol called Shibboleth – that was one of the initiatives that formed some of my thinking on this topic.

Higher ed is a microcosm of [privacy] regulations. The HIPPA regulation applies to the campus medical clinics, and financial privacy regulations apply to universities because they often, for example, issue ID cards to students that allow them to purchase meals at a commissary or to purchase textbooks. So, higher education encounters privacy regulations in many of its operations: health center operations, financial operations, and such, in the administration of both disciplinary records and academic records.

[In terms of related technology application areas, the example] that I’ve been most personally acquainted with, through some conversations with the Association of American Medical Colleges (AAMC) at the time the HIPPA regulation was promulgated, is telemedicine. There are a number of universities who are pioneering the use of telemedicine, especially through distribution of medical imagery over very high bandwidth connections like Internet2. For a long time this was the exclusive province of the academic community, and it created a set of electronic health record considerations that hadn’t been dealt with by anybody else in the [medical] community. So certainly, academic medical centers are really pioneering both the technology and the social context that’s going to be embedded there.

And one of the big problems in the online world is, if for every piece of data there was someone who was recognizably responsible for it, it would be so much easier to enforce privacy regulations. But the fact is, there’s a large amount of data out there that is in a certain sense, orphaned: [not necessarily abandoned], but nobody has any statutory responsibility for it. The fact that it’s out there and it d'esn’t have to be accurate and nobody’s accountable, gives you a kind of wild-west aspect to the landscape that contributes to the difficulty of solving the privacy problem.

Bob Blakley is the principal analyst for identity and privacy strategies at the Burton Group.

Featured

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • stylized illustration of an open laptop displaying the ChatGPT interface

    'Early Version' of ChatGPT Windows App Now Available to Paid Users

    OpenAI has announced the release of the ChatGPT Windows desktop app, about five months after the macOS version became available.

  • person signing a bill at a desk with a faint glow around the document. A tablet and laptop are subtly visible in the background, with soft colors and minimal digital elements

    California Governor Signs AI Content Safeguards into Law

    California Governor Gavin Newsom has officially signed off on a series of landmark artificial intelligence bills, signaling the state’s latest efforts to regulate the burgeoning technology, particularly in response to the misuse of sexually explicit deepfakes. The legislation is aimed at mitigating the risks posed by AI-generated content, as concerns grow over the technology's potential to manipulate images, videos, and voices in ways that could cause significant harm.

  • Jetstream logo

    Qualified Free Access to Advanced Compute Resources with NSF's Jetstream2 and ACCESS

    Free access to advanced computing and HPC resources for your researchers and education programs? Check out NSF's Jetstream2 and ACCESS.