Report: Need for Privacy Isn't New, but the Need for Help in Sorting It out Is

Contrary to what you may have heard, the need for privacy is as old as Adam and Eve. But figuring out how to effect it appropriately in an information age is complex, and we don't always know how to navigate the questions we're asked related to privacy. In fact, privacy behaviors vary among cultures and context, making appropriate responses even more complicated. For example, while Americans have a reputation for being more open about sexual matters, the Chinese are more open about financial details, such as income and what they've paid for goods. Could it be that individuals can't be counted on to make the best decisions for themselves since they don't always know the tradeoffs?

That's one of the suggestions in an article written by Carnegie Mellon researchers and published recently in Science, published by the American Association for the Advancement of Science.

"Privacy and Human Behavior in the Age of Information" examines research on privacy behavior from a multitude of sources and offers three themes that help to organize and make connections between them. The first theme encompasses "people's uncertainty about the nature of privacy trade-offs, and their own preferences over them." The second theme that surfaces is the "context-dependence of privacy preferences; the same person may respond quite differently to privacy questions depending on the situation. The third theme is the "malleability of privacy preferences"; privacy preferences change based on influence from outside forces that possess "greater insight" about the level of risk.

All three themes are closely connected, the researchers found. Context-dependence is tied to uncertainty. When people don't know how to respond to questions regarding privacy, they "cast around for cues to guide their behavior." Privacy choices are changeable and subject to influence "in large part because they are context-dependent and because those with an interest in information divulgence are able to manipulate context to their advantage."

"Privacy is not a modern invention, but a historically universal need," said lead author Alessandro Acquisti, a professor of information technology and public policy at the university. "In certain situations, individuals will care for privacy quite a lot and act to protect it, but advances in technology and the acceleration of data collection challenge our ability to make self-interested decisions in the face of increasingly complex tradeoffs."

For example, the authors explained, social networks may offer users granular control over how their information is shared; yet by virtue of customizing what's made public or not, without realizing it, the users may end up sharing more about themselves than those not given such control. Likewise, users may choose default privacy options on a Web site because it's more convenient or because the defaults are perceived as "implicit recommendations."

"Although control is the cornerstone of most policies designed to protect privacy, giving people more control increases trust and leads individuals to lower their guard and disclose more," said Laura Brandimarte, a postdoctoral fellow.

Ultimately, the researchers recommend, policy related to privacy shouldn't rely "exclusively" on allowing individuals to make their own privacy decisions since sites and services may lack sufficient transparency and control for them to make informed decisions. Better to develop policies with a "baseline framework of protection."

"People need assistance and even protection to aid in navigating what is otherwise a very uneven playing field," the report stated. "A goal of public policy should be to achieve a more even equity of power between individuals, consumers and citizens on the one hand and, on the other, the data holders such as governments and corporations that currently have the upper hand."

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • student reading a book with a brain, a protective hand, a computer monitor showing education icons, gears, and leaves

    4 Steps to Responsible AI Implementation

    Researchers at the University of Kansas Center for Innovation, Design & Digital Learning (CIDDL) have published a new framework for the responsible implementation of artificial intelligence at all levels of education.

  • glowing digital brain interacts with an open book, with stacks of books beside it

    Federal Court Rules AI Training with Copyrighted Books Fair Use

    A federal judge ruled this week that artificial intelligence company Anthropic did not violate copyright law when it used copyrighted books to train its Claude chatbot without author consent, but ordered the company to face trial on allegations it used pirated versions of the books.

  • server racks, a human head with a microchip, data pipes, cloud storage, and analytical symbols

    OpenAI, Oracle Expand AI Infrastructure Partnership

    OpenAI and Oracle have announced they will develop an additional 4.5 gigawatts of data center capacity, expanding their artificial intelligence infrastructure partnership as part of the Stargate Project, a joint venture among OpenAI, Oracle, and Japan's SoftBank Group that aims to deploy 10 gigawatts of computing capacity over four years.

  • laptop displaying a phishing email icon inside a browser window on the screen

    Phishing Campaign Targets ED Grant Portal

    Threat researchers at cybersecurity company BforeAI have identified a phishing campaign spoofing the U.S. Department of Education's G5 grant management portal.