Carnegie Mellon Gives Privacy Grade to Android Apps

Google Maps gets an A. The free version of Angry Birds gets a C. And My ABCs by BabyBus gets a D. The letters assigned to each of these Android apps are grades, and while A is great, D means failure — in privacy, that is.

Those grades and a million others were assigned through a scanning application that combines automated techniques with crowdsourcing to capture the behavior of an app and measure the gap that exists between how people expect the app to behave and how it actually behaves. For example, while people expect apps such as Google Maps to use location data from the smartphone, there's little reason for a game like Angry Birds or an educational app such as My ABCs to read phone status and location and gain network access other than to identify users for market and customer analysis and deliver targeted advertising.

That's why a research team at Carnegie Mellon University has launched PrivacyGrade, a Web site that shares privacy summaries that highlight the most unexpected behaviors of an app. The goal is to help smartphone users manage their privacy better and with more thought.

"These apps access information about a user that can be highly sensitive, such as location, contact lists and call logs, yet it often is difficult for the average user to understand how that information is being used or who it might be shared with," said Jason Hong, associate professor in the Human-Computer Interaction Institute, and primary investigator for the project in the Computer Human Interaction: Mobility Privacy Security (CHIMPS) Lab. "Our privacy model measures the gap between people's expectations of an app's behavior and the app's actual behavior.

PrivacyGrade also examines which third-party code libraries make use of the resources culled by the app. If the app accesses location data, the program checks to see if it's used by a library such as Google Maps, suggesting it is simply being used for mapping, or if it is being used by an advertising library, an indication that it will be used for targeted ads.

The application doesn't currently include paid apps, since the presumption is that because the developers receive income from sales, they're less likely to sell user data to other companies. Eventually, the CHIMPS team may add additional apps to the site, for iOS, Windows Mobile and Blackberry, if funding permits.

The work was funded through a National Science Foundation grant, as well as the Army Research Office, NQ Mobile and Google through its faculty award program.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • Analyst or Scientist uses a computer and dashboard for analysis of information on complex data sets on computer.

    Anthropic Study Tracks AI Adoption Across Countries, Industries

    Adoption of AI tools is growing quickly but remains uneven across countries and industries, with higher-income economies using them far more per person and companies favoring automated deployments over collaborative ones, according to a recent study released by Anthropic.

  • businessmen shaking hands behind digital technology imagery

    Microsoft, OpenAI Restructure AI Partnership

    Microsoft and OpenAI announced they are redefining their partnership as part of a major recapitalization effort aimed at preparing for the arrival of artificial general intelligence (AGI).

  • computer monitor displaying a collage of AI-related icons

    Google Advances AI Image Generation with Multi-Modal Capabilities

    Google has introduced Gemini 2.5 Flash Image, marking a significant advancement in artificial intelligence systems that can understand and manipulate visual content through natural language processing.

  • Hand holding a stylus over a tablet with futuristic risk management icons

    Why Universities Are Ransomware's Easy Target: Lessons from the 23% Surge

    Academic environments face heightened risk because their collaboration-driven environments are inherently open, making them more susceptible to attack, while the high-value research data they hold makes them an especially attractive target. The question is not if this data will be targeted, but whether universities can defend it swiftly enough against increasingly AI-powered threats.