TaintDroid Research Exposes Risks of Mobile Apps

Research by two universities and a corporate R&D lab has uncovered a consistent pattern by many Android apps of transmitting private information without explicitly informing the smartphone user. The joint research by Penn State, Duke University, and Intel Labs resulted in development of TaintDroid. This real-time monitoring service works as an extension of the Google mobile phone platform to analyze how private information is obtained and released by the apps that have been downloaded and installed on the phones by their users.

The researchers tested 30 popular Droid applications that use location, camera, or microphone data and flagged 105 instances in which these applications transmitted tainted data. "Of the 105, we determined that 37 were clearly legitimate," the researchers reported in "TaintDroid: An Information-Flow Tracking System for Realtime Privacy." TaintDroid also revealed that 15 of the 30 applications reported users' locations to remote advertising servers; seven applications collected the device ID and in some cases the phone number and the SIM card serial number. "In all, two-thirds of the applications in our study used sensitive data suspiciously," wrote the authors. "Our findings demonstrate that TaintDroid can help expose potential misbehavior by third-party applications."

Although the study focused on the Android platform, the researchers believe that other platforms should be examined for "tainted" apps as well.


A video shown on the TaintDroid research project shares a demonstration of the application exposing the secret transmission of private data when the user downloads and uses a Droid wallpaper app.

"We were surprised by how many of the studied applications shared our information without our knowledge or consent," said William Enck, a Penn State graduate student of computer science and engineering. "Often, Smartphone applications have obvious user interface changes when they use information like your physical location. These cases usually occur in response to the user pressing a button with clear implications. The cases we found were suspicious because there was no obvious way for the user to know what happened or why."

After an app is downloaded but before it's fully installed, users will frequently receive a screen that asks whether they'll allow certain information to be accessed, presumably to make the apps work right. For example, local sensors on the phone, such as GPS receiver, will feed data to a remote Web service in order to deliver new forms of information to the user, such as the route a bike ride has followed. If the user declines to allow that data access during installation, the app won't be installed.

The apps rarely provide privacy policies that state how a user's sensitive information will be used, and users have no way of knowing where applications send the information given to them. As a result, the report states, "users must blindly trust that applications will properly handle their private data."

"Many of these applications access users' personal data such as location, phone information and usage history to enhance their experience," said Patrick McDaniel, an associate professor at Penn State. "But users must trust that applications will only use their privacy-sensitive information in a desirable way."

The researchers say that TaintDroid eventually will be released as a downloadable program itself. "Our findings demonstrate the effectiveness and value of enhancing Smartphone platforms with monitoring tools such as TaintDroid," they concluded. The project Web site is currently collecting names of people who are interested in working with an open source version of the code for TaintDroid.

The National Science Foundation is a financial supporter of the project.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • student reading a book with a brain, a protective hand, a computer monitor showing education icons, gears, and leaves

    4 Steps to Responsible AI Implementation

    Researchers at the University of Kansas Center for Innovation, Design & Digital Learning (CIDDL) have published a new framework for the responsible implementation of artificial intelligence at all levels of education.

  • glowing digital brain interacts with an open book, with stacks of books beside it

    Federal Court Rules AI Training with Copyrighted Books Fair Use

    A federal judge ruled this week that artificial intelligence company Anthropic did not violate copyright law when it used copyrighted books to train its Claude chatbot without author consent, but ordered the company to face trial on allegations it used pirated versions of the books.

  • server racks, a human head with a microchip, data pipes, cloud storage, and analytical symbols

    OpenAI, Oracle Expand AI Infrastructure Partnership

    OpenAI and Oracle have announced they will develop an additional 4.5 gigawatts of data center capacity, expanding their artificial intelligence infrastructure partnership as part of the Stargate Project, a joint venture among OpenAI, Oracle, and Japan's SoftBank Group that aims to deploy 10 gigawatts of computing capacity over four years.

  • laptop displaying a phishing email icon inside a browser window on the screen

    Phishing Campaign Targets ED Grant Portal

    Threat researchers at cybersecurity company BforeAI have identified a phishing campaign spoofing the U.S. Department of Education's G5 grant management portal.