TaintDroid Research Exposes Risks of Mobile Apps

Research by two universities and a corporate R&D lab has uncovered a consistent pattern by many Android apps of transmitting private information without explicitly informing the smartphone user. The joint research by Penn State, Duke University, and Intel Labs resulted in development of TaintDroid. This real-time monitoring service works as an extension of the Google mobile phone platform to analyze how private information is obtained and released by the apps that have been downloaded and installed on the phones by their users.

The researchers tested 30 popular Droid applications that use location, camera, or microphone data and flagged 105 instances in which these applications transmitted tainted data. "Of the 105, we determined that 37 were clearly legitimate," the researchers reported in "TaintDroid: An Information-Flow Tracking System for Realtime Privacy." TaintDroid also revealed that 15 of the 30 applications reported users' locations to remote advertising servers; seven applications collected the device ID and in some cases the phone number and the SIM card serial number. "In all, two-thirds of the applications in our study used sensitive data suspiciously," wrote the authors. "Our findings demonstrate that TaintDroid can help expose potential misbehavior by third-party applications."

Although the study focused on the Android platform, the researchers believe that other platforms should be examined for "tainted" apps as well.


A video shown on the TaintDroid research project shares a demonstration of the application exposing the secret transmission of private data when the user downloads and uses a Droid wallpaper app.

"We were surprised by how many of the studied applications shared our information without our knowledge or consent," said William Enck, a Penn State graduate student of computer science and engineering. "Often, Smartphone applications have obvious user interface changes when they use information like your physical location. These cases usually occur in response to the user pressing a button with clear implications. The cases we found were suspicious because there was no obvious way for the user to know what happened or why."

After an app is downloaded but before it's fully installed, users will frequently receive a screen that asks whether they'll allow certain information to be accessed, presumably to make the apps work right. For example, local sensors on the phone, such as GPS receiver, will feed data to a remote Web service in order to deliver new forms of information to the user, such as the route a bike ride has followed. If the user declines to allow that data access during installation, the app won't be installed.

The apps rarely provide privacy policies that state how a user's sensitive information will be used, and users have no way of knowing where applications send the information given to them. As a result, the report states, "users must blindly trust that applications will properly handle their private data."

"Many of these applications access users' personal data such as location, phone information and usage history to enhance their experience," said Patrick McDaniel, an associate professor at Penn State. "But users must trust that applications will only use their privacy-sensitive information in a desirable way."

The researchers say that TaintDroid eventually will be released as a downloadable program itself. "Our findings demonstrate the effectiveness and value of enhancing Smartphone platforms with monitoring tools such as TaintDroid," they concluded. The project Web site is currently collecting names of people who are interested in working with an open source version of the code for TaintDroid.

The National Science Foundation is a financial supporter of the project.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • interconnected cloud icons with glowing lines on a gradient blue backdrop

    Report: Cloud Certifications Bring Biggest Salary Payoff

    It pays to be conversant in cloud, according to a new study from Skillsoft The company's annual IT skills and salary survey report found that the top three certifications resulting in the highest payoffs salarywise are for skills in the cloud, specifically related to Amazon Web Services (AWS), Google Cloud, and Nutanix.

  • AI-inspired background pattern with geometric shapes and fine lines in muted blue and gray on a dark background

    IBM Releases Granite 3.0 Family of Advanced AI Models

    IBM has introduced its most advanced family of AI models to date, Granite 3.0, at its annual TechXchange event. The new models were developed to provide a combination of performance, flexibility, and autonomy that outperforms or matches similarly sized models from leading providers on a range of benchmarks.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • happy woman sitting in front of computer

    Delightful Progress: Kuali's Legacy of Community and Leadership

    CEO Joel Dehlin updates us on Kuali today, and how it has thrived as a software company that succeeds in the tech marketplace while maintaining the community values envisioned in higher education years ago.