Research Finds that Anonymized Mobile Data Still Leads to Privacy Risks

mobile location sharing

When you allow an app to identify your current location through your mobile device, is the result being used to optimize your experience or putting your private data at risk? That's the question behind a study undertaken by researchers at MIT and Imperial College London, who recently published their findings in IEEE's Transactions on Big Data.

According to MIT's Daniel Kondor, Behrooz Hashemian and Carlo Ratti and Imperial College's Yves-Alexandre de Montjoye, the compilation of massive, anonymized datasets detailing people's movement patterns through their location stamps can be used for "nefarious purposes." Given a few randomly selected points in mobility datasets, someone could identify and learn sensitive information about those individuals behind the data. This is accomplished by merging one dataset that's anonymized with another that's not anonymized to reveal what's being hidden.

The researchers proved their point by performing user "matchability" with two large datasets from Singapore generated in 2011: one containing timestamps and geographic coordinates from 485 million records generated by 2 million users from a mobile network operator, and one containing 70 million records with timestamps for individuals moving around the city within a local transportation system.

The researchers applied statistical modelling to location stamps of users in both of the datasets to come up with a probability that data points in both sets originated from the same individual. Initially, the model could expect to match about 17 percent of individuals in one week's worth of data; after four weeks, it could probably match more than 55 percent. The estimate rose to about 95 percent with data compiled over 11 weeks. The main determinant of matchability was the expected number of co-occurring records in the two datasets.

"As researchers, we believe that working with large-scale datasets can allow discovering unprecedented insights about human society and mobility, allowing us to plan cities better," said Kondor, a postdoc in the Future Urban Mobility Group at the Singapore-MIT Alliance for Research and Technology (SMART). "Nevertheless, it is important to show if identification is possible, so people can be aware of potential risks of sharing mobility data."

Ratti, a professor of the practice in MIT's Department of Urban Studies and Planning and director of MIT's Senseable City Lab, offered an example: "I was at Sentosa Island in Singapore two days ago, came to the Dubai airport yesterday and am on Jumeirah Beach in Dubai today. It's highly unlikely another person's trajectory looks exactly the same. In short, if someone has my anonymized credit card information and perhaps my open location data from Twitter, they could then deanonymize my credit card data."

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • student reading a book with a brain, a protective hand, a computer monitor showing education icons, gears, and leaves

    4 Steps to Responsible AI Implementation

    Researchers at the University of Kansas Center for Innovation, Design & Digital Learning (CIDDL) have published a new framework for the responsible implementation of artificial intelligence at all levels of education.

  • glowing digital brain interacts with an open book, with stacks of books beside it

    Federal Court Rules AI Training with Copyrighted Books Fair Use

    A federal judge ruled this week that artificial intelligence company Anthropic did not violate copyright law when it used copyrighted books to train its Claude chatbot without author consent, but ordered the company to face trial on allegations it used pirated versions of the books.

  • server racks, a human head with a microchip, data pipes, cloud storage, and analytical symbols

    OpenAI, Oracle Expand AI Infrastructure Partnership

    OpenAI and Oracle have announced they will develop an additional 4.5 gigawatts of data center capacity, expanding their artificial intelligence infrastructure partnership as part of the Stargate Project, a joint venture among OpenAI, Oracle, and Japan's SoftBank Group that aims to deploy 10 gigawatts of computing capacity over four years.

  • laptop displaying a phishing email icon inside a browser window on the screen

    Phishing Campaign Targets ED Grant Portal

    Threat researchers at cybersecurity company BforeAI have identified a phishing campaign spoofing the U.S. Department of Education's G5 grant management portal.