Research Finds that Anonymized Mobile Data Still Leads to Privacy Risks
- By Dian Schaffhauser
- 12/18/18
When you allow an app to identify your current location through your mobile device, is the result being used to optimize your experience or putting your private data at risk? That's the question behind a study undertaken by researchers at MIT and Imperial College London, who recently published their findings in IEEE's Transactions on Big Data.
According to MIT's Daniel Kondor, Behrooz Hashemian and Carlo Ratti and Imperial College's Yves-Alexandre de Montjoye, the compilation of massive, anonymized datasets detailing people's movement patterns through their location stamps can be used for "nefarious purposes." Given a few randomly selected points in mobility datasets, someone could identify and learn sensitive information about those individuals behind the data. This is accomplished by merging one dataset that's anonymized with another that's not anonymized to reveal what's being hidden.
The researchers proved their point by performing user "matchability" with two large datasets from Singapore generated in 2011: one containing timestamps and geographic coordinates from 485 million records generated by 2 million users from a mobile network operator, and one containing 70 million records with timestamps for individuals moving around the city within a local transportation system.
The researchers applied statistical modelling to location stamps of users in both of the datasets to come up with a probability that data points in both sets originated from the same individual. Initially, the model could expect to match about 17 percent of individuals in one week's worth of data; after four weeks, it could probably match more than 55 percent. The estimate rose to about 95 percent with data compiled over 11 weeks. The main determinant of matchability was the expected number of co-occurring records in the two datasets.
"As researchers, we believe that working with large-scale datasets can allow discovering unprecedented insights about human society and mobility, allowing us to plan cities better," said Kondor, a postdoc in the Future Urban Mobility Group at the Singapore-MIT Alliance for Research and Technology (SMART). "Nevertheless, it is important to show if identification is possible, so people can be aware of potential risks of sharing mobility data."
Ratti, a professor of the practice in MIT's Department of Urban Studies and Planning and director of MIT's Senseable City Lab, offered an example: "I was at Sentosa Island in Singapore two days ago, came to the Dubai airport yesterday and am on Jumeirah Beach in Dubai today. It's highly unlikely another person's trajectory looks exactly the same. In short, if someone has my anonymized credit card information and perhaps my open location data from Twitter, they could then deanonymize my credit card data."
About the Author
Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.