UCSB Security Researchers To Help Too-Trusting Smartphone App Users

There's little assurance right now that the app you've just downloaded to your Android phone is safe. It could be the gateway through which a cybercriminal is pulling off important pieces of data about you and your contacts to build up profiles worth selling to other criminals. Little is known about the "trust relationships" that exist among users, the smartphone platform and the surrounding ecosystem, including smartphone apps and the app markets. But a research team at the University of California, Santa Barbara has received a $1.1 million grant from the National Science Foundation to research the topic.

"The victims of these types of malware and scams could be counted in the hundreds of millions," said Giovanni Vigna, a professor of computer science who will be the principal investigator on the project. "The thing we'll be seeing more and more are attempts to violate trust assumptions."

Vigna, who is also the director of the Center for CyberSecurity in the College of Engineering, will be working with Computer Science Professor Christopher Kruegel to develop a framework for understanding trust relationships in this smartphone ecosystem in order to understand the weaknesses. Those include situations in which trust is misplaced as well as points where trust vulnerabilities exist. s

For example, an app page may use icons to suggest the authenticity of the site or the security of the app file; or recognizable logos from trusted organizations may appear on the site or app without an actual connection to the trusted brand.

"People use their phones to click on the Facebook icon, for instance, and the Facebook application starts, and they inherently assume that it's Facebook running on their phone," Vigna said. He and his team have discovered that users will also click on an icon that feels familiar but leads to a faux application intended to do harm.

The researchers expect to examine include the relationship between the malware writer and the app store that publishes his or her app; the user who trusts the app store enough to download the app; and the developer who relies on a particular ad framework to display ads through the app, which then begins including links to additional malware. "Where's the trust there? How do you control this trust? How can you be assured that the ad network is going to perform as stated?" said Vigna.

The research also hopes to develop techniques to prevent or detect and mitigate trust violations. Initially, the group will focus on Android apps in particular, but they insist that the results will be general and applicable to other smartphone platforms as well.

"Android is a wonderful open platform that allows anybody to do anything--including hacking the cellphones of unsuspecting Android users," said Vigna. He added that Apple iOS is less vulnerable.

Also, the team may develop an app that users can use to analyze the behavior of other apps to report their flaws or potential untrustworthiness.

Until the research is done, Vigna offers several recommendations:

  • Stick to the "better known app markets," and stay away from other third-party sites;
  • Before downloading an app, consider the number of downloads it has; millions is a more trustworthy count than hundreds or a few thousand;
  • If the app doesn't work when you've downloaded it, it could turn out to be a bit of malicious code sucking up user information. Uninstall apps that don't work;
  • Carefully check that you're getting what you want. "Angry Bords" isn't from Rovio, and the results from installing it may be far more harmful than egg-stealing pigs.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • abstract pattern with interconnected blue nodes and lines forming neural network shapes, overlaid with semi-transparent bars and circular data points

    Data, AI Lead Educause Top 10 List for 2025

    Educause recently released its annual Top 10 list of the most important technology issues facing colleges and universities in the coming year, with a familiar trio leading the bunch: data, analytics, and AI. But the report presents these critical technologies through a new lens: restoring trust in higher education.

  • digital brain made of blue circuitry on the left and a shield with a glowing lock on the right, set against a dark background with fading binary code

    AI Dominates Key Technologies and Practices in Cybersecurity and Privacy

    AI governance, AI-enabled workforce expansion, and AI-supported cybersecurity training are three of the six key technologies and practices anticipated to have a significant impact on the future of cybersecurity and privacy in higher education, according to the latest Cybersecurity and Privacy edition of the Educause Horizon Report.

  • Campus Technology Product Award

    Call for Entries: 2024 Campus Technology Product Awards

    The entry period for the 2024 Campus Technology Product Awards is now open.

  • open laptop with screen depicting a glowing, holographic figure surrounded by floating symbols of knowledge like books, equations, and lightbulbs

    Cengage Intros Gen AI Student Assistant Beta

    Ed tech company Cengage has announced the beta launch of Student Assistant, a generative AI tool designed to guide students through the learning process with personalized resources and feedback.