CMU Research Team Analyzes Internet 'Miscreants'

A  team lead by Carnegie Mellon computer science researchers has developed computer tools capable of following the operations of electronic black markets for viruses, stolen data, and attack services.

Adrian Perrig, a CMU associate professor of electrical and computer engineering and public policy has led a team that developed the automated techniques to measure activities of spammers, virus writers, and identity thieves. In addition to Perrig, the team included Jason Franklin, a Ph.D. student in computer science, Vern Paxon of the International Computer Science Institute, and Stefan Savage of the University of California, San Diego.

The researchers estimated that more than $37 million in software tools for malicious programming were available for sale during their seven-month study period. During that time, more than 80,000 potential credit card numbers were available through "illicit underground Web economies," Franklin told the CMU press office.

The researchers found that buyers of malicious software tools and services would normally contact black market vendors using e-mail or instant messaging. Money generally changed hands through non-bank payment services such as e-gold, making the criminals difficult to track.

"These troublesome entrepreneurs even offer tech support and free updates for their malicious creations that run the gamut from denial of service attacks designed to overwhelm Web sites and servers to data stealing Trojan viruses," said Perrig.

The  researchers proposed approaches to thwart black marketers, including slander attacks designed to undercut a vendor's reputation in the black market. "Just like you need to verify that individuals are honest on eBay, online criminals need to verify that they are dealing with 'honest' criminals," Franklin said.

In a slander attack, an attacker discounts the verified status of a buyer or seller through false defamation. "By eliminating the verified status of the honest individuals, an attacker establishes a 'lemon' market where buyers are unable to distinguish the quality of the goods or services," Franklin said.

Perrig's team also developed a technique to establish fake verified-status identities that are difficult to distinguish from other verified-status sellers, which makes it  hard for buyers to identify honest verified-status sellers from dishonest verified-status sellers.

"So, when the unwary buyer tries to collect the goods and services promised, the seller fails to provide the goods and services. Such behavior is known as 'ripping.' And it is the goal of all black market site's verification systems to minimize such behavior," said Franklin.

"We believe these black markets are growing, so we will have even more incidents to monitor and study in the future," Perrig said.

Read More:

About the Author

Paul McCloskey is contributing editor of Syllabus.

Featured

  • Training the Next Generation of Space Cybersecurity Experts

    CT asked Scott Shackelford, Indiana University professor of law and director of the Ostrom Workshop Program on Cybersecurity and Internet Governance, about the possible emergence of space cybersecurity as a separate field that would support changing practices and foster future space cybersecurity leaders.

  • modern college building with circuit and brain motifs

    Anthropic Launches Claude for Education

    Anthropic has announced a version of its Claude AI assistant tailored for higher education institutions. Claude for Education "gives academic institutions secure, reliable AI access for their entire community," the company said, to enable colleges and universities to develop and implement AI-enabled approaches across teaching, learning, and administration.

  • AI microchip, a cybersecurity shield with a lock, a dollar coin, and a laptop with financial graphs connected by dotted lines

    Survey: Generative AI Surpasses Cybersecurity in 2025 Tech Budgets

    Global IT leaders are placing bigger bets on generative artificial intelligence than cybersecurity in 2025, according to new research by Amazon Web Services (AWS).

  • university building surrounded by icons for AI, checklists, and data governance

    Improving AI Governance for Stronger University Compliance and Innovation

    AI can generate valuable insights for higher education institutions and it can be used to enhance the teaching process itself. The caveat is that this can only be achieved when universities adopt a strategic and proactive set of data and process management policies for their use of AI.