Cornell and Google Research How to Block Fake Social Engagement

If you've ever watched a crummy video on YouTube with thousands of views and wondered how it generated such positive attention, you may have been the victim of "fake engagement activities." According to a joint Cornell University and Google research team, these are ploys undertaken by "bad actors" posting fake content or artificially inflating the number of YouTube engagements through automated means or by paying people to "like" the content or add comments. The goal is to game the system by inflating engagement metrics in order to obtain better rankings for videos.

And the problem of fake engagement isn't limited to YouTube. It also surfaces in all of the major social sites — Twitter with fake followers, Amazon with fake reviews and Facebook with fake likes. As "In a World That Counts: Clustering and Detecting Fake Social Engagement at Scale," a paper recently presented at the 25th International World Wide Web Conference in Montreal explained, this kind of spam activity is "all buyable by the thousand online."

The team set out to develop a way to discern fake activities from legitimate ones. The method the researchers developed, called "Local Expansion at Scale" (LEAS), analyzes the engagement behavior pattern between users and YouTube videos. Apparently, accounts posting fake hits or comments show a "stronger lockstep behavior pattern." Groups of users act together, commenting on the same videos at around the same time.

The work was begun by Cornell graduate student Yixuan Li while he was interning at Google. The research was continued under the guidance of John Hopcroft, a professor of engineering and applied mathematics in Cornell's department of computer science, as well as three Google researchers.

LEAS creates a map — an "engagement relationship graph" — that takes account of the frequency of common engagement activities shared between two individuals within a short period of time. The engagement graph allows the researchers "to detect orchestrated actions by sets of users which have a very low likelihood of happening spontaneously or organically."

To evaluate the accuracy of the system, humans manually reviewed postings from accounts LEAS had identified as spammers on YouTube. Even though some of those accounts had been created recently, they'd quickly run up a long list of postings. Their comments were often short text pieces, such as "good videos" or "very cool" or "nice" or "oh" or "lol." The researchers also found a few accounts posting comments under popular songs with content that was irrelevant to the given video but making view and subscribe requests. Additionally, several other "spammy" accounts posted comments with malicious URLs and advertisements.

LEAS now runs "regularly" at Google as one of multiple tools for helping detect fake engagement activities. When fakes are discovered, they may simply be removed on the same day they're detected, or the accounts may be deleted altogether. According to the paper, LEAS has "greatly" expanded the take-down volume on YouTube of fake engagement.

The research work was supported by the United States Army Research Office.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • abstract illustration of a glowing AI-themed bar graph on a dark digital background with circuit patterns

    Stanford 2025 AI Index Reveals Surge in Adoption, Investment, and Global Impact as Trust and Regulation Lag Behind

    Stanford University's Institute for Human-Centered Artificial Intelligence (HAI) has released its AI Index Report 2025, measuring AI's diverse impacts over the past year.

  • modern college building with circuit and brain motifs

    Anthropic Launches Claude for Education

    Anthropic has announced a version of its Claude AI assistant tailored for higher education institutions. Claude for Education "gives academic institutions secure, reliable AI access for their entire community," the company said, to enable colleges and universities to develop and implement AI-enabled approaches across teaching, learning, and administration.

  • lightbulb

    Call for Speakers Now Open for Tech Tactics in Education: Overcoming Roadblocks to Innovation

    The annual virtual conference from the producers of Campus Technology and THE Journal will return on September 25, 2025, with a focus on emerging trends in cybersecurity, data privacy, AI implementation, IT leadership, building resilience, and more.

  • From Fire TV to Signage Stick: University of Utah's Digital Signage Evolution

    Jake Sorensen, who oversees sponsorship and advertising and Student Media in Auxiliary Business Development at the University of Utah, has navigated the digital signage landscape for nearly 15 years. He was managing hundreds of devices on campus that were incompatible with digital signage requirements and needed a solution that was reliable and lowered labor costs. The Amazon Signage Stick, specifically engineered for digital signage applications, gave him the stability and design functionality the University of Utah needed, along with the assurance of long-term support.