Get AI summaries of any video or article — Sign up free
Facebook Fraud thumbnail

Facebook Fraud

Veritasium·
6 min read

Based on Veritasium's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Facebook’s ad-based “Get more likes” can still deliver low-quality, disengaged followers that behave like click-farm fans.

Briefing

Facebook’s “legitimate” ad system for gaining page likes can still produce the same kind of fake-fan problem as outright click-farms—leading to inflated follower counts, suppressed engagement, and a pay-to-reach trap that benefits the platform.

The story begins with Virtual Bagel, a Facebook page that promised a joke business model: “we send you bagels via the Internet.” In 2012, BBC technology correspondent Rory Cellan-Jones created the page and bought likes to test what a like is actually worth. Two routes exist. One is the prohibited route: buying likes from sites such as BoostLikes.com, which rely on click-farms in countries including India, the Philippines, Nepal, Sri Lanka, Egypt, Indonesia, and Bangladesh. Workers there are reportedly paid about $1 per thousand clicks. The other route is the “allowed” route: paying Facebook for page promotion via ads.

Cellan-Jones paid $100 to Facebook and targeted the ad to the UK and the US—yet the likes poured in fastest from developing countries. Within a day, the page gained over 1,600 likes, mostly from places like Egypt, Indonesia, and the Philippines. The suspiciousness didn’t stop at geography. Many new followers looked like classic click-farm artifacts: for example, a Cairo-based account named Ahmed Ronaldo filled its profile almost entirely with Cristiano Ronaldo images and liked thousands of pages, while showing little real engagement with the Virtual Bagel page.

Facebook later reported deleting 83 million fake accounts in August 2012, about 9% of its total at the time, and some celebrities saw noticeable drops. But the deletion didn’t remove the underlying issue of fake or low-quality likes. The transcript’s author describes receiving Facebook emails offering $50 in free promotion for a modest page. After using the offer, likes tripled within days and kept rising by the thousands per day—eventually reaching around 70,000. Yet engagement on posts didn’t improve; it even declined.

The reason: fake likes behave differently from real followers. Facebook initially distributes each post to a small slice of page likes to gauge reactions. When those likes are fake or disengaged, Facebook sees low interaction (likes, comments, shares) and stops expanding distribution. That creates a paradox where follower counts rise while reach falls. The platform then earns twice: first from ad spend to acquire fans, and again when low engagement forces page owners to pay to promote posts.

The transcript adds a practical limitation: fake likes can’t be deleted in bulk; page owners can only target around them. Attempts to “solve” the problem by excluding click-farm-heavy countries also fail. A test with Virtual Cat—a deliberately awful page—still attracted likes quickly even when ads were targeted only to the US, Canada, Australia, and the UK. The accounts liking the page appeared to like enormous numbers of unrelated items (everything from major telecoms to random household products), suggesting that ad clicks and likes can be generated without genuine interest.

A proposed hypothesis ties it together: click-farms may click ads for free, then “launder” their activity by liking other pages so their behavior blends into normal-looking patterns. The transcript’s conclusion is blunt: Facebook has little incentive to remove fake likes at scale because doing so would expose ad revenue tied to non-genuine clicks and would undermine the engagement suppression that keeps advertisers paying to reach audiences that aren’t really there.

Cornell Notes

Facebook’s ad-based “Get more likes” can generate follower counts that look real but behave like click-farm fans. A test using Virtual Bagel showed that even when ads were targeted to the UK and US, likes arrived disproportionately from developing countries and came from suspicious accounts with little engagement. The transcript’s author then used Facebook’s legitimate promotion and saw likes surge while engagement stayed flat or fell, because Facebook expands post distribution only when early recipients interact. Fake or disengaged likes therefore shrink organic reach, forcing additional paid promotion. Attempts to exclude certain countries don’t fully solve it, since low-quality likes can still appear from targeted regions via click-farm behavior that’s designed to evade detection.

What’s the core difference between buying likes “illegally” and using Facebook’s “legitimate” ads—and why does it still end up similar?

Illegally buying likes means paying third-party sites that use click-farms; workers are reportedly paid about $1 per thousand clicks. Facebook’s legitimate route means paying Facebook to promote a page via ads. But the transcript argues that the outcome can still be similar: ad-targeted likes can come from the same kinds of low-quality accounts (disengaged, high-volume likers) because the click-farm infrastructure can also interact with ads, producing fake-like behavior even when the advertiser pays Facebook directly.

How does fake or disengaged liking reduce a page’s reach even when follower counts rise?

Facebook distributes each post to a small fraction of a page’s likes to test reactions. If those early recipients don’t engage through likes, comments, or shares, Facebook doesn’t expand distribution to more followers. So a page can gain many new “likes” quickly, but if most of those new likes don’t interact, the algorithm interprets low engagement and limits reach—creating a drop in visibility despite growth in follower numbers.

What evidence suggests that likes gained through ads weren’t coming from genuinely interested fans?

The transcript’s author compares engagement by country using a bubble chart. Western countries like Canada and the US show engagement rates around 30%, while some other Western groups are higher (e.g., Germans over 40%, Austrians near 60%). In contrast, countries such as Egypt, India, the Philippines, Pakistan, Bangladesh, Indonesia, Nepal, and Sri Lanka contribute large like counts but under 1% engagement. Together, those regions account for roughly 75% of the author’s likes before a later video—yet almost none of the likes interact, indicating the followers weren’t meaningfully interested.

Why can’t page owners simply delete fake likes in bulk?

The transcript says there’s no bulk deletion option for fake likes. The only practical control is targeting posts around the problematic audience segments, which doesn’t remove the underlying engagement problem and doesn’t restore organic reach the way genuine followers would.

Why does excluding click-farm-heavy countries fail as a complete solution?

A test with Virtual Cat—an intentionally awful page—was advertised only to cat-lovers in the US, Canada, Australia, and the UK. Despite excluding major click-farm countries and expecting few likes, the page spent its budget quickly and received 39 likes in 20 minutes. The accounts liking the page appeared to like hundreds or thousands of unrelated things, including major telecoms and car brands, plus random household products—suggesting click-farm behavior can still produce likes inside targeted regions.

What hypothesis ties together ad-driven fake likes and the platform’s incentives?

The transcript proposes that click-farms may click ads for free. To avoid detection, they may “launder” their activity by liking other pages so their behavior looks organic rather than geographically or temporally suspicious. The conclusion is that Facebook benefits from maintaining the status quo: fake or low-quality likes suppress engagement, which reduces organic reach and pushes advertisers to pay again to reach those audiences—while also allowing Facebook to collect ad revenue from non-genuine interactions.

Review Questions

  1. How does Facebook’s post distribution mechanism turn low engagement from fake followers into reduced organic reach?
  2. What patterns in country-level engagement and follower behavior distinguish genuine fans from click-farm-like accounts?
  3. Why might Facebook have incentives not to remove fake likes at scale, according to the transcript’s reasoning?

Key Points

  1. 1

    Facebook’s ad-based “Get more likes” can still deliver low-quality, disengaged followers that behave like click-farm fans.

  2. 2

    Fake likes inflate follower counts while suppressing engagement because Facebook expands post distribution only after early interactions.

  3. 3

    Geography can be a red flag: large like volumes from regions with near-zero engagement suggest non-genuine followers.

  4. 4

    Fake likes can’t be removed in bulk; page owners can only target around them, leaving reach problems in place.

  5. 5

    Excluding certain countries doesn’t fully solve the issue because click-farm activity can appear even in targeted regions.

  6. 6

    A proposed mechanism is that click-farms may click ads for free and “launder” their behavior by liking many other pages to evade fraud detection.

  7. 7

    The platform’s incentives may align with keeping the system intact: suppressed organic reach increases the need for paid promotion.

Highlights

A $100 “legitimate” like campaign for Virtual Bagel produced most likes from developing countries and attracted suspicious accounts with little engagement.
Likes can triple quickly while engagement stays flat or drops because Facebook’s algorithm limits distribution when early recipients don’t interact.
Country-level engagement patterns show the mismatch: tens of thousands of likes from certain regions with under 1% engagement.
A deliberately awful page (Virtual Cat) still gained likes rapidly even when ads were targeted only to the US, Canada, Australia, and the UK.
The transcript’s hypothesis: click-farms may click ads for free and disguise activity by liking many unrelated pages.

Topics

  • Facebook Ads
  • Fake Likes
  • Click-Farms
  • Engagement Algorithms
  • Ad Fraud

Mentioned

  • Rory Cellan-Jones