Moscow’s Mission: Inside the Troll Farm

How Russia built fake communities to turn Americans against each other

Moscow’s Mission: Inside the Troll Farm

Voice & Vision | Moscow’s Mission, Part 2

This builds on Part 1, which you can read here if you’d like the context. Voice & Vision | Moscow’s Mission, Part 1

The Factory

The operation didn’t need to feel foreign. It needed to feel familiar. Russian operators built whole lives online, complete with names, faces, slang, and local concerns, then used those personas to steer conversations that were already hot. Race, immigration, religion, guns, patriotism, veterans, policing, and protest were the main fuel, repeated in friendly memes and urgent posts that looked like neighbors talking to neighbors. (Oxford Internet Institute; Mueller Report)

Many accounts were not “political pages” at first glance. They posed as community groups, ministries, veteran clubs, or local news feeds. They shared relatable content to earn trust, then turned that trust into reach when it mattered. That is why the pages felt real. They started with barbecue photos and church quotes, then graduated to wedge issues and calls to action. (Oxford Internet Institute)

And Americans flocked to their cause with reckless enthusiastic abandon.

Some properties became large hubs. “Blacktivist,” “Being Patriotic,” “Heart of Texas,” and “United Muslims of America” are four that show up over and over in the record. They were used to promote rallies, to flood comment threads, and to seed storylines that traditional media later chased. Their content was not sophisticated. It was familiar, emotional, and fast. (House Intelligence Facebook Ads Archive; Oxford Internet Institute)

At least one Russian Internet Research Agency (IRA) network even organized two rival events on the same day in the same city, pushing protesters toward each other in Houston. The goal was not persuasion in a classroom sense. It was heat. Put enough people in the same area, add volatile slogans, and let cameras do the rest. (House Intelligence Facebook Ads Archive; Mueller Report)

The mechanics were simple. First, create or buy pages. Second, post daily, mostly soft content that fit the audience. Third, pay for small ad boosts to find look-alike followers. Fourth, cross-share between properties to create the illusion of independent agreement. Fifth, escalate during news spikes with petitions, events, and “share if you agree” prompts. Platforms now describe this at scale as “coordinated inauthentic behavior.” (Meta/Facebook disclosures; Oxford Internet Institute)

“Coordinated inauthentic behavior” is a mouthful, but the idea is straight. It is when a network of accounts hides who runs it while acting together to mislead people about the source and goal of the content. Platforms removed many such clusters after 2016 and published samples, but takedowns happened after the fact. By the time labels appeared, screenshots and narratives had already spread to legitimate accounts. (Meta/Facebook disclosures; Wired overview)

Again, Americans conveniently chose to ignore and forget they were duped.

A second piece of the factory model was credibility laundering. Once a page gathered a real audience, Americans did most of the work. Real people commented, argued, and shared, which pushed posts into friend networks where the Russian origin no longer mattered. Researchers who studied the IRA archives found that the content’s power came from how closely it mirrored existing American discourse, not from any new ideology it invented. (Oxford Internet Institute)

The record also shows how platforms, press, and law enforcement adjusted. Companies set up threat teams and transparency archives. Reporters learned to ask who registered a domain and who controlled an event page. Government bulletins warned about ongoing activity, including fabricated videos timed to domestic narratives. The patterns evolved, but the factory floor stayed busy. (ODNI assessments; Meta/Facebook disclosures)

The lesson is not that a hidden army changed every mind. The lesson is that a small, disciplined shop can hijack attention if it looks local, speaks our dialect, and rides the built-in incentives of the feed. Once a post feels like “us,” we’ll happily do all the heavy lifting for any bad actor. (Oxford Internet Institute; Mueller Report)

If this pulled back the curtain on the “factory,” share it with one reader who still thinks these were just a few trolls with memes. Next in Part 3, we follow what happens after the click, and why the biggest casualty was shared reality.

Sources

Oxford Internet Institute (Computational Propaganda Project), The IRA, Social Media and Political Polarization in the United States, 2012–2018 (2018).

Special Counsel Robert S. Mueller III, Report on the Investigation into Russian Interference in the 2016 Presidential Election, Vol. I (2019).

U.S. House Permanent Select Committee on Intelligence, Exhibits: Russia-linked Facebook ads and pages (2018).

Meta/Facebook, public explanations and takedowns regarding “coordinated inauthentic behavior” (2018–2020).

ODNI, declassified assessments on foreign influence in U.S. elections (2017–2021).

*(Parenthetical references in the text point to the source that anchors the paragraph’s core claim.)*