When was the last time you were reading the comments on a major article and then came across That Guy? You know, the unhinged one who unleashes a blistering tirade — about the subject or writer — that includes all sorts of playground insults, things we were told in kindergarten not to say.
Have you ever considered that the person wasn’t really crazy? And in fact may not even be “real”?
Most of us like to assume the best about people, even people we only encounter on the internet. But a disturbingly large number of hateful comments and posts on social media come from “trolls” — not the kind in the “Lord of the Rings” movies or the miniature creatures with colorful hair, but people who are paid to manipulate public opinion. (Actually, the word troll in this context is a derivative of “troller” and has been used since the 1990s to describe people fishing for an argument on the internet.)
Few appreciate the scope of “troll farms” operating in Russia and at least 29 other governments worldwide. These “keyboard armies” operate for different reasons within each country, but they all do at least two things: spread propaganda favorable to the agenda of those in power and attack anyone raising questions about that agenda.
In Russia, this has been going on via social media for nearly 10 years, according to a U.S. Senate Intelligence Committee report. But it was only after the 2016 election that the general public started paying attention. A Justice Department indictment filed in 2018 suggests that hundreds of paid Russian trolls operate these campaigns thanks to an annual budget in the millions. And leading up to the 2020 election, their work reached an estimated 140 million Americans a month.
Writing on the most detailed study of internet trolls operating in China, Ryan Fedasiuk, a research analyst at Georgetown University’s Center for Security and Emerging Technology, says the effort there is “much larger than previously reported” — with 2 million paid employees who publish nearly 450 million posts each year.
These are large scale efforts to “distract the public and change the subject” from serious questions, according to a 2017 report by researchers at Harvard, Stanford and the University of California.
Of course, legitimate questions have been raised about similar kinds of pressures in our own country —though different in scale and substance—especially as we learn more about how the U.S. government influenced moderation standards on social media during the pandemic.
What’s clear, however, is that some governments are not content to only influence their own citizens. Twitter said it shut down tens of thousands of troll accounts in 2020 alone and even more “amplifier” accounts that seek to widen the trolling account’s influence.
Of course, governments have always sought influence using whatever medium is popular at the time, and this practice isn’t always unethical. The first Voice of America broadcasts, for example, were designed to combat Nazi propaganda during World War II. And some internet trolls are exactly what they appear to be: angry people letting off steam when they read something they don’t like. But Fedasiuk and others contend that there is something more sinister at play on a large scale — even, in Fedasiuk’s words, “a strategy to seize international discourse power.”
What trolls do
Independent researchers estimated in 2015 that the Russian “Internet Research Agency” had an estimated 400 staff members working 12-hour shifts, with 80 trolls dedicated to disrupting the U.S. political system alone. This happens on every social media platform and in the comment threads of major news sites — with every imaginable form of disinformation, including “fake fact-checking videos.”
According to a former worker, these efforts are carefully managed by supervisors who are “obsessed” with page views, posts, clicks and traffic. Lyudmila Savchuk described going undercover at a Russian troll factory that attracted young workers with pay even higher than doctors’. She recollected work shifts during which she was required to meet a quota of five political posts, 10 nonpolitical posts, and 150 to 200 comments on other trolls’ postings. Employees were given English grammar lessons and encouraged to watch American media. Since each troll can create and monitor many different accounts, it becomes a numbers game to see which one will grow the largest and have the most influence.
Most of us have seen social media as a place to either share family photos, inspiring quotes or cat memes. Just imagine for a moment if you quit your job, and dedicated yourself full time to spreading distrust and sowing discord in a rival nation.
What could you — just you alone — accomplish?
What trolls want
When the Russian assault on Ukraine began, Russian troll farms shifted their focus there. As reported by ProPublica, one troll account shared a video of “someone standing in front of rows of dark gray body bags that appeared to be filled with corpses. As he spoke to the camera, one of the encased bodies behind him lifted its arms to stop the top of the bag from blowing away.”
What viewers don’t realize is that this originally came from a climate change demonstration in Vienna, Austria. But the troll tweeted, “Propaganda makes mistakes too, one of the corpses came back to life right as they were counting the deaths of Ukraine’s civilians.”
Right on cue, another account tweeted the same video. “I’M SCREAMING!” with another two sharing the same video with histrionics, “Ukrainian propaganda does not sleep.”
TikTok appears to be an especially friendly place for trolls to harvest, according to an analysis by Clemson researchers and ProPublica that found more than 250 million views on posts promoting Russian state media and disparaging President Joe Biden.
Wherever they are located, troll farms all work to advance their funder’s preferred narrative, and to undercut the viability and credibility of competing points of view. That means harassing researchers, journalists and citizens daring to raise a dissenting voice — or even simply drawing attention to the trolls themselves.
Finnish investigative journalist Jessikka Aro was harassed online after she published a story based on interviews with workers at a troll factory in St/ Petersburg. Three persons were later convicted by a court in Helsinki on charges of defamation and negligence.
In addition to ensuring the dominance of their preferred narrative, these efforts also typically aim for other outcomes: expanding fear, eroding trust in institutions, sowing discord and inciting unrest.
How to spot a troll
Now to the hardest question of all: How do you spot a troll?
In all the great spy novels, the plot depends on the double agent keeping their cover; that’s critical for trolls, too. But there are some telltale signs you’re witnessing a troll at work.
According to an analysis by Clemson University and ProPublica, troll posts appear at defined times consistent with the IRA workday; they drop off during Russian holidays and on weekends, reflecting regularity in the work schedules. In addition, there are often almost identical text, photos and videos evident across various accounts and platforms.
Still, if you have a hard time picking up on this, don’t feel bad. Even our spy agencies have a hard time. Earlier this summer, the U.S. State Department announced large sums of reward money — up to $10 million — for people willing to leak information.
In lieu of any more obvious smoking gun, I believe we need to trust some of the more obvious intuitive patterns. For example, similar to how you might spot a hit piece in journalism, ask yourself: does this person sound completely unhinged? Are they sharing “screed” rather than a thoughtful comment — that is, something designed to enrage? If so, they might be a real grumpy American, but there’s also a chance they may live somewhere else entirely and are getting paid to post.
Another obvious one: Is this post protesting something that the large majority of reasonable human beings would likely agree upon? Civility. Kindness. Basic justice. Or how about, troll farms themselves?
I’ve been fascinated to read the comment threads in the growing number of stories about systematic troll operations. Every once awhile, you come across someone strangely bugged that any attention is being given to the subject at all — and ready with a clever quip that is supposed to convince the rest of us that the entire inquiry isn’t worth the time or attention.
But of course it is — especially if we care about the health of our public discourse. And maybe it’s time to think more seriously about who is behind that especially mean comment you see.
Wake up, America. It’s time to stop getting played.
Jacob Hess is the editor of Public Square Magazine and a former board member of the National Coalition of Dialogue and Deliberation. He has worked to promote liberal-conservative understanding since the publication of “You’re Not as Crazy as I Thought (But You’re Still Wrong)” with Phil Neisser. With Carrie Skarda, Kyle Anderson and Ty Mansfield, Hess also authored “The Power of Stillness: Mindful Living for Latter-day Saints.”