Illustration by Nate Kitch

The Extremism Machine

Online disinformation poses a danger to society. Researchers at U of T’s Citizen Lab are tracking it – and trying to figure out how to stop it

On January 6, while then-U.S. President Donald Trump urged supporters to “fight like hell” against his election loss, many of his followers started to make their way toward the Capitol. Thousands had travelled to Washington to take part in Trump’s “Save America” rally, and to support his “Stop the Steal” campaign after Joe Biden won the presidential election in November. Twenty minutes before Trump even finished speaking, rioters became violent with police outside the Capitol, and burst through barricades. A little over an hour later, some protesters smashed a window in the Capitol, and hundreds poured into the building as lawmakers were in the process of certifying the election. Police officers were beaten, offices were vandalized and looted, and five people ended up dead.

About The Author

Author image: Sadiya Ansari

Like many others, John Scott-Railton, a senior researcher at U of T’s Citizen Lab, watched events unfold with increasing horror. “I saw an image of a guy with zip-tie restraints and my heart dropped,” says Scott-Railton. “I thought, oh my God is there an intent to kidnap legislators?”

In his research, Scott-Railton examines technological threats against civil society. (Generally, people affiliated with the Citizen Lab work at the intersection of communication technologies and human rights.) Often, he says, these threats come from authoritarian governments targeting activists or protesters in their own country. But lately he has been spending more time tracking a different kind of problem – the evolution of online disinformation campaigns in democracies, such as the “Stop the Steal” movement, which grew rapidly in the weeks following the November election.

Scott-Railton says the attack on the U.S. Capitol is one of the most alarming instances of how toxic online culture translates into real – and highly damaging – offline consequences. While he watched the movement grow online and suspected that many people were underestimating the danger it posed, the reality of the attack still unnerved him. “I was taken aback by the physical manifestation of violent rhetoric,” he says.

In the months since the attack, Scott-Railton has been using a variety of online investigation techniques to identify and understand the people who participated in the riot. Law enforcement has gotten involved. But he says criminal prosecution of those who participated in the riot alone won’t fix this issue. “This is the clearest example we have of a societal problem that people have been warning us about for a generation.”

The problem is that online disinformation is contributing to polarization in political speech, at times veering into outright extremism. The phenomenon is global. And there are numerous reasons for concern.

Jan. 6 was an opportunity for right-wing extremist organizations. … But we also see lots of people who were in it for the ride and were willing to take directions. And to me, that’s very concerning.”

– John Scott-Railton, senior researcher at U of T’s Citizen Lab

In Canada, Alexandre Bissonnette read anti-Muslim lies and hate online before fatally shooting six people at a Quebec City mosque in 2017. In India, the 2019 election was rife with misleading information – driven largely by nationalism and spread primarily by political parties through WhatsApp. And worldwide, according to the non-profit Center for Countering Digital Hate, multiple anti-vaccination campaigns have ramped up during the pandemic, spreading false information about the safety of COVID-19 vaccines.

The solutions will not be easy.

While disinformation has always existed, there is “no doubt” that social media propels it, says Ron Deibert, a professor of political science at the Munk School of Global Affairs and Public Policy and the director of the Citizen Lab. To him the cause is clear: a business model that is based on collecting as much data as possible about its users by capturing – and hanging onto – their attention. Simple messages that draw emotional responses are more apt to be liked and shared. The algorithms on YouTube and Facebook that recommend new content nudge users to extreme material to elicit a stronger emotional reaction – watching yoga videos can lead down the alternative health rabbit hole to anti-vaccination videos, which can then lead to QAnon content. “That’s the algorithm working as intended,” says Deibert.

Citizen Lab associate Gabrielle Lim researches media manipulation and disinformation, and while disinformation can come from anywhere, she says the important thing is to note how it takes hold. Lim examines media manipulation case studies as part of a group at Harvard Kennedy School’s Shorenstein Center. The group studies how the information ecosystem allows disinformation to thrive, looking at how falsehoods such as the idea that COVID-19 was invented in a Chinese lab reach a large audience. Researchers pick apart how an idea gains traction – for example, from Twitter to a blog, to mainstream media and then to a politician. The bigger the platform, the more likely the idea will be seen as legitimate, says Lim.

“If an idea gets on Tucker Carlson [Tonight on Fox News], then that’s a whole new world – you might have reached a few hundred thousand people before, but now you’re reaching millions of people,” says Lim. “It’s one of the more common strategies we see: we call it trading up the chain.”

The Stop the Steal movement benefited from the ultimate amplifier, says Lim – Trump himself. But the problem with disinformation is not just that there are bad actors who deliberately want to deceive people, pushing it further up the chain. It is that there are also political and media elites who benefit from spreading false or misleading information, as well as individuals who – because of various cognitive, cultural and sociopolitical factors – are inclined to believe and share it. Analysis from the University of Chicago shows that a large majority – 89 per cent – of 193 people charged with crimes related to the Capitol riot had “no connection to existing far-right militias, white-nationalist gangs or other established violent organizations.” While there were organized extreme-right wing groups present – the Proud Boys, Oath Keepers and Three Percenters – the analysis found they represented only about 10 per cent of those arrested.

“Jan. 6 was an opportunity for right-wing extremist organizations,” says Scott-Railton. “We’re learning that the Proud Boys and Oath Keepers clearly had specific plans and executed on them. But we also see lots of people who were in it for the ride and were willing to take directions. And to me, that’s very concerning.” What worries Scott-Railton, in other words, is that thousands of people showed up to the Capitol unaffiliated with any extremist group and yet took violent action all the same.

Abstract illustration of a red coloured body, face and hands upside down, with black and white bits and spirals inside
Illustration by Nate Kitch

Megan Boler, a professor in the department of social justice education at the Ontario Institute for Studies in Education (OISE), investigates how social media influences emotions in the context of the “post-truth” moment we are in – where emotions have replaced facts. “Emotion drives politics more than ever,” says Boler.

As part of a three-year research project, she and her team are examining social media posts related to politically polarizing events, such as Justin Trudeau’s blackface fallout, or inflammatory statements from People’s Party of Canada leader Maxime Bernier, to track reactions from all parts of the political spectrum.

In her work, Boler has found that the right has been more successful than the left at engaging people online, partly because the right is more coordinated than the left. Like other scholars, Boler has been employing the concept of ressentiment, as philosopher Friedrich Nietzsche did, in her work. Ressentiment refers to a sense of resentment rooted in a particular understanding of history, says Boler. In the contemporary context, Boler has observed it in “white people claiming their whiteness as part of their victim identity.” In the case of Trump supporters, it’s clear, says Boler, that they feel the America they “counted on” – an America where white people had privilege – has been lost.

“There are a few scholars who [apply this concept to] the rise of Trump and the particular way that he situates his supporters,” says Boler. “It’s such a great description of the right in terms of an identity as someone who is victimized, one who is virtuous because of the victim identity, and one who wants a particular kind of revenge.”

While in her research she sees clear similarities between how emotions are targeted by all parts of the political spectrum – the focus on issues such as identity politics and freedom of speech – ressentiment is what separates the two. “When the right speaks about what they want, they invoke a kind of nostalgia for how things were – make America great again,” says Boler. “Whereas on the left, there’s less desire to move backward.”

Maxime Bernier is an example of a Canadian politician who has appealed to the same vein of narratives as the right in the U.S., says Boler, echoing a desire to “return” Canada to a simpler time by focusing on immigrants with “Canadian values” and cutting immigration levels by more than half of the number under the Trudeau government.

Disinformation has helped prop up his campaign. In a campaign speech ahead of the 2019 federal election, for instance, Bernier asked, “Are Canadians happy to subsidize 74 per cent of our current immigrants?” A thorough fact-check by the CBC showed that this statement was false, using “cherry-picked data” that did not reflect immigrants’ contribution to the Canadian economy.

Fact-checking is one tool used against disinformation, especially during elections. Factcheck.org out of the University of Pennsylvania, for instance, has been assessing claims in American politics since 2013. Many news sites fact-check specific events, such as presidential debates but also take it on as a broader enterprise. The Washington Post, like many other outlets, dedicated resources to verifying all of Trump’s statements (30,573 misleading claims in four years, they found).

But political fact-checking in particular may not be the best tool to fight disinformation, says Lim. She notes that the practice can be highly partisan; CNN and the Daily Caller, a right-wing website, will produce very different fact-checks. “They’re going to be skewed on what they choose to fact-check, and also how they choose to present it,” Lim says. Another major issue is that checking happens after a piece of information has already reached an audience. “It’s very hard to get that fact-check back into the hands of those who saw that original piece of content,” says Lim.

Deplatforming could be more effective. This occurs when social media companies delete or suspend accounts for spreading disinformation or hate, or when fringe platforms are forced to shut down. Twitter permanently suspended Donald Trump’s account two days after the Capitol storming. And “free speech” social media network Parler went offline for a month following the riots because Apple and Google booted it from their app stores and Amazon refused to host it on their web servers. “Deplatforming can work,” says Lim, “especially when it removes the financial resources for an individual or group to continue their work and takes their content out of the wider circulation of information.” While that content may move onto smaller platforms or messaging apps, Lim says, “growing an audience takes time and can make recruiting more difficult.”

Trump’s Twitter ban had many calling it a violation of free speech. But Twitter CEO Jack Dorsey defended the decision in a tweet, saying, “Offline harm as a result of online speech is demonstrably real.” He added that preventing real harm is “what drives our policy and enforcement [of our terms of service] above all.”

Ultimately, a broad solution is needed for a problem that has metastasized as a result of a tech environment that has gone unchecked for decades, says Deibert. “We’ve got an existing ecosystem that is highly insecure, invasive by design, poorly regulated and prone to abuse, and suddenly now we’re relying on it more than ever,” he says. This essentially has created “a giant data-manipulating machine that is bringing out the worst of us.”

Despite this gloomy diagnosis that he characterizes as a “social and political sickness,” Deibert is actually optimistic about being able to turn it around. “We’ve created these feedback loops and unintended consequences, principally around technology, that risk our collective ruin,” says Deibert. “We created them but we can also manage them – if we get our act together, we can do something about it.”

There’s no substitute for investing in the type of training and education that goes into what it means to be a human as part of a collective – a citizen.”

– Ron Deibert, political science professor and the director of the Citizen Lab

In his 2020 book Reset: Reclaiming the Internet for Civil Society, Deibert lays out a plan to manage this ecosystem better. His main argument is for pursuing solutions guided by the principle of restraint. For governments and companies, this means pulling back on what they can do with the powers of surveillance that information technology has enabled, and managing the risk of bad actors exploiting technology. And there’s a takeaway for individuals. “We’ll need personal restraints, too: restraints on our endless appetite for data, restraints on our emotions and anger as we engage online in the absence of the physical cues that normally help contain them,” writes Deibert.

There’s also a larger idea of what will help us mitigate the ills of technology in the future: the idea of civic virtue. Deibert points to how the environmental movement has taken on this collective, long-term view to advocate for restraint about consumption. This type of thinking, he says, has been neglected in favour of science, technology, engineering and math. And while those subjects have an important place in our society, the arts and humanities are what will ultimately help us manage these larger problems.

“There’s no substitute for investing in the type of training and education that goes into what it means to be a human as part of a collective – a citizen,” says Deibert. “Maybe during the pandemic some of us now recognize the importance of that and are starting to adjust accordingly. I hope so.”

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. 13 Responses to “ The Extremism Machine ”

  2. Stephen Kahnert says:

    My teenage sons will be up by noon this Sunday, and they'll find I've texted them this article highlighting "trading up the chain ... our collective ruin ... restraint." I hope my thanks go viral, as they say these days.

  3. Don Riddle says:

    "Mitigate the ills of technology"? As a member of the scientific community, I take exception to this characterization. Many of the people abusing social media are inciting a culture of anti-science. Anti-vaxxers are only the tip of the iceberg, and it is members of the scientific community we should rely on for fact-checking. Politicians and the media often fail to understand the science, which makes it even easier for those who manipulate social media.

  4. Marillene Allen says:

    Both this article and "Clearing the Air" are amazingly well thought-out and bring hope and direction. They need wider circulation.

  5. Dick Swenson says:

    If I could, I would send you copies of two articles written by Lee McIntyre in regard to his book Post-Truth, from the MIT Essential Knowledge Series. They discuss cognitive dissonance and the difficulty of negating a lie. Using the idea of an ecosystem is good. This is not just a single problem, but a social one that must be handled on a broad front.

  6. Joe says:

    This is a biased article. Unfortunately, there is disinformation and misinformation on both sides. When one side dominates and controls the narrative, there is a loss of trust in society's institutions.

  7. Maddy says:

    Excellent article, thank you!

  8. Thomas Verduyn says:

    The way to deal with the disinformation is not to deplatform people. Deplatforming enrages and entrenches people. The solution to misinformation is open debate. That is the Canadian way: laws are not passed without parliamentary debates; criminals are not convicted without a chance to defend themselves, scientific papers are not published till they have been peer reviewed. Even religious organizations such as the Christian church have a long history of calling councils to debate issues. Of course, debates do not always end up supporting the truth, for people are only human. Furthermore, not every individual is won to the side of the majority. But open debate remains our best avenue to stop misinformation. Sadly, how rare it is for two opposing sides to meet publicly and discuss an issue rationally.

  9. GG says:

    This article is evidence of the bias that exists within universities and media. It does not bode well for society when a very small privileged group (academia and media) use their power and influence to stifle dissenting views about what "the truth" is. There's a presumption in this article that all smart U of T alumni must agree on one "truth." No counter viewpoint is presented.

  10. Edward Wedler says:

    Fighting disinformation is tough, especially in this retweet, repost, BCC world where anonymity and unaccountability thrive. Trust is turning to distrust. We do need to reexamine the algorithms that can nurture the spread of falsehoods. While artificial intelligence is being explored to identify disinformation, I suggest we look at ways to "slow down" the spread; maybe using tools from blockchain technology or employing multi-step transaction/authentication in social media. If we cannot stop the spread of disinformation, by slowing it down we can give time to rethink our online actions -- especially if we are unwitting, innocent partners to the problem.

  11. Karen E Gough says:

    Very timely and informative!

  12. Leslie says:

    I am a proponent of building resilience and critical-thinking skills, which to me is much more sensible and rewarding, with a lot more upside, than doing what we typically do in a liberal democracy: play "whack-a-mole" or "fight." This is silly.

    "Fighting" disinformation is not necessary if we confer the critical-thinking skills on our youth through education. Any university that has graduated people who believe and spread disinformation needs to be asking and answering, "Why and how are we failing?"

    But, as we well know, the process needs to start in youth.

  13. University of Toronto Magazine says:

    From Will Steeves Mancini (BA 1991 UC):

    I was surprised to read that the author considers the belief that the current pandemic was invented in a Chinese lab, to be not only extreme, but just plain false. While I can't prove that this belief is true, the author can't simply declare it a falsehood either. There actually is a military laboratory in Wuhan, and the possibility of an accidental release can't simply be brushed aside.

    Similarly, if a pandemic were ever to appear to originate in north-central Maryland, the possibility that the pandemic was an accidental release from the U.S. Army Medical Research Institute of Infectious Diseases in Frederick, Maryland, could not simply be dismissed as implausible.

    With this in mind, I have to ask: In our zeal to root out online extremism and falsehoods, who gets to decide what's true and what's false? Or, what's extreme or moderate? In light of the upcoming passage of Bill C-10, I'm not sure I like where this slippery slope is headed.

  14. Dennis Kung says:

    This article only provides sources of what is perceived as "right-wing" falsehoods. Where are the left-wing examples? The article indicates that about 90 per cent of those charged with crimes from January 6 did not have any connections to right-wing organizations, but the event is still characterized as "right-wing." Gabrielle Lim promotes "deplatforming" and thus censorship. Giving the readers the facts and letting them make their own judgment is preferable.