Banning Trump from Social Media Makes Sense. But Beware the Downside

President Donald Trump addressed his supporters from the Ellipse at the White House on Wednesday, January 6, before thousands from the crowd breached the Capitol Building. The attack that was largely planned and discussed on social media and extreme online communities, according to one BU researcher. Photo By Bill Clark/CQ Roll Call via AP Images
Banning Trump from Social Media Makes Sense. But Beware the Downside
When online hate speech moves off Facebook and Twitter, it migrates to smaller, extreme, fringe platforms, BU researcher explains
After a shocking day in American history when a violent mob, incited by President Trump, stormed and breached the Capitol Building, Facebook and Twitter temporarily banned the president from using their platforms. On Thursday morning, Facebook founder Mark Zuckerberg went a step further, announcing Trump will be banned from Facebook’s social media platforms until at least the end of his term on January 20.
The events were a direct reaction to words that Trump has repeated on social media, and that he said at a rally Wednesday before the attack on the Capitol—baseless claims about election fraud, the election being stolen from him, and his loss to Joe Biden in November.
“I think that banning [Trump’s] account is the right call for social networks, but it might have unforeseen consequences,” says Gianluca Stringhini, a Boston University College of Engineering assistant professor of electrical and computer engineering. He has been studying online disinformation, hate speech, and radicalization for years, and recently earned a National Science Foundation CAREER award to develop tools to rapidly identify coordinated cyber mobs.
Much of the online activity that led to these days events happens on independent communities created by users after their communities on Reddit were banned. We study the effect of bans on user activity in our recent 📜https://t.co/dmUUQ8svxx
— Gianluca Stringhini (@gianluca_string) January 7, 2021
led by the awesome @manoelribeiro https://t.co/13PtJxsMJf pic.twitter.com/pVofPnEbdP
In a recent paper, Stringhini and his collaborators studied what can unfold after radical online communities are banned from platforms. The researchers analyzed online posts made between 2017 and 2020 from r/The_Donald and r/Incels, two communities that were banned from Reddit and subsequently moved to stand-alone websites. They found overall that having them banned significantly decreased posting activity, reducing the number of posts, active users, and newcomers.
But r/The_Donald users that migrated to an independent website called thedonald.win showed signs of increased toxicity and radicalization. According to Stringhini, their findings paint a nuanced picture of the effect of platform moderation action and should help inform decisions that platforms, and government officials, make when it comes to dealing with false and hateful messages.
The Brink caught up with Stringhini to discuss Wednesday’s events, and what impact a ban on Twitter and Facebook could have on Trump and his followers.
Q&A
With Gianluca Stringhini
The Brink: Can you explain how Wednesday’s events at the Capitol were fueled by online communities? Do you see a clear connection?
Stringhini: In the past five years, my collaborators and I observed a tendency for people to move to polarized online communities where they could discuss their political—and often extreme—views without fearing censorship. We found that relatively small communities like 4chan’s Politically Incorrect Board, Reddit’s r/The_Donald subreddit, and websites Gab, Voat, and Parler became fertile grounds for conspiracy theories, disinformation, and online hate. This nefarious content then makes its way to mainstream social networks, where millions of users see it. Oftentimes, this migration of content from fringe communities to mainstream ones is facilitated by popular figures. For example, President Trump’s Twitter account has posted conspiratorial and hateful content that first appeared in one of these small communities multiple times.
With respect to Wednesday’s events, conspiracy theories undermining the democratic process have spread like wildfire on these polarized communities in the past months. For example, the QAnon conspiracy theory is fueled by messages that come from “Q” and posted on platforms like 4chan and 8chan, which then get interpreted, discussed, and evolve inside these echo chambers, until they make their way to mainstream social media. [Wednesday’s] protests were largely discussed and organized on these platforms.
The Brink: Do you think it is a good idea for platforms like Twitter and Facebook to ban President Trump?
Stringhini: I think that in the short term the ban will help reduce the spread of conspiracy theories on Twitter and Facebook. As a side effect, however, many supporting Trump will feel like they are being censored. Without mentioning the incitement of violence, Donald Trump’s Twitter account has been sharing disinformation and conspiracy theories for years, often coming from those same polarized communities that I previously mentioned. The amplification effect that his account has in spreading this false information is staggering, because millions of people take his posts at face value. I think that banning his account is the right call for social networks, but it might have unforeseen consequences.
The Brink: What did your research find about bans and their impact on limiting false information and radicalization?
Stringhini: In our work we provided two case studies of communities that after being banned on Reddit migrated to their own independent website, in this case the two were thedonald.win and incels.co. We found that these migrations resulted in lower activity by users, possibly because the limited number of topics available on the platform made it less appealing to users who previously could post on any subreddit. We did however find that the users who migrated and remained active on thedonald.win showed increases in signals associated with toxicity and radicalization. This paints a nuanced picture of the effect of platform moderation actions, because while fewer users are exposed to toxic content, those who are [exposed to toxic content] become decidedly more extreme, which could lead to more virulent online activity or even real-world violence.
The Brink: Now that Trump is banned, even if it’s temporary, do you think it’s likely that he or his followers will move to a different platform?
Stringhini: If Trump moves to an alternative platform, a large fraction of his supporters will likely follow him. We have been observing large migrations of users to Parler as a reaction to the 2020 presidential election, and [in response] to Twitter flagging claims of election fraud as disputed.
Interview has been edited and condensed for clarity.
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.