Drew Anger / Getty Images
A new defamation lawsuit for conspiracy theorist Alex Jones, which began this week, may offer insight into the effectiveness of “deplatforming” — booting unwanted accounts from social media sites.
The trial, in Connecticut, is the second of three trials facing Jones for promoting lies on his streaming TV show and the Infowars website that the 2012 Sandy Hook Elementary School shooting was a hoax. The families of the victims, who Jones called “crisis actors”, have faced harassment, threats and psychological abuse. In August, a Texas jury awarded $45.2 million in damages to the family members, though Jones says he wants to appeal the ruling.
Jones, a serial conspirator and fabulist, was removed from nearly all major internet and social media platforms in 2018 after threatening then-special counsel Robert Mueller that was investigating then-President Donald Trump’s ties to Russia. Initially, a round of media coverage pointed to flagging traffic to Jones’ websites as evidence that “deplatforming works.” However, the disclosure of Jones’ defamation lawsuits may point to the existence of a rare class of extreme Internet personalities who are better shielded from attempts to block access to their content.
In the Connecticut trial, a corporate representative for Jones’ companies testified that Infowars may have generated anywhere from $100 million to $1 billion in revenue in the years following the Sandy Hook massacre. Testifying during a previous trial in Texas, Jones told the court that Infowars generated nearly $70 million in revenue in the most recent fiscal year, up from an estimated $53 million in 2018, the year Infowars was widely decimated. went.
The difference between Jones and many other right-wing actors is that Infowars had an existing infrastructure outside of social media, says political scientist Rebekah Tromble, who directs George Washington University’s Institute for Data, Democracy and Politics. ,
According to court filings, the largest of Jones’ nine private companies, Infowars makes up about 80% of its revenue selling products, mostly dietary supplements. He grew his talk radio audience aided by an initial partnership with a sympathetic distributor and now owns his own network and independent video-streaming site.
A growing body of research suggests that the removal of toxic actors or online communities typically significantly reduces audience size, with the caveat that this smaller audience tends to migrate to less regulated platforms, where extremism then violence. centered on possibility.
Assessing the effectiveness of deplatforming is complicated, because the term can refer to different things, says Megan Squire, a computer scientist who analyzes extremist online communities for the Southern Poverty Law Center.
Squire says, “Your site infrastructure is losing, your social media is losing, your banking is losing. So like the big three, I’d say.” She says they all have different effects depending on the specific case.
Square’s research shows that traffic to Jones’ online Infowars Store remained stagnant for about a year and a half after it was removed from major social media sites. It then declined in the lead-up to that year’s presidential election and its violent aftermath during 2020, when Infowars Store saw a massive spike in traffic that Jones had not seen two years before its deplatforming.
Jones’ flexibility is more the exception than the rule, says Squire. She points to the case of Andrew Anglin, founder of the neo-Nazi website The Daily Stormer. After the violent Unite the Right rally in Charlottesville, VA in 2017, he lost his web domain and had to cycle 14 more each time he lost traffic. Squire says Anglin is on the run from various lawsuits, including a decision to pay him $14 million in damages for terrorizing a Jewish woman and her family.
Post-Deplatforming Survival Strategies
Even after the ban on social media, conspirators like Jones find solutions. Square says it’s common for other users to host banned personalities on their own channels or just repost the banned person’s content. People may rebrand, or they may direct their audience to an alternative platform. After banning companies including YouTube and PayPal, white supremacist livestreamer Nick Fuentes eventually created his own video-streaming service, where he encouraged his audience to kill the parliamentarians In the lead up to the January 6 Capital Riot.
Other Internet communities have shown similar resilience. A popular pro-Trump messaging platform known as The Donald was removed from Reddit and later shut down by the latter’s owner following the Capital riot and is still more active than ever, according to Square. When Trump himself got banned from Twitter, Square was seen as a messaging app receive telegram Thousands of new users. It remains a thriving online space for right-wing celebrities and hate groups.
To raise funds, even if extremists are completely cut off from credit cards or financial institutions that process donations, they can always turn to cryptocurrency.
“100% of these people are in crypto,” says Squire, which, she notes, is not necessarily easy to live with. Its value is volatile, and redeeming it is not always straightforward. Nevertheless, Square and its allies have found unknown donors using crypto to funnel millions of dollars to Jones and Fuentes.
“We live in a capitalist society. And who says entrepreneurs can’t even be on the side of the plot of things?” says Robert Goldberg, a professor of history at the University of Utah. He explains that conspiracy smugglers have always been “incredibly knowledgeable” in whatever new technology is available to them.
“The Atlanta, Georgia, headquarters of the Klan hoods and robes and all this merchandise, this sign, this bling, if you will, will sell to the 5 to 6 million people who joined the Ku Klux Klan in the 1920s,” he said. it is said. But aside from the KKK’s heyday, Goldberg says, selling conspiratorial material about the Kennedy assassination, UFOs or the 9/11 terrorist attacks has generally been far less lucrative.
power and lies
A big question for researcher Shannon McGregor of the University of North Carolina’s Center for Information, Technology and Public Life is what conspiracy entrepreneurs hope to gain from their reach.
McGregor says, “Why are these people doing this in the first place? What are they getting out of this? And in many cases in this country, especially, in this very moment, it’s about seizing power. ” She says that fringe communities have always been present in a democracy, but what should matter is their proximity to power.
She rejects “both sides” framing the issue, identifying it primarily as a right-wing phenomenon that dates back decades. McGregor says, “Like the Nixon era, at least, this right-wing, ultra-conservative media ecosystem has been juxtaposed with political power, making it highly unlikely that it will actually go away.”
They argue that deplatforming and punitive defamation lawsuits are less a solution than “reduction in damages”. When an individual conspirator or conspiracy site loses its audience, replacements quickly emerge. This does not mean, McGregor and other experts agree, that efforts to stop the spread of extremist or undemocratic narratives should be abandoned altogether.
“I think overall, [social media company] Delegates would like if the conversation became, ‘Oh, well, deplatforming doesn’t work, does it? … So, you know, it’s not our responsibility anymore,” Tromble says.
Squire says there’s no doubt it’s worth doing anything that makes it harder for toxic conspirators to function smoothly or spread their message. It removes the platform from the safe and reinforces the societal norm that harassment and hate speech result.