An assassination attempt for the social media age

On a dark day for democracy, the rules of the 2020 internet no longer apply

An assassination attempt for the social media age
(DALL-E)

The attempted assassination of former President Trump on Saturday was a grim day in the country’s history. It also left in its wake an information vacuum that endures more than two days later. What was the shooter’s motive? What shaped his ideology? Was he working alone? 

Social media abhors a vacuum, and over the past 48 hours users of every big platform have raced to fill it with news, commentary, analysis, conspiracy theories, fabrications and jokes. But where once platforms attempted to intervene swiftly to prevent obvious falsehoods from spreading, this weekend found the role of trust and safety teams largely diminished. 

This was most obvious on X, formerly Twitter, where top trending spots after the shooting were held by hashtags promoting the idea that the event had been staged. (There is no evidence to support this.) In 2020, Twitter led its peers in labeling the former president’s tweets about mail-in ballots as being misleading. But after Elon Musk bought the company and rebranded it as X, the company laid off most of its trust and safety workers and stopped adding labels to posts from elected officials. 

Musk has since courted right-wing users to the platform by hosting audio events with Republican candidates, warning users that “cisgender” is considered a slur on X, and continuously posting Republican talking points about immigration and other issues. Given that ideological shift, then, it was notable that the conspiracy theories spreading on X over the weekend came from the left: terrified that the shooting would give Trump an insurmountable advantage in the election, liberals began posting that Trump’s campaign itself must have been behind the incident. 

But the right had conspiracy theories of its own to offer, and they played out across Trump-backed Truth Social and other online spaces friendly to conservatives. Here’s Taylor Lorenz in the Washington Post:

On X, Trump’s Truth Social and the pro-Trump message board Patriots.win, the shooting was portrayed without evidence as a failed execution attempt by shadowy Democrats or an “inside job” by the “deep state” to protect its grip on Washington. Some right-wing posters with millions of online followers shared theories that the Secret Service’s failure to stop the attack was preplanned, or that the agency had been weakened or distracted by diversity initiatives. Musk himself questioned whether the error was “deliberate.”

Right-wing influencers and provocateurs, including Trump’s longtime confidant Roger Stone, shared names and photos alleging that the shooter was in fact an anti-Trump protester, an “antifa extremist” or — in an odd turn — an Italian soccer journalist. They also widely shared a video from an online troll who said he fired the bullets because he hates Republicans, and that he got away with the attack. Conservative conspiracy theorist Mike Cernovich also alleged that the shooting was part of an FBI plot to inspire “copycat attacks.”

It’s tempting to lay all the blame for this at platforms’ feet. During the Trump Administration, platforms faced enormous pressure from lawmakers, regulators, and journalists to restrict the spread of theories like these. 

And yet of all the takes that landed over the weekend, none stuck with me more than this post from comedian Josh Gondelman. “I know people are saying not to spread conspiracy theories right now,” he wrote on Saturday in a viral X post, “but I would like to read them.”

Gondelman’s post, which has received nearly 70,000 likes, resonated because it speaks to a hugely important and rarely discussed aspect of the misinformation problem: the huge consumer demand for it. The rise of social media and parallel decline of mainstream journalism have enabled us to create what researcher Renee DiResta calls “bespoke realities”: custom versions of the truth that reflect what we already want to believe. As David French wrote last year in the New York Times: “We’re misinformed not because the government is systematically lying or suppressing the truth. We’re misinformed because we like the misinformation we receive and are eager for more.”

This is particularly true for the attempted assassination of Trump, which instantly became the world’s biggest news story despite the fact that very little was known about what had happened. Unlike many mass shooters, Trump’s would-be assassin apparently left no manifesto or trail of social media posts. He had a Discord account that was mostly dormant, and seems not to have been used to plan his crime. He wore a shirt branded with the name of a pro-gun YouTube channel named Demolition Ranch, but the channel itself largely eschews politics in favor of making viral videos of tanks shooting things. 

Given the importance of the story and the near-total lack of information about the events leading up to the shooting, it was inevitable that people would speculate wildly. And especially in the immediate aftermath, it’s not clear that platforms should have done much to stop them. Citizen reports about mass shootings and other major news events have often turned out to be true, after all, and asking platforms to divine the truth about what happened in real time and stop potentially false hashtags from being promoted seems just as likely to repress true speech as it does falsehoods.

Platforms shouldn’t take a totally hands off approach, of course. In the event that the shooting had been falsely attributed to some minority group, for example, and a platform’s users were attempting to foment violence against that group, platforms should intervene. But in this case, however wrong users may have been in their conspiracy theorizing, ultimately nothing promoted on X’s trending page was any crazier than the ideas that are promoted on Fox News every day. And it seems worth noting that an earlier conspiracy theory about the Kennedy assassination was nominated for eight Academy Awards.

As theories about the shooting flew, tech CEOs including Mark Zuckerberg, Sundar Pichai, Tim Cook, Satya Nadella, and Andy Jassy condemned the violence. (Musk went a step further, endorsing Trump and donating to his campaign.) It was a welcome affirmation of the rule of law, and a reminder of the role tech platforms often played in supporting democratic ideals during the Trump administration.

Whatever happens in the weeks ahead, it’s clear that those ideals will soon be tested again and again. And unlike in 2020, platforms showed this weekend that they are increasingly comfortable sitting on the sidelines of contentious news stories, content to let users seek out whichever versions of the truth most appeal to them.

On Monday, Trump named Ohio Sen. J.D. Vance, a former venture capitalist, as his running mate. Vance is an investor in Rumble, the right-wing YouTube alternative where conspiracy theories thrive. Vance, like Trump, has also endorsed the repeal of Section 230 of the Communications Decency Act, the law that grants platforms legal immunity in most cases for what their users post. 

Vance has said that smaller platforms should retain legal liability to help them compete against larger companies. If his view becomes law, the internet and its bespoke realities would splinter once again. Among large platforms, new legal liabilities would prompt them to restrict more speech for fear of getting sued. On smaller ones, continued immunity would allow them to continue hosting and promoting a much broader range of speech — including the conspiracy theories that so many Americans cherish.

The day before the attempted assassination, Meta said it would roll back the remaining restrictions on Trump’s Facebook and Instagram accounts, which his campaign is now heavily using. Trump had seen his account suspended for two years after he led an insurrection at the Capitol, which caused several to die and led to 174 police officers being injured.

From roughly 2017 to 2023, a broad consensus held among social platforms that tech policy could and should be used to promote high quality information and support democratic principles. By this summer, though, that consensus had broken. The trust and safety era peaked and is now in decline. Tech executives seem increasingly resigned to the idea that Trump will once again be president. And whatever ultimately happens in the 2024 election, it’s now clear that how it plays out online will look very different than it did in 2020.

Sponsored

Extremely Hardcore Sale

The e-book version of Extremely Hardcore: Inside Elon Musk's Twitter, is on sale this week for $1.99. Buy your copy today! It's the inside story of how Twitter ceased to exist, as told by the people who worked there.

Governing

Industry

Those good posts

We're going to be staying away from posts about the assassination here.

(Link)

(Link)

(Link)

Talk to us

Send us tips, comments, questions, and an end to political violence: casey@platformer.news and zoe@platformer.news.