Trump, social media and the way to responsibility

President Donald Trump (AP Photo/Susan Walsh, File)

Following the raid on the US Capitol, internet actors and social media companies have chosen to act by clamping down on accounts of President Donald Trump as well as others linked with pro-Trump violence.

It began with Twitter, Snapchat and Facebook suspending President Trump’s accounts. Twitter then decided to boot @realDonaldTrump permanently from its network, creating a heated debate on the freedom of speech between Democrats and Republicans.

Further actions have quickly spread across the tech space, with sites such as Reddit, TikTok , YouTube and Pinterest taking measures to restrict content seen to incite violence or limit hashtags related to topics such as voter fraud claims.

Parler, the self-billed “free speech” social media platform that has been popular with supporters of President Trump, was banned by Apple, Amazon and Google for failing to moderate posts.

On Sunday, Apple issued a statement saying:

“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues.”

One can wonder where Trump will go now, after having lost his main communication megaphones. He could join the “Trumpnet”, Rumble, MeWe, DLive and other platforms favorable to the US President, the Los Angeles Times explains. But the series of social media bans raises an even more important question that not only impacts a country but democracy itself: who is responsible for what goes online.

A Geneva perspective. In early October 2020, President Trump had been tested positive for the Covid-19. At the time, we had raised the question of information manipulation and the responsibility of social media companies in regulating the content they share with Graduate Institute Professor, Michael Kende.

One of the issues lies with whether these platforms should be treated as tech companies or publishers, which would subject them to tougher regulation. Following the latest series of crackdowns, Kende shares his analysis of the situation:

“In the United States, responsibility for content posted on platforms rests with the platforms, for two reasons. First, the First Amendment right to free speech strictly limits the role of the government in this domain, and second, Section 230 of a 1996 law allows platforms such as Twitter to publish - and restrict - content from others without liability.

Everyone, including President Trump, is given broad leeway to post what they choose in social media, subject to the terms and conditions of each platform. But leaders including President Trump are actually given extra leeway due to the public interest in what they have to say, says Kende - a freedom that President Trump abundantly exercised and that as a citizen Trump was set to lose when he left office later month.

“The platforms are clearly in a difficult position. Ban President Trump too early, and be accused of putting a heavy thumb on the political scales – but ban him too late and be accused of being accomplice to incitement of violence,” Kende says. He continues:

“Ironically, as his tweets began to pile up warnings, Trump sought to punish the platforms by seeking a repeal of Section 230. As a punishment it clearly would have worked – Section 230 is often credited with creating the Internet as we know it – but it also would have backfired on Trump. Without the protections of Section 230, the platforms would have been liable for his words like any other publisher and he might have been warned or banned earlier, for saying less than he did.

Section 230 is clearly under pressure, and will be post-Trump, because of the enormous responsibility it puts on a few select platforms that has been highlighted over and over. However, Winston Churchill noted that ‘it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time…’ and the same may be said about how to regulate platforms within a democracy. Until someone figures out how to guide the platforms without violating the First Amendment, Section 230 may be the best that we can do.”

Pressure is also felt by Professor Jacques de Werra, director of the Digital Law Center. He shares his understanding of the issues at stake:

“The ban of Trump’s accounts on social media platforms is another striking example of the huge pressure put on social media platforms to act as arbiters of public discourse that have to decide what online content is acceptable or not. Each social media platform decides independently on the basis of its own set of self-defined rules: each platform adopts and enforces its own rules and thus acts both as a regulator and as a judge. This is the (wild) era of Twitterocracy, Facebookracy (and other socialmediacracy) that calls for policy action.”

As calls for action and policy are heard across the world, one thing is certain: the Washington insurrection shows the concrete harmful impact of information shared on internet platforms. Will the international community seize the momentum and build the necessary foundations to a clear path while preserving democracy? The answer might just be a click away.