Calls for tougher regulation and guidance to tackle harmful social media content are constantly growing. One of the most recent campaigns, #StopHateForProfit, launched in July 2020, aims to hold social media companies responsible for hate discourses on their platforms. Major corporations such as Coca Cola or Patagonia paused their ads on Facebook, to stop valuating profits over hate, violence, and misinformation. In September, celebrities took over the movement by planning an “Instagram freeze” to fight hate speech. With millions of followers, Kim Kardashian West, Katy Perry or Leonardo di Caprio protested by not posting anything on Instagram for 24 hours.
Although symbolic and short-lived, the freeze is another expression of the pressure on Facebook and social media platforms in general.
Behind the scenes, many projects have been launched to address the growing problem of hate speech. One initiative comes from the Israel Democracy Institute and Yad Vashem. It was discussed last week during a webinar organized by the University of Geneva Digital Law Center, with Professor Yuval Shany, vice-president of the Israel Democracy Institute, who led and coordinated the initiative, and other panelists from the private sector (Facebook), academia and international organizations.
Recommendations for future action. The Israeli proposal provides a series of 16 recommendations for social media platforms to reduce hate speech and protect fundamental human rights, while ensuring freedom of expression. It highlights the legal and ethical responsibility of social media companies for the harm caused by hate speech found on their platforms, encouraging what constitutes hate speech to be revised in accordance with international human rights law standards. Shany, insists on the importance of some specific recommendations:
“Regulating hate speech should not be a binary matter (keep/remove content), but rather include nuanced responses (such as limits on virality and counter-messaging) and pre-emptive interventions (such as reminding users of community rules when offensive phrases are used). Complaint procedures must be improved and rendered more accessible and independent, and companies should engage in broader efforts of consultation with stakeholders and be more transparent about the application of their hate speech policies.”
The recommendations aim to mobilise governments, chief security officers, intergovernmental organisations and consumers to make specific demands to online platforms about their policies on the matter. They also aim to encourage the social media companies to revise policies in accordance with international law norms and institutions.
A challenging task. The importance of the measures to be taken is underlined by the challenges they are facing. As a member of the steering committee for the Israeli proposal, the director of the Digital Law Center, Professor Jacques de Werra, identifies two main issues:
“It is very complex to assess what should be considered as hate speech or not. It may depend on circumstances and may not even be expressed in words but in the form of a picture or a video. Moreover, hate speech is a very serious offense. It is not only about violating an individual’s right. Hate speech can trigger criminal sanctions. It affects not only the victim but society as a whole.”
Complexity and seriousness implies finding ways to help social media platforms to set up internal or external mechanisms to control the filtering of content, considering the necessity of the balance between freedom of expression and abusive messages.
A global commitment. If the platforms themselves try to self-regulate and take down the content that they consider violating online hate speech regulations, it is not enough. The international system needs to engage in the definition of regulatory mechanisms, according to Shany:
“The international system needs to develop more specific norms delineating the scope of protected and prohibited speech, given the unique features of online speech, and develop accountability procedures for victims targeted by online hate speech (which will also address illegitimate restrictions on free speech). Ultimately, international human rights law cannot focus only on the responsibility of governments and it must engage in a paradigmatic shift to regulate the policies and practices and responsibility of global platforms.”
It is also up to users to continue to raise awareness and notify illegal content. Celebrities, such as Sacha Baron Cohen and Demi Levato for example, chose to begin posting educational messages aimed at young people after the Instagram freeze. One thing is certain: the expectations are high and the need for a global discussion is indispensable.