In 2020, Meta de-platformed a leader from India’s ruling party BJP for repeatedly violating its policy on hate speech. To date, the legislator has been arrested and criminally charged for hate speech at least eight times, particularly ahead of elections and during rallies. But although he himself could not post anymore, his followers continued to share his hate speech on various social media platforms. Ahead of India’s general elections this year, one thing is clear: social media corporations are not ready for the amount of hate speech and incitement to violence that will flood their platforms.

Beyond India, this year, more than two billion people will go to the polls in over 65 elections worldwide. For over 4.59 bn people using social media worldwide, their news feeds will be a major source of information. Social media corporations must therefore demonstrate how they will guarantee equitable human rights protection for people everywhere, before, during and after elections.

Taking human rights seriously

Scathing whistle-blower testimonies have brought attention to social media corporations such as Meta, Alphabet, X and others for failing to prevent the spread of illegal content and incitement to violence on their platforms. In the European Union, the flagship Digital Services Act (DSA) now ensures that they take crucial steps, especially on risk assessments and transparency, to increase electoral integrity in EU countries with elections. The DSA’s provisions are set to enter into force by February 2024 — well in time for the European Parliament elections in June. Yet, there are serious concerns that the corporations may simply ignore the binding obligations of the DSA, as they have previously done for voluntary commitments.

For the rest of the world, Meta, Alphabet, X and others have given no indication that they are prepared to address the risks of egregious human rights violations facilitated by their platforms. Indeed, while Meta published an announcement on ‘How Meta is preparing for elections in 2024’ in November 2023, there is a major catch: the election plan published will only apply in the United States. It remains a mystery what measures are in place in the other 64 elections and whether there is any reason for different human rights standards to apply globally.

To take its devastating impact on human rights and democracy seriously, social media corporations must urgently strengthen partnerships with bodies that protect electoral integrity.

Despite proven links between hatred stoked online and offline violence, for example in the recent Democratic Republic of the Congo elections, social media corporations failed to tackle disinformation campaigns. For instance, two opposition candidates were depicted as candidates supported by Western countries, which negatively impacted their candidacy. Furthermore, despite real threats of offline violence sparked by polarising and heated debates online on WhatsApp and X and linked to electoral irregularity and fraud in presidential and legislative elections, corporations did not act to de-escalate the debates around these challenges.

Social media corporations claim they respect human rights globally, as Meta has done in its flagship annual Human Rights Report. Yet, it appears this only applies in some countries.

Need for improvement

To take its devastating impact on human rights and democracy seriously, social media corporations must urgently strengthen, increase and resource partnerships with fact-checkers, independent media, civil society and other bodies that protect electoral integrity. This contrasts with current practice. X, for instance, has blatantly refused to engage with civil society: when Indian fact-checkers from BOOM Live reached out to X (then Twitter) for comments, they received a ‘poop’ emoji as a response.

Civil society partnerships alone, although necessary, are insufficient to hold social media corporations accountable. As an alliance of over 150 civil society organisations under the umbrella of Global Coalition for Tech Justice has recommended, social media corporations must also address the most urgent threats with the dedication necessary. To do so, they must be fully resourced to provide the full spectrum of tools and measures available.

Twitter reportedly disbanded its ethical AI team in November 2023 and laid off all but one of its members, along with 15 per cent of its trust and safety department.

Yet, social media corporations have taken active steps that reduce the resources at their disposal for functioning election preparedness. In 2021, former Meta employee and whistle-blower Frances Haugen revealed that an internal company study had warned its senior leadership about targeted hate speech in India — but the leadership refused to allocate more resources to the problem and instead reportedly further fuelled online divisions. When Meta was summoned to testify in India’s Delhi Peace and Harmony Committee following the revelations, it failed to provide satisfactory answers about its resources.

In the years that followed, social media corporations have massively laid off staff: Twitter (now X) reportedly disbanded its ethical AI team in November 2023 and laid off all but one of its members, along with 15 per cent of its trust and safety department. Meta, meanwhile, has stated that it has over ’15 000 reviewers [...] in over 50 languages’ —therefore acknowledging that it neglects most of the world’s languages. In India alone, Meta provides content moderation for only 20 languages — while there are over 121 languages in the country. Even where it employs reviewers, the systems show deep flaws: in Kenya, over 184 content moderators sounded the alarm bells and sued Meta over their working conditions in 2023.

This year’s elections are a formidable challenge to the very foundations of democracy, and social media corporations must treat them as such. Especially in countries that experienced political coups and are struggling with ‘foreign interference’ in their politics, there is a need to carefully examine how social media corporations’ behaviour may enable or aggravate social cohesion. In Burkina Faso, recent coups have raised questions about whether the planned general election will proceed. Parliamentary elections in Chad have been rescheduled several times; Mali's presidential election is on its second attempt and has again been postponed.

Neglecting manifestations of online violence not only in fragile countries but in all countries going to the polls, would pose a major threat to human rights protection in global elections in 2024. In India, political parties have a track record of mobilising around communal violence ahead of elections, which are scheduled for April. The stakes for these elections are high for peace and stability, and social media corporations are critical vectors deciding the turn of history these countries take.

By embracing these transformative measures, social media corporations can foster a digital landscape that not only upholds human rights and democratic values but also inspires unity.

The fact that there are thousands of fact-checkers, trusted flaggers and human rights defenders who have systematically raised existing problems has not made a visible dent in the choices social media corporations make about their platforms. While social media corporations may make lofty claims to be concerned about human rights and democracy, the problems on their platforms are inherently tied to their very design. As the Council on Technology and Social Cohesion, a coalition of technologists, academics, policy influencers and peacebuilders has flagged, social media corporations will need to explore design-focused approaches to truly mitigate online harms and foster healthier societies — and implement changes at the core of their coding.

Ahead of these elections, social media corporations must now demonstrate that they will radically depart from business as usual. It is critical that they not only moderate their platforms with conflict sensitivity and all resources at their disposal but also make choices at the design level to demonstrate equitable protection of human rights for everyone, everywhere. By embracing these transformative measures, social media corporations can foster a digital landscape that not only upholds human rights and democratic values but also inspires unity, understanding and positive engagement among users worldwide in this decisive election year.