top-news-1350×250-leaderboard-1

The Algorithm Increases Harmful Content’s Reach, Raising Exposure and the Risk of Real-world Harm

CIVICUS speaks with Mercy Mutemi, a Kenyan tech lawyer at Nzili and Sumbi Advocates and legal counsel for petitioners challenging Facebook’s role in hate and violence in Ethiopia.

On 3 April, Kenya’s High Court allowed a landmark lawsuit against Meta to proceed. The lawsuit, valued at US$2.4 billion, was filed in Kenya because Meta’s regional content moderation was carried out by a contractor based there. It was brought by two Ethiopian nationals and a Kenyan civil society organisation who accuse Facebook of promoting hate and incitement during Ethiopia’s civil war . One of the claimants’ fathers was killed after being doxed on the platform. The case seeks a restitution fund, improved content moderation and systemic reform. Meta continues to challenge the court’s jurisdiction, but digital rights groups believe the lawsuit could establish a global precedent for platform accountability for online-incited violence .

How have Kenyan courts established jurisdiction over Meta?

Three separate cases have challenged Meta and its outsourcing partners, Samasource and Teleperformance (formerly Majorel), in Kenyan courts. The fact that Kenyan courts successfully asserted jurisdiction marks a significant shift in global tech accountability.

Meta initially challenged the courts’ jurisdiction, arguing it did not directly employ moderators or operate locally. However, this strategy failed across all three cases. In September 2024, the Court of Appeal ruled that Kenyan courts have jurisdiction to hear allegations of human rights violations linked to Meta. Similarly, the Constitutional and Human Rights Court ruled in April that, since content moderation decisions were made in Kenya, Kenyan courts could assess whether human rights were upheld.

These rulings break new ground. Unlike other jurisdictions that grant tech companies broad safe harbour protections, Kenyan courts have relied on constitutional rights to pierce that immunity and allow claims to proceed. This could potentially reshape how global platforms are held to account worldwide.

What are the cases and their focus?

The cases fall into two categories: labour rights and algorithmic harm. The two labour cases focus on exploitative working conditions. Former Facebook content moderators claim they were trafficked into jobs that severely damaged their mental health. These are the first cases to hold a major tech company accountable for labour-related human rights abuses occurring outside the USA, despite the involvement of outsourcing firms. They could establish global standards for how tech companies must treat their remote workforce and clarify where responsibility lies within complex supply chains.

The third case , in which I represent one of the plaintiffs, ta ckles a different problem: algorithmic amplification of hate speech during Ethiopia’s conflict. Despite Meta’s claim that the impact was felt in Ethiopia, not Kenya, the court ruled that Kenyan jurisdiction applies because the moderation decisions originated there.

This case will determine whether social media platforms can be held accountable for human rights violations resulting from AI-driven systems, whether Meta’s algorithms demonstrate bias or discrimination against African users and whether national courts have jurisdiction over digital decisions with cross-border impacts.

How can Facebook’s algorithm amplify violence?

The evidence presented reveals a disturbing pattern. Facebook’s algorithm , which prioritises ‘meaningful’ and ‘rewarding’ social interactions, is designed to amplify content that provokes strong reactions, including inflammatory, polarising and hateful content. Rather than being a bug, this appears to be a feature of the system.

The consequences in conflict zones are devastating. Instead of limiting harmful content, the algorithm increases its reach, raising exposure and the risk of real-world harm. During Ethiopia’s conflict, hate speech, war propaganda and incitement to violence spread unchecked. Critically, threats made online were not just rhetoric. They were carried out in real life. One claimant’s father was killed after being targeted on the platform.

Meanwhile, Facebook consistently failed to invest in meaningful content moderation in Africa. The platform employed too few moderators, subjected them to exploitative working conditions and failed to adjust its algorithms and community standards to local contexts. This created a perfect storm for violence.

What changes are you demanding?

Our demands target both immediate harms and systemic problems.

For immediate relief, we seek a restitution fund to compensate victims of algorithmic amplification and failed content moderation, plus transparent and accessible mechanisms to escalate moderation of content violating human rights or constitutional protections.

For long-term change, we demand algorithms be redesigned to reflect the unique risks and needs of different African communities, with linguistic and cultural equity ensured by training systems with diverse, localised data.

There’s a broader opportunity here. Africa plays a central role in the global tech ecosystem, particularly in training algorithms. Many data workers and moderators shaping these systems are based on the continent, often working through outsourcing firms. African governments and civil society must leverage this position to push for greater equity in algorithm training and demand more inclusive AI systems.

Success requires strengthening worker protections and increasing supply chain transparency. These are essential steps towards lasting tech accountability and safer digital environments across Africa.

GET IN TOUCH

LinkedIn
Twitter

SEE ALSO

CIVICUS Monitor rating | ETHIOPIA: closed

CIVICUS Monitor rating | KENYA: repressed

Technology: human perils of digital power CIVICUS | 2025 State of Civil Society Report

‘The closure of Meta’s US fact-checking programme is a major setback in the fight against disinformation’ CIVICUS Lens | Interview with Olivia Sohr 24.Jan.2025

‘It’s easier and cheaper than ever to spread disinformation on a massive scale’ CIVICUS Lens | Interview with Imran Ahmed 21.Sep.2024

Crédito: Link de origem

Leave A Reply

Your email address will not be published.