[ad_1]
“Judge Rules Meta Must Face Lawsuit from Former Facebook Moderator”
A Kenyan labor tribunal says a worker who claims reviewing graphic social media posts harmed his mental health can sue Facebook owner Meta.
Daniel Motaung claims he was paid around $2.20 (£1.80) an hour to review posts including beheadings and child molestation.
He is also suing his former employer, Sama, whom Meta hired to review jobs.
Meta argued that the court had no jurisdiction as the company was not based in Kenya, Reuters reported.
But the court disagreed, finding that Meta and Sama were “proper parties” in the case.
Meta declined to comment, but rights campaign group Foxglove expected she would appeal.
In 2020, Meta paid $52 million to settle a case brought by US content moderators over mental health issues they allegedly developed at work.
The Kenyan case is being supported by Foxglove, whose director Cori Crider said a key point had been made: “Daniel’s victory today should send a message to Facebook and, through a proxy, to all the big tech companies in Africa.
“The Kenyan judiciary is on par with any tech giant, and the giants would do well to wake up and respect the Kenyan people – and their law.”
flashbacks
Facebook employs thousands of moderators to review content reported by users or artificial intelligence systems to determine if it violates the platform’s community standards and to remove it if it does.
Mr Motaung said the first graphic video he saw was “a live video of someone being decapitated”.
He told the BBC in May 2022 that he suffered from flashbacks of imagining he was the victim.
Mr. Motaung, who says he was diagnosed with post-traumatic stress disorder, believes his colleagues also struggled with the content they had to watch.
“I saw people leaving the production floor to cry, you know, something like that,” he said.
Mr. Motaung was recruited from South Africa to work for Sama in Nairobi where much of the facilitation for East and South Africa was done.
He claims the moderators’ support was insufficient.
Sama has called its allegations “both disappointing and inaccurate,” arguing that it provided all members of its workforce with a competitive wage, benefits, upward mobility and a robust mental health and wellbeing program.
In the meantime, the company has finished its moderation work for Meta.
Meta has previously declined to comment directly on the legal action, but has said it requires partners to “provide industry-leading pay, benefits and support.”
Kenyan lawyer Mercy Mutemi (centre), working on both cases against Meta, in court
Activists say Monday’s court ruling could have implications for other cases they are trying to bring.
In December, a case, also supported by Foxglove, was launched in Kenya alleging that Facebook’s algorithm helped fuel the viral spread of hate and violence during Ethiopia’s civil war.
The case asks the court to require the creation of a $2 billion fund for victims of Facebook hate and changes to the platform’s algorithm.
Meta said hate speech and incitement to violence are against the platform’s rules and says it invests heavily in moderation and technology to remove such content.
It told the BBC: “We employ staff with local knowledge and expertise and continue to develop our skills to detect harmful content in the country’s most commonly spoken languages, including Amharic, Oromo, Somali and Tigrinya.”
[ad_2]
Don’t miss interesting posts on Famousbio