Facebook Permitted Posts Spurring Ethiopian Civil War: Suit


Photo: Getty Images (Getty Images)

Facebook and its parent company Meta are being sued for allegedly allowing toxic, violence-inciting content to flourish in communities in Ethiopia, where a civil war has left hundreds of thousands dead in recent years.

The lawsuit, filed by two Ethiopian researchers, accuses the tech giant of helping to fuel violence in the region through indifference—a lack of effective content moderation controls. The suit claims that the tech giant’s recommendations systemsalgorithmic encouragement for users to engage with certain kinds of content—fueled the sharing of hateful posts in the region. The lawsuit has asked a court to force Meta to take steps to halt the spread of violent content, including hiring additional regional moderation staff, adjusting its algorithms to demote such content, and setting up restitution funds of some $2 billion to help victims of violence “incited on Facebook,” Reuters reports.

“Not only does Facebook allow such content to be on the platform, they prioritise it and they make money from such content. Why are they allowed to do that?” Mercy Mutemi, the attorney for the researchers, asked during a recent press conference.

One of the researchers behind the suit, Abrham Meareg, has a personal connection to the ethnic violence. In November of 2021, Meareg’s father was shot to death, just one month after the elder man had been the subject to death threats and ethnic slurs in Facebook posts, the lawsuit claims. Meareg says that prior to the murder, he had contacted Meta and asked the company to take the content down but that the company ultimately did not respond quickly, nor did it end up taking down all of the posts about his father. The researcher now says that he holds Meta “directly responsible” for his father’s death.

Meta’s content moderation has been a source of ongoing litigation in East Africa and beyond. Facebook has been accused of letting its most toxic content flourish in Kenya after it approved pro-genocide advertisements, which nearly got the social network banned from the country entirely. Facebook previously faced a $150 billion lawsuit filed by Rohingya war refugees who accused the tech giant of fueling the genocide in Myanmar. Amnesty International concluded that the company had, in fact, contributed to the ethnic cleansing in the country. Additionally, the company has been accused of similar dysfunction in countries like Cambodia, Sri Lanka, and Indonesia.

Gizmodo reached out to Meta for comment on the most recent lawsuit and will update this story if it responds. In a statement provided to Reuters, company spokesperson Erin Pike defended the company, saying: “We invest heavily in teams and technology to help us find and remove this content…We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages” in Ethiopia.


Photo: Getty Images (Getty Images)

Facebook and its parent company Meta are being sued for allegedly allowing toxic, violence-inciting content to flourish in communities in Ethiopia, where a civil war has left hundreds of thousands dead in recent years.

The lawsuit, filed by two Ethiopian researchers, accuses the tech giant of helping to fuel violence in the region through indifference—a lack of effective content moderation controls. The suit claims that the tech giant’s recommendations systemsalgorithmic encouragement for users to engage with certain kinds of content—fueled the sharing of hateful posts in the region. The lawsuit has asked a court to force Meta to take steps to halt the spread of violent content, including hiring additional regional moderation staff, adjusting its algorithms to demote such content, and setting up restitution funds of some $2 billion to help victims of violence “incited on Facebook,” Reuters reports.

“Not only does Facebook allow such content to be on the platform, they prioritise it and they make money from such content. Why are they allowed to do that?” Mercy Mutemi, the attorney for the researchers, asked during a recent press conference.

One of the researchers behind the suit, Abrham Meareg, has a personal connection to the ethnic violence. In November of 2021, Meareg’s father was shot to death, just one month after the elder man had been the subject to death threats and ethnic slurs in Facebook posts, the lawsuit claims. Meareg says that prior to the murder, he had contacted Meta and asked the company to take the content down but that the company ultimately did not respond quickly, nor did it end up taking down all of the posts about his father. The researcher now says that he holds Meta “directly responsible” for his father’s death.

Meta’s content moderation has been a source of ongoing litigation in East Africa and beyond. Facebook has been accused of letting its most toxic content flourish in Kenya after it approved pro-genocide advertisements, which nearly got the social network banned from the country entirely. Facebook previously faced a $150 billion lawsuit filed by Rohingya war refugees who accused the tech giant of fueling the genocide in Myanmar. Amnesty International concluded that the company had, in fact, contributed to the ethnic cleansing in the country. Additionally, the company has been accused of similar dysfunction in countries like Cambodia, Sri Lanka, and Indonesia.

Gizmodo reached out to Meta for comment on the most recent lawsuit and will update this story if it responds. In a statement provided to Reuters, company spokesperson Erin Pike defended the company, saying: “We invest heavily in teams and technology to help us find and remove this content…We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages” in Ethiopia.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
Abrham Mearegamnesty internationalCivilComputingConflictCriticism of FacebookErin PikeEthiopianfacebookGizmodoInternetMercy MutemiMetaModeration systempermittedpostsSocial information processingsocial mediaSoftwareSpurringSuitTech NewsTechnologyTop StoriesWarWorld Wide Web
Comments (0)
Add Comment