Facebook owner Meta Platforce on Thursday reviewed the feasibility of deploying an independent human rights assessment in Ethiopia, and the board recommended that Facebook and Instagram be used to disseminate information that could jeopardize the dissemination of information. There was chaos.
The board of directors of the company makes mandatory decisions on some content content issues and provides non-binding policy advice to address critical content management issues.
Meta is under scrutiny by lawmakers and regulators around the world, especially in countries where such talk is likely to take place.
Thousands have died and millions have been displaced by a year-long conflict between the Ethiopian government and rebel forces in the northern Tigray region.
This is in response to the Board’s recommendations on content posted in the country in December.
The Board of Supervisors has ruled in favor of the Tigrayans in the Amhara region of Ethiopia. As soon as Meta returned the post after the user’s appeal to the board, the company had to remove the content again.
While Meta was downloading the post on Thursday, he disagreed with the board, which said it should be dismissed because it was a “unconfirmed rumor” that would greatly increase the risk of imminent violence. “It imposes a level of journalism on people,” he said.
A spokesman for the Board of Supervisors said in a statement:
The Board has made recommendations to ensure that these policies are implemented effectively in conflict situations.
“In this case, the rumors that a nation is an ally of evil can cause great harm to people,” he said.
The Board recommends that a meta-analysis of human rights work be completed within six months.
However, the company said that all parties to the proposal “could not be implemented in terms of time, data science or approach.” He said he would continue to receive information that he would continue to work on human rights and whether he could implement the Board’s decision in the next few months.
In a previous report on Myanmar and other countries, Reuters explored how Facebook struggled to control global language barriers. A.D. In 2018, UN human rights investigators say Facebook has played a key role in spreading hate speech that could exacerbate violence in Myanmar.
According to Meta, Myanmar has been slow to respond to misinformation and hate speech. Risk of conflict and violence.
The Board recommends that Meta re-write the statement of safety values to reflect the risk that online discourse endangers people’s physical well-being and their right to life. The company said it would make changes at this price, partially implementing the decision.
(Reporting by Elizabeth Coolford Editing by Nick Ziminsky)
.