Facebook notes that it admits that the website has a strong cord for misinformation

According to new internal documents showing that the social media company has been fighting hate speech in developing countries and refusing to censor US news outlets, Facebook has admitted that major sections of the platform are strong enough to spread misinformation and divisive content.

An internal note warned that Facebook’s “major product mechanics” or “basic product mechanics” have led to the rise of hate speech and misinformation on the platform. The note added that Facebook’s basic functions are “not neutral.”

“We also have compelling evidence that our main production mechanics, such as survival, advice, and engagement, are an important part of why this type of discourse grows on stage,” he said.

Citing Facebook’s security section, the document adds: “If loyalty to these issues is compromised for technical or philosophical reasons, the result is that Facebook as a whole will be active (if not negligent) promoting such activities.” The mechanics of our platform are not neutral.

The document is part of a wave of history in the New York Times on Monday. Based on statements made by NYT Stories and others to the Securities and Exchange Commission – the U.S. Financial Supervisor – and a statement made by a former Facebook employee to Congress, France Hawgen has become a legal adviser. The revised edition was obtained by a coalition of news organizations, including NYT, Bloomberg, Politico, and the Washington Post.

The documents were obtained by the Wall Street Journal, which has been publishing malicious expositions about Facebook since last month.

Other stories in Monday’s Facebook post indicate that Facebook has not been able to control hate speech and harmful content outside the US. According to a number of Facebook page partners, the incitement to hate and misinformation is worse for non-English speakers. Most of Facebook’s modest infrastructure is not used for languages ​​other than English, and the software makes it difficult for us to understand certain Arabic dialects, the Associated Press (AP) reports.

The company’s algorithm moderator can detect only 0.2% of the harmful ones in Afghanistan, according to an internal report by Politico this year. Although the company does not have Pashto or Dari, the country’s main language moderators, the remaining hazards should have been reported by staff. Although not widely known in Afghanistan, the only tools available to report harmful substances in the country were in English.

According to the AP, two years ago, Apple threatened to remove Facebook and Instagram from its app store, citing concerns that platforms were being used to trade for domestic servers. The concern was dropped after Facebook shared details of its efforts to solve the problem.

Dozens of Facebook employees testified that Hawgen did not recognize the company’s damage or did not properly support efforts to reduce or prevent the damage.

“We are FB, not a mild beginner. Following the January 6, 2021, attack on the US Capitol, one of the politicians quoted by Politico wrote that we must do better with unprecedented resources.

“Trump We have decided not to forget the day of the 2015 scaffolding, the day when Muslims were barred from entering the United States. . “From that day until today, one of the darkest days in the history of democracy has been a straight line. History does not judge us kindly.”

Elsewhere in the papers, a document seen in the Financial Times stated that a Facebook employee had been banned from downloading articles by a Facebook public policy group “when they saw that powerful politicians could be harmed.” Note: The move to download the content of repeat offenders in violation of Facebook guidelines, such as right-wing publishers, has often been reversed because publishers may retaliate.

Citing companies such as Breitbart and PragerU, the memo said:

A Facebook spokesman said: “These stories are a fabrication. Yes, we are a business and we make a profit, but the idea that we are doing this for the welfare of the people is in line with our business interests. In fact, we have invested $ 13 billion and we have over 40,000 people to do one job; Keep people safe on Facebook.

Leave a Comment