A.D.
For the next three weeks, follow all the tips provided by Facebook algorithms to join a simple law enforcement group, watch videos and browse new pages.
As a result, hate speech, misinformation, and riots abound, according to a Facebook post published later that month.
“I’ve seen more images of dead people in my life than I have seen in the last three weeks,” the Facebook researcher wrote.
The report is one of dozens of studies and articles written by Facebook staff fighting the impact of the forum on India. Human rights activists and politicians provide strong evidence that he is one of the most vocal critics of a company in the world, and that he does not fully understand the impact of local migration on local culture and politics and does not invest his resources. On issues that arise.
India, with 340 million people using Facebook, is the company’s largest market. And Facebook’s coverage of the continent’s problems is a broader version of global issues, exacerbated by a lack of resources and a lack of knowledge of the 22 official languages of India.
Internal documents obtained by a coalition of news organizations, including The New York Times, are a major component of Facebook Papers. The meeting was attended by former Facebook product manager Francis Hawgen, who testified and recently testified before the Senate Subcommittee on the company and social media. In a complaint filed by Ms. Hawgen earlier this month to the Security and Exchange Commission, India’s references were scattered.
The documents include reports of bots and fake accounts linked to the ruling party and the opposition wreaking havoc in the country’s elections. Facebook CEO Mark Zuckerberg also described how a plan to focus on “meaningful social interactions” or friendships and family relationships has led to more misinformation in India, especially during the epidemic.
Facebook did not have enough resources in India and could not deal with the problem it advertised there, including anti-Muslim posts, according to the documents. Eighty-seven percent of the company’s global budget for identifying misinformation is allocated to the United States, and only 13 percent is allocated to the rest of the world – although North American users make up only 10 percent of social networking sites. According to a document on Facebook’s wealth distribution, active users every day.
Facebook spokeswoman Andy Stone said the figures were incomplete and did not include the company’s third-party fact-finding partners, most of whom were outside the United States.
That distorted focus on the United States has had consequences in many countries besides India. Company documents show that Facebook imposed measures during the November elections in Myanmar to alleviate misinformation, including misinformation shared by the Myanmar military government.
The company reversed those actions after the election, although research shows it has reduced the viewing of annoying posts by 25.1 percent and misleading photos by 48.5 percent. Three months later, the military staged a coup. He said that after the coup, Facebook had implemented a special policy to eliminate praise and support in the country and subsequently banned the Myanmar army from Facebook and Instagram.
In Sri Lanka, hundreds of thousands of users have joined Facebook groups, exposing them to offensive and hateful content. The Ethiopian National Youth Militia has successfully coordinated riots on Facebook and published other provocative articles.
Facebook has invested heavily in technology to find hate speech in various languages; Of these, Hindi and Bengali are the most widely used languages, said Mr. Ston. He added that Facebook has halved the number of hate speech worldwide this year.
“Hate speech against marginalized groups, including Muslims, is on the rise in India and internationally,” Mr Stone said. “So as hate speech improves online, we are improving enforcement and we are determined to improve our policy.”
“There is definitely a question about resource mobilization in Facebook,” said Katie Harbaz, who has spent 10 years as Facebook’s director of public policy. Confirming India’s National Election. She said Facebook needed to find a solution that could be implemented in countries around the world.
Facebook users have conducted various tests and field studies in India for many years. That work has intensified ahead of India’s 2019 national elections. At the end of January of that year, a small number of Facebook users traveled to the country to meet with colleagues and talk to dozens of local Facebook users.
According to a post-trip note, one of the key questions posed by users in India was that Facebook should “take action on information related to real-world harm, especially political and religious groups.”
Ten days after the researcher opened his fake account to investigate false information, a suicide bombing near the disputed Kashmir border has sparked accusations, misinformation and conspiracies between Indians and Pakistanis.
Following the attack, anti-Pakistani content began to circulate in the researcher’s recommended Facebook group. She said most of the groups had tens of thousands of users. A special Facebook report published in December 2019 stated that Indian Facebook users are more likely to join large groups, with 140,000 members in the country’s middle group.
A beheaded Pakistani man and white corpses on the ground were distributed among the mixed groups.
After sharing her case with colleagues, the researcher commented that her colleagues were concerned about misinformation about the upcoming elections in India.
Two months after the start of India’s national election, Facebook has taken a series of steps to curb misinformation and hate speech in the country, according to an internal election study.
The case study includes a number of fact-finding partners – including a third-party Facebook network and the misinformation that Facebook has removed. It also looks at how Facebook has created a political whitelist to curb PR threats, especially the list of politicians who have been specifically exempted from fact-finding.
The study did not address the biggest problem facing bots in India and issues such as voter suffocation. During the election, Facebook tried to disseminate bots linked to various political parties – or fake accounts – as well as misinformation that could affect people’s perceptions of the voting process.
In a separate post-election report, Facebook confirmed that more than 40 percent of views in the Indian state of West Bengal were “untrue.” An invalid account has created over 30 million views.
A.D. According to a report published in March 2021, many of the issues raised during the 2019 elections continue.
In the document, researchers from the so-called Adversarial Harmful Networks: India Case Study wrote that there were “angry and misleading anti-Muslim content” groups and pages on Facebook.
The report states that there are many inhumane posts comparing Muslims to “pigs” and “dogs” and that the Qur’an, the holy book of Islam, requires men to rape their female families.
Most of the material promoting Rashtriya Swayamsevak Sangh, India’s right wing and national support group, has been circulated on Facebook. The groups have been linked to the Muslim minority in western Bengal and Pakistan, and have posted articles on Facebook urging the expulsion of Muslims from India and the promotion of Muslim rule.
The report noted that Facebook was aware of the proliferation of such malicious posts on the platform, and that it was aware of the need to improve “classifiers” so that they could find and remove posts containing violent and provocative language. Facebook has been reluctant to label RSS as a dangerous organization because of “political sentiment” that could harm the country’s social networking system.
Of the 22 official languages in India, Facebook claims to have trained AI systems in five. (He said he had other human reviewers.) But he still did not have enough information in Hindi and Bengali to control the content, and the content, which targeted most Muslims, was “never marked or no action will be taken,” the Facebook report said. .
Five months ago, Facebook was still struggling to crack down on hate speech against Muslims. Another company, Bajrang Dal, described the efforts of an extremist group affiliated with the Hindu nationalist party Baratia Janata to publish anti-Muslim narrations on stage.
Facebook is considering classifying the group as a dangerous organization because it is “inciting religious violence,” the document said. But he has not done so yet.
“Join the group and help manage the group; increase the number of members, friends to spread the message of Bajrang Dal,” said a post on Facebook seeking recruiters. “Fight for truth and justice until the perpetrators disappear.”
Ryan Mac, Cecilia Kang And Mike Isaac Contributing Reporting.