Whistle-blowing contributed to Facebook’s January 6 riots, Memo says

“We will continue to investigate – some are fair and some are not fair,” he said in a statement. But we must also keep our heads high.

Here is the full text of Mr. Craig’s note:

Our position on policing and elections

You’ve seen a series of articles about us in the Wall Street Journal in recent days, and you’ve seen the public interest. This Sunday night, a former employee of a local company who published materials for the newspaper will appear in the classroom on CBS for 60 minutes. We understand that it could prove that we are contributing to polarization in the United States, and that the extraordinary steps we have taken for the 2020 elections have helped to quickly relax and contribute to the catastrophic events of January 6 in the capital.

I know some of you – especially those in the United States – get questions from friends and family about these things, so I wanted to take a moment when we got into the weekend hoping for some useful context for us. Work in these critical areas.

Facebook and Polarization

People worry about divisions in society and seek answers and ways to solve them. Social media has had a huge impact on society in recent years, and Facebook is often the site of this debate. So it is natural for people to ask if this is part of the problem. But the idea that Facebook is the main cause of polarization is not really supported – as Chris and Pratitty initially stated in this case.

The rise of polarization has been the subject of intense academic research in recent years. In fact, there is not much to agree on. But is there any evidence that Facebook or social media in general do not easily support the idea that polarity is a factor?

Increasing political polarization in American pre-social media for decades. If it is true that Facebook is the main cause of polarization, then we expect it to be highly regarded wherever it is popular. no I do not. In fact, polarization has declined in many countries where the United States has a high social media presence.

In particular, we expect the report to suggest that changes to Facebook’s news feed algorithm are responsible for increasing the polarization content of the forum. In January 2018, we made a series of changes to promote meaningful social interactions (MSI) so that you can see more content from your friends, family and groups in your news feed. This change has been greatly enhanced by internal and external research that shows that meaningful participation with friends and family is better for people’s well-being, and we have improved over time, as we do at all levels. Of course, everyone has a cheating uncle or an old-school classmate who has strong or extremist views that we don’t agree with – and the change means you’re more likely to get their posts. Nevertheless, we have developed industry-leading tools to eliminate hate content and reduce problematic content distribution. As a result, hate speech has now dropped by about 0.05%.

But the simple fact is that a change in algorithm rating systems on a social media platform still does not explain the wider community polarization. Of course, polarization content and misinformation are also present on non-strategic platforms, including private messaging apps such as iMessage and WhatsApp.

Elections and Democracy

No other topic could be more eloquent than our work in order to dramatically change the way we approach elections. Beginning in 2017, we began to build new defenses, develop new skills, and strengthen our policies to prevent interference. Today, we have over 40,000 people working for safety and security in the company.

Since 2017, we have disrupted and eliminated more than 150 covert activities, including before democratic elections. A.D. In 2020 alone, we removed more than 5 billion fake accounts – identifying almost everything before anyone suggested it to us. And since March to Election Day, we have removed more than 265,000 Facebook and Instagram content in the United States as a result of voter interference.

Considering the unconventional nature of controversial elections, we have implemented so-called “mirror mirror” measures – and we have spoken publicly about them – to respond to and protect our unique and unusual signals on and before election day and before our content reviewers review content with our guidelines. It can be violated.

These measures are not without marketing: they are clear tools designed to deal with specific situations. Responding to temporary threats that may be hidden in a particular neighborhood is like closing all city streets and highways. By implementing them, we know that in a very uncertain time, we have been influenced by a large number of people who do not violate our rules to put human safety first. For example, our systems limit the distribution of live videos that may be related to the selection. That infringement was a very serious step to prevent the spread of the virus, but it also affected many completely normal and reasonable content, including those that had nothing to do with the election. Normally, we do not take such a cruel, all-encompassing standard, but these are not normal.

When we saw a return to normal conditions: We only responded to these emergency measures based on careful analysis of data. We have decided to leave some for a longer period of time this year until February, and not to recommend others, such as civic, political or new groups, to remain permanent.

Fighting hate groups and other dangerous organizations

I want to be completely clear: we limit hate speech, and we have clear policies that prohibit violent content. We do not benefit from polarization, in fact, the opposite. We, the military, do not allow dangerous organizations to operate on our devices, including social networks or conspiracy theories. And we remove content that praises or supports hate groups, terrorist organizations, and criminal groups.

We have become more aggressive than any other Internet company in the fight against harmful content, including content that you want to represent. But we have been working for years against these hate groups. We have downloaded tens of thousands of QAnon pages, groups and tags from our apps, removed the original #StopTheSteal group, and removed references to stop piracy until graduation. By 2020 alone, we have removed more than 19 million pieces of content that violate our policies on terrorism and violate our policies on organized hate by 2020. We have selected proud men as a hate organization in 2018 and we will continue to remove their praise, support and representation. Between August and January 12 last year, we identified about 900 militia organizations and removed thousands of pages, groups, events, Facebook profiles, and Instagram accounts associated with these groups.

This work will never be completed. There are always new threats and new problems in the United States and around the world. That’s why we stay awake and alert – and we always have.

That is why it is sometimes misleading to think that violence on January 6 would not have happened without social media. To be clear, the responsibility for those events lies solely with those who actively promote them and those who actively promote them. Mature democracies on social media have always held elections – for example, last week’s German election – without violence. We actively share the legal resources we can find in our services related to these tragic events. But it is very easy to reduce the complex causes of polarization in the United States – or violence in particular – to a technological explanation.

We will continue to face investigations – some fair and some unfair. We will continue to ask difficult questions. And many people continue to question our motives. That’s what comes with a company that has a significant impact on the world. When it is fair, we must be humble to accept criticism, and to make changes where they are right. We are not perfect and we do not have all the answers. That is why we first do research on the subject of these stories. And we will continue to look for ways to respond to feedback from our users, including experimental methods to ensure that their political content does not take their news feeds.

But we must also keep our heads up. You and your team do incredible work. Our tools and products have a profound impact on the world and people’s lives. And you have good reason to be proud of that work.

Leave a Comment