Facebook struggles with the features it uses to identify social networks

SAN FRANCISCO – In 2019, Facebook researchers launched a new study on one of the basic characteristics of a social network: Like Key.

According to company documents, Facebook has taken a look at what people would do if they took a different thumbnail icon and other emojis from a photo-sharing app posted on Instagram. The buttons sometimes created “stress and anxiety” for small Instagram users, especially if the researchers didn’t get enough likes from your friends.

Researchers have found that when the key to love is hidden, users have little to do with posts and ads. In addition, it did not alleviate the social anxiety of the teens and the young users did not share more photos as the company intended, which resulted in mixed results.

According to the documents, Facebook CEO Mark Zuckerberg and other executives discussed hiding the Like button for more Instagram users. Finally, a big challenge to “building a positive press narrative” around Instagram has been distributed to a limited extent.

The Like Key study was an example of how Facebook has questioned the basic features of social networking sites. As the company went through a crisis of misinformation, privacy and hate speech, the main issue was the way the platform was run – basically, the behaviors that made Facebook so popular.

In addition to liking, Facebook has explored the sharing key that allows users to instantly distribute content posted by other people. The nature of the groups, used to form digital communities; And other tools that explain how more than 3.5 billion people work and connect online. Research in thousands of internal documents shows how often the company struggles with what it has created.

What researchers have found is often negative. People have repeatedly determined that they have abused key traits or that those traits have increased toxicity with other effects. A.D. In an August 2019 memo, several researchers said that Facebook’s “major product mechanics” – which were the source of misinformation and hate speech – were the basics of how the product worked.

“The mechanics of our platform are not neutral,” he concluded.

The documents – which include slide decks, internal chat threads, charts, notes, and presentations – do not indicate what action Facebook took after receiving the findings. In recent years, the company has changed some features, which allows people to hide posts they do not want to see and reduce the spread of misinformation by destroying political group recommendations.

But the main way Facebook works – a fast-paced way for people to gather information and gather friends and followers and likes – is ultimately unchanged.

Some current and former executives have said that many significant improvements to the social network have been blocked by growth services and user retention. Facebook is worth more than $ 900 billion.

Brian Bolland, vice-president of Facebook, who left last year, said: “There is a gap in Facebook as an employee. “It’s really going to be harder to make changes.”

The company’s documents are part of a series of Facebook papers, and a whistleblower has been represented by former Facebook employee Francis Haugen to the Security and Exchange Commission and Congress. Ms. Hawgen had previously submitted the documents to the Wall Street Journal. This month, a member of Congress presented statements to more than a dozen other news outlets, including The New York Times.

In a statement, Facebook spokeswoman Andy Stone criticized the documents based on the “false origin”.

“Yes, we are a business and we make a profit, but the idea that we are doing this for the welfare of the people is misleading where our own business interests are,” he said. He said Facebook is investing $ 13 billion in human security and employing more than 40,000 people.

In a post this month, Mr Zuckerberg said the company did not want to buy ads on a platform that spreads hate and misinformation.

“At a very basic level, I think most of us don’t know the fake image of the paint company,” he wrote.

When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dormitory, the purpose of the page was to connect people on college campuses and to bring people with common interests and space to digital groups.

A.D. Growth erupted in 2006 when Facebook introduced a news feed with photos, videos, and status updates posted by friends. Over time, the company has added additional features to make people want to spend time on the platform.

A.D. In 2009, Facebook introduced the key to love. The little thumb, a simple indicator of people’s choice, has become one of the most important features of a social network. The company allows users of other websites to use the likes button to share their interests on their Facebook profile.

That Facebook has helped him understand people’s movements and emotions outside of his own website, so he can better advertise them. Likes indicate what users want to see more in their news feed so that people can spend more time on Facebook.

Facebook also allows people to join private social media channels to talk about specific groups’ behaviors, and pages, businesses, and celebrities to gather big fans and send messages to their followers.

Another innovation was the sharing key that people used to quickly share photos, videos and messages to their own news feeder or elsewhere. Automated counseling system also identifies new groups, friends, or pages that people follow based on their previous online behavior.

But the features had side effects, according to the documents. Some people try to compare themselves with others. Others used the share button to spread information faster, so false or misleading content got into the virus in seconds.

He said Facebook was conducting an internal investigation to identify possible issues to make the products safer. Instagram CEO Adam Mosseri says a study on consumer safety has invested in anti-bullying measures on Instagram.

However, when many problems return to its core features, Facebook cannot easily adapt to a healthy social network, according to Jane Litvinenko, a senior fellow at the Harvard Kennedy Schorenstein Center, which studies social networks and misinformation.

“When we talk about Like Key, Share Key, News Feed and their power, we are primarily talking about the infrastructure that the network is building on,” she said. “The main problem here is the infrastructure itself.”

As Facebook researchers dig into how the products work, the shocking results are piled up.

A.D. In a July 2019 study of groups, researchers looked at how members of those communities could be targeted by misinformation. The starting point, according to the researchers, was the so-called “invited whales” who invited others to join a private group.

These people have been effective in recruiting thousands of new groups and making the community a balloon overnight, the study found. The study also found that inviting whales could send spam to groups with posts promoting ethnic violence or other harmful content.

Another 2019 report looked at how some people gained a lot of followers on their Facebook page, often using cute animals and other harmless articles. But one page grew to tens of thousands of followers, and the founders sold it. According to the study, buyers use the pages to mislead their followers or to divide them politically.

When researchers studied the Like key, the documents revealed that executives intended to hide the behavior on Facebook. In a small experiment in Australia in September 2019, he removed likes from Facebook posts from users.

The company wanted to see if the change would reduce user pressure and social stigma. That, in turn, may encourage people to post more often on the network.

But once the love button was removed, people did not share any more posts. Facebook chose not to release the test extensively, saying:

Last year, the company’s researchers reviewed the share button. In a September 2020 study, one researcher wrote that a set of posts shared by friends of friends, automatically generated by the news feed button and redesigned sections, were designed to attract attention and encourage participation. ”

But if left unchecked, behaviors such as bullying and border posts can “be used to highlight bad content and sources,” the researcher said.

This is because people are reluctant to share posts, videos and messages with each other. In fact, users are three times more likely to share any type of content than re-sharing units, the researcher said.

One post that was widely circulated was an outdated message from an “angry patriotic” account. The post informed users that they were targeting a police station in Portland Ore. Hundreds of hate-filled comments flooded the redistribution booths. This is an example of “hate fishing.” The researcher said.

A common theme in the documents is that Facebook employees argue about changes in how they work on social networks and often criticize executives on the streets.

A.D. In an August 2020 internal post, a Facebook researcher criticized pages and groups for advising people to follow: “It can quickly lead users to conspiracy theories and groups.”

“Because of fear Capacity We are the responses of the public and policy stakeholders He knows. Exposing consumers to the dangers of dishonesty, ”writes the researcher. “When we were hesitant, I saw people from my hometown flocking to the rabbit pit,” according to QAnon and anti-vaccine and CV-19 plots.

The researcher adds: “It is very painful to observe.

Reported by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.

Leave a Comment