Facebook “tears our societies apart”: key excerpts from a whistleblower | Facebook

Frances Haugen’s interview with the US news program 60 Minutes contained a litany of damning statements on Facebook. Haugen, a former Facebook employee who joined the company to help it fight disinformation, told CBS that the tech company prioritizes profit over security and “is tearing our societies apart.”

Haugen will testify in Washington on Tuesday, as political pressure intensifies on Facebook. Here are some key excerpts from Haugen’s interview.

Prioritize profit over public good

Haugen’s sharpest words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit before the well-being of its users and the public.

“What I saw over and over on Facebook was that there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, has chose to optimize for his own interests, such as earning more money.

She also accused Facebook of endangering public safety by reversing changes to its algorithm after the 2020 presidential election is over, allowing misinformation to spread again on the platform. “And as soon as the elections were over, they turned them over [the safety systems] go back or they’ve brought the settings back to what they were before, to prioritize growth over safety. And that really sounds like a betrayal of democracy to me. “

Facebook’s approach to security compared to others

During a 15-year career as a tech professional, Haugen, 37, has worked for companies like Google and Pinterest, but said Facebook has the worst approach to restricting harmful content. She said: “I saw a bunch of social media and it was a lot worse on Facebook than anything I had seen before.” Referring to Mark Zuckerberg, founder and CEO of Facebook, she said, “I have a lot of empathy for Mark. And Mark never set out to create a hate platform. But it has made it possible to make choices where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach. “

Instagram and mental health

Instagram logo. Photograph: Dado Ruvić / Reuters

The most impactful document leak was a series of research slides that showed Facebook’s Instagram app was harming the mental health and well-being of some teenage users, with 32% of teenage girls believing it to be. worsened their body dissatisfaction.

She said, “And what’s super tragic is Facebook’s own research, as these young women start consuming this eating disorder content, they’re getting more and more depressed. And that actually encourages them to use the app more. And so, they find themselves in this feedback cycle where they hate their bodies more and more. Facebook’s own research indicates that it’s not just that Instagram is dangerous for teens, that it harms teens, it’s significantly worse than other forms of social media.

Facebook described the Wall Street Journal reports on the slides as a “misrepresentation” of its research.

Why Haugen leaked the documents

Haugen said “person after person” tried to fix Facebook’s problems but was crushed. “Imagine you know what’s going on inside Facebook and no one outside. I knew what my future would look like if I continued to stay inside Facebook, which person after person is tackling this inside Facebook and getting down to the ground.

Having joined the company in 2019, Haugen said she decided to take action this year and started copying tens of thousands of documents from Facebook’s internal system, which she says shows Facebook is not doing , despite public comments to the contrary, significant progress in the fight online. hatred and misinformation. “At one point in 2021, I realized, ‘OK, I’m going to have to do this systemically, and I have to get out enough that no one can question that this is real. “”

Facebook and violence

Facebook logo.
Facebook logo. Photograph: Dado Ruvić / Reuters

Haugen said the company contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” concerning the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment filled with angry, hateful and polarizing content, it erodes our civic trust, it erodes our faith in each other, it erodes our ability. to want to take care of each other The version of Facebook that exists today is tearing our societies apart and causing ethnic violence in the world.

Facebook and the Washington riot

The January 6 riot, when crowds of right-wing protesters stormed the Capitol, came after Facebook disbanded the civic integrity team of which Haugen was a member. The team, which focused on election-related issues around the world, was dispersed to other Facebook units in the wake of the US presidential election. “They told us, ‘We are dissolving civic integrity.’ Like, basically, they said, ‘Oh well, we’ve passed the election. There were no riots. We can get rid of civic integrity now. Fast forward a few months we had the insurgency. And when they got rid of civic integrity, that’s when I was like, “I don’t believe they’re willing to invest what needs to be invested to keep Facebook from being dangerous. . “

The 2018 algorithm change

Facebook has changed the algorithm of its news feed – Facebook’s central feature, which provides users with a personalized feed of content such as photos of friends and news – to prioritize content that increases engagement. users. Haugen said this makes the controversial content more important.

“One of the consequences of how Facebook selects this content today is that it optimizes content that elicits engagement or reaction. But his own research shows that content that is hateful, that divides, that polarizes – it’s easier to inspire people with anger than other emotions. She added, “Facebook has realized that if they change the algorithm to be more secure, people will spend less time on the site, they will click less ads, they will earn less money.

Haugen said European political parties have reached out to Facebook to say the change in news feed is forcing them to take more extreme political positions in order to gain users’ attention. Describing the concerns of politicians, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know that if we don’t take these positions, we won’t win in the social media market.

In a statement to 60 Minutes, Facebook said, “Every day, our teams must balance protecting the right of billions of people to speak out openly with the need to keep our platform a safe and positive place. make significant improvements to combat the spread of disinformation and harmful content. To suggest that we encourage bad content and do nothing is just not true. If research had identified an exact solution to these complex challenges , the tech industry, governments and society would have solved them a long time ago. ”


Source link

Comments are closed.