Frances Haugen spent 15 years working for some of the biggest social media companies in the world, including Google, Pinterest and until May, Facebook.
Haugen left Facebook of her own free will and walked away with thousands of pages of internal research and communications that she shared with the Securities and Exchange Commission. 60 Minutes obtained the documents from a Congressional source.
In her first interview on Sunday, Haugen spoke to 60 Minutes correspondent Scott Pelley about what she called “systemic” issues with the platform’s ranking algorithm that led to the amplification of the platform. “angry content” and division. The proof is, she said, in the company’s own internal research.
“Facebook’s mission is to connect people around the world,” Haugen said. “When you have a system that you know can be hacked with anger, it’s easier to anger people. And the editors say, ‘Oh, if I do more angry, polarizing, and divisive content. , I earn more money. “Facebook has a system of incentives in place that separates people.”
Haugen said Facebook changed its algorithm in 2018 to promote “what it calls meaningful social interactions” through “engagement-based rankings.” She explained that the content you engage with, such as reactions, comments and shares, is more widely distributed.
Haugen said some of Facebook’s own research has found that “angry content” is more likely to elicit engagement, which content producers and political parties are aware of.
“One of the most shocking pieces of information I pulled from Facebook that I think is critical to this disclosure is that political parties have been quoted, in Facebook’s own research, as saying, we know you’ve changed the way whose content you choose that goes into the home feed, ”Haugen said.“ What if we don’t post angry, hateful, polarizing, divisive, crickets content. We don’t get anything. And we don’t like it. We know our constituents don’t like it. But if we don’t like it. If we don’t do these stories, we are not distributed. And so before, we used to do very little, and now we have to do a lot, because we have work to do. What if we don’t. get traffic and engagement, we’re going to lose our jobs. “
Facebook declined an on-camera interview with 60 Minutes. The company told 60 Minutes it conducted internal and external research before changing its algorithm.
“The goal of changing the ranking of meaningful social interactions is in the name: to improve people’s experience by prioritizing posts that inspire interactions, especially conversations, between family and friends – what the research is. shows it’s better for people’s well-being – and by deprioritizing public content, “said Lena Pietsch, Facebook’s director of political communications, in a statement to 60 Minutes.” Research also shows that polarization s has been increasing in the United States for decades, long before platforms like Facebook even existed, and declining in other countries where Internet and Facebook use has increased. We have our part to play and will continue to do so. to make changes consistent with the goal of making people’s experiences more meaningful, but blaming Facebook ignores the deeper causes of these problems – and the rec search. “
The company said it continues to make changes to its platform in an attempt to “make people’s experience more meaningful” and is conducting “new research to reduce political content on Facebook based on research and comments “.
As for Haugen, the 37-year-old data scientist with an MBA from Harvard, is due to testify before Congress this week.
THE FOREIGN IMPACT OF FACEBOOK
Facebook is one of the largest internet platforms in the world. It has 2.8 billion global users who represent about 60% of the people connected to the Internet on earth.
Despite its massive reach, former employee-turned-whistleblower Frances Haugen told 60 Minutes that the company does not offer the same security systems for every language on the platform or country where Facebook is used.
“It’s really important to remember that Facebook makes different amounts of money for each country in the world,” Haugen said. “Every time Facebook expands to a new one of these linguistic areas, it costs as much, if not more, to create the security systems for that language as it does to make English or French, right. Because each new language costs more money. but there are fewer and fewer customers. And so, economics just don’t make sense for Facebook to be safe in a lot of these parts of the world. “
Facebook told 60 Minutes that it works with 80 independent third-party fact-checkers who review content in 60 languages.
“Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately bad for our business,” said Pietsch of Facebook. “Our motivation is to provide a safe and positive experience for the billions of people who use Facebook. That is why we have invested so much in safety and security.”
BAD INFORMATION AND HATE CONTENT
In August, Facebook touted its regulations on disinformation and hate speech related to COVID-19. The company released a public report saying it had deleted 3,000 accounts, pages and groups for violating its rules for disseminating disinformation related to COVID-19. Facebook also said it removed 20 million pieces of false information about COVID-19 from the platform and that the removal of hate speech content has increased 15-fold since the company began reporting it.
Former employee Frances Haugen believes Facebook isn’t telling the whole story in its transparency reports.
“We don’t have independent transparency mechanisms that allow us to see what Facebook is doing internally,” Haugen told Scott Pelley. “And we’ve seen things like the community app report that when Facebook is allowed to create its own assignments, it chooses metrics that are in its own best interest. And the implication is that they can say we get 94% hate speech, then their internal documents say we get 3-5% hate speech. We can’t rule that. “
In a statement to 60 Minutes, Facebook said it has devoted many resources to the safety of people. The company says it has 40,000 people working on safety and security and has invested $ 13 billion in such measures over the past six years.
“We have invested heavily in people and technology to keep our platform secure, and have made tackling disinformation and providing authoritative information a priority,” Facebook’s Pietsch told 60 Minutes. “If research had identified an exact solution to these complex challenges, the tech industry, governments and society would have solved them a long time ago. with experts and organizations – to inform about changes to our applications. “
Haugen: Facebook must declare “moral bankruptcy”
Before leaving Facebook, Frances Haugen worked in the Social Media Platform’s Civic Integrity unit, which she said was responsible for making sure the company was “a good force in society.”
Haugen described his work as part of an understaffed counterintelligence team to combat foreign actors using the platform for malicious purposes.
“Our team at one point could only work on a third of the cases we had,” Haugen told 60 Minutes. “A third of them. Like, we literally had to sit down and make a list and say, ‘Who are we really going to protect? “And there’s no reason we had to do that. We could have had, you know, two, three, ten times as many people. And we intentionally didn’t build detection systems because we didn’t already couldn’t handle the cases we had. “
Facebook told 60 Minutes that it is constantly improving to eliminate and suppress hate speech.
“Every day, our teams must balance protecting the right of billions of people to speak out openly with the need to keep our platform a safe and positive place,” said Pietsch of Facebook. “We continue to make significant improvements to combat the spread of disinformation and harmful content. To suggest that we promote bad content and do nothing is just not true.”
Haugen said she believes the social media giant should declare “moral bankruptcy” and level up with the public on its past failures.
“The reason I came forward is because Facebook is struggling,” Frances Haugen told 60 Minutes. “They hid information… And we don’t have to solve the problems alone, we have to solve them together. And that’s why I came forward.”
YOU CAN WATCH THE FULL SCOTT PELLEY REPORT BELOW