A former Facebook employee turned whistleblower urged Congress on Tuesday to regulate the social media giant because she said its products harm children and democracy.
“The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their astronomical profits before people,” said Frances Haugen, a former Facebook data scientist who has provided reams of internal company records to news organizations and regulators.
“Congressional action is needed. [Facebook] cannot solve this crisis without your help … Facebook has not earned our blind faith,” Haugen testified during her highly anticipated testimony before the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security.
The subcommittee is examining allegations that Facebook’s own research revealed that Instagram — its photo-sharing platform — generated intense peer pressure on young users, particularly girls. As a result, those users suffered from serious mental health problems, with some reporting that Instagram intensified suicidal thoughts and eating disorders. The Wall Street Journal last month first reported on the company’s research, which it obtained from Haugen.
Haugen also alleged at the hearing that the company concealed internal research that shows the platform knowingly amplifies political unrest, misinformation and hate.
Facebook executives have disputed how media organizations have characterized the research, and have said the company works hard to ensure their platforms are safe and don’t spread misinformation.
The whistleblower’s testimony came as lawmakers and regulators have begun taking a harder look at how Facebook and other social media companies stoke division and propagate misinformation. Republicans have argued for years that Facebook and other social media companies suppress conservative views. The whistleblower’s appearance on Capitol Hill followed Monday’s massive, global outage of Facebook and two of its other platforms, Instagram and WhatsApp.
Tuesday’s hearing was a rare event on Capitol Hill where Democrats and Republicans seemed to agree on the extent of a problem, with both sides praising Haugen for coming forward to expose Facebook’s questionable practices.
Facebook is “facing a Big Tobacco moment; a moment of reckoning,” said Sen. Richard Blumenthal (D-Conn.), chairman of the subcommittee, referencing how tobacco companies were called to account for hiding research that proved their products were dangerous.
“Big Tech now faces that Big Tobacco jaw-dropping moment of truth,” said Blumenthal, who has called on the U.S. Federal Trade Commission and the Securities and Exchange Commission to investigate Haugen’s allegations.
The subcommittee’s ranking Republican, Sen. Marsha Blackburn of Tennessee, joined Blumenthal in sharply criticizing Facebook, insisting its executives “knew what they were doing. They knew what the violations were.”
“It is clear that Facebook prioritizes profits over the well-being of children and [all users],” Blackburn said.
Sen. Edward J. Markey (D-Mass.) took aim at Mark Zuckerberg, Facebook’s co-founder and chief executive, promising that lawmakers would take action to rein in the social media platform.
“Your time of invading our privacy, promoting toxic content and preying on children and teens is over,” he said. “Congress will be taking action. You can work with us or not work with us. But we will not allow your company to harm our children, and our families and our democracy any longer.”
Haugen testified that Facebook shut off tools designed to combat violent incitement and misinformation following President Biden’s presidential victory late last year. Turning off these safeguards contributed to the Jan. 6 insurrection at the U.S. Capitol, she said.
“Facebook changed those safety defaults in the run-up to the election because they knew they were dangerous,” Haugen said. “And, because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults.”
One of the safeguards required users to click on links before sharing them. Other platforms like Twitter have found this “significantly reduces misinformation” and it does not violate users’ free speech, Haugen testified.
“No one is censored by being forced to click on a link before re-sharing it,” said Haugen, 37, who joined Facebook in 2019.
Nick Clegg, Facebook’s vice president of policy and public affairs, said Sunday on CNN’s “Reliable Sources” that “even with the most sophisticated technology, which I believe we deploy, even with the tens of thousands of people that we employ to try and maintain safety and integrity on our platform, we’re never going to be absolutely on top of this 100% of the time.”
Much of the hearing focused on how Instagram, a photo-sharing platform, damages teens’ mental health and body image. For example, a key document shared by Haugen found that 32% of teen girls said that “when they felt bad about their bodies, Instagram made them feel worse.”
And despite feeling bad, it’s hard for these children to leave the app because they want the dopamine rush that comes from users liking their pictures, Haugen said.
“I feel a lot of pain for those kids,” Haugen said. “Imagine you’re in this relationship where every time you open the app it makes you feel worse. But you also fear isolation if you don’t” keep using it.
Last month, Facebook paused work on a controversial version of Instagram that targeted children ages 10 to 12. In a tweet on Sept. 27, Instagram head Adam Mosseri said he believed that building such an app is the “right thing to do” but the company needed “more time to speak with parents and experts working out how to get this right.”
One way to better regulate companies like Facebook, Haugen said, would be for lawmakers to amend Section 230 of the 1996 Telecommunications Act, which grants such websites broad legal protections and freedom to moderate user-generated content. Haugen said the section should be modified to enhance oversight of platforms’ use of algorithms that determine what fills users’ feeds.
acebook “shouldn’t get a free pass on that because they’re paying for their profits right now with our safety,” Haugen testified.
Haugen laid the blame for Facebook’s policies on Zuckerberg.
“In the end, the buck stops with Mark,” she said. “There is no one currently holding Mark accountable but Mark himself.”
A Facebook spokeswoman, Lena Pietsch, said after the hearing that Haugen worked at the tech giant for less than two years and “had no direct reports, never attended a decision-point meeting with C-level executives.” Pietsch added that the company does not agree with “her characterization of the many issues she testified about.”
“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet,” Pietsch said.