Facebook Whistleblower Frances Haugen Testifies Before Parliament

House of Lords and Big Ben in London, United Kingdom

On October 25, Facebook whistleblower Frances Haugen testified before the UK Parliament’s Joint Committee on the Draft Online Safety Bill. During the hearing, Haugen reaffirmed her allegations that Facebook has consistently made choices that prioritize profits over safety. She fielded questions from Members of Parliament about the Committee’s Draft Online Safety Bill, Facebook’s algorithm, children’s safety online, and the culture inside Facebook.

Background

Haugen testified in front of the U.S. Senate Subcommittee on Consumer Protection, Product Safety, and Data Security on October 5. The testimony followed an interview Haugen gave to CBS News on October 4 in which she provided information about Facebook’s handling of dangerous content on its platforms and its prioritization of profit over the safety of its users.

Haugen’s testimony rocked the world and continues to make headlines. She alleged that Facebook has continually made choices that keep its platforms less safe for users — especially for children and teens — and has not acted to pursue solutions that do not involve content but rather tackle issues intrinsic to Facebook’s systems. “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won’t solve this crisis without your help,” Haugen told the Senate Subcommittee. She urged Congress to act and regulate Facebook and highlighted how the secrecy of Facebook’s own research prevents the public from knowing exactly how the algorithm works and understanding its dangers. “As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change,” Haugen warned. Read a recap of Haugen’s Senate testimony here.

Facebook executives and founder and CEO Mark Zuckerberg have maintained that Haugen’s allegations are a misrepresentation of the truth. After Haugen’s October 5 testimony, Facebook’s Director of Policy Communications Lena Pietsch said, “We don’t agree with her characterization of the many issues she testified about.”

Zuckerberg posted a lengthy response to Haugen’s allegations the night of her Senate testimony. “Many of the claims don’t make any sense,” Zuckerberg said in the post, which he wrote to everyone at Facebook. He questioned Haugen’s critiques of Facebook’s conduct and flatly denied the claims that the company prioritizes profit over “safety and well-being.” He also contended that “[t]he argument that we deliberately push content that makes people angry for profit is deeply illogical.”

On October 18, another Facebook whistleblower Sophie Zhang testified before Parliament’s Joint Committee on the Draft Online Safety Bill. In her testimony, she talked about what she witnessed while working for Facebook and gave suggestions to the committee about the draft bill. Zhang worked on Facebook’s Site Integrity Team fake engagement team, which focused on inauthentic activity on the platform, before being fired in 2020. In an internal exit memo, “Zhang alleged that while she worked at Facebook, she found multiple instances of governments in other countries attempting to sway public opinion and mislead their citizens using fake accounts,” previous WNN reporting states. Zhang’s memo “claimed that Facebook had a delayed reaction to addressing the inauthentic activity, and in a July interview with WNN, Zhang described how she tried to make changes within the system before leaving the company.”

Haugen’s Testimony Before Parliament

Each of the Members of Parliament expressed gratitude for Haugen’s appearance in front of the Committee. Damian Collins, the Committee Chair, told Haugen that he “respect[s] the personal decision you’ve taken to speak out on these matters with all the risks incumbent with speaking out against a multibillion-dollar corporation.”

“Part of why I came forward is that I am extremely, extremely worried about the condition of our societies, our condition of the Global South, and the interaction of the choices Facebook has made and how it plays out more broadly,” Haugen told the Committee. Throughout her testimony, she advocated for mandatory risk assessments and mandatory remediation strategies in order to hold Facebook and other tech companies accountable.

“If I were writing standards on risk assessments, a mandatory provision I would put in there is you need to do segmented analysis because the median experience on Facebook is a pretty good experience. The real danger is that 20% of the population has a horrible experience, or an experience that is dangerous,” Haugen explained. She advised the Committee that there should be a process “where it’s not just Facebook articulating harms, it’s also the regulator going out and collecting harms from other populations, and then turning back to Facebook and saying you need to articulate how you’re going to solve these problems.”

Another theme Haugen mentioned multiple times during her testimony was language diversity in relation to Facebook’s safety systems. “I’m deeply concerned about their underinvestment in non-English languages, and how they mislead the public that they are supporting them. So Facebook says things like we support 50 languages, when in reality most of those languages get a tiny fraction of the safety systems that English gets,” Haugen said. She mentioned linguistic differences even between British English and American English, and two Members gave examples of times they had experienced hate and threats on Facebook. However, in these instances, language differences/regional slang terms made it so that Facebook initially did not flag the threatening phrases as dangerous. “Facebook should have to disclose dialectical differences,” Haugen advised.

“I came forward now because now is the most critical time to act…right now, the failures of Facebook are making it harder for us to regulate Facebook,” Haugen told the Committee. One of the Members asked about the January 6 attacks and if the way content is moderated on the site will precipitate more events like this. Haugen said she has “no doubt that the events we’re seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters, because engagement-based ranking does two things. One, it prioritizes and amplifies divisive, polarizing, extreme content and two, it concentrates it.”

Haugen also discussed how she tried to raise concerns within Facebook, as numerous Members were curious about the internal culture at Facebook. “I flagged repeatedly when I worked on civic integrity that I felt that critical teams were understaffed, and I was told at Facebook, we accomplish unimaginable things with far fewer resources than anyone would think possible. There is a culture that lionizes kind of a start-up ethic that is in my opinion irresponsible.” Haugen also stated that “Facebook is overwhelmingly full of conscientious, kind, empathetic people. Good people who are embedded in systems with bad incentives are led to bad actions. And there is a real pattern of people who are willing to look the other way are promoted more than people who raise alarms.”

The Members of Parliament asked clarifying questions about how the algorithm functions, and Haugen talked about how Facebook’s algorithms reward extreme content that induces anger instead of content that induces feelings of compassion. “The current system is biased towards bad actors and biased towards people who push people to the extremes,” she said.

When some Members pointed out that the current draft bill omits societal harm and paid-for advertising, Haugen expressed concern over said omissions. “We have to care about societal harm, not just for the global south but for our own societies,” Haugen said. Additionally, she said that she is “extremely concerned about paid-for advertising being excluded, because engagement-based ranking impacts ads as much as it impacts organic content.”

Other topics the Members of Parliament asked about included Facebook groups, safety measures for children online, end-to-end encryption, and the power of legislation and a regulator in incentivizing Facebook to make decisions that are more aligned with the public good. “I am incredibly excited and proud of the UK for taking such a world-leading stance with regard to thinking about regulating social platforms,” she told the Committee.

When Collins asked if Facebook was making hate worse, Haugen said, “Unquestionably it’s making hate worse.” Later in her testimony, she stated,” We’re not trying to kill Facebook, we’re trying to make the digital world better and safer for its users.”

Aftermath

After her testimony, Haugen tweeted that “[t]he past month has been a powerful, wild, and overwhelming experience. I am beyond grateful to the support for my community in navigating this complex conversation. Thank you for all your messages of encouragement and care.”

She continued, “As the chaos settles, I hope to use Twitter to speak in greater detail about the documents, their meaning, and my experience at Facebook. Please see me as a resource and perspective on the inside of an organization that has been quite opaque.”

The third tweet in the thread read, “To start, we must demand:

1) Privacy-concious [sic] mandatory transparency from Facebook

2) A reckoning wih [sic] the dangers of engagement-based ranking

3) Non-content-based solutions to these problems — we need tools other than censorship to keep the world safe.”

Watch Haugen’s testimony here.

Read about Haugen’s testimony to the U.S. Senate here.

Read more Facebook news on WNN here.

Exit mobile version