Facebook Whistleblower Sophie Zhang Testifies Before Parliament

UK Whistleblower

On October 18, Facebook whistleblower Sophie Zhang testified before the UK Parliament’s Joint Committee on the Draft Online Safety Bill. In her testimony, she talked about what she witnessed while working for Facebook and gave suggestions to the committee about the draft bill.

Background on Zhang’s Story

In September of 2020, BuzzFeed News reported on the internal exit memo Zhang wrote after being fired by Facebook. In her memo, she detailed what she found through her work on the Facebook Site Integrity fake engagement team. Zhang alleged that while she worked at Facebook, she found multiple instances of governments in other countries attempting to sway public opinion and mislead their citizens using fake accounts. Her memo claimed that Facebook had a delayed reaction to addressing the inauthentic activity, and in a July interview with WNN, Zhang described how she tried to make changes within the system before leaving the company.

In the wake of Facebook whistleblower Frances Haugen’s allegations about the tech giant’s handling of dangerous content on its platforms, Zhang’s story has once again caught the public’s attention. On October 11, about a week after Haugen testified before a Senate Subcommittee, Zhang wrote an article in The Guardian about her experience as a Facebook whistleblower.

In her article, Zhang also provided advice to other tech employees who might find themselves in a position to blow the whistle or raise concerns. “…whistleblowing is never straightforward. When I was deciding whether to speak out, I struggled to find guidance on the best way to go about it. If you’re in that position now, here’s my best advice on how to navigate the complicated path to becoming a whistleblower,” Zhang wrote.

Zhang’s appearance before Parliament on October 18, which happened over video call, centered around the UK’s Draft Online Safety Bill. The draft bill “established a new regulatory framework to tackle harmful content online,” according to the webpage. The 12-member Joint Select Committee is tasked with deliberating the draft bill. On October 14, the Committee’s Twitter account announced Zhang’s upcoming testimony through a statement by the Committee’s Chair, Damian Collins MP.

Zhang’s Testimony

In her testimony, Zhang talked about what her work entailed as a data scientist at Facebook. She discussed the nature of inauthentic activity and how the relationship between fake accounts and misinformation and hate speech is often misconstrued. Zhang also talked about the changing nature of ideas and how they can be spread: with social media, more opinions can be amplified than before, when news outlets acted as “gatekeepers.”

Zhang emphasized that social media algorithms, like the ones Haugen talked about in her testimony, weren’t in her area of expertise, but noted that these algorithms “create an incentive for people to write discussions that are sensationalist or attention drawing, or emotion grabbing. And one of the easiest ways to do that, sadly, is making bold claims that fall into the realm of misinformation or hate speech or the like.” One suggestion Zhang offered to the committee is “considering areas to decrease virality, such as requiring social media companies to use chronological newsfeeds, or potentially limiting the number of reshares, so that if someone on Facebook reshares a post, and then you look at the shared version and then share it as well, maybe after that you have to go to the original post to share it.”

Zhang also elaborated on the ways in which she tried to make change from within Facebook, stating that she “personally briefed” Facebook vice president of integrity Guy Rosen on a system of inauthentic accounts that were being used to influence citizens in Honduras. “The general trend that I would describe is that everyone agreed that the situation was terrible, but people were not convinced that this was worth giving more priority to whether Facebook should act, et cetera,” Zhang said of Facebook’s reaction to her concerns. “There was mostly agreement that this was terrible, but no agreement on what actions should be taken and how much of a priority it should be.”

Additionally, Zhang talked about Facebook’s use of artificial intelligence in moderating content and commented on the company’s priorities. “…I think it’s important to remember that Facebook is ultimately a company, it’s goal is to make money. And to the extent that it cares about protecting democracy, it’s because the people at Facebook are human and need to sleep at the end of the night. And also because if democracy is negatively impacted, that can create news articles, which impact Facebook’s ability to make money,” Zhang remarked.

Zhang provided suggestions about how Facebook should be regulated, like requiring “companies to apply its policies consistently.” Another suggestion was “to try and independently verify” each platform’s efficacy in catching “bad activity” by having experts conduct tests on said platforms. She also proposed that companies be required “to provide data access to trusted researchers and provide funding for such researchers to have more independent verification.” However, she did recognize that this idea creates “some privacy risks.” Zhang was also asked to comment on the current proposed bill and the idea of walking the line between protecting free speech and taking down dangerous content. She circled back to questions about what she was working on at Facebook and the way inauthentic activity could make a political figure look more popular than they actually are.

“Is Facebook being used as a tool by authoritarian governments in those countries? Yes, it is. Is Facebook used by the opposition in those countries to get their voices out? Yes, it also is,” Zhang said in a response to a question asking about Facebook’s role in undermining democracies.

“It was an honor to testify today to the @OnlineSafetyCom regarding the UK #OnlineSafetyBill,” Zhang tweeted after her appearance before Parliament. “One area I didn’t have a chance to cover is the possible banning of E2EE platforms by the bill for failing to prevent harm. I am strongly opposed to banning encrypted platforms in Britain[.]” E2EE, or end-to-end encryption, is a system in which only the sender and recipient of a message can access said message.

Watch Zhang’s testimony or read a transcript here.

Read Zhang’s interview on WNN here.

Exit mobile version