Facebook Whistleblower Sophie Zhang Shares Her Story

Facebook Whistleblower Sophie Zhang Shares Her Story

In September of 2020, BuzzFeed News reported on a former Facebook employee’s whistleblowing memo regarding the company’s conduct with artificial accounts. Sophie Zhang worked as a data scientist for Facebook’s fake engagement team for three years. In her memo, she described her findings of “evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes,” according to BuzzFeed. Zhang alleged that these situations of fake accounts attempting to sway public opinion about government candidates were occurring in several countries including “India, Ukraine, Spain, Brazil, Bolivia, and Ecuador.”

On April 12, Zhang talked with the Guardian about trying to address the issue internally, the conduct from fake accounts she witnessed on Facebook, and the aftermath of her disclosures and work on the fake engagement team. “A lot of the time it felt like trying to empty the ocean with an eyedropper,” Zhang said about her work on the team.

“What we have seen is that multiple national presidents believe that this activity is sufficiently valuable for their autocratic ambitions that they feel the need to do it so blatantly that they aren’t even bothering to hide,” Zhang said in her interview with the Guardian. In the article, she details how she first discovered fake account activity linked with international governments and tried to bring up the issue internally. “I tried to fix this problem within Facebook … I spoke to my manager, my manager’s manager, different teams, and everyone up to a company vice-president in great detail,” Zhang said. “I repeatedly tried to get people to fix things… I offered to stay on for free after they fired me, and they said no. I hoped that when I made my departure post it might convince people to change things, but it hasn’t.”

Zhang first noticed fake activity on Facebook in 2018, about six months after she started working at the company. She saw that the president of Honduras Juan Orlando Hernández’s Facebook content “was amassing large numbers of fake likes” and discovered that the fake engagement stemmed from fake accounts that were designed to look like user accounts with “names, profile pictures and job titles.” The Guardian reports: “One individual was the administrator for hundreds of those fake Pages, as well as for the official Pages of both Hernández and his late sister, who had served as communications minister.”

Zhang saw that fake engagement was being used on political targets, or “civic” targets, as Facebook called them. “The most blatant example was Hernández, who was receiving 90% of all the known civic fake engagement in Honduras as of August 2018,” the article states. This manner of fake engagement “fell into a serious loophole in the company’s rules.” Users need to have their “real” name on their account, but Facebook Pages, “which can perform many of the same engagements that accounts can, including liking, sharing and commenting,” are not beholden to that rule. According to the article, Facebook “initially resisted” naming the Honduran activity as “coordinated inauthentic behavior” (CIB) because of this loophole. Previously, Facebook came up with the term CIB to ban the kind of behavior that happened during the 2016 U.S. elections, when Russia’s Internet Research Agency “set up Facebook accounts purporting to be Americans and used them to manipulate individuals and influence political debates.”

Zhang thought that “once she alerted the right people to her discovery, the Honduras network would be investigated and the fake Pages loophole would be closed.” However, messages from various teams led her to believe that “Honduras was simply not a priority.” A manager working on the civic integrity team told Zhang, “I don’t think Honduras is big on people’s minds here,” the Guardian reports.

After facing the company’s inaction, Zhang posted about Hernández and the fake engagement on a group for Facebook’s “election integrity core team” in March 2019, stating that Facebook was knowingly letting this conduct happen. Investigators found merit to Zhang’s suspicions in June of 2019, and a month later, Facebook announced it was taking down “181 accounts and 1,488 Facebook Pages that were involved in domestic-focused coordinated inauthentic activity in Honduras.”

Zhang felt encouraged by the action Facebook took: the day after the announcement, “she filed an escalation within Facebook’s task management system to alert the threat intelligence team to a network of fake accounts supporting a political leader in Albania.” Over the course of her time at Facebook, Zhang continued finding evidence of fake activity and filing escalations “for suspicious networks in Azerbaijan, Mexico, Argentina and Italy.” Later, “she added networks in the Philippines, Afghanistan, South Korea, Bolivia, Ecuador, Iraq, Tunisia, Turkey, Taiwan, Paraguay, El Salvador, India, the Dominican Republic, Indonesia, Ukraine, Poland, and Mongolia.”

The article chronicles Zhang’s continuous fight to urge Facebook to take action against Pages engaging in this activity — up until her termination in mid-August 2020. According to the Guardian, Zhang “was fired for poor performance…a result of spending too much time focused on uprooting civic fake engagement and not enough time on the priorities outlined by management.”

“It shouldn’t have been my job, but at the end of the day, I was the only one who was effectively making any decisions regarding these cases,” she said. “Whether a network was taken down or not was effectively based on how much I chose to push it, how much I chose to yell at other people about it.”

Zhang still grapples with the work she was trying to do at Facebook and its real-world implications. “I still have trouble sleeping at night, sometimes,” she told the Guardian. “It was just very overwhelming and frustrating because, frankly, I should never have had this much responsibility and power.” On her last day at the company, she left notes about more “suspicious accounts” in numerous countries, hoping that other employees “would pick up her work after her departure.”

Facebook and Whistleblowers

Whistleblower advocates have long been pressuring Facebook to amend various policies in regards to the content posted on the site. In February, the Alliance to Counter Crime (ACCO) published an open letter “urging Attorneys General from 48 states to expand antitrust lawsuits against Facebook to address the platform’s role in facilitating online crime,” according to previous WNN reporting. ACCO provided the Attorneys General with whistleblower complaints filed to the U.S. Securities and Exchange Commission (SEC) that “document allegations that Facebook fails to properly police an assortment of illegal activity on its site, including the sale of opioids and the spread of terror and hate content.”

The whistleblower nonprofit National Whistleblower Center (NWC) has also addressed hate content online and Facebook’s lack of action. In June 2020, NWC helped a whistleblower file a supplement to a petition alleging that Facebook was not acting on removing “content from hate and terrorist groups and instead “was assisting such groups through its algorithms and its auto-generation of web pages for such groups, effectively assisting them with networking and recruiting.” NWC is advocating for Facebook to stop misleading shareholders about terror and hate content on the site.

Facebook clearly has an issue with regulating harmful content on its site, but their issue with listening to whistleblowers is serious, too. Zhang tried to alert individuals inside of Facebook and address the real-world consequences of fake engagement, but was fired for her work. Facebook needs to change the way they treat employees who bring important issues to light, as well as the way they manage illicit content on their site.

Read the Guardian’s article featuring Zhang here.

Read BuzzFeed’s article about Zhang’s memo here.

Read WNN’s coverage of Zhang’s memo here.

 

Exit mobile version