Facebook Whistleblower Details Meta’s Disregard for Teen Distress

facebook whistleblower

Arturo Bejar, former senior engineer and product leader at Facebook, testified to the Congressional Subcommittee on Privacy, Technology, and the Law on November 7, discussing teen’s experience with online harassment, insufficient reporting mechanisms, and recommendations on how to pressure Meta to reform.

His testimony was grounded in his time as an employee and what he has seen as a father of a teenage girl.

He implores to Congress: “It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse.”

From 2009-2015, Bejar ran the Protect & Care team, overseeing engineering, product, user research, data, and design. The team focused on three core issues:

  1. Site integrity: stopping attacks and malicious behavior
  2. Security infrastructure: compliance improvement and engineered systems
  3. Care: user interface and internal customer care tools

Bejar reported to the CTO, who reported directly to Meta CEO Mark Zuckerberg.

The team tried and tested response surveys for those reporting inappropriate content and harassment. They designed a flow that teens were comprehensively engaging with, which gave them a better sense of negative user experience.

Upon his return as an independent consultant for “well-being” on Instagram in 2019, he found that the tools they had built for teenagers to get support for bullying and harassment were no longer available: “Almost all of the work that I and my colleagues had done during my earlier stint at Facebook through 2015 was gone.”

His new team had gathered troubling evidence that teens were experiencing distress and abuse on the platform.

He found that the number of people reporting to surveys that they had a negative experience on Instagram was 51% every week. Only 1% of those reported the offending content and only 2% of those succeeded in getting the offending content taken down.

The internal research team detailed the staggering levels of abuse that teens aged 13-15 experience weekly:

“Looked at over time, it is likely the largest-scale sexual harassment of teens to have ever happened,” he tells the Committee; “one that demands action.”

He reported this information to Mark Zuckerberg, Adam Mosseri, and Sheryl Sandberg via email on October 5, 2021. Sandberg expressed sympathy, Mosseri asked for a follow up, Mark Zuckerberg didn’t respond at all. This was out of character for Zuckerberg, who usually responded to Bejar or would ask for follow ups.

Two years after this issue was brought to the executive’s attention, there is still no way for a minor to flag a conversation on Instagram to indicate it contains unwanted sexual advances.

Bejar’s observations reflect a practice noted by prior whistleblowers: the repeated public misrepresentation of harm that users experience.

Repeated examples of harm enabled by Meta and other companies have become publicized over the past few years through whistleblowing:

Whenever such reports emerge, Meta’s response is to talk about ‘prevalence’, and its investment in moderation and policy, as if that was the only relevant issue. But there is a material gap between their narrow definition of prevalence and the actual distressing experiences that are enabled by Meta’s products.

Managers, Zuckerberg included, have repeatedly ignored mechanisms of harm called to their attention, rather downplaying published findings and the results of internal research. They have a record of obfuscating the situation by quoting statistics that are irrelevant to the issues at hand.

Bejar argues that alongside disregarding the distress of users, the way Meta responds to these problems often makes them worse, normalizing harmful behavior and encouraging unwanted contact and content.

Social media companies are not going to start addressing the harm they cause on their own. They must be pressured and compelled by regulators and policymakers to be transparent about these harms and how they are addressing them.

To catalyze action, Bejar recommends that public earnings calls should include a report on the percentage of teenagers who experienced unwanted sexual advances in that quarter, as having to report their high numbers would expose their latent accountability measures.

Having to publicize the information themselves would ideally galvanize the platform into introducing features that enable users to better address those harms.

“My goal in all this, as a father and as an engineer who has worked on these problems for many years, is to help regulators, policymakers, academics, journalists, and the public better understand how companies think about these problems and what it would take to address them. I also want to create a meaningful increase in support for integrity and trust and safety workers at the companies.”

Bejar sees his recommendations as a necessary start to the work that is needed to create a safer, more accountable online environment.

“I believe the only way this can happen is through regulation imposed from the outside. Meta has consistently demonstrated that it will not address these issues on its own.”

Exit mobile version