It’s Time for Facebook to be Sanctioned for Misleading Shareholders and the Public About Terror and Hate Speech on its Website
The Securities and Exchange Commission (SEC) now has all the information it needs to sanction Facebook for its dishonesty about terror and hate content on its website, thanks to a petition filed by a whistleblower working with the National Whistleblower Center (NWC). Today, the Associated Press published an explosive story describing and confirming the key findings in the petition.
- Terror and hate speech and images are proliferating on Facebook
- Contrary to its assurances, Facebook has no meaningful strategy for removing this terror and hate content from its website
- Facebook is generating its own terror and hate content, which is being “Liked” by individuals affiliated with terrorist organizations
- Facebook is providing a powerful networking and recruitment tool to terrorist and hate groups
- Facebook has argued it is not a content provider, failing to disclose that it is generating terror and hate content
This anonymous whistleblower petition, first filed in January 2019 and updated in late April, raises important ethical and legal questions. First, what are the responsibilities of Facebook, with over 2.2 billion monthly users worldwide, for terrorist and hate group organizing and recruiting on its website? Second, why are whistleblowers, and whistleblower protections, needed to hold Facebook accountable for its failure to live up to these responsibilities?
Democracy at Risk
In the past six months, we have all become increasingly concerned about the future of democratic governance as attackers have targeted Jews worshipping in Pittsburgh and San Diego, Christians in Sri Lanka, and Muslims in New Zealand. The frequency of terrorist attacks, with deadly consequences, appears to be accelerating, and in virtually every case, social media is playing a prominent role in connecting terrorists with each other and helping them spread their propaganda of hate. A technology created to foster human connectedness is now spinning out of control, with democratically-elected governments looking increasingly hapless in their efforts to stop the madness.
Recent history teaches us that as extremists gain ground and create an atmosphere of fear, authoritarian leaders, with easy pledges of stability, begin weakening democratic institutions.
The Central Role of Facebook
Any discussion of terror and hate content on social media must begin with Facebook, the world’s largest social networking website. Facebook serves as the primary communications vehicle for residents in many countries. For criminals seeking to perpetrate extreme violence and destabilize governments, Facebook is invaluable – it provides an instant messaging platform and a virtually limitless supply of new recruits. Moreover, because Facebook designs its algorithms to reward users whose posts increase engagement, those who post terror and hate content receive especially favorable treatment. As shown with the anti-Muslim riots in Sri Lanka last year, spurred by Facebook-based misinformation and hate speech, nothing provokes engagement like videos and images brimming with anger and fear.
Facebook is well aware that its website is now being used by terror and hate groups to achieve their destructive goals. Apparently, it has concluded that it is no longer tenable for it to claim that it cannot be responsible for terror and hate content. Thus, it has recently begun assuring shareholders and the public that it is now hard at work removing this content. These assurances are false, as the petition filed by the whistleblower working with NWC shows.
Contrary to Assurances, Facebook is Facilitating Recruiting by Terrorist and Hate Groups
Facebook regularly states that it is blocking 99% of the activity of targeted terrorist groups on its website, without any need for user notifications, using a combination of Artificial Intelligence (AI) and human reviewers. Yet, after researching the Facebook activity of those professing support for terrorist groups from August through December 2018, our whistleblower found that far more extremist content remains on the platform than those that are blocked. Of the content from over 3,000 Friends of terrorist groups in the study, less than 30% had been removed from the website at the conclusion of the five-month study period. Focusing on the smaller subset of 317 Friends displaying symbols of terrorist groups, the whistleblower found that only 38% had been removed.
More importantly, the whistleblower found that Facebook was generating its own terror and hate content by repurposing materials created by those professing allegiance to terrorist and hate groups.
This last point deserves special attention: Facebook’s own auto-generated content is emblazoned with the symbols of terrorist and hate groups such as al-Qaeda and the American Nazi Party. These materials are generating thousands of Likes that these groups can use in their recruiting efforts. A page auto-generated by Facebook for the American Nazi Party goes so far as to provide a link to the hate group’s website.
Facebook’s Ethical Duty
From an ethical standpoint, it is obviously wrong for corporate leaders to facilitate violent criminal activity. It is also deeply wrong for them to cover up these misdeeds by issuing false and misleading statements. All concerned citizens should appeal to the basic humanity of Mark Zuckerberg, Sheryl Sandberg and other leaders of Facebook and encourage them to spend the necessary funds to rectify the situation. For a company that earned $59 billion in the year ending Q1 2019– a whopping 32 percent increase over the prior year- it is a lack of commitment, not a lack of resources, that is the obstacle that must be overcome.
I am not optimistic that Facebook will suddenly develop a sense of corporate social responsibility and take corrective action voluntarily. After reaching a consent decree with the Federal Trade Commission (FTC) in 2011 admitting to massive privacy violations and agreeing to corrective measures, Facebook proceeded to violate the decree repeatedly. Facebook is now reportedly budgeting $3 to $5 billion for an anticipated fine from FTC. Although this would be the largest fine for a privacy violation in the FTC’s history, from Facebook’s perspective, it is the equivalent of a parking ticket.
An ethical reckoning and a rethinking of the company’s business model does not appear to be underway.
Facebook Legal Responsibility
Facebook’s explosive growth into the world’s largest social networking website has taken place, in significant part, due to its having persuaded regulators that it is merely a platform for content produced by others, not a publisher of its own content. This assertion has been central to its ability not only to avoid regulation but also to maintain immunity from liability in civil litigation.
In lawsuits filed by those who believe Facebook is legally responsible for harm caused by activity on its website, such as families of victims of terror attacks, Facebook has successfully wielded Section 230 of the Communications Decency Act. Section 230 states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Now that our whistleblower has shown that terror and hate content is being generated by Facebook itself and not by “another information content provider,” Facebook’s ability to claim Section 230 immunity in cases filed by families of terror victims has been placed into question. Plaintiffs in these cases will still carry the heavy burden of proving that Facebook’s content contributed to the acts of terrorism that harmed their family members. But Facebook may now be facing the disappearance of a legal defense and the possibility of being forced to argue over causation before a jury. This will presumably cause major consternation among Facebook shareholders.
The Importance of Whistleblowers, and Whistleblower Protections, to Exposing Corporate Deception
Until now, Facebook’s shareholders have never had the benefit of knowing that Facebook generates its own terror and hate content or that it is not meeting its stated intention of removing 99% of the content of targeted terrorist groups. All relevant statements from Facebook have falsely suggested that the company is implementing an aggressive strategy for removing terror and hate content from its website. A whistleblower was key to making this information known.
Petitions filed by other whistleblowers supported by NWC reveal that Facebook is also issuing misleading statements about its handling of illegal trafficking of endangered wildlife on its website. Not once in its statements about wildlife has Facebook ever acknowledged that it is profiting from the sale of advertisements on the pages of illegal traffickers.
This high-value information is brought forward because of the Dodd-Frank Act of 2010, which authorizes financial rewards to whistleblowers who supply “original information” to the SEC and thereby enable the SEC to secure civil or criminal penalties from a publicly-traded company for misleading its shareholders and the public. Original information is defined broadly so that anyone, not just someone employed by the company, can provide useful information that otherwise would not be known by the SEC.
Shareholders depend on accurate information to guide their decisions on whether to invest in a company. As owners, they are in a unique position to force management to adopt corporate social responsibility practices that are essential to maintaining the company’s social license and long-term profitability. Thanks to the whistleblower protections in the Dodd-Frank amendments, the SEC is now fully aware of Facebook’s deceptive practices and is well-positioned to act to protect shareholders and the public from continued deception.
The Urgent Need for a Meaningful Sanction by the SEC
The question now is whether the SEC will take the action necessary to achieve meaningful changes in Facebook’s behavior. The Securities and Exchange Act empowers the SEC to pursue civil and criminal penalties when publicly-traded companies make false and misleading statements about matters that materially affect share value. The Dodd-Frank amendments allow the SEC to impose civil penalties through its own administrative proceedings rather than having to go to court, so the SEC has the ability to move quickly.
It is time for the SEC to act. Only through a substantial civil or criminal penalty can the SEC ensure that Facebook’s deceptive practices come to an end and enable shareholders and the public to bring pressure to bear on the company regarding its handling of terror and hate content.
News Networks Reporting on Facebook Auto Generated Terror and Hate Content
Breaking AP Exclusive Story about Facebook Whistleblower Petition
BBC News – Facebook ‘auto-generated’ extremist video
ABC News KCJT8 – Facebook auto-generates videos celebrating extremist images