Asked by CNN about Sandberg’s quote and whether she supported it, a Facebook spokeswoman pointed to the larger context around Sandberg’s quote. She noted that the Jan.6 organization took place largely online, including, but not limited to, Facebook’s platforms, the spokesperson said.
One of Haugen’s main allegations about the company centers on the attack on the Capitol. In an SEC disclosure, she alleges that “Facebook has misled investors and the public about its role in perpetuating disinformation and violent extremism regarding the 2020 election and the January 6 insurgency.”
Facebook denies the premise of Haugen’s findings and claims that Haugen selected documents to present an unfair portrayal of the company.
âThe responsibility for the violence that took place on January 6 lies with those who attacked our Capitol and those who encouraged them. Trump prematurely declared victory, suspending new political ads and removing the original #StopTheSteal group in November, “Facebook spokesman Andy Stone told CNN on Friday.
“After the violence on Capitol Hill erupted and we saw continued attempts to organize events to challenge the presidential election result, we removed content with the phrase ‘stop the theft’ as part of our damage coordination policy and suspended Trump from our platform. ”
“Our application was piecemeal”
“Hindsight is 20:20”, write the author (s) of the analysis, who are not identifiable from what has been provided. “[A]At the time, it was very unclear whether what we were seeing was a coordinated effort to delegitimize the election, or whether it was free speech protected by users who were frightened and confused and deserved our empathy. But looking back, being 8:20 p.m., it’s all the more important to look back to find out what we can about the growth of the election delegitimization movements that have developed, spread the plot, and helped incite the election. insurrection of the Capitol. ”
The analysis revealed that the policies and procedures put in place by Facebook were simply not up to the task of slowing, let alone stopping, the âmeteoricâ growth of Stop the Steal. For example, those behind the analysis noted that Facebook was treating each piece of content and person or group within Stop the Steal individually, rather than as part of a whole, with disastrous results.
“Almost all of the fastest growing FB groups were Stop the Steal during their growth peak,” the analysis says. âBecause we were looking at each entity individually, rather than as a cohesive move, we were only able to remove individual groups and pages once they passed a violation threshold. We were unable to act on simple objects like posts and comments, as they individually tended not to violate, even though they were surrounded by hatred, violence and misinformation. ”
This approach eventually changed, according to the analysis – after it was too late.
âAfter the Capitol uprising and a wave of Storm the Capitol events across the country, we realized that the individual delegitimization groups, pages and slogans made a cohesive movement,â the analysis says.
It wasn’t the only way Facebook hadn’t anticipated something like Stop the Steal, or where its response was lacking.
Facebook has had a policy of prohibiting “coordinated inauthentic behavior” on its platforms for some time. This ban allows him to take action against, for example, the Russian troll army which attempted to interfere with the 2016 US election through accounts and pages configured to make it appear that they were American. little policy around “coordinated genuine prejudice” – that is, little to prevent people from organizing under their real names and not hiding their intention to get the country to reject the results of the ‘election.
The Stop the Steal and Patriot Party groups “did not directly mobilize damage offline, nor directly promote militarization,” the analysis said. âInstead, they amplified and normalized disinformation and violent hatred in a way that delegitimized a free and fair democratic election. Evil existed at the network level: an individual’s speech is protected, but as a movement it normalized delegitimization and hatred in a way that resulted in offline damage and damage to the norms that underlie it. democracy. ”
The analysis notes, however, that once Facebook saw the results of Stop the Steal on January 6 and took action, it was able to roll out measures that hampered the growth of the Stop the Steal and Patriot Party groups.
Facebook’s Stone told CNN, âFacebook has taken extraordinary steps to tackle harmful content and we will continue to do our part. We also worked closely with law enforcement, both before January 6 and in the days and weeks that followed, to ensure that information linking those responsible for January 6 to their crimes are available.”
Pull the levers
Haugen began collecting evidence on the company before finally leaving the tech giant last May. To reduce the risk of being caught taking screenshots of Facebook’s internal systems, she used her phone to take pictures of her computer screen.
As the insurgency unfolded in Washington and Facebook tried to get the situation under control, Haugen took photos, documenting the company’s response.
One of the documents she captured, titled “Capitol Protest BTG [Break the Glass] Response, âwas an array of actions Facebook could take in response to the January 6 attack. The table seems to have been prepared in advance; By the time Haugen photographed it, just under two hours after the first breach of the Capitol, the company had instituted some of these measures while others were still under consideration. Among the potential actions listed in the table was the demotion “of content deemed to violate our community’s standards in the areas of hate speech, graphic violence, violence and incitement.”
The page labeled them as “US2020 Levers, Previously Canceled.”
These “levers,” as Facebook calls them, are measures – safeguards – that the company put in place ahead of last year’s presidential election in an attempt to slow the spread of hate and disinformation. on the platform. Facebook has not been clear in its public statements about the measures it rolled back after the election and why it did so at a time of uproar when the sitting president questioned the results of the vote.
But according to the “Capitol Protest BTG response” document, safeguards reimplemented by Facebook on January 6 included reducing the visibility of posts that may be flagged and freezing “comments on posts in groups that begin to have flagged posts. a high rate of hate speech and violence & inciting comments, “among others.
In the SEC disclosure, Haugen alleges that these levers were only restored “after the insurgency broke out.”
Asked about decisions to recall the levers and then push them back, Stone said, âBy gradually introducing and then adjusting additional measures before, during and after the election, we have taken into account the specific signals on the platforms and the platforms. information from our, regular engagement with law enforcement. When those signals changed, the measures changed. ”
A crossing line
When Facebook executives posted messages publicly and internally condemning the riot, some employees backed down, even suggesting that Facebook could have been guilty.
“There were dozens of Stop the Steal groups active until yesterday, and I doubt they mince words on their intentions,” wrote an employee in response to a post from Mike Schroepfer, chief technology officer at Facebook.3
Another wrote: âWith all due respect, haven’t we had enough time to figure out how to deal with speech without allowing violence? We’ve been fueling this fire for a long time, and we shouldn’t be surprised that it’s now out of control. . ”
Other Facebook employees went further, saying decisions made by company executives over the years helped create the conditions that paved the way for an attack on the U.S. Capitol.
In response to Schroepfer’s post, a staff member wrote that âleadership trumps research-based policy decisions to better serve people as the groups inciting violence today. Grassroots workers have done their part to identify changes to improve our platforms, but have been actively engaged. ”
Another staff member, referring to years of controversial and questionable decision-making by Facebook management around political discourse concluded, “History will not judge us kindly.”