New York (CNN Business)Facebook has confronted whistleblowers, PR firestorms and Congressional inquiries in recent years. But now it faces a combination of all three at once in what could be the most intense and wide-ranging crisis in the company's 17-year history.
On Friday, a consortium of 17 US news organizations began publishing a series of stories — collectively called "The Facebook Papers" — based on a trove of hundreds of internal company documents which were included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel. The consortium, which includes CNN, reviewed the redacted versions received by Congress.
CNN's coverage includes stories about how coordinated groups on Facebook (FB) sow discord and violence, including on January 6, as well as Facebook's challenges moderating content in some non-English-speaking countries, and how human traffickers have used its platforms to exploit people.
The reports from CNN, and the other outlets that are part of the consortium, follow a month of intense scrutiny for the company. The Wall Street Journal previously published a series of stories based on tens of thousands of pages of internal Facebook documents leaked by Haugen. (The consortium's work is based on many of the same documents.)
The publication of the Journal's "Facebook Files," which raised concerns about the impact of Instagram on teen girls, among other issues, prompted a Senate subcomittee hearing with Facebook head of global safety Antigone Davis. Haugen herself then testified before the Senate subcommittee, during which she said she believes that "Facebook's products harm children, stoke division, and weaken our democracy."
There's currently no end in sight for Facebook's troubles. Members of the subcommittee have called for Facebook CEO Mark Zuckerberg to testify. And on Friday, another former Facebook employee anonymously filed a complaint against the company to the SEC, with allegations similar to Haugen's.
Facebook has dealt with scandals over its approach to data privacy, content moderation and competitors before. But the vast trove of documents, and the many stories surely still to come from it, touch on concerns and problems across seemingly every part of its business: its approach to combatting hate speech and misinformation, managing international growth, protecting younger users on its platform and even its ability to accurately measure the size of its massive audience.
All of this raises an uncomfortable question for the company: Is Facebook actually capable of managing the potential for real-world harms from its staggeringly large platforms, or has the social media giant become too big not to fail?
Facebook tries to turn the page
Facebook, for its part, has repeatedly tried to discredit Haugen, and said her testimony and reports on the documents mischaracterize its actions and efforts.
"At the heart of these stories is a premise which is false," a Facebook spokesperson said in a statement to CNN. "Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or wellbeing misunderstands where our own commercial interests lie."
In a tweet thread last week, the company's Vice President of Communications, John Pinette, called the Facebook Papers a "curated selection out of millions of documents at Facebook" which "can in no way be used to draw fair conclusions about us." But even that response is telling --— if Facebook has more documents that would tell a fuller story, why not release them? (During her Senate testimony Facebook's Davis said Facebook is "looking for ways to release more research.")
Instead, Facebook is now reportedly planning to rebrand itself under a new name as early as this week, as the wave of critical coverage continues. (Facebook previously declined to comment on this report.) The move appears to be a clear attempt to turn the page, but a fresh coat of paint won't fix the underlying issues outlined in the documents — only Facebook, or whatever it may soon be called, can do that.
Take the example of a report published by the Journal on September 16 that highlighted internal Facebook research about a violent Mexican drug cartel, known as Cartél Jalisco Nueva Generación. The cartel was said to be using the platform to post violent content and recruit new members using the acronym "CJNG," even though it had been designated internally as one of the "Dangerous Individuals and Organizations" whose content should be removed. Facebook told the Journal at the time that it was investing in artificial intelligence to bolster its enforcement against such groups.
Despite the Journal's report last month, CNN last week identified disturbing content linked to the group on Instagram, including photos of guns, and photo and video posts in which people appear to have been shot or beheaded. After CNN asked Facebook about the posts, a spokesperson confirmed that multiple videos CNN flagged were removed for violating the company's policies, and at least one post had a warning added.
Haugen has suggested Facebook's failure to fix such problems is in part because it prioritizes profit over societal good, and, in some cases, because the company lacks the capacity to put out its many fires at once.
"Facebook is extremely thinly staffed ... and this is because there are a lot of technologists that look at what Facebook has done and their unwillingness to accept responsibility, and people just aren't willing to work there," Haugen said in a briefing with the "Facebook Papers" consortium last week. "So they have to make very, very, very intentional choices on what does or doesn't get accomplished."
Facebook has invested a total of $13 billion since 2016 to improve the safety of its platforms, according to the company spokesperson. (By comparison, the company's annual revenue topped $85 billion last year and its profit hit $29 billion.) The spokesperson also said Facebook has "40,000 people working on the safety and security on our platform, including 15,000 people who review content in more than 70 languages working in more than 20 locations all across the world to support our community."
"We have also taken down over 150 networks seeking to manipulate public debate since 2017, and they have originated in over 50 countries, with the majority coming from or focused outside of the US," the spokesperson said. "Our track record shows that we crack down on abuse outside the US with the same intensity that we apply in the US."
Still, the documents suggest that the company has much more work to do to eliminate all of the many harms outlined in the documents, and to address the unintended consequences of Facebook's unprecedented reach and integration into our daily lives.
An uncertain future
In the meantime, the company appears to be quickly losing trust — not only among some of its users and regulators, but internally, as well.
Several of the internal documents point to concerns among Facebook employees about the company's actions, including one December 2020 post on Facebook's internal site about attrition on the company's integrity team in which an employee notes in a comment, "Our recent Pulse results show confidence in leadership has declined across the company." (Pulse surveys are often used by companies to gauge employee sentiment on certain topics.)
The internal post came after Facebook's Civic Integrity team was broken up following the Presidential election and its staff assigned to other roles within the company, a move that Haugen criticized but that Facebook Vice President of Integrity Guy Rosen has said was done "so that the incredible work pioneered [by the team] for elections could be applied even further ... their work continues to this day."
And on Thursday, Facebook's independent oversight board accused the company of not being "fully forthcoming" on the details of its Cross-Check program that reportedly shielded millions of VIP users from the social media platform's normal content moderation rules. (A Facebook spokesperson said in a statement that the company had "asked the board for input into our Cross-Check system, and we will strive to be clearer in our explanations to them going forward.")
The good news for Facebook: Haugen, and the team supporting her, aren't aiming to shut down or break up the company. During her Senate testimony, Haugen repeatedly told lawmakers that she was there because she believes in Facebook's potential for good, if the company is able to address its serious issues. Haugen even said she would work for Facebook again, if given the chance. She suggested that Congress give the company the chance to "declare moral bankruptcy and we can figure out how to fix these things together."
"The most interesting thing I discovered as I read these documents is how extraordinary the company is," Lawrence Lessig, a Harvard Law School professor and strategic legal adviser to Haugen, told CNN. "The company is filled with thousands of thousands of Frances Haugens ... who are just trying to do their job. They are trying to make Facebook safe and useful and the best platform for communication that they can."
What remains to be seen is how much Facebook will change in response to the revelations from current and future whistleblowers, especially if its advertising-fueled business continues to chug along unimpeded, as it has so far. Will it agree to the kind of transparency and cooperation that Haugen, regulators and others have called for? Or will it simply continue with business as usual under a new name?
This article is part of a CNN series published on "The Facebook Papers," a trove of over ten thousand pages of leaked internal Facebook documents that give deep insight into the company's internal culture, its approach to misinformation and hate speech moderation, internal research on its newsfeed algorithm, communication related to Jan. 6, and more. You can read the entire series here.