Advertisement
Australia markets closed
  • ALL ORDS

    7,937.50
    -0.40 (-0.01%)
     
  • ASX 200

    7,683.00
    -0.50 (-0.01%)
     
  • AUD/USD

    0.6500
    +0.0000 (+0.01%)
     
  • OIL

    82.78
    -0.03 (-0.04%)
     
  • GOLD

    2,321.50
    -16.90 (-0.72%)
     
  • Bitcoin AUD

    98,952.30
    -3,526.11 (-3.44%)
     
  • CMC Crypto 200

    1,386.76
    -37.34 (-2.62%)
     
  • AUD/EUR

    0.6070
    -0.0001 (-0.01%)
     
  • AUD/NZD

    1.0948
    +0.0006 (+0.06%)
     
  • NZX 50

    11,946.43
    +143.15 (+1.21%)
     
  • NASDAQ

    17,526.80
    +55.33 (+0.32%)
     
  • FTSE

    8,040.38
    -4.43 (-0.06%)
     
  • Dow Jones

    38,460.92
    -42.77 (-0.11%)
     
  • DAX

    18,088.70
    -48.95 (-0.27%)
     
  • Hang Seng

    17,363.64
    +162.37 (+0.94%)
     
  • NIKKEI 225

    37,818.11
    -641.97 (-1.67%)
     
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

What’s in the Facebook Papers and what it means for the company

Facebook — make that Meta — is battling serious issues on multiple fronts.

Meta (Modified by Engadget)

Facebook (and now, Meta) might just be experiencing its most sustained and intense bout of bad press ever, thanks to whistleblower Frances Haugen and the thousands of documents she spirited out of the company.

The Wall Street Journal was the first publication to report on the contents of the documents, which have also been turned over to the Securities and Exchange Commission. Since then, the documents have made their way into the hands of more than a dozen publications who formed “a consortium,” much to the dismay of Facebook’s PR department.

There have now been more than a hundred stories based on the documents. And while many of those reference the same documents, the details are significant. But as important as they are, it’s also a dizzying amount of information. There are detailed documents written by the company's researchers, free-form notes and memos, as well as comments and other posts in Workplace, the internal version of Facebook used by its employees.

ADVERTISEMENT

This mix of sources, together with the fact that the consortium has not released most of the documents to researchers or other journalists, makes the Facebook Papers difficult to parse. Gizmodo has been publishing some of the underlying documents, but new revelations could be trickling out for weeks or months as the material becomes more widely distributed.

But amid all that noise, a few key themes have emerged, many of which have also been backed up by prior reporting on the company and its policies. This article will detail Haugen’s disclosures, and additional details that have arisen from reporting on the Facebook Papers. We'll continue to update it as fresh allegations emerge.

Facebook allowed politics to influence its decisions

This likely won’t be a surprise to anyone who has followed Facebook over the last five years or so, but the Facebook Papers add new evidence to years-long allegations that Mark Zuckerberg and other company leaders allowed politics to influence their decisions.

One of the first stories to break from Haugen’s disclosures (via The Wall Street Journal) included details about Facebook’s “cross check” program, which allowed politicians, celebrities and other VIPs to skirt the company’s rules. The initial motivation for the program? To avoid the “PR fires” that may occur if the social network were to mistakenly remove something from a famous person’s account. In another document, also reported by The Journal, a researcher on Facebook's integrity team complained that the company had made “special exceptions” for right-wing publisher Brietbart. The publication, part of Facebook’s official News Tab, also had “managed partner” status, which may have helped the company avoid consequences for sharing misinformation.

At the same time, while Facebook’s policies were often perceived internally as putting their thumb on the scale in favor of conservatives, Zuckerberg has also been accused of shelving ideas that could have been perceived as benefiting Democrats. The CEO was personally involved in killing a proposal to put a Spanish language version of its voting information center into WhatsApp ahead of the 2020 presidential election, The Washington Post reported. Zuckerberg reportedly said the plan wasn’t “politically neutral.”

Facebook has serious moderation failures outside the US and Europe

Some of the most damning revelations in the Facebook Papers relate to how the social network handles moderation and safety issues in countries outside of the United States and Europe. The mere fact that Facebook is prone to overlook countries that make up its “rest of world” metrics is not necessarily new. The company's massive failure in Myanmar, where Facebook-fueled hate helped incite a genocide, has been well documented for years.

Yet a 2020 document noted the company still has “significant gaps” in its ability to detect hate speech and other rule-breaking content on its platform. According to Reuters, the company’s AI detection tools — known as “classifiers” — aren’t able to identify misinformation in Burmese. (Again, it’s worth pointing out that a 2018 report on Facebook’s role in the genocide in Myanmar cited viral misinformation and the lack of Burmese-speaking content moderators as issues the company should address.)

Unfortunately, Myanmar is far from the only country where Facebook’s under-investment in moderation has contributed to real-world violence. CNN notes that Facebook’s own employees have been warning that the social network is being abused by “problematic actors” to incite violence in Ethiopia. Yet Facebook lacked the automated tools to detect hate speech and other inciting content even though it had determined the country was one of the most “at risk” countries.

Even in India — Facebook’s largest market — there’s a lack of adequate language support and resources to enforce the platform’s rules. In one document, reported by The New York Times, a researcher created a test account as an Indian user and started following Facebook’s automated recommendations for accounts and pages to follow. It took just three weeks for a new user’s feed to become flooded with “hate speech, misinformation and celebrations of violence.” At the end of the experiment, the researcher wrote: “I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life.” The report was not an outlier. Facebook groups and WhatsApp messages are being used to “spread religious hatred” in the country, according to The Wall Street Journal’s analysis of several internal documents.

Facebook has misled authorities and the public about its worst problems

Lawmakers, activists and other watchdogs have long suspected that Facebook knows far more about issues like misinformation, radicalization and other major problems than it publicly lets on. But many documents within the Facebook Papers paint a startling picture of just how much the company’s researchers know, often long before issues have boiled over into major scandals. That knowledge is often directly at odds with what company officials have publicly claimed.

For example, in the days after the Jan. 6 insurrection, COO Sheryl Sandberg said that rioters had “largely” organized using other platforms, not Facebook. Yet a report from the company’s own researchers, which first surfaced in April, found that the company had missed a number of warning signs about the brewing “Stop the Steal” movement. Though the company had spent months preparing for a chaotic election, including the potential for violence, organizers were able to evade Facebook’s rules by using disappearing Stories and other tactics, according to BuzzFeed.

Likewise, Facebook’s researchers were internally sounding the alarm about QAnon more than a year before the company banned the conspiracy movement. A document titled “Carol’s Journey to QAnon” detailed how a “conservative mom” could see QAnon and other conspiracy theories takeover their News Feed in just five days only by liking Pages that Facebook’s algorithms recommended. “Carol’s” experience was hardly an outlier. Researchers ran these types of experiments for years, and repeatedly found that Facebook’s algorithmic recommendations could push users deeper into conspiracies. But much of this research was not acted on until “things had spiraled into a dire state,” one researcher wrote in a document reported by NBC News.

The documents also show how Facebook has misleadingly characterized its ability to combat hate speech. The company has long faced questions about how hate speech spreads on its apps, and the issue sparked a mass advertiser boycott last year. According to a document cited by Haugen, the company’s own engineers estimate that the company is taking action on “as little as 3-5% of hate” on its platform. That’s in stark contrast to the statistics the company typically showcases.

Similarly, the Facebook Papers indicate that Facebook’s researchers knew much more about vaccine and COVID-19 misinformation than they would share with the public or officials. The company declined to answer lawmakers’ questions about how COVID-19 misinformation spreads even though, according to The Washington Post’s reporting, “researchers had deep knowledge of how covid and vaccine misinformation moved through the company’s apps.”

Facebook has misled advertisers and shareholders

These are the allegations that could end up being some of the most consequential because they show serious problems affecting the company’s core business — and could tie into any future SEC action.

Instagram has long been viewed as a bright spot for Facebook in terms of attracting the teens and younger users Facebook needs to grow. But increasingly, teens and younger users are spending more time and creating more content in competing apps like TikTok. The issue is even more stark for Facebook, where “teen and young adult DAU [daily active users] has been in decline since 2012/2013,” according to a slide shared by Bloomberg.

The story points out another issue that could get the company into hot water with the SEC: that the company is overcharging advertisers and misrepresenting the size of its user base due to the number of duplicate accounts. Though this is hardly the first time the issue has been raised, the company’s own reports suggest Facebook “undercounts” the metric, known as SUMA (single user multiple account), according to Bloomberg.

Zuckerberg prioritized growth over safety

While the Facebook Papers are far from the first time the company has faced accusations that it puts profit ahead of users’ wellbeing, the documents have shed new light on many of those claims. One point that’s come up repeatedly in the reporting is Zuckerberg’s obsession with MSI, or meaningful social interaction. Facebook retooled its News Feed around the metric in 2018 as a strategy to combat declining engagement. But the decisions, meant to make sure Facebook users were seeing more content from friends and family, also made the News Feed angrier and more toxic.

By optimizing for “engagement,” publishers and other groups learned they could effectively game the company’s algorithms by, well, pissing people off. But politicians learned they could reach more people by posting more negative content, according to The Wall Street Journal. Publishers also complained that the platform was incentivizing more negative and polarizing content. Yet when Zuckerberg was presented with a proposal that found reducing the amount of some re-shared content could reduce misinformation, the CEO “said he didn’t want to pursue it if it reduced user engagement.”

That wasn’t the only time a Facebook leader was unwilling to make changes that could have a detrimental effect on engagement, even if it would address other serious issues like misinformation. Several documents detail research and concerns about Facebook’s “like” button and other reactions.

Because the News Feed algorithm prioritized a “reaction” more than a like, it boosted content that received the “angry” reaction even though researchers flagged that these posts were much more likely to be toxic. “Facebook for three years systematically amped up some of the worst of its platform, making it more prominent in users’ feeds and spreading it to a much wider audience,” The Washington Post wrote. The company finally stopped giving extra weight to “angry” last September.

Facebook slow-walked, and in some cases outright killed, proposals from researchers about how to address the flood of anti-vaccine comments on its platform, the AP reported.

The company has also been accused of downplaying research that found Instagram can exacerbate mental health issues for some of its teen users. The documents, which were some of the first records to emerge from Haugen’s disclosures, forced Facebook to “pause” work on an Instagram Kids app that had already drawn the attention of 44 state Attorneys General. The research also prompted the first Congressional hearing as a result of Haugen's whistleblowing.

Similarly, the company's researchers have extensively studied "problematic use," such as "compulsive use of social media that impacts their sleep, work, parenting or relationships," The Wall Street Journal reported. the term could describe many as one out fo eight Facebook users, according to the report based on findings by a "wellbeing" team at the company. The company later rolled out a "quiet mode" features but researchers worried it was "buried" in the app. (Facebook has defended its research, writing in a response that it has added "nearly 10 tools since 2018" meant to make it easier for people to control how much time they spend in its apps.)

What does all this mean for Facebook Meta?

While the Facebook Papers contain a dizzying amount of details about Facebook’s failures and misdeeds, many of the claims are not entirely new allegations. And if there’s one thing Facebook’s history has taught us, it’s that the company has never let a scandal affect its ability to make billions of dollars.

But, there are some signs that Haugen’s disclosures could be different. For one, she has turned over the documents to the SEC, which has the authority to conduct a wide-ranging investigation into the company’s actions. As many experts have pointed out, it’s not clear what could actually come from such an investigation, but it could at the very least force Facebook’s top executives to formally answer detailed questions from the regulator.

And though Haugen has said she is not in favor of antitrust action against the social network, the FTC has reportedly begun to take a look at the disclosures. (The FTC is already in the midst of a legal battle with Facebook.) Facebook already seems to be reacting as well. The company has asked employees to preserve documents going back to 2016, The New York Times reported this week. There are other, more practical, issues too. The company is reportedly struggling to recruit engineering talent, according to documents reported by Protocol.

The constant scandals and internal roadblocks have also taken a toll on existing employees. For as much scrutiny as the company has faced externally, the Facebook Papers paint a picture of a company whose employees are at times deeply divided and frustrated. The events of January 6th in particular sparked a heated debate about Facebook’s role, and how it missed opportunities to recognize the threat of the “Stop the Steal Movement.” But there have been fundamental disagreements between researchers and other staffers, and Facebook’s leaders for years.

As Wired points out, the Facebook Papers are full of “badge posts” — Facebook speak for the companywide posts employees write upon their departure from the social network — from “dedicated employees who have concluded that change will not come, or who are at least are too burned out to continue fighting for it.”

Update 11/5: This story was updated with details on Facebook's "wellbeing" research.