Advertisement
Australia markets closed
  • ALL ORDS

    7,837.40
    -100.10 (-1.26%)
     
  • ASX 200

    7,575.90
    -107.10 (-1.39%)
     
  • AUD/USD

    0.6538
    +0.0015 (+0.23%)
     
  • OIL

    83.73
    +0.16 (+0.19%)
     
  • GOLD

    2,353.20
    +10.70 (+0.46%)
     
  • Bitcoin AUD

    98,449.26
    +1,100.69 (+1.13%)
     
  • CMC Crypto 200

    1,343.83
    -52.71 (-3.77%)
     
  • AUD/EUR

    0.6113
    +0.0040 (+0.66%)
     
  • AUD/NZD

    1.0980
    +0.0023 (+0.21%)
     
  • NZX 50

    11,805.09
    -141.34 (-1.18%)
     
  • NASDAQ

    17,686.74
    +256.23 (+1.47%)
     
  • FTSE

    8,134.79
    +55.93 (+0.69%)
     
  • Dow Jones

    38,274.77
    +188.97 (+0.50%)
     
  • DAX

    18,171.32
    +254.04 (+1.42%)
     
  • Hang Seng

    17,651.15
    +366.61 (+2.12%)
     
  • NIKKEI 225

    37,934.76
    +306.28 (+0.81%)
     

Researchers: Platforms like Facebook have played ‘major role’ in fueling polarization

Facebook has long tried to downplay its role in fueling divisiveness.

SOPA Images via Getty Images

Social media platforms like Facebook “have played a major role in exacerbating political polarization that can lead to such extremist violence,” according to a new report from researchers at New York University’s Stern Center for Business and Human Rights.

That may not seem like a surprising conclusion, but Facebook has long tried to downplay its role in fueling divisiveness. The company says that existing research shows that “social media is not a primary driver of harmful polarization.” But in their report, NYU’s researchers write that “research focused more narrowly on the years since 2016 suggests that widespread use of the major platforms has exacerbated partisan hatred.”

To make their case, the authors highlight numerous studies examining the links between polarization and social media. They also interviewed dozens of researchers, and at least one Facebook executive, Yann Le Cun, Facebook’s top AI scientist.

ADVERTISEMENT

While the report is careful to point out that social media is not the "original cause" of polarization, the authors say that Facebook and others have “intensified” it. They also note that Facebook’s own attempts to reduce divisiveness, such as de-emphasizing political content in News Feed, show the company is well aware of its role. “The introspection on polarization probably would be more productive if the company’s top executives were not publicly casting doubt on whether there is any connection between social media and political divisiveness,” the report says.

“Research shows that social media is not a primary driver of harmful polarization, but we want to help find solutions to address it,” a Facebook spokesperson said in a statement. “That is why we continually and proactively detect and remove content (like hate speech) that violates our Community Standards and we work to stop the spread of misinformation. We reduce the reach of content from Pages and Groups that repeatedly violate our policies, and connect people with trusted, credible sources for information about issues such as elections, the COVID-19 pandemic and climate change.”

The report also raises the issue that these problems are difficult to address “because the companies refuse to disclose how their platforms work.” Among the researchers recommendations is that Congress force “Facebook and Google/YouTube, to share data on how algorithms rank, recommend, and remove content.” Platforms releasing the data, and independent researchers who study it, should be legally protected as part of that work, they write.

Additionally, Congress should “empower the Federal Trade Commission to draft and enforce an industry code of conduct,” and “provide research funding” for alternative business models for social media platforms. The researchers also raise several changes that Facebook and other platforms could implement directly, including adjusting their internal algorithms to further de-emphasize polarizing content, and make these changes more transparent to the public. The platforms should also “double the number of human content moderators” and make them all full employees, in order to make decisions more consistent.