Advertisement
Australia markets closed
  • ALL ORDS

    8,153.70
    +80.10 (+0.99%)
     
  • ASX 200

    7,896.90
    +77.30 (+0.99%)
     
  • AUD/USD

    0.6517
    -0.0018 (-0.28%)
     
  • OIL

    83.13
    +1.78 (+2.19%)
     
  • GOLD

    2,239.80
    +27.10 (+1.22%)
     
  • Bitcoin AUD

    108,100.88
    +2,516.19 (+2.38%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • AUD/EUR

    0.6040
    +0.0009 (+0.16%)
     
  • AUD/NZD

    1.0906
    +0.0026 (+0.24%)
     
  • NZX 50

    12,105.29
    +94.63 (+0.79%)
     
  • NASDAQ

    18,257.70
    -23.14 (-0.13%)
     
  • FTSE

    7,952.62
    +20.64 (+0.26%)
     
  • Dow Jones

    39,791.79
    +31.71 (+0.08%)
     
  • DAX

    18,492.49
    +15.40 (+0.08%)
     
  • Hang Seng

    16,541.42
    +148.58 (+0.91%)
     
  • NIKKEI 225

    40,168.07
    -594.66 (-1.46%)
     

Facebook is going after misinformation superspreaders

Users who repeatedly share misinformation will have their posts down-ranked in News Feed.

Facebook

Facebook says it will penalize individuals who repeatedly share misinformation. The company introduced new warnings that will notify users that repeatedly sharing false claims could result in “their posts moved lower down in News Feed so other people are less likely to see them.”

Until now, the company’s policy has been to down-rank individual posts that are debunked by fact checkers. But posts can go viral long before they are reviewed by fact checkers, and there was little incentive for users to not share these posts in the first place. With the change, Facebook says it will warn users about the consequences of repeatedly sharing misinformation.

Pages that are considered repeat offenders will include pop-up warnings when new users try to follow them, and individuals who consistently share misinformation will receive notifications that their posts may be less visible in News Feed as a result. The notifications will also link to the fact check for the post in question, and give users the opportunity to delete the post.

facebook will warn individuals who share misinformation.
facebook will warn individuals who share misinformation. (Facebook)

The update comes after a year when Facebook has struggled to control viral misinformation about the coronavirus pandemic, the presidential election and COVID-19 vaccines. "Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps," the company wrote in a blog post.

ADVERTISEMENT

Facebook didn’t indicate how many posts it would take to trigger the reduction in News Feed, but the company has used a similar “strike” system for pages that share misinformation. (That policy has been a source of controversy after reports that Facebook officials removed “strikes” from popular conservative pages last year.)

Researchers who study misinformation have pointed out that it’s often the same individuals behind the most viral false claims. For example, a recent report from the Center for Countering Digital Hate found that the vast majority of anti-vaccine misinformation was linked to just 12 individuals.