Advertisement
Australia markets open in 4 hours 45 minutes
  • ALL ORDS

    7,937.90
    +35.90 (+0.45%)
     
  • AUD/USD

    0.6487
    +0.0036 (+0.56%)
     
  • ASX 200

    7,683.50
    +34.30 (+0.45%)
     
  • OIL

    83.43
    +1.53 (+1.87%)
     
  • GOLD

    2,337.20
    -9.20 (-0.39%)
     
  • Bitcoin AUD

    102,847.66
    +487.40 (+0.48%)
     
  • CMC Crypto 200

    1,436.14
    +21.38 (+1.51%)
     

Facebook could slow down sharing

The social network confirmed several 'temporary steps' meant to stop election-related misinformation.

NurPhoto via Getty Images

Facebook could soon slow down users’ ability to share posts in an effort to curb the spread of misinformation and conspiracy theories about the election, The New York Times reports. The changes “could be rolled out as soon as Thursday,” and would add “friction” to the social network’s sharing features.

It wasn’t immediately clear what exactly Facebook would change about its sharing features, but it could “an additional click or two,” according to The New York Times. Twitter has also introduced updates to slow the spread of viral tweets and combat election misinformation. Until now, Facebook has relied on its voting information center and labels within its app, but these notices are easily-dismissed and don’t prevent users’ from sharing inflammatory posts.

Facebook is reportedly worried about the potential for violence and wants users to “cool down.” Separately, BuzzFeed News reported that an internal metric Facebook uses to track potential for violence had sharply risen over the last day. According to the report, it “indicates that the company's own internal metrics have found Facebook posts are contributing to an unstable situation around the counting of ballots in the US presidential election as President Donald Trump and his supporters attempt to inject unfounded doubts into the process.”

ADVERTISEMENT

There are other signs that Facebook may be more attuned to activity on its platform spilling over into potentially violent circumstances. Earlier in the day, the company shut down a group with hundreds of thousands of members due to “worrying calls for violence.” Facebook said the move was “in line with the exceptional measures that we are taking during this period of heightened tension.”

Prior to the election, Facebook officials confirmed there were policies in place should there be violence following the election. Nick Clegg, the company’s head of global affairs, said Facebook might consider “pretty exceptional measures to significantly restrict the circulation of content on our platform.”

Update 11/5 10:06pm ET: In a statement, a Facebook spokesperson confirmed the company is taking additional “temporary steps” to keep false information from spreading more widely. Facebook will now direct users to its voting information center when they try to share election-related posts, and the company is down-ranking some posts in users’ Facebook and Instagram feeds. Additionally, Facebook will throttle some live videos “that may relate to the election.” The spokesperson didn’t indicate how long these measures would be in place.

“As vote counting continues, we are seeing more reports of inaccurate claims about the election. While many of these claims have low engagement on our platform, we are taking additional temporary steps, which we’ve previously discussed, to keep this content from reaching more people. These include demotions for content on Facebook and Instagram that our systems predict may be misinformation, including debunked claims about voting. We are also limiting the distribution of Live videos that may relate to the election on Facebook. In addition, on Facebook and Instagram, when people try to share a post that features an informational election label, they will see a message encouraging them to visit the Voting Information Center for reliable election information.”