Advertisement
Australia markets closed
  • ALL ORDS

    7,937.50
    -0.40 (-0.01%)
     
  • ASX 200

    7,683.00
    -0.50 (-0.01%)
     
  • AUD/USD

    0.6502
    +0.0002 (+0.04%)
     
  • OIL

    82.75
    -0.06 (-0.07%)
     
  • GOLD

    2,330.00
    -8.40 (-0.36%)
     
  • Bitcoin AUD

    99,104.58
    -3,230.49 (-3.16%)
     
  • CMC Crypto 200

    1,389.85
    -34.25 (-2.40%)
     
  • AUD/EUR

    0.6075
    +0.0004 (+0.07%)
     
  • AUD/NZD

    1.0944
    +0.0003 (+0.02%)
     
  • NZX 50

    11,946.43
    +143.15 (+1.21%)
     
  • NASDAQ

    17,526.80
    +55.33 (+0.32%)
     
  • FTSE

    8,040.38
    -4.43 (-0.06%)
     
  • Dow Jones

    38,460.92
    -42.77 (-0.11%)
     
  • DAX

    18,088.70
    -48.95 (-0.27%)
     
  • Hang Seng

    17,201.27
    +372.34 (+2.21%)
     
  • NIKKEI 225

    37,996.41
    -463.67 (-1.21%)
     

TikTok removed almost 350,000 videos for spreading election misinformation

The company also shared new stats on coronavirus misinformation.

SOPA Images via Getty Images

TikTok is offering a new glimpse into just how much misinformation is on its platform. Between July and December of last year, the app removed hundreds of thousands of videos for breaking its rules around misinformation about the 2020 presidential election and the coronavirus pandemic. Details of the takedowns were released as part of the company’s latest transparency report.

Unsurprisingly, election misinformation was the most prevalent. The company removed 347,225 videos for sharing election misinformation or manipulated media, according to the report. An additional 441,000 clips were removed from the app’s recommendations because the content was “unsubstantiated.” (Like Facebook, TikTok works with third-party fact checking organizations; the company also warns users when videos contain “unverified” claims.”)

During the same period, TikTok took down 51,505 videos for sharing misinformation about COVID-19. In its report, TikTok notes that 87 percent of these clips were removed within 24 hours of being posted, and that 71 percent had “zero views” at the time they were removed.

ADVERTISEMENT

The new stats come after TikTok tightened its policies around misinformation ahead of the election. In the lead-up to the 2020 election, the company introduced new rules barring deepfakes and expanded its work with fact checking organizations to debunk false claims. The app also added in-app notices to direct users to credible information. TikTok says its PSAs were viewed more than 73 billion times.

In its report, TikTok says it was well-prepared for the election, and that much of the misinformation was from domestic sources within the United States. “We prepared for 65 different scenarios, such as premature declarations of victory or disputed results, which helped us respond to emerging content appropriately and in a timely manner,” TikTok writes. “We also prepared for more domestic activity based on trends we’ve observed on how misleading content is created and spread online. Indeed, during the US 2020 elections, we found that a significant portion of misinformation was driven by domestic users –– real people.”

The company also notes that misinformation and disinformation represents only a fraction of the total content TikTok removes. The app took down more than 89 million videos that broke its rule, according to the report. As with its previous report, the biggest category of takedowns were around “minor safety” (36 percent of removals) and adult nudity (20.5 percent). “Integrity and authenticity,” which includes misinformation as well as things like bots and fake accounts, represented 2.4 percent of TikTok’s takedowns.

But even with a relatively low amount of misinformation, TikTok has still at times struggled to contain viral misinformation. In the days after the election, viral videos spreading debunked conspiracy theories about voter fraud racked up hundreds of thousands of views before they were removed, according to Media Matters.