Advertisement
Australia markets closed
  • ALL ORDS

    7,937.50
    -0.40 (-0.01%)
     
  • ASX 200

    7,683.00
    -0.50 (-0.01%)
     
  • AUD/USD

    0.6528
    +0.0028 (+0.43%)
     
  • OIL

    83.15
    +0.34 (+0.41%)
     
  • GOLD

    2,341.10
    +2.70 (+0.12%)
     
  • Bitcoin AUD

    98,137.77
    -3,907.86 (-3.83%)
     
  • CMC Crypto 200

    1,356.80
    -25.77 (-1.86%)
     
  • AUD/EUR

    0.6089
    +0.0019 (+0.31%)
     
  • AUD/NZD

    1.0957
    +0.0015 (+0.14%)
     
  • NZX 50

    11,946.43
    +143.15 (+1.21%)
     
  • NASDAQ

    17,526.80
    +55.33 (+0.32%)
     
  • FTSE

    8,097.94
    +57.56 (+0.72%)
     
  • Dow Jones

    38,460.92
    -42.77 (-0.11%)
     
  • DAX

    17,995.56
    -93.14 (-0.51%)
     
  • Hang Seng

    17,284.54
    +83.27 (+0.48%)
     
  • NIKKEI 225

    37,628.48
    -831.60 (-2.16%)
     

Facebook reveals how influence campaigns changed between the 2016 and 2020 elections

Facebook is better at finding fake accounts, but they're coming from more sources than before.

SOPA Images via Getty Images

Facebook has published a new report on the state of influence operations on its platform. The report, which covers the period between 2017 and 2020, sheds new light on the company’s efforts to prevent election interference, and how attempts to manipulate its platform have evolved.

The report notes influence campaigns have changed a lot since 2016, when Russia’s Internet Research Agency used fake accounts to great effect. While Facebook is still uncovering IRA activity, its tactics are changing. For example, last year Facebook and Twitter uncovered an IRA scheme that involved US-based journalists who were unwittingly conned into authoring articles for a fake news site meant to prop up their influence campaigns. (Both Facebook and Twitter said at the time that the fake accounts were found before they could reach a large audience.)

“Threat actors basically faced an empty field in 2016, and the world is very different today,” Facebook’s Head of Security Policy, Nathaniel Gleicher, said during a call with reporters. “But also… there are more actors who are using these techniques today than was the case in 2016.”

ADVERTISEMENT

Another major difference between now and 2016 is that “inauthentic behavior” is more frequently coming from within the country being targeted, not just from foreign actors. In the report, Facebook notes that it down an equal number of CIB (coordinated inauthentic behavior) networks from Russia, Iran and the US itself.

Facebook found just as many networks of fake accounts in the US as from Russia and Iran.
Facebook found just as many networks of fake accounts in the US as from Russia and Iran. (Facebook)

Of those originating in the US, “more than half were campaigns operated by conspiratorial and fringe political actors that used fake accounts to amplify their views and to make them appear more popular than they were.”

Also complicating matters: that these networks often relied on real people to spread their message or run fake accounts. And though the report doesn’t specifically name Donald Trump, it notes that the “then-US President” was among those “promoting false information amplified by IO [influence operations] from various countries including Russia and Iran.”

Of course, Facebook dealt with numerous other issues surrounding the election besides fake accounts. The company was slow to address QAnon and other extremists prior to the election, and these groups were able to spread conspiracy theories relatively unchecked for months. Following the election, Facebook failed to prevent the “Stop the Steal” movement, which fueled the violence on Jan. 6. An internal report from Facebook suggested that the company’s focus on looking for fake accounts may have blinded it to the dangers posed by legitimate accounts spreading conspiracy theories.

“The US 2020 election campaign brought to the forefront the complexity of separating bad actors behind covert influence operations from unwitting people they co-opt or domestic influencers whose interests may align with threat actors,” the report says.