Australia markets close in 4 hours 46 minutes
  • ALL ORDS

    6,702.00
    +34.50 (+0.52%)
     
  • ASX 200

    6,501.20
    +31.80 (+0.49%)
     
  • AUD/USD

    0.6486
    +0.0028 (+0.43%)
     
  • OIL

    77.09
    +0.38 (+0.50%)
     
  • GOLD

    1,638.50
    +5.10 (+0.31%)
     
  • BTC-AUD

    30,168.15
    +967.57 (+3.31%)
     
  • CMC Crypto 200

    448.72
    +15.62 (+3.61%)
     
  • AUD/EUR

    0.6731
    +0.0019 (+0.28%)
     
  • AUD/NZD

    1.1419
    -0.0028 (-0.24%)
     
  • NZX 50

    11,359.64
    -75.18 (-0.66%)
     
  • NASDAQ

    11,254.11
    -57.13 (-0.51%)
     
  • FTSE

    7,020.95
    +2.35 (+0.03%)
     
  • Dow Jones

    29,260.81
    -329.60 (-1.11%)
     
  • DAX

    12,227.92
    -56.27 (-0.46%)
     
  • Hang Seng

    17,855.14
    0.00 (0.00%)
     
  • NIKKEI 225

    26,656.86
    +225.31 (+0.85%)
     

Instagram is working on 'nudity protection' technology for messages

·Contributing Reporter
·2-min read
Thomas White / Reuters

Unsolicited nude photos are a massive problem on social media, but Instagram is reportedly working on a tool that could help. An early screengrab tweeted by researcher Alessandro Paluzzi indicates that "Nudity protection" technology "covers photos that may contain nudity in chat," giving users the option to view them or not. Instagram parent Meta confirmed to The Verge that it's in development.

Meta said the aim is to help shield people from nude images or other unsolicited messages. As further protection, the company said it can't view the images itself nor share them with third parties. "We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive," a spokesperson said. It plans to share more details in the coming weeks ahead of any testing.

The new feature is akin to the "Hidden Words" tool launched last year, Meta added. That feature allows users to filter abusive message in DM requests based on key words. If a request contains any filter word you've chosen, it's automatically placed in a hidden folder that you can choose to never open — though it's not completely deleted.

The feature is welcome but long overdue, as unwanted nude photos were largely ignored by social media companies and are now a pervasive problem. One study back in 2020 by the University College London found that of 150 young people aged 12-18, 75.8 percent had been sent unsolicited nude images.

Sending unwanted nude photos, also known as "cyberflashing" has been targeted by multiple jurisdictions including California and the UK. In the UK, it could become a criminal offense if the Online Safety Bill is passed by parliament. California didn't go quite that far, but last month, the state legislature and senate voted unanimously to allow users to sue over unsolicited nude photos and other sexually graphic material.