Advertisement
Australia markets closed
  • ALL ORDS

    7,937.50
    -0.40 (-0.01%)
     
  • AUD/USD

    0.6506
    +0.0017 (+0.26%)
     
  • ASX 200

    7,683.00
    -0.50 (-0.01%)
     
  • OIL

    82.97
    -0.39 (-0.47%)
     
  • GOLD

    2,329.80
    -12.30 (-0.53%)
     
  • Bitcoin AUD

    102,179.85
    +415.75 (+0.41%)
     
  • CMC Crypto 200

    1,431.83
    +7.73 (+0.54%)
     

This site tells you if photos of you were used to train the AI

Deepfakes, AI-generated porn and a thousand more innocent uses -- there's been a lot of news about neural network-generated images. It makes sense that people started getting curious; were my photos used to train the robots? Are photos of me in the image-generating training sets? A brand new site tries to give you an answer.

Spawning AI creates image-generation tools for artists, and the company just launched Have I Been Trained? which you can use to search a set of 5.8 billion images that have been used to train popular AI art models. When you search the site, you can search through the images that are the closest match, based on the LAION-5B training data, which is widely used for training AI search terms.

It's a fun tool to play with, and may help give a glimpse into the data that the AI is using as the basis for its own. The photo at the top of this post is a screenshot of the search term "couple". Try putting your own name in, and see what happens... I also tried a search for "Obama," which I will not be sharing a screenshot of here, but suffice it to say that these training sets can be... Problematic.

An Ars Technica report this week reveals that private medical records -- as many as thousands -- are among the many photos hidden within LAION-5B with questionable ethical and legal statuses. Removing these records is exceptionally difficult, as LAION isn't a collection of files itself but merely a set of URLs pointing to images on the web.

ADVERTISEMENT

In response, technologists like Mat Dryhurst and Holly Herndon are spearheading efforts such as Source+, a standard aiming to allow people to disallow their work or likeness to be used for AI training purposes. But these standards are -- and will likely remain -- voluntary, limiting their potential impact.

Via DIY Photography / PetaPixel