How to keep your art out of AI generators

  News, Rassegna Stampa
image_pdfimage_print

AI-generated imagery feels inescapable. It’s in the video games you play, in the movies you watch, and has flooded social media platforms. It’s even been used to promote the physical hardware that real, human artists use to create digital paintings and illustrations, to the immense frustration of those who already feel displaced by the technology. 

The pervasive nature of it seems especially egregious to creators who are fighting to stop their works from being used, without consent or compensation, to improve the very thing that threatens to disrupt their careers and livelihoods. The data pools that go into training generative AI models often contain images that are indiscriminately scraped from the internet, and some AI image generator tools allow users to upload reference images they want to imitate. Many creative professionals need to advertise their work via social media and online portfolios, so simply taking everything offline isn’t a viable solution. And a lack of legal clarity around AI technology has created something of a Wild-West environment that’s difficult to resist. Difficult, but not impossible.

While the tools are often complicated and time consuming, several AI companies provide creators with ways to opt their work out of training. And for visual artists who want broader protections there are tools like Glaze and Kin.Art, which make the works useless for training. Here’s how to navigate the best solutions we’ve found so far.

Generative AI models depend on training datasets, and the companies behind them are motivated to avoid restricting those potential data pools. So while they often do allow artists to opt their work out, the process can be crude and labor intensive — especially if you have a sizable catalog of work. 

Opting out typically requires submitting a request to an AI provider, either via a dedicated form or directly via email, along with the copies and written descriptions of images you want to protect. Additionally, if you’ve agreed to let third parties license your images, the terms may include a license for AI training. It’s worth scanning the user agreements for any platforms hosting your work to check what rights they hold over it. But different AI tools’ policies vary — here’s how to opt out of some popular ones.

OpenAI started allowing creators to remove their work from its training data alongside its DALL-E 3 generative AI model last September, and it’s one of the easier processes to follow. Content creators or owners just need to submit a form to OpenAI to request that the work be excluded from future training datasets, including a copy of the image, a description of it, and a ticked checkbox confirming that you have the rights for said image.