Menu

Blog

Oct 24, 2023

This new data poisoning tool lets artists fight back against generative AI

Posted by in categories: futurism, robotics/AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to “poison” this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs… More.

Leave a reply