Menu

Blog

Nov 9, 2023

New Tool Lets Artists “Poison” Their Work to Mess Up AI Trained on It

Posted by in category: robotics/AI

In particular, many have griped over their original work being used to train these AI models — a use they never opted into, and for which they’re not compensated.

But what if artists could “poison” their work with a tool that alters it so subtly that the human eye can’t tell, while wreaking havoc on AI systems that try to digest it?

That’s the idea behind a new tool called “Nightshade,” which its creators say does exactly that. As laid out in a yet-to-be-peer-reviewed paper spotted by MIT Technology Review, a team of researchers led by University of Chicago professor Ben Zhao built the system to generate prompt-specific “poison samples” that scramble the digital brains of image generators like Stable Diffusion, screwing up their outputs.

Leave a reply