Menu

Blog

Jan 21, 2024

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

Posted by in categories: internet, robotics/AI

The Glaze/Nightshade team, for its part, denies it is seeking destructive ends, writing: Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.

In other words, the creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.

How did we get here? It all comes down to how AI image generators have been trained: by scraping data from across the web, including scraping original artworks posted by artists who had no prior express knowledge nor decision-making power about this practice, and say the resulting AI models trained on their works threatens their livelihood by competing with them.

Leave a reply