814
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
(venturebeat.com)
This is a most excellent place for technology news and articles.
It corrupts the training data to recategorize all images generated in the future. It's not about protecting a single image, that's what glaze is for. This is about making the AI worse at making new images.