Nightshade, the AI data poisoning tool, is now available. In a recent article, the Register says, “Nightshade poisons image files to give indigestion to models that ingest data without permission. It’s intended to make those training image-oriented models respect content creators’ wishes about the use of their work.”
Artists and photographers can use this defensive tool to prevent their work from being used without permission for AI training models. It’s free and easy to use, but it does require a large chunk of disk space and at least 30 minutes per image to achieve the desired effect. The processed images look almost identical to the original images, though. Attached below are four of my own images that have been processed through Nightshade’s default algorithm for about 30 minutes each.
Even if you are fine with having your own images used to train these large data sets without your explicit permission, I hope you can understand that other folks don’t. I think we all have the obligation to use whatever tools we can to fight for the rights of artists and push OpenAI, Stabillity AI, Midjourney, and other companies to only use ethically sourced data.
Unsurprisingly, ChatGPT is more concerned with the changes artists make to their own images than with how OpenAI uses those same images without permission.