New Tool Lets Artists "Poison" Their Work to Mess Up AI Trained on It
The advent of AI-powered image generators that can whip up an image in any style from a text prompt has shaken many human artists to their core.
In particular, many have griped over their original work being used to train these AI models — a use they never opted into, and for which they're not compensated.
But what if artists could "poison" their work with a tool that alters it so subtly that the human eye can't tell, while wreaking havoc on AI systems that try to digest it?
That's the idea behind a new tool called "Nightshade," which its creators say does exactly that. As laid out in a yet-to-be-peer-reviewed paper spotted by MIT Technology Review, a team of researchers led by University of Chicago professor Ben Zhao built the system to generate prompt-specific "poison samples" that scramble the digital brains of image generators like Stable Diffusion, screwing up their outputs.
In early experiments involving Nightshade, Zhao and his team found that it just took only 50 poisoned images to get an otherwise unmodified version of Stable Diffusion to create weird, demented pictures when asked to draw a dog. And just 300 poisoned samples caused the machine-learning model to spit out images that looked more like cats than dogs.
Best of all, Nightshade isn't technically limited to poisoning prompts like "dog." Because of how AI image generators works, it also infects tangentially related images, like "puppy" and "husky."
"Surprisingly, we show that a moderate number of Nightshade attacks can destabilize general features in a text-to-image generative model, effectively disabling its ability to generate meaningful images," the paper reads.
Artists are reveling at the opportunity to fight back.
"I’m just really grateful that we have a tool that can help return the power back to the artists for their own work," artist Autumn Beverly told MIT Tech.
Nightshade builds on a different tool called Glaze, also developed by Zhao and his team. According to the project's official website, Glaze is designed to "protect human artists by disrupting style mimicry" by making a "set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style." The team is hoping to eventually integrate Nightshade into Glaze, giving content creators an additional line of defense against AI models.
But the tool is still little-tested in the real world. Mainstream image-generating tools are trained on billions of samples, which means it's still unclear just how useful tools like Nightshade will realistically be as a "last defense for content creators against web scrapers that ignore opt-out/do-not-crawl directives," as the researchers write in their the paper.
"We don’t yet know of robust defenses against these attacks," Vitaly Shmatikov, a Cornell professor who studies AI models, who was not involved in the research, told MIT Tech Review. "We haven’t yet seen poisoning attacks on modern [machine learning] models in the wild, but it could be just a matter of time."
The question of unpaid labor being used to train AI models is a hot one. Earlier this year, for instance, a group of artists sued the creators of image generators Stable Diffusion and Midjourney, arguing that their livelihoods were at stake.
That threat is very real, with AI image generators already starting to take work away from artists and illustrators.
Some AI companies have started offering artists new ways to have their work excluded from AI training data. For instance, late last month, OpenAI announced new ways for artists to "opt out" and have their artwork not appear in DALL-E data sets.
To many artists, however, these kinds of efforts are far too little too late.
Nightshade and Glaze, however, could give these embattled artists some of their agency back — and a way to retaliate.
Take Eva Toorenent, an illustrator and artist who's used Glaze, who told MIT Tech that Nightshade will "make [AI companies] think twice, because they have the possibility of destroying their entire model by taking our work without our consent."
More on AI image generators: Disney Has No Comment on Microsoft's AI Generating Pictures of Mickey Mouse Doing 9/11