Tuesday, October 24, 2023

AI = Amplify Ignorance. Fighting back with Nightshade

These two images were making the rounds today...


My first thoughts on seeing these was to take them seriously  -- Posts like these confirm that AI image generators amplify ignorance. Erode creative rights. Train the public to reinforce their own perceptions about artists and the creative process. Reward instant gratification. Rob our collective soul. 

It’s more important than ever to educate fans. Empower them to become advocates. Protect what we all value. These tools aren’t going away. We need to forge the playground rules. 

The comments thread on the posts shared a lot of outrage by artists. And lots of suggestions that everyone was falling for a troll just trying to push buttons.

Yes, this particular example may indeed be a troll. But I wouldn’t be surprised if there are many people playing around with Midjourney and thinking they really are improving the art they are playing with. We all have access to a vast library of images online --- but there are not enough opportunities for the public to learn about art. The history. The process. The people who make it possible.

AI image generator tools are here to stay. What we need are clear rules that require the companies to credit and compensate people who elect to have their images/content to be scraped to train the AI. 

Presently, companies elected to create these platforms with the assumption that all content online was available for their use. The companies failed to ask permission, or establish licensing arrangements/credit/compensation. Because just taking the work was easiest for them. This puts the burden on the rest of us to decipher what we see online. Not to mention artists and others having to file lawsuits to help the legal side catch up with protecting creative rights. 

Only when “opt in” is the default -- will artists and everyone else who posts anything online be protected from IP theft. Opt-in as the default is where the companies assume all content is opt out/excluded for scraping UNLESS the copyright owners elected to opt in, to be included for scraping (and hopefully credit and compensation, but those issues are still contentious).  Companies want to keep their current stolen treasure trove -- which they call their "Foundational Models." Artists are pushing lawmakers to force the companies to scrap the current data sets (via "data disgorgement") and start over.

Some people may be fine with  having their content scraped to build datasets. Especially if they might get credit and compensation for it. Or even if they don't.

Many artists want to exclude their artWORK from building datasets and AI image generation platforms. They are not interested in giving up what has taken them a lifetime to master. 

Links for more resources on Copyright and AI basics here: https://stuartngbooks.blogspot.com/2023/10/copyright-trademark-ai-image-generator.html


Artists are fighting back with tech tools too. First with Glaze https://glaze.cs.uchicago.edu/index.html#whatis

Now with Nightshade

 https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/?fbclid=

(Art by STEPHANIE ARNETT/MITTR | REIJKSMUSEUM, ENVATO)

Excerpt:

"A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to “poison” this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs useless—dogs become cats, cars become cows, and so forth. MIT Technology Review got an exclusive preview of the research, which has been submitted for peer review at computer security conference Usenix.  

AI companies such as OpenAI, Meta, Google, and Stability AI are facing a slew of lawsuits from artists who claim that their copyrighted material and personal information was scraped without consent or compensation. Ben Zhao, a professor at the University of Chicago, who led the team that created Nightshade, says the hope is that it will help tip the power balance back from AI companies towards artists, by creating a powerful deterrent against disrespecting artists’ copyright and intellectual property. Meta, Google, Stability AI, and OpenAI did not respond to MIT Technology Review’s request for comment on how they might respond.

Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.

The team intends to integrate Nightshade into Glaze, and artists can choose whether they want to use the data-poisoning tool or not. The team is also making Nightshade open source, which would allow others to tinker with it and make their own versions. The more people use it and make their own versions of it, the more powerful the tool becomes, Zhao says. The data sets for large AI models can consist of billions of images, so the more poisoned images can be scraped into the model, the more damage the technique will cause.….

.......Junfeng Yang, a computer science professor at Columbia University, who has studied the security of deep-learning systems and wasn’t involved in the work, says Nightshade could have a big impact if it makes AI companies respect artists’ rights more—for example, by being more willing to pay out royalties.

AI companies that have developed generative text-to-image models, such as Stability AI and OpenAI, have offered to let artists opt out of having their images used to train future versions of the models. But artists say this is not enough. Eva Toorenent, an illustrator and artist who has used Glaze, says opt-out policies require artists to jump through hoops and still leave tech companies with all the power.

Toorenent hopes Nightshade will change the status quo.

“It is going to make [AI companies] think twice, because they have the possibility of destroying their entire model by taking our work without our consent,” she says.

Autumn Beverly, another artist, says tools like Nightshade and Glaze have given her the confidence to post her work online again. She previously removed it from the internet after discovering it had been scraped without her consent into the popular LAION image database.

“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” she says."


No comments: