Photo: Olivier26/Depositphotos

In history, the nightshade plant was used to poison kings and emperors.

So it’s only fitting that a new tool used to poison AI art generators is named Nightshade.

Nightshade allows artists to inject an invisible pixel into their artwork before they upload it online.

Nightshade AI Art Tool

Photo: Olivier26/Depositphotos

So what happens when an image is injected with Nightshade?

Based on tests by the developers, the poison data manipulates the AI models.

According to arecent tweet, their efforts have gotten 78 million artworks opted out.

Nightshade AI Art Tool

Photo: stockasso/Depositphotos

This is what makes Zhao’s efforts so intriguing.

It also works by injecting an invisible pixel into artwork.

Nightshade is a new tool with the ability to poison AI art generators.

It’s a tool that performs a data poisoning attack against generative AI image models.

Poisoning is not new.

Poisoning genAI models at scale is new.

you’ve got the option to read the MIT TR article for the high level story.

Smaller companies, individuals, anyone without a reputation or legal liability, can scrape your data without regard.

The Nightshade paper is public.

Any one is free to read it, understand it, and do followup research.

It is a technique, and can be used for applications in different domains.

You know who you are.

Artists, best tool we could have asked for to fight this unprecedented exploitation of our labor is here!

No, we arent trying to somehow stuff poison in your AI model datasets ourselves.

We poison our own work.

And if YOU choose to scrape our work, YOU poison your own dataset.

Its a retaliation to you offending first.