- Riding the Wave
- Posts
- ๐Artists fight AI back with poison
๐Artists fight AI back with poison
A new tool can ruin image models if they train on artists' work
Hello Surfers๐!
Midjourney threw open the doors to their newly-renovated, ultra-chic website, however image generation remains exclusive to Discord for now. After reading this newsletter they may have other challenges to address beyond perfecting their site.
Hereโs your two minutes of AI news for the day:
ONE PIECE OF NEWS
๐Artist fight AI back with poison
Much like dodos strolling the sunlit beaches in the 1600s, artist were oblivious to the dangers that lay ahead when the sails of AI advancement appeared on last year's horizon. However, unlike the extinct dodo, artists are adapting to face the threat. Their newest weapon? Poison.
Enter Nightshade: a revolutionary tool allowing artists to subtly alter their artworks. Invisible to the naked eye, it wreaks havoc when an AI is trained on it. A touch of this "poison", and suddenly AI might mistake a dog for a cat, or a car for a cow.
Major players like OpenAI, Midjourney and Stable Diffusion have indiscriminately harvested web content, often overlooking artists' rights, to train their AI models. But with Nightshade in play, this practice just became much more risky. A mere 50 tampered dog images skewed Stable Diffusion's model, producing odd dog representations, and 300 such images rendered the model useless.
And Nightshade doesn't just affect a specific term: if "fantasy art" is disrupted, so are related terms like "dragon".
This is a potential power shift from tech behemoths back to the art community. The developers of Nightshade are eager to open-source it, bolstering its potency. In simple terms, the more "poisoned" images AI systems inhale, the more chaos ensues.
Detecting these tampered images in the vast libraries containing billions of training photos is like finding a needle in a haystack. So getting rid of them, might cost AI companies huge sums of money.
However, this tool is a double-edged sword. Nightshade, if swiped by the mischief-makers, could become AI's worst nightmare. It's like giving pranksters a can of spray paint near a blank wall. It presents a fresh avenue for attacks, and creates a pressing need for defense against such attacks.
The big hope? Nightshade may push AI corporations to truly value and perhaps even compensate artists. Many in the art world view Nightshade as a glimmer of hope. The underlying message is clear: with their foundational operations at risk, AI companies might finally exercise caution, respecting artistic consent.
ONE MORE THING
This is how to make ChatGPT help you think out an idea for a novel.
yall, you must try this
1. ChatGPT app
2. give the prompt "I want you to help me flesh out a story about X. Ask me questions about the worldbuilding and plot to help think. Do not offer suggestions unless asked."
3. turn on speaking mode, put in headphones, and go for a walkโ stoop kid (@StoopMensch)
1:48 AM โข Oct 23, 2023
โ If you have one more minute
๐Apple will bring AI to Apple Music, Pages, Keynote, and more
๐๏ธSam Altman talks about the Future of AI on the WSJ conference
๐ฎHow generative AI could radically reshape gaming - CNN
๐ผ๏ธQuiz: AI images vs. historyโs most iconic photos โ can you tell the difference?
๐ง A brain-inspired chip from IBM, is more than 20 times as fast as current AI chips
๐๏ธ6 retail leaders innovating the customer experience with AI - Insider
AI Art of the day ๐จ
Giving ChatGPT the instruction: โcreate in unusual styles from deep art historyโ yields amazing results.
๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐
Thatโs it folks!
If you liked it, please share this hand-crafted newsletter with a friend and make this writer happy!