University of Chicago researchers finally release to public Nightshade, a tool that is intended to “poison” pictures in order to ruin generative models trained on them
submitted by /u/Alphyn [link] [comments]
submitted by /u/Alphyn [link] [comments]
submitted by /u/jerrydavos [link] [comments]
submitted by /u/RefrigeratorPale2304 [link] [comments]
submitted by /u/ai_happy [link] [comments]
submitted by /u/Nerdy_Goat [link] [comments]
submitted by /u/violethyperia [link] [comments]
submitted by /u/ansmo [link] [comments]
submitted by /u/More_Bid_2197 [link] [comments]
submitted by /u/aicelos [link] [comments]
submitted by /u/mannmann2 [link] [comments]