University of Chicago researchers finally release to public Nightshade, a tool that is intended to “poison” pictures in order to ruin generative models trained on themby AI Generated Robotic Contentin Imageon Posted on January 20, 2024 submitted by /u/Alphyn [link] [comments]Share this article with your network:TwitterFacebookRedditLinkedInEmailLike this:Like Loading...