Nightshade, a project from the University of Chicago, gives artists some recourse by “poisoning” image data, rendering it useless or disruptive to AI model training. Ben Zhao, a computer science professor who led the project, compared Nightshade to “putting hot sauce in your lunch so it doesn’t get stolen from the workplace fridge.” Tap for more at Techcrunch
![Nightshade](https://www.coshorts.com/wp-content/uploads/2024/01/Nightshade.jpg)