Tuesday 24 October 2023

digitalis (11. 073)

A new data-poisoning tool allows artists to fight back against generative AI by allowing them to make invisible alterations to pixels so when their data is scraped—without consent or compensation—for training, causing the output to verge in chaotic directions. Called Nightshade, these subtle changes could have significant down-stream effects for later iterations of what’s become mostly recursive machine learning. The industry faced with numerous lawsuits over this unauthorised sampling, the application’s creator hopes that this method—which reminds me of trap streets on maps, fake entries in dictionaries and other honeypots—will create a deterrent for such infringement.