Has anyone looked into ways to prevent bots from scraping a pod's catalogue and use the data for AI training without the creators' consent?
One approach I recently came across is a method called “poison-pilling with adversarial noise”: If I understand it correctly inaudible noise is merged into a track before publication and it renders the file contents unusable for AI training while human listeners won't hear any difference.
Apparently one research paper has already been published that claims that similar algorithms applied to digital images are not a reliable layer of protection, but even if the reliability turns out to be at less than 100%, I'd still use the technique to pose at least a small obstacle to scraping bots who attempt to violate my rights.
If proven effective, would it make sense to implement something similar in funkwhale?
—moontan