Comment on This new data poisoning tool lets artists fight back against generative AI

ayaya@lemdro.id ⁨1⁩ ⁨year⁩ ago

Obviously this is using some bug and/or weakness in the existing training process, so couldn’t they just patch the mechanism being exploited?

Or at the very least you could take a bunch of images, purposely poison them, and now you have a set of poisoned images and their non-poisoned counterparts allowing you to train another model to undo it.

Sure you’ve set up a speedbump but this is hardly a solution.

source
Sort:hotnewtop