Comment on AI insiders seek to poison the data that feeds them

<- View Parent
algernon@lemmy.ml ⁨18⁩ ⁨hours⁩ ago

I… have my doubts. I do not doubt that a wider variety of poisoned data can improve training, by implementing new ways to filter out unusable training data. In itself, this would, indeed, improve the model.

But in many cases, the point of poisoning is not to poison the data, but to deny the crawlers access to the real work (and provide an opportunity to poison their URL queue, which is something I can demonstrate as working). If poison is served instead of the real content, that will hurt the model, because even if it filters out the junk, it will have access to less new data to train on.

source
Sort:hotnewtop