Comment on This new data poisoning tool lets artists fight back against generative AI

<- View Parent
MxM111@kbin.social ⁨11⁩ ⁨months⁩ ago

As I understand, healthy people hallucinate all the time, but in different sense, non-psychiatric sense. It is just healthy brain has this extra filter that rejects all hallucinations that do not correspond to the signal coming from reality, that is our brain performs extra checks constantly. But we often get fooled if we do not have checks done correctly. For example, you can think that you saw some animal, while it was just a shade. There is even statement that our perception of the world is “controlled hallucination” because we mostly imagine the world and then best fit it to minimize the error from external stimuli.

Of course, current ANNs do not have such extensive error checking, thus they are more prone to those “hallucinations”. But fundamentally those are very similar to what we have in those “generative suggestions” our brain generates.

source
Sort:hotnewtop