Comment on Judge dismisses authors' copyright lawsuit against Meta over AI training

<- View Parent
tabular@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

hallucination refers to the generation of plausible-sounding but factually incorrect or nonsensical information”

Is an output an hallucination when the training data involved in the output included factually incorrect data? Suppose my input is “is the would flat” and then an LLM, allegedly, accurately generates a flat-eather’s writings saying it is.

source
Sort:hotnewtop