ramirezmike
@ramirezmike@programming.dev
This is a remote user, information on this page may be incomplete. View at Source ↗
- Comment on Anthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publishers 2 days ago:
that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”