LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you’re more likely to get a false positive from a chatgpt.
So i think it just exacerbates things more than alternatives
LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you’re more likely to get a false positive from a chatgpt.
So i think it just exacerbates things more than alternatives