Comment on I totally missed the point when PeerTube got so good
Crozekiel@lemmy.zip 2 days agoWhy the fuck do people ask ChatGPT for shit like this? ChatGPT doesn’t know facts. It’s a magic 8-ball with more words.
Comment on I totally missed the point when PeerTube got so good
Crozekiel@lemmy.zip 2 days agoWhy the fuck do people ask ChatGPT for shit like this? ChatGPT doesn’t know facts. It’s a magic 8-ball with more words.
eronth@lemmy.world 2 days ago
Asking chatgpt can be super useful to get info. I just don’t understand why people don’t try to verify what it says before just re-posting like fact.
Taldan@lemmy.world 2 days ago
For basic fact checking like this, it’s basically useless. You’d have to go look it up to verify anyway, so it’s just an extra step. There’s use cases for it, but this isn’t it
Ulrich@feddit.org 2 days ago
Explain AI in 10 words or less:
bigfondue@lemmy.world 2 days ago
If you are just going to verify the info, why not just find out yourself and save yourself some time?
nulluser@lemmy.world 2 days ago
It depends on what info you’re trying to find.
I was recently trying to figure out the name of a particular uncommon type of pipe fitting. I could describe what it looked like, but had no idea what it was called. I described it to chatgpt, which gave me a name, which I could then search for with a normal search engine to confirm that the name was correct. Sure enough, search results took me to plumbing supply companies selling it, with pictures that matched what I described.
But, asking it when a particular feature got added to a piece of software? There’s no additional information one would get from the answer to help them confirm that the answer is correct.
Ulrich@feddit.org 2 days ago
You should use something like perplexity instead that actually provides links to where it found the information.
eronth@lemmy.world 1 day ago
Sometimes it’s nice to know where you even start, then verify from there.
Ulrich@feddit.org 2 days ago
Why bother even using CGPT when you have to go elsewhere to verify everything it says anyway?
oantolin@discuss.online 2 days ago
It depends on the type of facts, but sometimes it’s much easier to verify an answer than to get the answer in the first place. For example sometimes the LLM willmention a keyword that you didn’t know or didn’t remember and that makes googling much easier.
chiliedogg@lemmy.world 2 days ago
The only thing it’s useful at is shit that isn’t necessary.
We had a P&Z member at the city I work at get butthurt because we corrected him at a meeting, so the city manager asked me to write an apology letter to him.
That was the one time I loved ChatGPT. It was bullshit that didn’t need to happen that I didn’t care about and achieved nothing, so I let the fucking bot write it.