Who get there news from chatgpt lol
Comment on BBC will block ChatGPT AI from scraping its content
porkins@sh.itjust.works 1 year agoI’d rather have ChatGPT know about news content than not. I appreciate the convenience. The news shouldn’t have barriers.
Apollo@sh.itjust.works 1 year ago
FlyingSquid@lemmy.world 1 year ago
A disturbing number of people.
spez_@lemmy.world 1 year ago
I do
Apollo@sh.itjust.works 1 year ago
Why?
prashanthvsdvn@lemmy.world 1 year ago
It’s funny seeing Apollo and spez_ fighting on a topic regarding ChatGPT.
abhibeckert@lemmy.world 1 year ago
Because ChatGPT doesn’t do clickbait headlines or auto-play videos?
Touching_Grass@lemmy.world 1 year ago
You don’t get your news from it but building tools can be useful. Scrapping news websites to measure different articles for tthinga like semantic analysis or identify media tricks that manipulate readers is a fun practice. You can use llm to identify propaganda much easier. I can get why media would be scared that regular people can run these tools on their propaganda machine easily.
C4d@lemmy.world 1 year ago
The pure ChatGPT output would probably be garbage. The dataset will be full of all manner of sources (together with their inherent biases) other with spin, untruths and outright parody and it’s not apparent that there is any kind of curation or quality assurance on the dataset (please correct me if I’m wrong).
I don’t think it’s a good tool for extracting factual information from. It does seem to be good at synthesising prose and helping with writing ideas.
I am quite interested in things like this where the output from a “knowledge engine” is paired with something like ChatGPT - but it would be for eg writing a science paper rather than news.
Touching_Grass@lemmy.world 1 year ago
I don’t think its generating news. Sounds like people are using it to reformat articles already writing to remove all the bullshit propganada from the news. Like taking a fox news article and just pulling out key information
netchami@sh.itjust.works 1 year ago
But ChatGPT often takes correct and factual sources and adds a whole bunch of nonsense and then spits out false information. That’s why it’s dangerous. Just go to the fucking news websites and get your information from there. You don’t need ChatGPT for that.
echodot@feddit.uk 1 year ago
So they have automated Fox then.
netchami@sh.itjust.works 1 year ago
Yeah, pretty much.
guacupado@lemmy.world 1 year ago
More data fixes that flaw, not less.
netchami@sh.itjust.works 1 year ago
Not too long ago, ChatGPT didn’t know what year it is. You’re telling me it needs more data than it already has to figure out the current year? I like AI for certain things (mostly some programming/scripting stuff) but you definitely don’t need it to read the news.
ours@lemmy.film 1 year ago
Yes. The LLM doesn’t know what year it currently is, it needs to get that info from a service and then answer.
It’s a Large **Language ** Model. Not an actual sentient being.
CurlyMoustache@lemmy.world 1 year ago
It is not “a flaw”, it is the way language learning models work. They try to replicate how humans write by guessing based on a language model. It has no knowledge of what is a fact or not, and that is why using LLMs to do research or use them as a search engine is both stupid and dangerous
Touching_Grass@lemmy.world 1 year ago
His would it hallucinate information from an article you gave it
Natanael@slrpnk.net 1 year ago
It’s not more data, the underlying architecture isn’t designed for handling facts
Touching_Grass@lemmy.world 1 year ago
You just described news