Exactly. The data harvest has had years in the making.
Comment on BBC will block ChatGPT AI from scraping its content
netchami@sh.itjust.works 11 months ago
Kinda late
C4d@lemmy.world 11 months ago
Comment on BBC will block ChatGPT AI from scraping its content
netchami@sh.itjust.works 11 months ago
Kinda late
Exactly. The data harvest has had years in the making.
porkins@sh.itjust.works 11 months ago
I’d rather have ChatGPT know about news content than not. I appreciate the convenience. The news shouldn’t have barriers.
netchami@sh.itjust.works 11 months ago
But ChatGPT often takes correct and factual sources and adds a whole bunch of nonsense and then spits out false information. That’s why it’s dangerous. Just go to the fucking news websites and get your information from there. You don’t need ChatGPT for that.
echodot@feddit.uk 11 months ago
So they have automated Fox then.
netchami@sh.itjust.works 11 months ago
Yeah, pretty much.
guacupado@lemmy.world 11 months ago
More data fixes that flaw, not less.
netchami@sh.itjust.works 11 months ago
Not too long ago, ChatGPT didn’t know what year it is. You’re telling me it needs more data than it already has to figure out the current year? I like AI for certain things (mostly some programming/scripting stuff) but you definitely don’t need it to read the news.
CurlyMoustache@lemmy.world 11 months ago
It is not “a flaw”, it is the way language learning models work. They try to replicate how humans write by guessing based on a language model. It has no knowledge of what is a fact or not, and that is why using LLMs to do research or use them as a search engine is both stupid and dangerous
Natanael@slrpnk.net 11 months ago
It’s not more data, the underlying architecture isn’t designed for handling facts
Touching_Grass@lemmy.world 11 months ago
You just described news
Apollo@sh.itjust.works 11 months ago
Who get there news from chatgpt lol
FlyingSquid@lemmy.world 11 months ago
A disturbing number of people.
spez_@lemmy.world 11 months ago
I do
Apollo@sh.itjust.works 11 months ago
Why?
Touching_Grass@lemmy.world 11 months ago
You don’t get your news from it but building tools can be useful. Scrapping news websites to measure different articles for tthinga like semantic analysis or identify media tricks that manipulate readers is a fun practice. You can use llm to identify propaganda much easier. I can get why media would be scared that regular people can run these tools on their propaganda machine easily.
C4d@lemmy.world 11 months ago
The pure ChatGPT output would probably be garbage. The dataset will be full of all manner of sources (together with their inherent biases) other with spin, untruths and outright parody and it’s not apparent that there is any kind of curation or quality assurance on the dataset (please correct me if I’m wrong).
I don’t think it’s a good tool for extracting factual information from. It does seem to be good at synthesising prose and helping with writing ideas.
I am quite interested in things like this where the output from a “knowledge engine” is paired with something like ChatGPT - but it would be for eg writing a science paper rather than news.
Touching_Grass@lemmy.world 11 months ago
I don’t think its generating news. Sounds like people are using it to reformat articles already writing to remove all the bullshit propganada from the news. Like taking a fox news article and just pulling out key information