Comment on I totally missed the point when PeerTube got so good
deranger@sh.itjust.works 1 day agoI’d argue that it’s very important, especially since more and more people are using it. Wikipedia is generally correct and people, myself included, edit incorrect things. ChatGPT is a black box and there’s no user feedback. It’s also stupid to waste resources to run an inefficient LLM that a regular search and a few minutes of time, along with like a bite of an apple worth of energy, could easily handle. After all that, you’re going to need to check all those sources chatGPT used anyways, so how much time is it really saving you? At least with Wikipedia I know other people have looked at the same things I’m looking at, and a small percentage of those people will actually correct errors.
agamemnonymous@sh.itjust.works 1 day ago
From what I can tell, running an LLM isn’t really all that energy intensive, it’s the training that takes loads of energy. And it’s not like regular searches don’t use loads of energy to initially index web results.
And this also ignores the gap between having a question, and knowing how to search for the answer. You might not even know where to start. Maybe you can search a vague question, but you’re essentially hoping that somewhere in the first few results is a relevant discussion to get you on the right path. GPT, I find, is more efficient for getting from vague questions to more directed queries.
I find this attitude much more troubling than responsible LLM use. You should not be trusting tertiary sources, no matter how good their track record, you should be checking the sources used by Wikipedia too. You should always be checking your sources.
That’s beyond the scope of my argument, and not really much worse than pasting directly from any tertiary source.