Because the tech industry hasn’t had a real hit of it’s favorite poison “private equity” in too long.
The industry has played the same playbook since at least 2006. Likely before, but that’s when I personally stated seeing it. My take is that they got addicted to the dotcom bubble and decided they can and should recreate the magic evey 3-5 years or so.
This time it’s AI, last it was crypto, and we’ve had web 2.0, 3.0, and a few others I’m likely missing.
But yeah, it’s sold like a panacea every time, when really it’s revolutionary for like a handful of tasks.
sugar_in_your_tea@sh.itjust.works 1 month ago
Exactly! LLMs are useful when used properly, and terrible when not used properly, like any other tool. Here are some things they’re great at:
Some things it’s terrible at:
I use LLMs a handful of times a week, and pretty much only when I’m stuck and need a kick in a new (hopefully right) direction.
spankmonkey@lemmy.world 1 month ago
I used to be able to use Google and other search engines to do these things before they went to shit in the pursuit of AI integration.
sugar_in_your_tea@sh.itjust.works 1 month ago
Google search was pretty bad at each of those, even when it was good. Finding new keywords to use is especially difficult the more niche your area of search is, and I’ve spent hours trying different combinations until I found a handful of specific keywords that worked.
Likewise, search is bad for getting a broad summary, unless someone has bothered to write it on a blog. But most information goes way too deep and you still need multiple sources to get there.
Fact lookup is one the better uses for search, but again, I usually need to remember which source had what I wanted, whereas the LLM can usually pull it out for me.
I use traditional search most of the time (usually DuckDuckGo), and LLMs if I think it’ll be more effective. We have some local models at work that I use, and they’re pretty helpful most of the time.
jjjalljs@ttrpg.network 1 month ago
It is absolutely stupid, stupid to the tune of “you shouldn’t be a decision maker”, to think an LLM is a better use for “getting a quick intro to an unfamiliar topic” than reading an actual intro on an unfamiliar topic. For most topics, wikipedia is right there, complete with sources. For obscure things, an LLM is just going to lie to you.
As for “looking up facts when you have trouble remembering it”, using the lie machine is a terrible idea. It’s going to say something plausible, and you tautologically are not in a position to verify it. And, as above, you’d be better off finding a reputable source. If I type in “how do i strip whitespace in python?” an LLM could very well say “it’s your_string.strip()”. That’s wrong. Just send me to the fucking official docs.
There are probably edge or special cases, but for general search on the web? LLMs are worse than search.
spankmonkey@lemmy.world 1 month ago
No search engine or AI will be great with vague descriptions of niche subjects because by definition niche subjects are too uncommon to have a common pattern of ‘close enough’.
LePoisson@lemmy.world 1 month ago
I will say I’ve found LLM useful for code writing but I’m not coding anything real at work. Just bullshit like SQL queries or Excel macro scripts or Power Automate crap.
It still fucks up but if you can read code and have a feel for it you can walk it where it needs to be (and see where it screwed up)
sugar_in_your_tea@sh.itjust.works 1 month ago
Exactly. Vibe coding is bad, but generating code for something you don’t touch often but can absolutely understand is totally fine. I’ve used it to generate SQL queries for relatively odd cases, such as CTEs for improving performance for large queries with common sub-queries. I always forget the syntax, and LLMs are great at generating something reasonable that I can tweak for my tables.
LePoisson@lemmy.world 1 month ago
Me with literally everything code I touch always and forever.