Lol. Humans are just moving up on the stack. I’m sure some people were upset about how we wouldn’t need electrical engineers anymore once digital circuits were invented. AI is a tool, without a trained user a tool is almost useless.
End of coding? Microsoft framework makes devs AI supervisors
Submitted 7 months ago by floofloof@lemmy.ca to technology@lemmy.world
Comments
AdamEatsAss@lemmy.world 7 months ago
vanderbilt@lemmy.world 7 months ago
I use Claude to write plenty of the code we use, but it comes with the huge caveat that you can’t blindly accept what it says. Ever hear newscasters talk about some hacker thing and wonder how they get it so wrong? Same thing with AI code sometimes. If you can code you can tell what it does wrong.
ptz@dubvee.org 7 months ago
Is that why Windows 11 sucks so much?
tsonfeir@lemm.ee 7 months ago
Bugs. Bugs. Bugs.
AI is fine as an assistant, or to brainstorm ideas, but don’t let it run wild, or take control.
CarbonatedPastaSauce@lemmy.world 7 months ago
I write automation code for devops stuff. I’ve tried to use ChatGPT several times for code, and it has never produced anything of even mild complexity that would work without modification. It loves to hallucinate functions, methods, and parameters that don’t exist.
It’s very good for helping point you in the right direction, especially for people just learning. But at the level it’s at now (and all the articles saying we’re already seeing diminishing returns with LLMs) it won’t be replacing any but the worst coders out there any time soon.
QuadratureSurfer@lemmy.world 7 months ago
It’s great for Pseudo code. But I prefer to use a local LLM that’s been fine tuned for coding. It doesn’t seem to hallucinate functions/methods/parameters anywhere near as much as when I was using ChatGPT… but admittedly I haven’t used ChatGPT for coding in a while.
I don’t ask it to solve the entire problem, I mostly just work with it to come up with bits of code here and there. Basically, it can partially replace stack overflow. It can save time for some cases for sure, but companies are severely overestimating LLMs if they think they can replace coders with it in its current state.
Pantherina@feddit.de 7 months ago
Could you recommend one?
tal@lemmy.today 7 months ago
I can believe that they manage to get useful general code out of an AI, but I don’t think that it’s gonna be as simple as just training an LLM on English-code mapping. Like, part of the job is gonna be identifying edge conditions, and that can’t be just derived from the English alone; or from a lot of code. It has to have some kind of deep understanding of the subject matter on which it’s working.
Might be able to find limited-domain tasks where you can use an LLM.
But I think that a general solution will require not just knowing the English task description and a lot of code.
Cryan24@lemmy.world 7 months ago
It’s good for doing the boilerplate code for you that’s about it… you still need a human to do the thinking on the hard stuff.
TimeSquirrel@kbin.social 7 months ago
Context-aware AI is where it's at. One that's
integrated into your IDE and can see your entire codebase and offer suggestions with functions and variables that actually match the ones in your libraries. Github Copilot does this.
7heo@lemmy.ml 7 months ago
The thing is, devops is pretty complex and pretty diverse. You’ve got at least 6 different solutions among the popular ones.
Last time I checked only the list of available provisioning software, I counted 22.
Sure, some like
cdist
are pretty niche, but still, when you apply for a company, even tho it is going to either be AWS (mostly), azure, GCE, oracle, or some run of the mill VPS provider with extended cloud features (simili S3 based on minio, “cloud LAN”, etc), and you are likely going to use terraform for host provisioning, the most relevant information to check is which software they use. Packer? Or dynamic provisioning like Chef? Puppet? Ansible? Salt? Or one of the “lesser ones”?And thing is, even among successive versions, among compatible stacks, the DSL evolved, and the way things are supposed to be done changed. For example, before hiera, puppet was an entirely different beast.
And that’s not even throwing docker or (or rkt, appc) in the mix. Then you have k8s, podman, helm, etc.
The entire ecosystem has considerable overlap too.
So, on one hand, you have pretty clean and useable code snippets on stackoverflow, github gist, etc. So much so that tools like that emerged… And then, the very second LLMs were able to produce any moderately usable output, they were trained on that data.
And on the other hand, you have devops. An ecosystem with no clear boundaries, no clear organisation, not much maturity yet (in spite of the industry being more than a decade old), and so organic that keeping up with developments is a full time job on its own. There’s no chance in hell LLMs can be properly trained on that dataset before it cools down. Not a chance. Never gonna happen.