Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

New Junior Developers Can’t Actually Code.

⁨0⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨year⁩ ago⁩ by ⁨Cat@ponder.cat⁩ to ⁨technology@lemmy.world⁩

https://nmn.gl/blog/ai-and-learning

source

Comments

Sort:hotnewtop
  • phoenixz@lemmy.ca ⁨1⁩ ⁨year⁩ ago

    To be fair, most never could. I’ve been hiring junior devs for decades now, and all the ones straight out of university barely had any coding skills .

    Its why I stopped looking at where they studied, I always first check their hobbies. if one of the hobbies is something nerdy and useless, tinkering with a raspberry or something, that indicates to me it’s someone who loves coding and probably is already reasonably good at it

    source
    • UnderpantsWeevil@lemmy.world ⁨1⁩ ⁨year⁩ ago

      Nevermind how cybersecurity is a niche field that can vary by use case and environment.

      At some level, you’ll need to learn the security system of your company (or the lack there of) and the tools used by your department.

      There is no class you can take that’s going to give you more than broad theory.

      source
  • RamenJunkie@midwest.social ⁨1⁩ ⁨year⁩ ago

    I am not a professional coder, just a hobbyist, but I am increasingly digging into Cybersecurity concepts.

    And even as an “amature Cybersecurity” person, everything about what you describe, and LLM coders, terrifies me, because that shit is never going to have any proper security methodology implemented.

    source
    • phlegmy@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

      On the bright side, you might be able to cash in on some bug bounties.

      source
  • endeavor@sopuli.xyz ⁨1⁩ ⁨year⁩ ago

    Im in uni learning to code right now but since I’m a boomer i only spin up oligarch bots every once in a while to check for an issue that I would have to ask the teacher. It’s far more important for me to understand fundies than it is to get a working program. But that is only because ive gotten good at many other skills and realize that fundies are fundamental for a reason.

    source
  • socsa@piefed.social ⁨1⁩ ⁨year⁩ ago

    This isn't a new thing. Dilution of "programmer" and "computer" education has been going on for a long time. Everyone with an IT certificate is an engineer th se days.

    For millennials, a "dev" was pretty much anyone with reasonable intelligence who wanted to write code - it is actually very easy to learn the basics and fake your way into it with no formal education. Now we are even moving on from that to where a "dev" is anyone who can an AI. "Prompt Engineering."

    source
    • frezik@midwest.social ⁨1⁩ ⁨year⁩ ago

      “Prompt Engineer” makes a little vomit appear in the back of my mouth.

      source
  • Phoenicianpirate@lemm.ee ⁨1⁩ ⁨year⁩ ago

    I could have been a junior dev that could code. I learned to do it before ChatGPT. I just never got the job.

    source
  • Matriks404@lemmy.world ⁨1⁩ ⁨year⁩ ago

    No wonder open source software becomes more efficient than proprietary one.

    source
  • drathvedro@lemm.ee ⁨1⁩ ⁨year⁩ ago

    This post is literally an ad for AI tools.

    No, thanks. Call me when they actually get good. As it stands, they only offer marginally better autocomplete.

    I should probably start collecting dumb AI suggestions and gaslighting answers to show the next time I encounter this topic…

    source
    • finitebanjo@lemmy.world ⁨1⁩ ⁨year⁩ ago

      It’s actually complaining about AI, tho.

      source
      • drathvedro@lemm.ee ⁨1⁩ ⁨year⁩ ago

        There are at least four links leading to AI tools in this page. Why would you link something when you complain about it?

        source
        • -> View More Comments
  • frequenttimetraveler@lemmy.world ⁨1⁩ ⁨year⁩ ago

    that s the point of being junior. Then problems show up and they are forcing them to learn to solve them

    source
  • LibreHans@lemmy.world ⁨1⁩ ⁨year⁩ ago

    All I hear is “I’m bad at mentoring”

    source
    • Saleh@feddit.org ⁨1⁩ ⁨year⁩ ago

      There is only so much mentoring can do though. You can have the best math prof. You still need to put in the exercise to solve your differential equations to get good at it.

      source
      • endeavor@sopuli.xyz ⁨1⁩ ⁨year⁩ ago

        You get out of education what you put into it. You won’t be an artist from the best art school if you do the bare minimum to pass. You can end up as a legend of the industry coming from a noname school.

        source
    • Valmond@lemmy.world ⁨1⁩ ⁨year⁩ ago

      And some sort of “no one wants to work any more”.

      I know young brilliant people, maybe they have to be paid correctly?

      source
  • filister@lemmy.world ⁨1⁩ ⁨year⁩ ago

    The problem is not only the coding but the thinking. The AI revolution will give birth to a lot more people without critical thinking and problem solving capabilities.

    source
    • OrekiWoof@lemmy.ml ⁨1⁩ ⁨year⁩ ago

      apart from that, learning programming went from something one does out of calling, to something one does to get a job. The percentage of programmers that actually like coding is going down, so on average they’re going to be worse

      source
    • commander@lemmings.world ⁨1⁩ ⁨year⁩ ago

      That’s the point.

      Along with censorship.

      source
  • froggycar360@slrpnk.net ⁨1⁩ ⁨year⁩ ago

    I could barely code when I landed my job and now I’m a senior dev. It’s saying a plumber’s apprentice can’t plumb - you learn on the job.

    source
    • FlyingSquid@lemmy.world ⁨1⁩ ⁨year⁩ ago

      You’re not learning anything if Copilot is doing it for you. That’s the point.

      source
      • froggycar360@slrpnk.net ⁨1⁩ ⁨year⁩ ago

        That’s true, it can only get you so far. I’m sure we all started by Frankenstein-ing stack overflow answers together until we had to actually learn the “why”

        source
      • Mr_Dr_Oink@lemmy.world ⁨1⁩ ⁨year⁩ ago

        100% agree.

        I dont think there is no place for AI as an aid to help you find the solution, but i dont think it’s going to help you learn if you just ask it for the answers.

        For example, yesterday, i was trying to find out why a policy map on a cisco switch wasn’t re-activating after my radius server came back up. Instead of throwing my map at the AI and asking whats wrong l, i asked it details about how a policy map is activated, and about what mechanism the switch uses to determine the status of the radius server and how a policy map can leverage that to kick into gear again.

        Ultimately, AI didn’t have the answer, but it put me on the right track, and i believe i solved the issue. It seems that the switch didnt count me adding the radius server to the running config as a server coming back alive but if i put in a fake server and instead altered the IP to a real server then the switch saw this as the server coming back alive and authentication started again.

        In fact, some of the info, it gave me along the way was wrong. Like when it tried to give me cli commands that i already knew wouldn’t work because i was using the newer C3PL AAA commands, but it was mixing them up with the legacy commands and combining them together. Even after i told it that was a made-up command and why it wouldn’t work, it still tried to give me the command again later.

        So, i dont think it’s a good tool for producing actual work, but it can be a good tool to help us learn things if it is used that way. To ask “why” and “how” instead of “what”.

        source
  • pls@lemmy.plaureano.nohost.me ⁨1⁩ ⁨year⁩ ago

    Of course they don’t. Hiring junior devs for their hard skills is a dumb proposition. Hire for their soft skills, intellectual curiosity, and willingness to work hard and learn. There is no substitute for good training and experience.

    source
  • zerofk@lemm.ee ⁨1⁩ ⁨year⁩ ago

    As someone word who’s interviewed candidates for developer jobs for over a decade: this sounds like “in my day everything was better”.

    Yes, there are plenty of candidates who can’t explain the piece of code they copied from Copilot. But guess what? A few years ago there were plenty of candidates who couldn’t explain the code they copied from StackOverflow. And before that, there were those who failed at the basic programming test we gave them.

    We don’t hire those people. We hire the ones who use the tools at their disposal and also show they understand what they’re doing. The tools change, the requirements do not.

    source
    • filister@lemmy.world ⁨1⁩ ⁨year⁩ ago

      But how do you find those people solely based on a short interview, where they can use AI tools to perform better if the interview is not held in person?

      And mind you the SO was better because you needed to read a lot of answers there and try to understand what would work in your particular case. Learn how to ask smartly. Do your homework and explain the question properly so as not to get gaslit, etc. this is all now gone.

      source
      • nossaquesapao@lemmy.eco.br ⁨1⁩ ⁨year⁩ ago

        That’s simple. They use an LLM to find the right people for the job /s

        source
      • major_jellyfish@lemmy.ca ⁨1⁩ ⁨year⁩ ago

        Pretty easy to come up with problems that chatGPT is useless at. You can test it pretty easily. Throw enough constraints at it and the transformer starts to loose attention and forget vital parts.

        With a bit of effort you can make problems where chatGPT will actuallt give a misleading answer and candidates have to think critically.

        Just like in the past it was pretty easy to come up with problems which werent easily found on SO.

        Same landscape. If you put in the time and the effort to have a solid recruitment process, you get solid devs. If you have a lazy and shitty process, you get shitty devs.

        source
      • tinkling4938@lemmynsfw.com ⁨1⁩ ⁨year⁩ ago

        Evil me: Ask questions to which there is no solution but ChatGPT will happily give incorrect solutions to and will run itself in circles trying to answer correctly as you feed it error messages.

        source
    • uranibaba@lemmy.world ⁨1⁩ ⁨year⁩ ago

      I think that LLMs just made it easier for people who want to know but not learn to know. Reading all those posts all over the internet required you to understand what you pasted together if you wanted it to work (not always but the barr was higher). With ChatGPT, you can just throw errors at it until you have the code you want.

      While the requirements never changed, the tools sure did and they made it a lot easier to not understand.

      source
      • major_jellyfish@lemmy.ca ⁨1⁩ ⁨year⁩ ago

        Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.

        It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.

        If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.

        And then you get people like OP, blaming the generation while if anything its them and their company to blame… for falling behind. Got to keep up folks. Our field moves fast.

        source
        • -> View More Comments
  • maximilian@lemmy.ml ⁨1⁩ ⁨year⁩ ago

    Has anyone else clicked the chat.com url in the article …

    source
  • 7fb2adfb45bafcc01c80@lemmy.world ⁨1⁩ ⁨year⁩ ago

    To me, I feel like this is a problem perpetuated by management. I see it on the system administration side as well – they don’t care if people understand why a tool works; they just want someone who can run it. If there’s no free thought the people are interchangeable and easily replaced.

    I often see it farmed out to vendors when actual thought is required, and it’s maddening.

    source
    • icmpecho@lemmy.ml ⁨1⁩ ⁨year⁩ ago

      i always found this to be upsetting as an IT tech at a former company - when a network or server had an issue and i was sent to resolve it, it was a “just reboot it” fix, which never kept the problem from recurring and bringing the server down at 07:00 the next Monday.

      the limitations on the questions i could ask hurt that SLA more than any network switch’s memory leak ever did, and i felt as if my expertise meant nothing as a result.

      source
  • Evotech@lemmy.world ⁨1⁩ ⁨year⁩ ago

    They never could

    source
    • invertedspear@lemm.ee ⁨1⁩ ⁨year⁩ ago

      Exactly, the jr dev that could write anything useful is a rare gem. Boot camps cranking out jr dev by the dozens every couple of months didn’t help the issue. Talent needs cultivation, and since every tech company has been cutting back lately, they stopped cultivating and started sniping talent from each other. Not hard given the amount of layoffs lately. So now we have jr devs either unable to find a place to refine them, or getting hired by people who just want to save money and don’t know that you need a senior or two to wrangle them. Then chat gpt comes along and gives the illusion of sr dev advice, telling them how to write the wrong thing better, no one to teach them which tool is the right one for the job.

      Our industry is in kind of a fucked state and will be for a while. Get good at cleaning up the messes that will be left behind and that will keep you fed for the next decade.

      source
      • Evotech@lemmy.world ⁨1⁩ ⁨year⁩ ago

        Not that this is very unique to the field, but junior anything usually needs at least 6 months to get to a productive level.

        source
        • -> View More Comments
  • GreenKnight23@lemmy.world ⁨1⁩ ⁨year⁩ ago
    [deleted]
    source
    • WagyuSneakers@lemmy.world ⁨1⁩ ⁨year⁩ ago

      People who would have gone into finance or received an MBA have been going to tech for a decade now. Every one of them pushes out someone who would have been a real developer.

      I’ve also had the pleasure of watching a lot of the generation who’s now complaining as they grew through their journey as developers. I think a lot of them are sugar coating their own abilities. I struggled with many a now illustrious developer whole they banged their head against the wall for hours.

      source
  • nexguy@lemmy.world ⁨1⁩ ⁨year⁩ ago

    Stack Overflow and Google were once the “AI” of the previous generation. “These kids can’t code, they just copy what others have done”

    source
    • lightnsfw@reddthat.com ⁨1⁩ ⁨year⁩ ago

      As someone who can’t code (not a developer) but occasionally needs to dip my toes in it. I’ve learned quite a bit from using chatgpt and then picking apart whatever it shat out to figure out why it’s not working. It’s still better than me starting from scratch on whatever it is I’m working on because usually I don’t even know where to begin.

      source
    • Feathercrown@lemmy.world ⁨1⁩ ⁨year⁩ ago

      Yeah, and copy-pasting SO answers with no thought is just as bad.

      source
      • embed_me@programming.dev ⁨1⁩ ⁨year⁩ ago

        And when copy-pasting didn’t work, those who dared to rise above and understand it, became better. Same with AI, those of the new generation who see through the slop will learn. It’s the same as it has always been. Software engineering is more accessible than ever, say what you will about the current landscape of software engineering but that fact remains undeniable.

        source
        • -> View More Comments
  • JustJack23@slrpnk.net ⁨1⁩ ⁨year⁩ ago

    Very “back in my day” energy.

    I do not support AI but programming is about solving problems and not writing code.

    If we are concentrating on tool, no developers and use punched card as well. Is that a bad thing?

    source
    • maniclucky@lemmy.world ⁨1⁩ ⁨year⁩ ago

      You’re right in that the goal is problem solving, you’re wrong that inability to code isn’t a problem.

      AI can make a for loop and do common tasks but the moment you have something halfway novel to do, it has a habit of shitting itself and pretending that the feces is good code. And if you can’t read code, you can’t tell the shit from the stuff you want.

      It may be able to do it in the future but it can’t yet

      Source: data engineer who has fought his AI a time or two.

      source
      • JustJack23@slrpnk.net ⁨1⁩ ⁨year⁩ ago

        Of course I use as well on a daily basis for coding and AI is shit.

        Again, I in no way support AI, I just think that the argument made in the article is also not good.

        source
  • barsoap@lemm.ee ⁨1⁩ ⁨year⁩ ago

    Not in any way a new phenomenon, there’s a reason fizzbuzz was invented, there’s been a steady stream of CS graduates who can’t code their way out of a wet paper bag ever since the profession hit the mainstream.

    Actually fucking interview your candidates, especially if you’re sourcing candidates from a country with for-profit education and/or rote learning cultures, both of which suck when it comes to failing people who didn’t learn anything. No BS coding tests go for “explain this code to me” kind of stuff, worst case they can understand code but suck at producing it, that’s still prime QA material right there.

    source
    • sugar_in_your_tea@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

      We do two “code challenges”:

      1. Very simple, many are done in 5 min; this just weeds out the incompetent applicants, and 90% of the code is written (i.e. simulate working in an existing codebase)
      2. Ambiguous requirements, the point is to ask questions, and we actually have different branches depending on assumptions they made (to challenge their assumptions); i.e. simulate building a solution with product team

      The first is in the first round, the second is in the technical interview. Neither are difficult, and we provide any equations they’ll need.

      It’s much more important that they can reason about requirements than code something quick, because life won’t give you firm requirements, and we don’t want a ton of back and forth with product team if we can avoid it, so we need to catch most of that at the start.

      In short, we’re looking for actual software engineers, not code monkeys.

      source
      • gothic_lemons@lemmy.world ⁨1⁩ ⁨year⁩ ago

        Sounds nice? What type of place you work at? I’m guess not a big corp

        source
        • -> View More Comments
      • mindaika@lemmy.dbzer0.com ⁨1⁩ ⁨year⁩ ago

        Most hiring managers are looking for unicorns

        source
      • barsoap@lemm.ee ⁨1⁩ ⁨year⁩ ago

        Those are good approaches, I would note that the “90% is written” one is mostly about code comprehension, not writing (as in: Actually architect something), and the requirement thing is a thing that you should, IMO, learn as a junior, it’s not a prerequisite. It needs a lot of experience, and often domain knowledge new candidates have no chance of having. But, then, throwing such stuff at them and then judging them by their approach, not end result, should be fair.

        The main question I ask myself, in general, is “can this person look at code from different angles”. Somewhat like rotating a cube in your mind’s eye if you get what I mean. And it might even that they’re no good at it, but they demonstrate the ability when talking about coffee making. People who don’t get lost when you’re talking about cash registers having a common queue having better overall latency than cash registers with individual queues. Just as a carpenter would ask someone “do you like working with your hands”, the question is “do you like to rotate implication structures in your mind”.

        source
        • -> View More Comments
  • FarceOfWill@infosec.pub ⁨1⁩ ⁨year⁩ ago

    Junior Dev’s could never code, yes including us

    source
    • jballs@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

      Oddly enough, on my first development project I was paired with a “senior dev” who turned out just to be a guy in his 60s who had never actually coded before, so… just a senior.

      I ended up doing 100% of the coding, but the guy managed to keep his job for a few months.

      source
    • sugar_in_your_tea@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

      Agreed. I was hired for my first job due to an impressive demo, and making that demo became my job. I got there, but I produced a ton of tech debt in the process.

      source
    • patatahooligan@lemmy.world ⁨1⁩ ⁨year⁩ ago

      Agreed. A few year back the devs looking for quick fixes would go over to StackOverflow and just copy answers without reading explanations. This caused the same type of problems that OP is talking about. That said, the ease of AI might be making things even worse.

      source
      • vatlark@lemmy.world ⁨1⁩ ⁨year⁩ ago

        Hell, I would copy the question sometimes :P

        source
  • spark947@lemm.ee ⁨1⁩ ⁨year⁩ ago

    What are you guys working on where chatgpt can figure it out? Honestly, I haven’t been able to get a scrap of working code beyond a trivial example out of that thing or any other LLM.

    source
    • uranibaba@lemmy.world ⁨1⁩ ⁨year⁩ ago

      ChatGPT is perfect for learning Delphi.

      source
    • sugar_in_your_tea@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

      Same. It can generate credible-looking code, but I don’t find it very useful. Here’s what I’ve tried:

      • describe a function - takes longer to read the explanation than grok the code
      • generate tests - hallucinates arguments, doesn’t do proper boundary checks, etc
      • looking up docs - mostly useful to find search terms for the real docs

      The second was kind of useful since it provided the structure, but I still replaced 90% of it.

      I’m still messing with it, but beyond solving “blank page syndrome,” it’s not that great. And for that, I mostly just copy something from elsewhere in the project anyway, which is often faster than going to the LLM.

      source
    • CeeBee_Eh@lemmy.world ⁨1⁩ ⁨year⁩ ago

      I’ve been using (mostly) Claude to help me write an application in a language I’m not experienced with (Rust). Mostly with helping me see what I did wrong with syntax or with the borrow checker. Coming from Java, Python, and C/C++, it’s very easy to mismanage memory the exact way Rust requires it.

      That being said, any new code that generates for me I end up having to fix 9 times out of 10. So in a weird way I’ve been learning more about Rust from having to correct code that’s been generated by an LLM.

      I still think LLMs for the next while will be mostly useful as a hyper-spell checker for code, and not for generating new code. I often find that I would have saved time if I just tackled the problem myself and not tried to reply on an LLM. Although sometimes an LLM can give me an idea on how to solve a problem.

      source
    • 0x0@programming.dev ⁨1⁩ ⁨year⁩ ago

      I’m forced to use Copilot at work and as far as code completion goes, it gets it right 10-15% of the times… the rest of the time it just suggests random — credible-looking — noise or hallucinates variables and shit.

      source
      • expr@programming.dev ⁨1⁩ ⁨year⁩ ago

        Forced to use copilot? Wtf?

        I would quit, immediately.

        source
        • -> View More Comments
    • Thorry84@feddit.nl ⁨1⁩ ⁨year⁩ ago

      Agreed. I wanted to test a new config in my router yesterday, which is configured using scripts. So I thought it would be a good idea for ChatGPT to figure it out for me, instead of 3 hours of me reading documentation and trying tutorials. It was a test scenario, so I thought it might do well.

      It did not do well at all. The scripts were mostly correct but often in the wrong order (referencing a thing before actually defining it). Sometimes the syntax would be totally wrong and it kept mixing version 6 syntax with version 7 syntax (I’m on 7). It will also make mistakes and when I point out the mistake it says Oh you are totally right, I made a mistake. Then goes on to explain what mistake it did and output new code. However more often than not the new code contained the exact same mistake. This is probably because of a lack of training data, where it is referencing only one example and that example just had a mistake in it.

      In the end I gave up on ChatGPT, searched for my testscenario and it turned out a friendly dude on a forum put together a tutorial. So I followed that and it almost worked right away. A couple of minutes of tweaking and testing and I got it working.

      I’m afraid for a future where forums and such don’t exist and sources like Reddit get fucked and nuked. In an AI driven world the incentive for creating new original content is way lower. So when AI doesn’t know the answer, you are just hooped and have to re-invent the wheel yourself. In the long run this will destroy productivity and not give the gains people are hoping for at the moment.

      source
      • baltakatei@sopuli.xyz ⁨1⁩ ⁨year⁩ ago

        It’s like useful information grows as fruit from trees in a digital forest we call the Internet. However, the fruit spoils over time (becomes less relevant) and requires fertile soil (educated people being online) that can be eroded away (not investing in education or infrastructure) or paved over (intellectual property law). LLMs are like processed food created in factories that lack key characteristics of more nutritious fresh ingredients you can find at a farmer’s market. Sure, you can feed more people (provide faster answers to questions) by growing a monocrop (training your LLM on a handful of generous people who publish under Creative Commons licenses like CC BY-SA on Stack Overflow), but you also risk a plague destroying your industry like how the Panama disease fungus destroyed nearly all Gros Michel banana farming (companies firing those generous software developers who “waste time” by volunteering to communities like Stack Overflow and replacing them with LLMs).

        There’s some solar punk ethical fusion of LLMs and sustainable cultivation of high quality information, but we’re definitely not there yet.

        source
        • -> View More Comments
    • daniskarma@lemmy.dbzer0.com ⁨1⁩ ⁨year⁩ ago

      I used it a few days ago to translate a math formula into code.

      Here is the formula: wikimedia.org/…/126b6117904ad47459ad0caa791f296e6…

      It’s not the most complicated thing. I could have done it. But it would take me some time. I just input the formula directly, the desired language and the result was well done and worked flawlessly.

      It saved me some time.

      source
    • theterrasque@infosec.pub ⁨1⁩ ⁨year⁩ ago

      When I had to get up to speed on a new language, it was very helpful. It’s also great to write low to medium complexity scripts in python, powershell, bash, and making ansible tasks. That said I’ve been programming for ~30 years, and could have done those things myself if I needed, but it would take some time (a lot of it being looking up documentation and writing boilerplate code). It’s also nice to write C# unit tests.

      However, the times I’ve been stuck on my main languages, it’s been utterly useless.

      source
      • prettybunnys@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

        I love asking AI to generate a framework / structure for a project that I then barely use and then realize I shoulda just done it myself

        source
      • MagicShel@lemmy.zip ⁨1⁩ ⁨year⁩ ago

        ChatGPT is extremely useful if you already know what you’re doing. It’s garbage if you’re relying on it to write code for you. There are nearly always bugs and edge cases and hallucinations and version mismatches.

        It’s also probably useful for looking like you kinda know what you’re doing as a junior in a new project. I’ve seen some shit in code reviews that was clearly AI slop. Usually from exactly the developers you expect.

        source
        • -> View More Comments
  • TsarVul@lemmy.world ⁨1⁩ ⁨year⁩ ago

    I’m a little defeatist about it. I saw with my own 3 eyes how a junior asked ChatGPT how to insert something into an std::unordered_map. I tell them about cppreference. The little shit tells me “Sorry unc, ChatGPT is objectively more efficient”. I almost blew a fucking gasket, mainly cuz I’m not that god damn old. I don’t care how much you try to convince me that LLMs are efficient, there is no shot they are more efficient than opening a static page with all the info you would ever need. Not even considering energy efficiency. Utility aside, the damage we have dealt to developing minds is irreversible. We have convinced them that thought is optional. This is gonna bite us in the ass. Hard.

    source
    • nossaquesapao@lemmy.eco.br ⁨1⁩ ⁨year⁩ ago

      Might sound a bit unrelated, but have you been noticing an apparent rise on ageism too? The social media seem to be fueling it for some reason.

      source
    • tigeruppercut@lemmy.zip ⁨1⁩ ⁨year⁩ ago

      Make the junior put it to the test John Henry style. You code something while they use gpt and see who comes up with a working version first

      source
    • celia@lemmy.blahaj.zone ⁨1⁩ ⁨year⁩ ago

      I work at a software development school, and ChatGPT does a lot of damage here too. We try to teach that using it as a tool to help learning is different from using it as a “full project code generator”, but the speed advantages it provides makes it irresistible from many students’ perspective. I’ve lost many students last year because they couldn’t pass a simple code exam (think FizzBuzz difficulty level) because they had no access to internet, and had to code in Emacs. We also can’t block access to it because it starts an endless game where they always find a way to access it.

      source
      • TsarVul@lemmy.world ⁨1⁩ ⁨year⁩ ago

        Damn, I forgot about the teaching aspect of programming. Must be hard. I can’t blame students for taking shortcuts when they’re almost assuredly swamped with other classwork and sleep-deprived, but still. This is where my defeatist comment comes in, because I genuinely think LLMs are here to stay. Like autocomplete, but dumber. Just gotta have students recognize when ChatGPT hallucinates solutions, I guess.

        source
  • Naich@lemmings.world ⁨1⁩ ⁨year⁩ ago

    Poisoning AI with backdoored code is surely a real risk now? I can see this getting quite nasty.

    source
  • ryven@lemmy.dbzer0.com ⁨1⁩ ⁨year⁩ ago

    Recently my friend was trying to get me to apply for a junior dev position. “I don’t have the right skills,” I said. “The biggest project I ever coded was a calculator for my Java final, in college, a decade and a half ago.”

    It did not occur to me that showing up without the skills and using a LLM to half ass it was an option!

    source
  • corsicanguppy@lemmy.ca ⁨1⁩ ⁨year⁩ ago

    I’ve said it before, but this is a 20-year-old problem.

    After Y2K, all those shops that over-porked on devs began shedding the most pricey ones; worse in ‘at will’ states.

    Who were those devs? Mentors. They shipped less code, closed fewer tickets, cost more, but their value wasn’t in tickets and code: it was investing in the next generation. And they had to go because #numbersGoUp

    And they left. And the first gen of devs with no mentorship joined and started their careers. No idea about edge cases, missing middles or memory management. No lint, no warnings, build and ship and fix the bugs as they come.

    And then another generation. And these were the true ‘lost boys’ of dev. C is dumb, C++ is dumb, perl is dumb, it’s all old, supply chain exploits don’t exist, I made it go so I’m done, fuck support, look at my numbers. It’s all low-attention span, baling wire and trophies because #numbersGoUp.

    And let’s be fair: they’re good at this game, the new way of working where it’s a fast finish, a head-pat, and someone else’s problem. That’s what the companies want, and that’s what they built.

    They say now that relying on Ai makes one never really exercise critical thought and problem-solving, and I see it when I’m forced to write fucking YAML for fucking Ansible. I let the GPTs do that for me, without worrying that I won’t learn to code YAML for Ansible. Coding YAML for Ansible is NEVER going to be on my list of things I want to remember. But we’re seeing people do that with actual work; with go and rust code, and yeah, no concept of why we want to check for completeness let alone a concept of how.

    One day these new devs will proudly install a patch in the RTOS flashed into your heart monitor and that annoying beep will go away. Sleep tight.

    source
  • avidamoeba@lemmy.ca ⁨1⁩ ⁨year⁩ ago

    Unless AI dramatically improves from where LLMs are today, I’m looking forward to the drastic shortage of experienced senior devs in a few years time.

    source
  • rottingleaf@lemmy.world ⁨1⁩ ⁨year⁩ ago

    One can classify approaches to progress in at least four most popular ways:

    The most dumb clueless jerks think that it’s replacing something known with something known and better. Progress enthusiasts, not knowing a single thing from areas they are enthusiastic about, are usually here.

    The careful and kinda intellectually limited people think that it’s replacing something known with something unknown. They can sour the mood, but are generally safe for those around them.

    The idealistic idiots think that it’s replacing something unknown with something known, that’s “order bringers” and revolutionaries. Everybody knows how revolutionaries do things, who doesn’t can look at Musk and DOGE.

    The only sane kind think that it’s replacing something unknown with something unknown. That is, that when replacing one thing with another thing you are breaking not only what you could see and have listed for replacement. Because nature doesn’t fscking care what you want to see.

    source