Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

The Great Software Quality Collapse: How We Normalized Catastrophe

⁨215⁩ ⁨likes⁩

Submitted ⁨⁨13⁩ ⁨hours⁩ ago⁩ by ⁨onehundredsixtynine@sh.itjust.works⁩ to ⁨technology@lemmy.world⁩

https://techtrenches.substack.com/p/the-great-software-quality-collapse

source

Comments

Sort:hotnewtop
  • ms_lane@lemmy.world ⁨23⁩ ⁨minutes⁩ ago

    Software quality collapse

    That started happening years ago.

    The developers of .net should be put on trial for crimes against humanity.

    source
  • odama626@lemmy.world ⁨1⁩ ⁨hour⁩ ago

    Accurate but ironically written by chatgpt

    source
  • vane@lemmy.world ⁨3⁩ ⁨hours⁩ ago

    Quality in this economy ? We need to fire some people to cut cost and use telemetry to make sure everyone that’s left uses AI to pay AI companies because our investors demand it because they invested all their money in AI and they see no return.

    source
  • IrateAnteater@sh.itjust.works ⁨12⁩ ⁨hours⁩ ago

    I think a substantial part of the problem is the employee turnover rates in the industry. It seems to be just accepted that everyone is going to jump to another company every couple years (usually due to companies not giving adequate raises). This leads to a situation where, consciously or subconsciously, noone really gives a shit about the product. Everyone does their job (and only their job, not a hint of anything extra), but they’re not going to take on major long term projects, because they’re already one foot out the door, looking for the next job. Shitty middle management of course drastically exacerbates the issue.

    I think that’s why there’s a lot of open source software that’s better than the corporate stuff. Half the time it’s just one person working on it, but they actually give a shit.

    source
    • MotoAsh@piefed.social ⁨11⁩ ⁨hours⁩ ago

      Definitely part of it. The other part is soooo many companies hire shit idiots out of college. Sure, they have a degree, but they’ve barely understood the concept of deep logic for four years in many cases, and virtually zero experience with ANY major framework or library.

      Then, dumb management puts them on tasks they’re not qualified for, add on that Agile development means “don’t solve any problem you don’t have to” for some fools, and… the result is the entire industry becomes full of functionally idiots.

      It’s the same problem with late-stage capitalism… Executives focus on money over longevity and the economy becomes way more tumultuous. The industry focuses way too hard on “move fast and break things” than making quality, and … here we are.

      source
      • Croquette@sh.itjust.works ⁨5⁩ ⁨hours⁩ ago

        My hot take : lots of projects would benefit from a traditional project management cycle instead of trying to force Agile on every projects.

        source
        • -> View More Comments
      • sp3ctr4l@lemmy.dbzer0.com ⁨9⁩ ⁨hours⁩ ago

        Shit idiots with enthusiasm could be trained, mentored, molded into assets for the company.

        Ala an apprenticeship structure or something similar, like how you need X years before you’re a journeyman at many hands on trades.

        But uh, nope, C suite could order something like that be implemented at any time.

        They don’t though.

        Because that would make next quarter projections not look as good.

        And because that would require actual leadership.

        This used to be how things largely worked in the software industry.

        But, as with many other industries, now finance runs everything, and they’re trapped in a system of their own making… but its not really trapped, because… they’ll still get a golden parachute no matter what happens, everyone else suffers, so that’s fine.

        source
        • -> View More Comments
      • WanderingThoughts@europe.pub ⁨10⁩ ⁨hours⁩ ago

        That’s “disrupting the industry” or “revolutionizing the way we do things” these days. The “move fast and break things” slogan has too much of a stink to it now.

        source
        • -> View More Comments
  • panda_abyss@lemmy.ca ⁨11⁩ ⁨hours⁩ ago

    I’ve been working at a small company where I own a lot of the code base.

    I got my boss to accept slower initial work that was more systemically designed, and now I can complete projects that would have taken weeks in a few days.

    The level of consistency and quality you get by building a proper foundation and doing things right has an insane payoff. And users notice too when they’re using products that work consistently and with low resources.

    source
    • Telorand@reddthat.com ⁨10⁩ ⁨hours⁩ ago

      This is one of the things that frustrates me about my current boss. He keeps talking about some future project that uses a new codebase we’re currently writing, at which point we’ll “clean it up and see what works and what doesn’t.” Meanwhile, he complains about my code and how it’s “too Pythonic,” what with my docstrings, functions for code reuse, and type hints.

      So I secretly maintain a second codebase with better documentation and optimization.

      source
      • panda_abyss@lemmy.ca ⁨9⁩ ⁨hours⁩ ago

        How can your code be too pythonic?

        Also type hints are the shit. Nothing better than hitting shift tab and getting completions and documentation.

        Even if you’re planning to migrate to a hypothetical new code base, getting a bunch of documented modules for free is a huge time saver.

        Also migrations fucking suck, you’re an idiot if you think that will solve your problems.

        source
  • neclimdul@lemmy.world ⁨7⁩ ⁨hours⁩ ago

    “AI just weaponized existing incompetence.”

    Daamn. Harsh but hard to argue with.

    source
    • PattyMcB@lemmy.world ⁨4⁩ ⁨hours⁩ ago

      Weaponized? Probably not. Amplified? ABSOLUTELY!

      source
      • _stranger_@lemmy.world ⁨4⁩ ⁨hours⁩ ago

        It’s like taping a knife to a crab. Redundant and clumsy, yet strangely intimidating

        source
        • -> View More Comments
  • chunes@lemmy.world ⁨9⁩ ⁨hours⁩ ago

    Software has a serious “one more lane will fix traffic” problem.

    Don’t give programmers better hardware or else they will write worse software. End of.

    source
    • fluckx@lemmy.world ⁨9⁩ ⁨hours⁩ ago

      This is very true. You don’t need a bigger database server, you need an index on that table you query all the time that’s doing full table scans.

      source
      • PattyMcB@lemmy.world ⁨4⁩ ⁨hours⁩ ago

        Or sharding on a particular column

        source
      • GenosseFlosse@feddit.org ⁨5⁩ ⁨hours⁩ ago

        You never worked onrold code. It’s never that simple in practice when you have to make changes to existing code without breaking or rewriting everything.

        Sometimes the client wants a new feature that cannot easily implement and has to do a lot of different DB lookups that you can not do in a single query. Sometimes your controller loops over 10000 DB records, and you call a function 3 levels down that suddenly must spawn a new DB query each time it’s called, but you cannot change the parent DB query.

        source
        • -> View More Comments
  • geoff@midwest.social ⁨6⁩ ⁨hours⁩ ago

    Anyone else remember a few years ago when companies got rid of all their QA people because something something functional testing? Yeah.

    The uncontrolled growth in abstractions is also very real and very damaging, and now that companies are addicted to the pace of feature delivery this whole slipshod situation has made normal they can’t give it up.

    source
    • shalafi@lemmy.world ⁨4⁩ ⁨hours⁩ ago

      That was M$, not an industry thing.

      source
      • geoff@midwest.social ⁨3⁩ ⁨hours⁩ ago

        It was not just MS. There were those who followed that lead and announced that it was an industry thing.

        source
    • PattyMcB@lemmy.world ⁨4⁩ ⁨hours⁩ ago

      I must have missed that one

      source
  • PattyMcB@lemmy.world ⁨4⁩ ⁨hours⁩ ago

    Non-technical hiring managers are a bane for developers (and probably bad for any company). Just saying.

    source
  • themaninblack@lemmy.world ⁨1⁩ ⁨hour⁩ ago

    Being obtuse for a moment, let me just say: build it right!

    That means minimalism! No architecture astronauts! No unnecessary abstraction! No premature optimisation!

    Lean on opinionated frameworks so as to focus on coding the business rules!

    And for the love of all that is holy, have your developers sit next to the people that will be using the software!

    All of this will inherently reduce runaway algorithmic complexity, prevent the sort of artisanal work that causes leakiness and speed up your code.

    source
  • afk_strats@lemmy.world ⁨7⁩ ⁨hours⁩ ago

    Accept that quality matters more than velocity. Ship slower, ship working. The cost of fixing production disasters dwarfs the cost of proper development.

    This has been a struggle my entire career. Sometimes, the company listens. Sometimes they don’t. It’s a worthwhile fight but it is a systemic problem caused by management and short-term profit-seeking over healthy business growth

    source
    • dual_sport_dork@lemmy.world ⁨7⁩ ⁨hours⁩ ago

      “Apparently there’s never the money to do it right, but somehow there’s always the money to do it twice.”

      Management never likes to have this brought to their attention, especially in a Told You So tone of voice. One thinks if this bothered pointy-haired types so much, maybe they could learn from their mistakes once in a while.

      source
      • ozymandias117@lemmy.world ⁨6⁩ ⁨hours⁩ ago

        We’ll just set up another retrospective meeting and have a lessons learned.

        Then we won’t change anything based off the findings of the retro and lessons learned.

        source
        • -> View More Comments
      • tehn00bi@lemmy.world ⁨6⁩ ⁨hours⁩ ago

        Twice? Shiiiii

        source
        • -> View More Comments
    • HertzDentalBar@lemmy.blahaj.zone ⁨6⁩ ⁨hours⁩ ago

      That applies in so many industries 😅 like you want it done right… Or do you want it done now? Now will cost you 10x long term though…

      Welp now it is I guess.

      source
      • PattyMcB@lemmy.world ⁨4⁩ ⁨hours⁩ ago

        You can have it fast, you can have it cheap, or you can have it good (high quality), but you can only pick two.

        source
    • ryathal@sh.itjust.works ⁨3⁩ ⁨hours⁩ ago

      There’s levels to it. True quality isn’t worth it, absolute garbage costs a lot though. Some level that mostly works is the sweet spot.

      source
    • Blackmist@feddit.uk ⁨4⁩ ⁨hours⁩ ago

      The sad thing is that velocity pays the bills. Quality it seems, doesn’t matter a shit, and when it does, you can just patch up the bits people noticed.

      source
      • _stranger_@lemmy.world ⁨4⁩ ⁨hours⁩ ago

        This is survivorship bias. There’s probably uncountable shitty software that never got adopted. Hell, the E.T. video game was famous for it.

        source
  • panda_abyss@lemmy.ca ⁨11⁩ ⁨hours⁩ ago

    Fabricated 4,000 fake user profiles to cover up the deletion

    This has got to be a reinforcement learning issue, I had this happen the other day.

    I asked Claude to fix some tests, so it fixed the tests by commenting out the failures. I guess that’s a way of fixing them that nobody would ever ask for.

    Absolutely moronic. These tools do this regularly. It’s how they pass benchmarks.

    Also you can’t ask them why they did something, they have no capacity of introspection, they can’t read their input tokens, they just make up something that sounds plausible for “what were you thinking”.

    source
    • MelodiousFunk@slrpnk.net ⁨11⁩ ⁨hours⁩ ago

      Also you can’t ask them why they did something, they have no capacity of introspection, (…) they just make up something that sounds plausible for “what were you thinking”.

      It’s uncanny how it keeps becoming more human-like.

      source
      • MotoAsh@piefed.social ⁨10⁩ ⁨hours⁩ ago

        No. No it doesn’t, ALL humna-like behavior stems from its training data … that comes from humans.

        source
    • FishFace@lemmy.world ⁨9⁩ ⁨hours⁩ ago

      The model we have at work tries to work around this by including some checks. I assume they get farmed out to specialised models and receive the output of the first stage as input.

      Maybe it catches some stuff? It’s better than pretend reasoning but it’s very verbose so the stuff that I’ve experimented with - which should be simple and quick - ends up being more time consuming than it should be.

      source
      • panda_abyss@lemmy.ca ⁨7⁩ ⁨hours⁩ ago

        I’ve been thinking of having a small model like a long context qwen 4b run and do quick code review to check for these issues, then just correct the main model.

        It feels like a secondary model that only exists to validate that a task was actually completed could work.

        source
        • -> View More Comments
  • Pika@sh.itjust.works ⁨12⁩ ⁨hours⁩ ago

    I’m glad that they added CloudStrike into that article, because it adds a whole extra level of incompetency in the software field. CS as a whole should have never happens in the first place if Microsoft properly enforced their stance they claim they had regarding driver security and the kernel.

    The entire reason CS was able to create that systematic failure was because they were(still are?) abusing the system MS has in place to be able to sign kernel level drivers. The process dodges MS review for the driver by using a standalone driver that then live patches instead of requiring every update to be reviewed and certified. This type of system allowed for a live update that directly modified the kernel via the already certified driver. Remote injection of un-certified code should never have been allowed to be injected into a secure location in the first place. It was a failure on every level for both MS and CS.

    source
  • BroBot9000@lemmy.world ⁨10⁩ ⁨hours⁩ ago

    Don’t give clicks to substack blogs. Fucking Nazi enablers.

    source
  • cygnus@lemmy.ca ⁨13⁩ ⁨hours⁩ ago

    I wonder if this ties into our general disposability culture (throwing things away instead of repairing, etc)

    source
    • anamethatisnt@sopuli.xyz ⁨13⁩ ⁨hours⁩ ago

      That and also man hour costs versus hardware costs. It’s often cheaper to buy some extra ram than it is to pay someone to make the code more efficient.

      source
      • Sxan@piefed.zip ⁨11⁩ ⁨hours⁩ ago

        Sheeeit… we haven’t been prioritizing efficiency, much less quality, for decades. You’re so right and þrowing hardware at problems. Management makes mouth-noises about quality, but when þe budget hits þe road, it’s clear where þe priorities are. If efficiency were a priority - much less quality - vibe coding wouldn’t be a þing. Low-code/no-code wouldn’t be a þing. People building applications on SAP or Salesforce wouldn’t be a þing.

        source
    • MotoAsh@piefed.social ⁨10⁩ ⁨hours⁩ ago

      Yes, if you factor in the source of disposable culture: capitalism.

      “Move fast and break things” is the software equivalent of focusing solely on quarterly profits.

      source
    • ininewcrow@lemmy.ca ⁨12⁩ ⁨hours⁩ ago

      Planned Obsolescence … designing things for a short lifespan so that things always break and people are always forced to buy the next thing.

      It all originated with light bulbs 100 years ago … inventors did design incandescent light bulbs that could last for years but then the company owners realized it wasn’t economically feasible to produce a light bulb that could last ten years because too few people would buy light bulbs. So they conspired to engineer a light bulb with a limited life that would last long enough to please people but short enough to keep them buying light bulbs often enough.

      source
      • Tehdastehdas@piefed.social ⁨29⁩ ⁨minutes⁩ ago

        Not the light bulbs. They improved light quality and reduced energy consumption by increasing filament temperature, which reduced bulb life. Net win for the consumer.

        You can still make an incandescent bulb last long by undervolting it orange, but it’ll be bad at illuminating, and it’ll consume almost as much electricity as when glowing yellowish white (standard).

        source
      • MotoAsh@piefed.social ⁨10⁩ ⁨hours⁩ ago

        Edison was DEFINITELY not unique or new in how he was a shithead looking for money more than inventing useful things… Like, at all.

        source
  • _NetNomad@fedia.io ⁨12⁩ ⁨hours⁩ ago

    i think about this every time i open outlook on my phone and have to wait a full minute for it to load and hopefully not crash, versus how it worked more or less instantly on my phone ten years ago. gajillions of dollars spent on improved hardware and improved network speed and capacity, ans for what? machines that do the same thing in twice the amount of time if you're lucky

    source
    • socialsecurity@piefed.social ⁨12⁩ ⁨hours⁩ ago

      Well obviously it has to ping 20 different servers from 5 different mega corporations!

      source
      • snoons@lemmy.ca ⁨12⁩ ⁨hours⁩ ago

        And verify your identity three times, for good measure, to make sure you’re you and not someone that should be censored.

        source
  • balsoft@lemmy.ml ⁨8⁩ ⁨hours⁩ ago

    I’ve not read the article, but if you actually look at old code, it’s pretty awful too. If you try using Windows 95 or something, you will cry and weep. Linux used to be so much more painful 20 years ago too; anyone remember “plasma doesn’t crash” proto-memes? So, “BEFORE QUALITY” thing is absolute bullshit. What is happening today is that more and more people can do stuff with computers, so naturally you get “chaos”, as in a lot of software that does things, perhaps not in the best way possible, but does them nonetheless; you will still have more professional developers doing their things and building great software. What I can agree more is that capitalism doesn’t reward good quality software in general, so the quality trend for anything vaguely commercial is going to be slightly down; see enshittification (once again, and old concept).

    source
  • ThePowerOfGeek@lemmy.world ⁨11⁩ ⁨hours⁩ ago

    I don’t trust some of the numbers in this article.

    Microsoft Teams: 100% CPU usage on 32GB machines

    I’m literally sitting here right now on a Teams call (I’ve already contributed what I needed to), looking at my CPU usage, which is staying in the 4.6% to 7.3% CPU range.

    Is that still too high? Probably. Have I seen it hit 100% CPU usage? Yes, rarely (but that’s usually a sign of a deeper issue).

    Maybe the author is going with worst case scenario. But in that case he should probably qualify the examples more.

    source
    • MotoAsh@piefed.social ⁨10⁩ ⁨hours⁩ ago

      Well, it’s also stupid to use RAM size as an indicator of a machines CPU load capability…

      Definitely sending off some tech illiterate vibes.

      source
      • panda_abyss@lemmy.ca ⁨7⁩ ⁨hours⁩ ago

        Most software shouldn’t saturate either RAM or CPU on a modern computer.

        Yes, Photoshop, compiling large codevases, and video encoding and things like that should make just of an the performance available.

        But an app like Teams or Discord should not be hitting limits basically ever (I’ll excuse running a 4k stream, but most screen sharing is actually 720p)

        source
        • -> View More Comments
    • OmegaSunkey@ani.social ⁨10⁩ ⁨hours⁩ ago

      I haven’t really checked but CPU usage on Teams while just being a member on a call is low, but using the camera with filters clearly uses more. Just checking CPU temps gives you more or less how much CPU is used by a program. So clearly it is just worst case scenario: using camera with filters on top.

      My issue with Teams is that it uses a whole GB of ram on my machine with it just existing. It’s like it loads the entire .NET runtime on the browser or something. IDK if it uses C# on the frontend.

      source
      • boonhet@sopuli.xyz ⁨4⁩ ⁨hours⁩ ago

        IDK if it uses C# on the frontend.

        Pretty sure it’s a webview app, so probably all javascript.

        source
      • FishFace@lemmy.world ⁨9⁩ ⁨hours⁩ ago

        Ram usage today is insane, because there are two types of app on the desktop today: web browsers, and things pretending not to be web browsers.

        source
    • justlemmyin@lemmy.world ⁨10⁩ ⁨hours⁩ ago

      Naah bro, teams is trash resource hog. What you are saying is essentially ‘it works on my computer’.

      source
  • WanderingThoughts@europe.pub ⁨9⁩ ⁨hours⁩ ago

    That’s been going on for a lot longer. We’ve replaced systems running on a single computer less powerfull than my phone but that could switch screens in the blink of an eye and update its information several times per second with the new systems running on several servers with all the latest gadgets, but taking ten seconds to switch screens and updates information every second at best. Yeah, those layers of abstraction start adding up over the years.

    source
  • goatinspace@feddit.org ⁨10⁩ ⁨hours⁩ ago

    Image

    source