AMERICAN manufacturers, just waint until the Chinese industries swoop in to fill the gap. I seriously feel America just wants to kneecap itself.
Consumer hardware is no longer a priority for manufacturers
Submitted 2 weeks ago by throws_lemy@lemmy.nz to technology@lemmy.world
https://www.xda-developers.com/consumer-hardware-is-no-longer-a-priority-for-manufacturers/
Comments
Ilixtze@lemmy.ml 2 weeks ago
foodandart@lemmy.zip 2 weeks ago
Wants to kneecap itself?
Dude, the US is going full seppuku and we’re going to gut ourselves on the floor.
boogiebored@lemmy.world 2 weeks ago
“banned for security concerns”
Ilixtze@lemmy.ml 2 weeks ago
Not a problem for me; I’m not in America, I own a Huawei phone and a Huion Tablet.
errer@lemmy.world 2 weeks ago
Hard to swoop in with massive tariffs. The few players that remain will just charge a lot more…it’ll become the rich lucky few who can afford their own hardware.
tja@sh.itjust.works 2 weeks ago
The US is not the only place to sell to
Ilixtze@lemmy.ml 2 weeks ago
The rest of the world will be fine.
brucethemoose@lemmy.world 2 weeks ago
I mean, I’d kill for a Chinese GPU. But lock-in for your Steam back catalog is strong.
Also, have you been watching all the Chinese GPU announcements? They’re all in on machine learning ASICs too.
Ilixtze@lemmy.ml 2 weeks ago
There is already a lot of good Chinese DDR 5 memory on the market and it’s a matter of time before Chinese GPU’s and CPU’s proliferate. I remember people in the west global north were sceptic about the viability of Chinese electric cars ever existing just 5 years ago; Elon even laughed at the possibility. Tables turn fast when you have industrial capacity and central planning.
BackgrndNoize@lemmy.world 2 weeks ago
Do you think they won’t just ban the Chinese products lol, this ain’t a democracy bud
Ilixtze@lemmy.ml 2 weeks ago
I’m not American so I ain’t part of your non-democracy.
IratePirate@feddit.org 2 weeks ago
America doesn’t. The Russian asset in the White House and its brainwashed minions do.
mattyroses@lemmy.today 2 weeks ago
You guys voted him in, twice, with the popular vote the second time. Don’t pretend you don’t own him.
Ilixtze@lemmy.ml 2 weeks ago
Sorry to break it to you bud; But America has a Plutocracy problem; It’s not a question of Putin running the show, but the American legal system being unable to persecute crimes and corruption if it happens to be billionaires. So in essence the system is compromised.
UnspecificGravity@piefed.social 2 weeks ago
I’m looking forward to cheap Chinese video cards that out perform Nvidia shit for 1/4 the price.
Lfrith@lemmy.ca 2 weeks ago
Hopefully Linux supported. That’s the main selling point of AMD GPUs right now for me, since there’s less problems with trying to get stuff like HDR running on it than NVIDIA.
I wonder why China is still for the most part ignoring Linux in favor of Windows. Like to update 8bitdo controllers they only provide a Microsoft program and no Linux version.
You’d think they’d be rushing towards pushing Linux adoption.
jbloggs777@discuss.tchncs.de 2 weeks ago
That’s capitalism for you. But also Linux, where it’s typical to upstream hardware support and rely on existing ecosystems rather than release addon drivers or niche supporting apps.
China has made some strategic investments in Linux over the years though – often domestically targeted, like Red Flag Linux, and drivers for chinese hardware, etc.
captain_aggravated@sh.itjust.works 2 weeks ago
China has no need for open source because they steal everything anyway.
UnspecificGravity@piefed.social 2 weeks ago
Most of the handheld consoles they sell run Linux of some variety. It’s just a question of what is marketable.
Psythik@lemmy.world 2 weeks ago
I hope you’re right because Intel and AMD still can’t compete with high end Nvidia cards, and that’s how we ended up with a $5000 5090.
muusemuuse@sh.itjust.works 2 weeks ago
And can already beat nvidia at the price tiers most people actually buy at, and Intel is gaining ground way faster than anyone expected.
But outside of the GPU shakeup, I could give a shit about Intel. Let China kill us. We earned this.
drev@lemmy.dbzer0.com 2 weeks ago
FIVE THOUSAND?!
Jesus nun-fucking Christ, what an absolute scam. I bought a 1070 for $220 in the first few months after release. Guess I’ll just have to hope it can run for another 10 years…
JohnEdwa@sopuli.xyz 2 weeks ago
We also partly ended up with the 5k 5090 because it’s just the TITAN RTX of the 50xx generation - the absolute top of the line a card where you pay 200% extra for that last +10% performance.
nVidia just realized few generations back that naming those cards the xx90 gets a bunch of more people to buy them, because they always desperately need to have the shiniest newest xx90 cards, no matter the cost.EndlessNightmare@reddthat.com 2 weeks ago
Nvidia cards are more powerful, but even if others never catch up they could still be solidly “good enough” for gaming. I have a newer Nvidia card and my computer is feels so wildly overbuilt. The only thing I wish I had more of was SSD space, but that’s a different problem.
Unless you’re a professional competitive gamer, in which case this is actual work equipment, the difference in performance between medium-tier and upper-echelon is probably not worth it for the average consumer.
dhork@lemmy.world 2 weeks ago
This is yet another thing I blame on American Business sacrificing itself on the altar of Shareholder Value. It’s no longer acceptable for a public business to simply make a profit. It has to grow that profit, every quarter, without fail.
So, simply having a good consumer product division that makes money won’t be enough. At some point some executive will decide that he can’t possibly get his bonus if that’s all they do, and decide they need to blow it all up to chase larger profits elsewhere.
Maybe we need a small, private company to come along and start making good consumer hardware. They still need components, though, so will have to navigate getting that from public companies who won’t return their calls. And even once they are successful, the first thing they will do is cash out and go public, and the cycle starts again.
SoleInvictus@lemmy.blahaj.zone 2 weeks ago
Maybe we need a small, private company to come along and start making good consumer hardware.
I’ve always wanted to start a business like this. “Generic Brand” household goods. Not fancy, just solidly functional base models but with modular upgradability. Wish you bought the WiFi capable washer? Buy the module for $30. Everything would be fully user serviceable and upgradable (within reason), so parts sales ensure sustained income once market saturation is reached.
dhork@lemmy.world 2 weeks ago
It’s not totally out of the realm of possibility. Michael Dell did it, after all, but he did it in a different time.
And Dell is actually a good case study for all this. It went public rather quickly after it started growing, but grew a bit stagnant by the 2000’s. So much so that 2013, Michael Dell orchestrated a leveraged buyout of his own company (with the help of venture capital) to make it private again. He pretty much admitted that the changes he wanted to make to the company would be impossible while it was still public. It stayed private for a while, but went public again as part of some deal made after it acquired the parent company of VMware.
Another notable thing is that Carl Ichan owned a large chunk of Dell, both in its first public incarnation and in its private incarnation. When Dell tried to take it private, Ichan challenged the plan, and thought about putting in his own bid, only to back off when he decided it wasn’t worth the effort to revive the company. Still, he was publicly against Dell’s buyout plan but was outvoted by other shareholders. Yet, he must have still held a part of the private company, because Ichan also protested it’s second plan to go public, and forced Dell to increase its terms to the private holders.
Michael Dell is no saint, but I conclude that he realized that the company meant more than a spreadsheet, and needed a purpose to justify its existence. He also realized that in order to sustain a business over the long term, having to constantly sustain quarterly numbers may be counterproductive. I think Carl Ichan, on the other hand, only cares about Number Go Up, and doesn’t care at all about how the company makes that happen. Over the long term, that will never be sustainable, but fuck you all, he got his bag already.
DevotedShitStain69@lemmy.world 2 weeks ago
Imma remember what Curcial and others are doing, so when the AI bubble pops I’ll skip on all their products.
neclimdul@lemmy.world 2 weeks ago
Kind of makes sense really when you think about it. The vast majority of consumers have had all their wealth eroded over decades to the point no one can buy anything. Better to let the AIs buy everything now.
Korkki@lemmy.ml 2 weeks ago
The silver lining is that hardware performance gains have been so minor from generation to generation that upgrading isn’t really that important anymore. Like if i upgrade from next generation equivalent GPU it would give like 8% more fps… and it costs like 1,5k… No thanks.
cmnybo@discuss.tchncs.de 2 weeks ago
You used to get a fairly significant upgrade ever few years for about the same cost as the old hardware. Transistors aren’t really getting much smaller anymore, so more performance needs a bigger die and costs more money.
amorpheus@lemmy.world 2 weeks ago
Probably also a big reason why it’s less profitable - consumers are upgrading more and more slowly. In part because of the performance gains being smaller, in part because a lot of things are getting more expensive. In that way it’s a self fulfilling prophecy.
varjen@lemmy.world 2 weeks ago
I see a future where all computing is done in the cloud and home computers are just dumb terminals. An incredibly depressing future. Customers not users is the goal.
Buelldozer@lemmy.today 2 weeks ago
This has been predicted and worked towards since the 90s.
lavander@lemmy.dbzer0.com 2 weeks ago
ChromeOS was exactly mean for that
Atropos@lemmy.world 2 weeks ago
They can pry my hardware from my cold, dead hands.
Also my car with knobs. I will continue to maintain that car for as long as possible.
varjen@lemmy.world 2 weeks ago
Cars without knobs shouldn’t be allowed. Knobs are awesome.
umbrella@lemmy.ml 2 weeks ago
off to sell it cheaper to companies, so they can rent it back to us.
tal@lemmy.today 2 weeks ago
For some workloads, yes. I don’t think that the personal computer is going to go away.
But it also makes a lot of economic and technical sense for some of those workloads.
Historically — like, think up to about the late 1970s — useful computing hardware was very expensive. And most people didn’t have a requirement to keep computing hardware constantly loaded. In that kind of environment, we built datacenters and it was typical to time-share them. You’d use something like a teletype or some other kind of thin client to access a “real” computer to do your work.
What happened at the end of the 1970s was that prices came down enough and there was enough capability to do useful work to start putting personal computers in front of everyone. You had enough useful capability to do real computing work locally. They were still quite expensive compared to the great majority of today’s personal computers:
en.wikipedia.org/wiki/Apple_II
The original retail price of the computer was US$1,298 (equivalent to $6,700 in 2024)[18][19] with 4 KB of RAM and US$2,638 (equivalent to $13,700 in 2024) with the maximum 48 KB of RAM.
But they were getting down to the point where they weren’t an unreasonable expense for people who had a use for them.
At the time, telecommunications infrastructure was much more limited than it was today, so using a “real” computer remotely from many locations was a pain, which also made the PC make sense.
From about the late 1970s to today, the workloads that have dominated most software packages have been more-or-less serial computation. While “big iron” computers could do faster serial compute than personal computers, it wasn’t radically faster. Video games with dedicated 3D hardware were a notable exception, but those were latency sensitive and bandwidth intensive, especially relative to the available telecommunication infrastructure, so time-sharing remote “big iron” hardware just didn’t make a lot of sense.
And while we could — and to some extent, did — ramp up serial computational capacity by using more power, there were limits on the returns we could get.
However, what AI stuff represents has notable differences in workload characteristics. AI requires parallel processing. AI uses expensive hardware. We can throw a lot of power at things to get meaningful, useful increases in compute capability.
-
Just like in the 1970s, the hardware to do competitive AI stuff for many things that we want to do is expensive. Some of that is just short term, like the fact that we don’t have the memory manufacturing capacity in 2026 to meet need, so prices will rise to price out sufficient people that the available chips go to whoever the highest bidders are. That’ll resolve itself one way or another, like via buildout in memory capacity. But some of it is also that the quantities of memory are still pretty expensive. Even at pre-AI-boom prices, if you want the kind of memory that it’s useful to have available — hundreds of gigabytes — you’re going to be significantly increasing the price of a PC, and that’s before whatever the cost of the computation hardware is.
-
Power. Currently, we can usefully scale out parallel compute by using a lot more power. Under current regulations, a laptop that can go on an airline in the US can have an 100 Wh battery and a 100 Wh spare, separate battery. If you pull 100W on a sustained basis, you blow through a battery like that in an hour. A desktop can go further, but is limited by heat and cooling and is going to start running into a limit for US household circuits at something like 1800 W, and is going to be emitting a very considerable amount of heat dumped into a house at that point. Current NVidia hardware pulls over 1kW. A phone can’t do anything like any of the above. The power and cooling demands range from totally unreasonable to at least somewhat problematic. So even if we work out the cost issues, I think that it’s very likely that the power and cooling issues will be a fundamental bound.
In those conditions, it makes sense for many users to stick the hardware in a datacenter with strong cooling capability and time-share it.
Now, I personally really favor having local compute capability. I have a dedicated computer, a Framework Desktop, to do AI compute, and also have a 24GB GPU that I bought in significant part to do that. I’m not at all opposed to doing local compute. But at current prices, unless that kind of hardware can provide a lot more benefit than it currently does to most, most people are probably not going to buy local hardware.
If your workload keeps hardware active 1% of the time — and maybe use as a chatbot might do that — then it is something like a hundred times cheaper in terms of the hardware cost to have the hardware timeshared. If the hardware is expensive — and current Nvidia hardware runs tens of thousands of dollars, too rich for most people’s taste unless they’re getting Real Work done with the stuff — it looks a lot more appealing to time-share it.
There are some workloads for which there might be constant load, like maybe constantly analyzing speech, doing speech recognition. For those, then yeah, local hardware might make sense. But…if weaker hardware can sufficiently solve that problem, then we’re still back to the “expensive hardware in the datacenter” thing.
Now, a lot of Nvidia’s costs are going to be fixed, not variable. And assuming that AMD and so forth catch up, in a competitive market, will come down — with scale, one can spread fixed costs out, and only the variable costs will place a floor on hardware costs. So I can maybe buy that, if we hit limits that mean that buying a ton of memory isn’t very interesting, price will come down. But I am not at all sure that the “more electrical power provides more capability” aspect will change. And as long as that holds, it’s likely going to make a lot of sense to use “big iron” hardware remotely.
What you might see is a computer on the order of, say, a 2022 computer on everyone’s desk…but that a lot of parallel compute workloads are farmed out to datacenters, which have computers more-capable of doing parallel compute there.
Cloud gaming is a thing. I’m not at all sure that there the cloud will dominate, even though it can leverage parallel compute. There, latency and bandwidth are real issues. You’d have to put enough datacenters close enough to people to make that viable and run enough fiber. And I’m not sure that we’ll ever reach the point where it makes sense to do remote compute for cloud gaming for everyone. Maybe.
But for AI-type parallel compute workloads, where the bandwidth and latency requirements are a lot less severe, and the useful returns from throwing a lot of electricity at the thing significant…then it might make a lot more sense.
I’d also point out that my guess is that AI probably will not be the only major parallel-compute application moving forward. Unless we can find some new properties in physics or something like that, we just aren’t advancing serial compute very rapidly any more; things have slowed down for over 20 years now. If you want more performance, as a software developer, there will be ever-greater relative returns from parallelizing problems and running them on parallel hardware.
umbrella@lemmy.ml 2 weeks ago
i don’t think it’d make so much financial sense to have them charge by the minute of compute.
amazon aws pricing is only the tip of the iceberg on what could be coming.
-
floquant@lemmy.dbzer0.com 2 weeks ago
Just a reminder that “consumer” means human. They’re fucking over everyone in favour of “corporations” (aka a few select humans)
ivanafterall@lemmy.world 2 weeks ago
Human…capital?
brucethemoose@lemmy.world 2 weeks ago
Also, this has been the case (or at least planned) for a while.
Pascal (the GTX 1000 series) and Ampere (the RTX 3000 series) used the exact same architecture for datacenter/gaming. The big gaming dies were dual use and datacenter-optimized. This habit goes all the way back to ~2008, but Ampere and the A100 are really where datacenter came first.
AMD announced a plan to unify their datacenter/gaming architecture awhile ago.
Intel wanted to do this, but had some roadmap trouble.
melfie@lemy.lol 2 weeks ago
I’ve been looking into self-hosting LLMs, and it seems a $10k GPU is kind of a requirement to run a decently-sized model and get reasonable tokens / s rate. There’s CPU and SSD offloading, but I’d imagine it would be frustratingly slow to use. I find cloud-based AI like GH Copilot to be rather annoyingly slow. Even so, GH Copilot is like $20 a month per user, and I’d be curious what the actual costs are per user considering the hardware and electricity cost.
What we have now is clearly an experimental first generation of the tech, but the industry is building out data centers as though it’s always going to require massive GPUs / NPUs with wicked quantities of VRAM to run these things. If it really will require huge data centers full of expensive hardware where each user prompt requires minutes of compute time on a $10k GPU, then it can’t possibly be profitable to charge a nominal monthly fee to use this tech, but maybe there are optimizations I’m unaware of.
brucethemoose@lemmy.world 2 weeks ago
This is not true. I have a single 3090 + 128GB CPU RAM (which wasn’t so expensive that long ago), and I can run GLM 4.6 350B at 6 tokens/sec. I can run sparser models like Stepfun 3.5, GLM Air or Minimax 2.1 much faster, and these are all better than the cheapest API models.
melfie@lemy.lol 2 weeks ago
Appreciate all the info! I did find this calculator the other day, and it’s pretty clear the RTX 4060 in my server isn’t going to do much though its NVMe may help.
apxml.com/tools/vram-calculator
I’m also not sure under 10 tokens per second will be usable, though I’ve never really tried it.
I’d be hesitant to buy something just for AI that doesn’t also have RTX cores because I do a lot of Blender rendering. RDNA 5 is supposed to have more competitive RTX cores along with NPU cores, so I guess my ideal would be a SoC with a ton of RAM. Maybe when RDNA 5 releases, the RAM situation will have have blown over and we will have much better options.
WhyJiffie@sh.itjust.works 2 weeks ago
You can’t just do “ollama run” and expect good performance, as the local LLM scene is finicky and highly experimental. You have to compile forks and PRs, learn about sampling and chat formatting, perplexity and KL divergence, about quantization and MoEs and benchmarking. Everything is moving too fast, and is too performance sensitive, to make it that easy, unfortunately.
how do you have the time to figure all these out and keep being up to date? do you do this at work?
Xenny@lemmy.world 2 weeks ago
Ai failed and now they are doing this to capture the compute market to then make their profit back through unscrupulous means.
hector@lemmy.today 2 weeks ago
As I am told, there is no way these llm’s ever make their investments back. It’s like Tesla at this point. Whomever is paying the actual money to build this stuff is going to get hosed if they can’t offload it onto some other sucker. That ultimate sucker probably being the US taxpayer.
Analog@lemmy.ml 2 weeks ago
Can run decent size models with one of these: …minisforum.com/…/minisforum-ms-s1-max-mini-pc
For $1k more you can have the same thing from nvidia in their dgx spark. You can use high speed fabric to connect two of ‘em and run 405b parameter models, or so they claim.
Point being that’s some pretty big models in the 3-4k range, and massive models for less than 10k. The nvidia one supports comfyui so I assume it supports cuda.
It ain’t cheap and AI has soooo many negatives, but… it does have some positives and local LLMs mitigate some of the minuses, so I hope this helps!
melfie@lemy.lol 2 weeks ago
Nice, though $3k is still getting pretty pricey. I see mini PCs with a AMD RYZEN AI MAX+ 395 and 96GB of RAM can be had for $2k, or even $1k with less RAM: gmktec.com/…/amd-ryzen™-ai-max-395-evo-x2-ai-mini…
I’m looking for something that also does path tracing well if I’m going to drop that kind of coin. It sounds like this chip can be on par with a 4070 for rasterization, but it only gets a benchmark score of 495 for Blender rendering compared to 3110 for even a RTX 4060. RDNA 5 with true RTX cores should drastically change the situation of chips like this, though.
Clam_Cathedral@lemmy.ml 2 weeks ago
Honestly just jump in with whatever hardware you have available and a small 1.5b/7b model. You’ll figure out all the difficult uncertainties as you go and try to improve things.
I’m hosting a few lighter models that are somewhat useful and fun without even using a dedicated GPU- just a lot of ram and fast NVMe so the models don’t take forever to spin up.
Of course I’ve got an upgrade path in mind for the hardware and to add a GPU but there are other places I’d rather put the money atm and I do appreciate that it all currently runs on a 250w PSU.
Bakkoda@sh.itjust.works 2 weeks ago
Consumer sales are very very trackable. Off channel bulk sales can be very hard to verify and I’m sure that’s not being used to prop up valuation. Not at all.
Grandwolf319@sh.itjust.works 2 weeks ago
Even as consumer revenue remains sizable and maintains steady year-on-year growth, it finds itself competing against segments that grow exponentially faster and earn more per unit.
So it has nothing to do with people having less money. It honestly gives me hope, things could change with a bubble burst.
forrgott@lemmy.zip 2 weeks ago
The point is to make the bubble butter, then they’ll pretend that’s why they have to edit consumer market.
The goal is the so called “thin client” - i.e. absolutely everything in cloud
4am@lemmy.zip 2 weeks ago
Bingo! And they’re doing it to enterprises too. Why do you think copilot is shoved into everything? Why do you think Recall is creeping towards being mandatory? Why do you think OneDrive isn’t optional anymore?
They don’t just want your data for advertising. They want to watch the entire capital machine in real-time to make sure there aren’t any gaps, and dissent, and most of all, anyone getting the jump on something new and big. OneDrive to rule them all.
adespoton@lemmy.ca 2 weeks ago
At what point can AI companies play the “too big to fail” card though, like the banks?
Bubble bursts, and the government uses our taxes to bail out the companies. Again.
WhatAmLemmy@lemmy.world 2 weeks ago
The nazis will bail out everyone who bends the knee and submits a bribe. It’s not like they’re spending their own money…
CosmoNova@lemmy.world 2 weeks ago
Yeah no shit?
swade2569@lemmy.world 2 weeks ago
Probably because the hardware is going into systems that eliminate jobs and we become broke. All that gear is gonna sit on the shelf if we can’t afford it.
RizzRustbolt@lemmy.world 2 weeks ago
You can make a lot of money selling imaginary products to nonfunctional industries.
dantheclamman@lemmy.world 2 weeks ago
This will not stop until the ultra-rich are destroyed as a class. They have constructed a parallel economy, and we are all their serfs. History shows this situation can’t last and the question is whether they can be parted with their wealth peacefully or not
Jason2357@lemmy.ca 2 weeks ago
Generally speaking, the consumer market has been entirely eclipsed by business to business sales. The only entities with expendable cash are businesses.
anon_8675309@lemmy.world 2 weeks ago
With the exception of a few tasks most modern hardware should last a while doing everyday tasks. If you’re due for an upgrade do it and then get off the consumerism train for a while. If you’ve got something within the past two or three years you’re set for a while.
BeardededSquidward@lemmy.blahaj.zone 2 weeks ago
I thought I needed to upgrade my core components but since I’m no longer detailed graphics chasing like I was it’s not as important. It’s still fairly old at this point but it’s doing me just fine.
Mrkawfee@lemmy.world 2 weeks ago
In early 2022, consumer GPUs [Nvidia] accounted for 47% of Team Green’s total revenue; by early 2026, that share had fallen to just 7.5%. Over the same period, data center revenue surged to $51.2 billion, representing roughly 90% of the company’s earnings.
Wow, that’s a complete wipeout of GPUs for home computing.
I wonder if the diminishing returns in gaming graphics has something to do with it as well.
MonkderVierte@lemmy.zip 2 weeks ago
Eh, maybe a chance for a architecturial redesign too, that caters more to desktop than to server.
Draconic_MEO@programming.dev 2 weeks ago
Steal it from the assholes trying to charge you an arm and a leg.
SnotFlickerman@lemmy.blahaj.zone 2 weeks ago
Part of this has been a long-standing move by every industry to prioritize business-to-business sales as opposed to consumer sales simply because businesses have money and consumers don’t, because businesses are pocketing all the profits and refusing to pay their employees (consumers) a living wage, let alone a thriving wage.
It’s been a long time coming for the PC industry, because it’s been a general trend for at least two decades as sales to business have become more profitable over consumer sales ever since the late 90s.
antonim@lemmy.dbzer0.com 2 weeks ago
This is an important point in general. The old story of “voting with your wallet” is now more and more obviously mathematically absurd.
realitista@lemmus.org 2 weeks ago
You can only vote with your wallet if there’s something in it.
nforminvasion@lemmy.world 2 weeks ago
Top 10% owns 93% of stocks and accounted for 55% of market activity in early 2025, before tariffs and mass layoffs and an unnamed recession. They’re probably at 60 -65% of revenue now. You’re absolutely right. The rich have been working to remove us from the equation for decades. In our buying power, and in our labor power with AI.
tomiant@piefed.social 2 weeks ago
I always hated that fucking saying. No, you vote with YOUR VOTE. If you vote with your dollars the man with a billion dollars has a billion votes more than you do.
nforminvasion@lemmy.world 2 weeks ago
I am so glad to see someone else talking about this. Yeah, We’re going back to feudalism… And only the upper ranks of society will be able to afford goods and be able to engage in trade.
SnotFlickerman@lemmy.blahaj.zone 2 weeks ago
I mean, it’s very arguable that we’ve just been doing “feudalism with extra steps” for a very long time anyway.
To be less US/Europe-centric than my original post, the majority of the world has been in the “priced out of anything bus bare subsistence” basket for most of the history of modern capitalism because only the citizens of the Imperial Democracies of the Western world were benefiting while the majority of the Southern and Eastern hemispheres were simply locked out from being beneficiaries either through trade embargoes or outright exploitation via not paying foreign workers the home-country equivalent, and instead paying them a much lower “localized” rate.
It’s really that the Imperial Boomerang has finally made it’s way home to the citizens of the West.
tomiant@piefed.social 2 weeks ago
Yet another lie of capitalism- that demand dictates supply. If people want something, capitalism will solve the problem because greed will solve every problem, right? WRONG. It doesn’t take into consideration that scale and logistics and infrastructure and mass production all collude and interact in ways that can and will easily create isolated and unexpected functional consequences just like this one.
Capitalism itself is such a dumb and evil creation that the only thing that keeps it running is to have laws that limit its destructive power- unfortunately, since money is power, capitalism quickly supersedes rule of law and democracy, creating a system of government where feudal lords rule over what basically amounts to serfdom. Surely a more comfortable serfdom than in the 1300’s, but most certainly lacking the basic freedoms of a democratic society, and only so long as the lords so wish and comfort isn’t taken away at their whims due to opposition or otherwise.
Fuck this shit.
verdi@tarte.nuage-libre.fr 2 weeks ago
Eat the rich!
CosmoNova@lemmy.world 2 weeks ago
The whole economy reminds me more and more of the decline of the Roman Empire. Their biggest problem was that there were no consumers left to keep money in circulation and economy afloat. You either owned lots of land and slaves or you were a slave, meaning the only one you could trade with were merchants from outside the empire but as the empire expanded, those became harder to reach. War expenses spiraled out of control while the economy declined until it ceased to exist.
Now mega corps only trade with each other and threaten to replace all workers with AI and robots. Meanwhile the economy becomes stale, people buy less while politicians around the globe cut down the social sector, meaning people will have even less money to spare. Money won‘t circulate as much, slowing things down even further.
There are ways out of this spiral of decline but billionaires won‘t give up so easily. You can say many things about them but they are persistent as hell.
SnotFlickerman@lemmy.blahaj.zone 2 weeks ago
That persistence is a type of sociopathy, though. It’s an antisocial personality disorder. Sure, they’re persistent, but they’re persistent at pursuing absolutely terrible things for personal gain that effectively means nothing considering they already have enough power and money to make Solomon blush. It’s a mental disorder where they need more and more and more while they have more than they could ever use in their entire lifetimes and in their grandchildren’s their grandchildren’s lifetimes.