2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.
Can’t wait. I recently bought a firewall that gets noticeably warm on idle, even with a little case that has a heat sink. We need more energy efficient PCs.
Submitted 10 months ago by corbin@infosec.pub to technology@lemmy.world
https://www.spacebar.news/arm-desktop-2024/
2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.
Can’t wait. I recently bought a firewall that gets noticeably warm on idle, even with a little case that has a heat sink. We need more energy efficient PCs.
“The most exciting tech isn’t the thing that currently exists and is being improved and integrated daily, it’s this other thing we don’t even know for sure will maybe happen.”
"Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level. This new chip design might end up with comparable capabilities to the existing chip design!"
Yeah, there was no need to try to hype this up as the biggest thing ever.
Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level.
That isn’t what’s happening with “IA” right now.
This is just a repeat of the same old pro-RISC myths from decades ago. There is very little performance difference between x86 and any RISC based CPU, at least when pertaining to the ISA itself. Apple merely has the advantage of having far more resources available for CPU development than their competitors.
Modern x86 is a CISC outer layer around a RISC inner core. It didn’t hang on this long by ignoring RISC, but by assimilating it.
One of the hurdles to ARM is that you need to recompile and maintain a separate version of every piece of software for the different processors.
This is a much easier task for a tightly controlled ecosystem like Mac than the tons of different suppliers Windows ecosystem. You can do some sort of emulation to run non-native stuff, but at the cost of the optimization that you were hoping to gain.
Another OS variation also adds a big cost/burden to enterprise customers where they need to manage patches, security, etc.
I would expect to see more inroads in non-corporate areas following Apple success, but not any sort of explosion.
micrsoft has spent the last few years rebuilding their shit to work on ARM. no idea how far theyve come, but you will absolutely see windows on arm for the enterprise.
Apple has the benefit of having done architecture transitions a few times already. Microsoft has been trying to get everyone out of the “Program Files (x86)” directory for over decade.
On the other hand, a completely open ecosystem works well too — ARM for Linux feels exactly like ARM on x86/64 in my experience. Granted this is for headless stuff on an (RPi and Orange Pi, both ARM, both running Debian), but really the only difference is the bootloader situation.
I really hope so... Those x86 architecture chips are killing me.
Does anyone else worry that the rise of personal computers using super custom SOCs is going to have negative effects on our abilities to build our own machines?
Omg the porting of games would be awfull
Wake me up when risc-v has performance parity and more software
Ah yes, let’s welcome one device - one operating system myth to the desktops. Welcome the expiration date on computers called “years of software support” and welcome overall unfriendlyness for alternative systems.
Performance and efficency is one side of the coin. But let me remind you that Qualcomm (among with Google) is the reason we cannot have lifetime updates for our phones, ROMs build needs to be specific for each model and making a phone with anything but Android is nearly impossible.
I’ll take ARM over x86, but I’ll take AMD/Intel over Qualcomm thousand times more.
AI is currently limited in application. I think when we start seeing it in things like document ecosystems like Google Workspace or Microsoft Office, or in operating systems like Windows 12 and Android, that’s when we’ll start seeing what it’s really capable of.
Thinks like creating helper bots that aid in troubleshooting or “assistants” that can draft/send emails, create calendar events, answer questions based on emails, etc.
But yeah in it’s current state it is mostly just a glorified search engine.
ow yea 2024 will definitely be the year where AI gets integrated into all those products.
But one detail that we cannot forget is that with the increase in ARM architecture in PCs and laptops we will probably see an increase in fully locked hardware. We don’t need the expansion of the ARM architecture for PCs if it doesn’t come with hardware and software freedom
blazera@kbin.social 10 months ago
It says it a few times about x86 being decades old...but so is ARM? I dont know whats supposed to be game changing about it.
abhibeckert@lemmy.world 10 months ago
I’m guessing you’ve never used an ARM Mac.
They don’t look all that fast on GeekBench (more on that in further down) but in real world usage they are incredibly fast. As in an entry level 13" school homework laptop will have performance on par with a high end gaming PC with a thousand watt PSU.
I’m able to play games with ghat 3k resolution with good settings while running on battery power and it lasts several hours on battery. Not a big battery either, it’s about half the size/weight of a typical gaming laptop battery. I’m also able to compile software nice and quickly, I can run docker with a dozen containers open at the same time without breaking a sweat (this is particularly impressive on the Mac version of Docker which uses virtual machines instead of running directly on the host), and stable diffusion generates images in about 20 seconds or so with typical generation settings.
The best thing though is I can do all of that on a tiny battery that lasts almost an entire day under heavy load and multiple days under normal load. I’ve calculated the average power draw with typical use is somewhere around 3 watts for the entire system including the screen. It’s hard to believe, especially considering how fast it is.
On the modest GeekBench score Apple ARM processors have - it’s critical to understand GeekBench is designed to test very short bursts and avoid thermal throttling. Intel’s recent i9 processors, with good cooling, will thermal throttle after about 12 seconds and GeekBench is designed to avoid hitting that number by doing much shorter bursts than that. Apple’s processors not only take far longer to thermal throttle, they also “throttle” by reducing performance to barely lower than full speed.
But even worse than that - one of the ways Apple achieves incredible battery life is they don’t run the processors at high clock rates for short bursts. The CPU starts slow and ramps up to full speed when you keep it under high load. So something quick, like loading a webpage, won’t run at full speed and therefore GeekBench also isn’t running at full speed either.
A third difference, and probably the biggest one, is Apple’s processor has very fast memory and also massive memory caches which are even faster. Again that often doesn’t show up on CPU benchmark because it’s not really measuring compute power. But real world software spends a massive amount of time just reading and writing to memory and those operations are fast on Apple’s ARM processors.
You really can’t trust the benchmarks when you’re comparing completely different processors. You need to try real world usage, and the real world usage difference is game changing.
0ddysseus@lemmy.world 10 months ago
Haha “entry level school homework Mac” Hahahahaha Sure thing Richy Rich
soren446@lemmy.world 10 months ago
Don’t even dare saying anything positive on Lemmy about Apple, even if your comment is constructive and adds to the conversation. Lemmy’s tech community is somehow worse than Reddit unless you’re into Linux. Your comment doesn’t deserve the barrage of downvotes you got.
I use arch btw.
joshhsoj1902@lemmy.ca 10 months ago
I work on an ARM Mac, it’s fine. If you’re just doing light work on it, it works great! Like any other similarly priced laptop would.
Under load, or doing work outside what it is tuned for, it doesn’t perform spectacularly.
It’s a fine laptop, the battery life is usually great. But as soon as you need to use the x86 translation layer, performance tanks, battery drains, it’s not a great time.
Things are getting better, and for a light user, It works great, but I’m much more excited about modern x86 laptop processors for the time being.
frezik@midwest.social 10 months ago
X86 has an incredible amount of cruft built up to support backwards compatibility all the way back to the 8086. ARM isn’t free of cruft, but it’s nowhere on the same level. Most of that isn’t directly visible to customers, though.
What is visible is that more than three companies can license and manufacture them. The x86 market has one company that owns it, another who licenses it but also owns the 64 bit extensions, and a third one who technically exists but is barely worth talking about. It’s also incredibly difficult to optimize, and the people who know how already work for one of main two companies (arguably only one at this point). Even if you could legally license it as a fourth player, you couldn’t get people who could design an x86 core that’s worth a damn.
Conversely, ARM cores are designed by CS students all the time. That’s the real advantage to end users: far more companies who can produce designs. If one of them fails the way Intel has of late, we’re not stuck with just one other possibility.