Sam_Bass@lemmy.world 2 days ago
Doesn’t confuse me, just pisses me off trying to do things I don’t need or want don’t. Creates problems to find solutions to
Sam_Bass@lemmy.world 2 days ago
Doesn’t confuse me, just pisses me off trying to do things I don’t need or want don’t. Creates problems to find solutions to
Gsus4@mander.xyz 2 days ago
Can the NPU at least stand in as a GPU in case you need it?
Sam_Bass@lemmy.world 2 days ago
Nope. Don’t need it
UsoSaito@feddit.uk 2 days ago
No as it doesn’t compute graphical information and is solely for running computations for “AI stuff”.
Gsus4@mander.xyz 2 days ago
GPUs aren’t just for graphics. They speed up all kinds of vector operations, including those used in “AI stuff”. I just never heard of NPUs before, so I imagine they may be hardwired for graph architecture of neural nets instead of linear algebra.
JATtho@lemmy.world 1 day ago
Initially, x86 CPUs didn’t have a FPU. It cost extra, and was delivered as a separate chip.
Later, GPU is just a overgrown SIMD FPU.
NPU is a specialized GPU that operates on low-precision floating-point numbers, and mostly does mostly matrix-multiply-and-add.
There is zero neural processing going on here, which would mean the chip operates using bursts of encoded analog signals, within power consumption of about 20W, and would be able to adjust itself on the fly online, without having a few datacenters spending exceeding amount of energy to update the weights of the model.