NPUs yes, TPUs no (or not yet). Hailo is meant to be releasing a TPU “soon” that accelerates LLM.
RobotToaster@mander.xyz 1 week ago
Do NPUs/TPUs even work with ComfyUI? That’s the only “AI PC” I’m interested in.
SuspciousCarrot78@lemmy.world 1 week ago
Fermiverse@gehirneimer.de 1 week ago
https://github.com/patientx/ComfyUI-Zluda
Works with the 395+
L_Acacia@lemmy.ml 1 week ago
The support is bad for custom nodes and NPUs are fairly slow compared to GPUs (expect 5x to 10x longer generation time compared to 30xx+ GPUs in best case scenarios) NPUs are good at running small models efficiently, not large LLM / Image models.