Comment on Is possible to run a LLM on a mini-pc like the GMKtec K8 and K9?
1rre@discuss.tchncs.de 3 months agoIntel Arc also works surprisingly fine and consistently for ML if you use llama.cpp for LLMs or Automatic for stable diffusion, it’s definitely much closer to Nvidia in terms of usability than it is to AMD
TheBigBrother@lemmy.world 3 months ago
You would suggest the K9 instead of the K8?