Comment on "Self-host a local AI stack and access it from anywhere" | Tailscale Blog

<- View Parent
30p87@feddit.org ⁨1⁩ ⁨week⁩ ago

You mean an A4000? My 1070 can run llama 3.1 comfortably (the low B versions). My 7800XT answers instantly.

source
Sort:hotnewtop