Comment on "Self-host a local AI stack and access it from anywhere" | Tailscale Blog
30p87@feddit.org 1 week agoYou mean an A4000? My 1070 can run llama 3.1 comfortably (the low B versions). My 7800XT answers instantly.
Comment on "Self-host a local AI stack and access it from anywhere" | Tailscale Blog
30p87@feddit.org 1 week agoYou mean an A4000? My 1070 can run llama 3.1 comfortably (the low B versions). My 7800XT answers instantly.