Comment on "Self-host a local AI stack and access it from anywhere" | Tailscale Blog
BigMikeInAustin@lemmy.world 4 weeks ago
[deleted]
Comment on "Self-host a local AI stack and access it from anywhere" | Tailscale Blog
BigMikeInAustin@lemmy.world 4 weeks ago
30p87@feddit.org 4 weeks ago
You mean an A4000? My 1070 can run llama 3.1 comfortably (the low B versions). My 7800XT answers instantly.