Comment on The first GPT-4-class AI model anyone can download has arrived: Llama 405B
MonkderVierte@lemmy.ml 3 months agoWell, yes and no. See the other comment, 64 GB RAM at the lowest setting and the 70b running slow with modern CPU and 32 GB RAM.
admin@lemmy.my-box.dev 3 months ago
Oh, sure. For the 405B model it’s absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.
I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.