Submitted 1 year ago by 2tone@lemmy.world to technology@lemmy.world
Llama-2 because you can run it on your own hardware. For the big GPU on a rented instance: Falcon 70b. OpenAI and Google can have turns playing proprietary asshat jack in the box.
How expensive would it be to run on a rented server?
For the bigger ones you could do it under $.50 / hr. I run llama 2 13b-8 bit on my 3090 no problem, which can be rented for $.20/hr.
vast.ai/#pricing
Since of the lower pricing I’ve seen.
j4k3@lemmy.world 1 year ago
Llama-2 because you can run it on your own hardware. For the big GPU on a rented instance: Falcon 70b. OpenAI and Google can have turns playing proprietary asshat jack in the box.
procrastinator@lemmy.world 1 year ago
How expensive would it be to run on a rented server?
Stubborn9867@lemmy.jnks.xyz 1 year ago
For the bigger ones you could do it under $.50 / hr. I run llama 2 13b-8 bit on my 3090 no problem, which can be rented for $.20/hr.
vast.ai/#pricing
Since of the lower pricing I’ve seen.