Comment on Beating GPT-4 on HumanEval with a Fine-Tuned CodeLlama-34B
ChrisLicht@lemm.ee 1 year ago
Dumb question: Does one install the Python model, or access online?
Comment on Beating GPT-4 on HumanEval with a Fine-Tuned CodeLlama-34B
ChrisLicht@lemm.ee 1 year ago
Dumb question: Does one install the Python model, or access online?
L_Acacia@lemmy.one 1 year ago
The best way to run Llama model locally is to run using Text generation web UI, the model will most likely be quantized to 4/5bit GGML / GPTQ today, which will make it possible to run on a “normal” computer.
Phind might make it accessible on their website soon, but it doesn’t seem to be the case yet.
ChrisLicht@lemm.ee 1 year ago
You are awesome; thanks for the clue-in!