Comment on Faster Ollama alternative

theunknownmuncher@lemmy.world ⁨3⁩ ⁨weeks⁩ ago

Ummm… did you try /set parameter num_context # and /set parameter num_predict #? Are you using a model that actually supports the context length that you desire…?

source
Sort:hotnewtop