Comment on Does anyone else have experience with koboldcpp? How do I make it give me longer outputs?
fhein@lemmy.world 4 months agoIs max tokens different from context size?
Might be worth keeping in mind that the generated tokens go into the context, so if you set it to 1k with 4k context you only get 3k left for character card and chat history. I think i usually have it set to 400 tokens or something, and use TGW’s continue button in case a long response gets cut off
tal@lemmy.today 4 months ago
No. Same thing. If you hover over the question mark by “Max Tokens” in the Kobold AI Web UI:
“Max number of tokens of context to submit to the AI for sampling. Make sure this is higher than Amount to Generate. Higher values increase VRAM/RAM usage.”