Comment on [deleted]

<- View Parent
fishynoob@infosec.pub ⁨5⁩ ⁨days⁩ ago

Thanks for the edit. You have a very intriguing idea; a second LLM in the background with a summary of the conversation + static context might make performance a lot better. I don’t know if anyone has implemented it/knows how one can DIY it with Kobold/Ollama

source
Sort:hotnewtop