Comment on What exactly is a self-hosted small LLM actually good for (<= 3B)
MTK@lemmy.world 4 days agoRAG is basically like telling an LLM “look here for more info before you answer” so it can check out local documents to give an answer that is more relevant to you.
You just search “open web ui rag” and find plenty kf explanations and tutorials
iii@mander.xyz 3 days ago
I think RAG will be surpassed by LLMs with tool calling (aka agents), with search being one of the tools.
interdimensionalmeme@lemmy.ml 3 days ago
LLMs that train LoRas on the fly then query themselves with the LoRa applied