Comment on Descentralized AI book reading server

<- View Parent
Cyberflunk@lemmy.world ⁨1⁩ ⁨week⁩ ago

What are you talking about? RAG is a method you use. It only has limitations you design. Your datastore can be whatever you want it to be. The llm performs a tool use YOU define. RAG isn’t one thing. You can build a rag system out of flat files or a huge vector datastore. You determine how much data is returned to the context window. Python and chromadb easily scales to gigabytes, on consumer hardware, completely suitable for local rag.

source
Sort:hotnewtop