Probably not a flash drive but you can get decent mileage out of 7b models that run on any old laptop for tasks like text generation, shortening or summarizing.
Deflated0ne@lemmy.world 1 week ago
And an LLM that you could run local on a flash drive will do most of what it can do.
ckmnstr@lemmy.world 1 week ago
Tikiporch@lemmy.world 1 week ago
What do you use your usb drive llm for?
outhouseperilous@lemmy.dbzer0.com 1 week ago
Porn. Obviously.
lefixxx@lemmy.world 1 week ago
Can you give an example?
EncryptKeeper@lemmy.world 1 week ago
I mean no not at all, but local LLMs are a less energy reckless way to use AI
Corkyskog@sh.itjust.works 1 week ago
Why not for the ignorant, such as myself?
EncryptKeeper@lemmy.world 1 week ago
AI models require a LOT of VRAM to run. Failing that they need some serious CPU power but it’ll be dog slow.
A consumer model that is only a small fraction of the capability of the latest ChatGPT model would require at least a $2,000+ graphics card, if not more than one.
Corkyskog@sh.itjust.works 1 week ago
How slow?