Comment on We don't need AI
amino@lemmy.blahaj.zone 1 week agooffline Wikipedia on a phone vs llama which needs a PC:
wiki needs:
- 20-30 GB for the text version
- can fit on most phones otherwise use SD card /USB storage
- isn’t full of AI slop
Llama 3.1 8B Requirements:
- CPU: Modern processor with at least 8 cores
- RAM: Minimum of 16 GB recommended
- GPU: NVIDIA RTX 3090
- doesn’t run on Android or iOS
also keep in mind most people outside of tech/gamer bros can’t afford the financial investment for a PC that can run it. most people can afford an Android phone
untakenusername@sh.itjust.works 1 week ago
my PC is a refurbished second-hand dinosaur from 2019 and it can do this fine, no gpu. And you don’t need good specs at all other than the RAM being enough to load the model into memory, which in this case is 5 gigs.
You sure about androids?
Anyway if llama 3.1 is too big, just use qwen3:0.6b or something small
Also Wikipedia doesn’t contain all knowledge of the internet, but this stuff was trained on everything
amino@lemmy.blahaj.zone 1 week ago
at least Wikipedia won’t waste your time with misinformation. llama could be trained on the entirety of human history for all I care, doesn’t matter one bit if it can’t provide accurate sources and facts
untakenusername@sh.itjust.works 1 week ago
When you don’t have internet access, which is the use case I was talking about, you don’t have sources other than what you’ve downloaded. If you cant check the sources, then effectively there are none.
amino@lemmy.blahaj.zone 1 week ago
let’s say I’m reading an article on communism and the sources provided by Wikipedia come from Marx while llama regurgitates answers from US conservatives on Reddit telling me that communism is when no iPhone and state dictatorship.
even if I can’t access the source in the moment, I can still tell that Wikipedia at least attempts to follow the scientific method. just knowing the name of the source tells me what useless junk to skip. oh, and I could go to the library and ask them if they have my source available.