Comment on AI one-percenters seizing power forever is the real doomsday scenario, warns AI godfather
jcdenton@lemy.lol 1 year ago
No one can fucking run it locally right now only people who have 1%er money can run it
Comment on AI one-percenters seizing power forever is the real doomsday scenario, warns AI godfather
jcdenton@lemy.lol 1 year ago
No one can fucking run it locally right now only people who have 1%er money can run it
SupraMario@lemmy.world 1 year ago
Uhh what? You can totally run LLMs locally.
MooseBoys@lemmy.world 1 year ago
Inference, yes. Training, no. Derived models don’t count.
Jeremyward@lemmy.world 1 year ago
I have Llama 2 running on localhost, you need a fairly powerful GPU but it can totally be done.
SailorMoss@sh.itjust.works 1 year ago
I’ve run one of the smaller models on my i7-3770 with no GPU acceleration. It is painfully slow but not unusably slow.
jcdenton@lemy.lol 1 year ago
To get the same level as something like chat gpt?