Comment on Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec

<- View Parent
Soundhole@lemm.ee ⁨9⁩ ⁨months⁩ ago

That’s already here. Anyone can run AI chatbots similar too, but not as intelligent, Chatgpt or Bard.

Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available. And there are numerous open source models available.

Hell, you can even run llama.cpp on Android phones.

This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

source
Sort:hotnewtop