Comment on The therapy I can afford

<- View Parent
Captain_Stupid@lemmy.world ⁨1⁩ ⁨week⁩ ago

The smallest Modells that I run on my PC take about 6-8 GB of VRAM and would be very slow if I ran them purely with my CPU. So it is unlikely that you Phone has enough RAM and enough Cores to run a decent LLM smootly.

If you still want to use selfhosted AI on you phone:

You now can use selfhosted AI with your phone

source
Sort:hotnewtop