Comment on Hello GPT-4o

<- View Parent
abhibeckert@lemmy.world ⁨6⁩ ⁨months⁩ ago

you can run locally some small models

Emphasis on “small” models. The large ones need about $80,000 in RAM.

source
Sort:hotnewtop