With fewer resources, you have to maximize their efficiency.
Chinese company trained GPT-4 rival with just 2,000 GPUs — 01.ai spent $3M compared to OpenAI's $80M to $100M
Submitted 2 days ago by BrikoX@lemmy.zip to technology@lemmy.zip
Submitted 2 days ago by BrikoX@lemmy.zip to technology@lemmy.zip
With fewer resources, you have to maximize their efficiency.
jonathan@lemmy.zip 2 days ago
I’d like to hear it expressed in terms watts rather than number of GPUs for an indeterminate amount of time.