JohnDClay@sh.itjust.works 4 months ago
I’m surprised it’s only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
JohnDClay@sh.itjust.works 4 months ago
I’m surprised it’s only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
31337@sh.itjust.works 4 months ago
Same. I think I’ve read that a single GPT-4 instance runs on a 128 GPU cluster, and ChatGPT can still take something like 30s to finish a long response. A H100 GPU has a TDP of 700w. Hard to believe that uses only 10x more energy than a search that takes milliseconds.