Comment on The Irony of 'You Wouldn't Download a Car' Making a Comeback in AI Debates

<- View Parent
Hackworth@lemmy.world ⁨2⁩ ⁨months⁩ ago

Just taking GPT 3 as an example, its training set was 45 terabytes, yes. But that set was filtered and processed down to about 570 GB. GPT 3 was only actually trained on that 570 GB. The model itself is about 700 GB. Much of the generalized intelligence of an LLM comes from abstraction to other contexts.

source
Sort:hotnewtop