Comment on OpenAI's GPT Trademark Request Has Been Denied

<- View Parent
NevermindNoMind@lemmy.world ⁨9⁩ ⁨months⁩ ago

I don’t know enough to know whether or not that’s true. My understanding was that Google’s Deep mind invented the transformer architecture with their paper “all you need is attention.” A lot, if not most, LLMs use a transformer architecture, though your probably right a lot of them base it on the open source models OpenAI made available. The “generative” part is just descriptive of the model generating outputs (as opposed to classification and the like), and pre trained just refers to the training process.

But again I’m a dummy so you very well may be right.

source
Sort:hotnewtop