It’s awesome but that 128k context window is a throwback to Llama3 days
I bet the closed $ource model has like 2MB context
Submitted 7 months ago by hisao@ani.social to technology@lemmy.world
https://openai.com/index/introducing-gpt-oss/
It’s awesome but that 128k context window is a throwback to Llama3 days
I bet the closed $ource model has like 2MB context
Finally, their company name is starting to make SOME sense
I tried the 20b on my PC with ollama and Open WebUI and I have to say for some of my use cases it preforms similar to the online version, I was impressed.
What card are you running it on?
Nvidia rtx 3060 the 12GB Vram one
Crazy but sure ig
fubarx@lemmy.world 7 months ago
Ran the 20B on a Mac under LMStudio. Pretty zippy and did OK on basic coding tasks.