Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.
This is just adhd gamer boyfriend with extra steps.
Comment on People were mad because they lost their AI boyfriend after GPT-4o deprecation
rozodru@lemmy.world 6 days ago
I tried gpt5 lastnight and I don’t know if it was just me but these people are going to be in shambles if they try to recreate their “boyfriend”.
It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of “walk me through the steps of building my own one page website in basic HTML and CSS” and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it “something didn’t work” to try and fix the problem it would then forget what we were even trying to do.
At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.
Again maybe it was just me but it felt like a massive step backwards.
Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.
This is just adhd gamer boyfriend with extra steps.
So, those things are going backwards.
brucethemoose@lemmy.world 5 days ago
This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation are taking a nosedive.
arxiv.org/abs/2504.04717
Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.
lichtmetzger@discuss.tchncs.de 5 days ago
Wasn’t that a ringtone service in the 90s?