Yeah sounds like something that needs to be tested, could be total bullshit
Comment on Study finds that Chat GPT will cheat when given the opportunity and lie to cover it up later.
canihasaccount@lemmy.world 11 months agoGPT-4 will. For example, I asked it the following:
What is the neighborhood stranger model of fluid mechanics?
It responded:
The “neighborhood stranger model” of fluid mechanics is not a recognized term or concept within the field of fluid mechanics, as of my last update in April 2023.
Now, obviously, this is a made-up term, but GPT-4 didn’t confidently give incorrect answer. Other LLMs will. For example, Bard says,
The neighborhood stranger model of fluid mechanics is a simplified model that describes the behavior of fluids at a very small scale. In this model, fluid particles are represented as points, and their interactions are only considered with other particles that are within a certain “neighborhood” of them. This neighborhood is typically assumed to be a sphere or a cube, and the size of the neighborhood is determined by the length scale of the phenomena being studied.
Cannacheques@slrpnk.net 11 months ago
butterflyattack@lemmy.world 11 months ago
Interestingly, the answer from bard sounds like it could be true. I don’t know shit about fluid dynamics but it seems pretty plausible.
Socsa@sh.itjust.works 11 months ago
Because it is describing a real numerical solver method which is reasonably well stated by that particular made up phrase. In a way, I can see how there is value to this, since in engineering and science there are often a lot of names for the same underlying model. It would be nice if it did both tbh - admit that it doesn’t recognize the specific language, while providing a real, adjacent terminology. Like, if I slightly misremember a technical term, it should be able to figure out what I actually meant by it.