I wonder if it can be used legally against the company behind the model, though. I doubt that it’s possible, but having a “your own model says it effed up my data” could give some beef to a complaint. Or at least to a request to get a refund on the fees.
Comment on Vibe coding service Replit deleted production database
balder1991@lemmy.world 1 day agoAll I see is people chatting with an LLM as if it was a person. “How catastrophic from 0 to 100?”, you’re just tweeting to get some random answer based solely on whatever context is being fed in the input and that you probably don’t know the extent of it.
Trying to make the LLM “see its mistakes” is a pointless exercise.
andallthat@lemmy.world 1 day ago
6nk06@sh.itjust.works 16 hours ago
How bad is this on a scale of sad emoji to eggplant emoji.
Children are replacing us, it’s terrifying.
cyrano@lemmy.dbzer0.com 1 day ago
Yeah the interaction are pure waste of time I agree, make it write an apology letter? WTF! For me it looks like a fast track way to learn environment segregation, & secret segregation. Data is lost, learn from it and there are tool already in place like git like alembic for proper development.
UntitledQuitting@reddthat.com 1 day ago
the apology letter(s) is what made me think this was satire. using shame to punish “him” like a child is an interesting troubleshooting method.
the lying robot hasn’t heel-turned, any truth you’ve gleaned has been accidental.
cyrano@lemmy.dbzer0.com 1 day ago
It doesn’t look like satire unfortunately