A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.
The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis. It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary. It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.
When customers began experimenting with entering a wider range of household shopping list items into the app, however, it began to make even less appealing recommendations. One recipe it dubbed “aromatic water mix” would create chlorine gas. The bot recommends the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses”.
“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.
New Zealand political commentator Liam Hehir posted the “recipe” to Twitter, prompting other New Zealanders to experiment and share their results to social media. Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.
A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”. In a statement, they said that the supermarket would “keep fine tuning our controls” of the bot to ensure it was safe and useful, and noted that the bot has terms and conditions stating that users should be over 18.
In a warning notice appended to the meal-planner, it warns that the recipes “are not reviewed by a human being” and that the company does not guarantee “that any recipe will be a complete or balanced meal, or suitable for consumption”.
“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot,” it said.
DeltaTangoLima@reddrefuge.com 1 year ago
Oh fuck. Right. Off. Don’t blame someone for trivially showing up how fucking stupid your marketing team’s idea was, or how shitty your web team’s implementation of a sub-standard AI was. Take some godamm accountability for the unleashing this piece of shit onto your customers like this.
Fucking idiots. Deserve to be mocked all over the socials.
MagicShel@programming.dev 1 year ago
For now, this is the fate of anyone exposing an AI to the public for business purposes. AI is currently a toy. It is, in limited aspects, a very useful toy, but a toy nonetheless and people will use it as such.
ScrivenerX@lemm.ee 1 year ago
He asked for a cocktail made out of bleach and ammonia, the bot told him it was poisonous. This isn’t the case of a bot just randomly telling people to make poison, it’s people directly asking the bot to make poison. You can see hints of the bot pushing back in the names, like the “clean breath cocktail”. Someone asked for a cocktail containing bleach, the bot said bleach is for cleaning and shouldn’t be eaten, so the user said it was because of bad breath and they needed a drink to clean their mouth.
It sounds exactly like a small group of people trying to use the tool inappropriately in order to get “shocking” results.
Do you get upset when people do exactly what you ask for and warn you that it’s a bad idea?
Karyoplasma@discuss.tchncs.de 1 year ago
Isn’t getting upset when facing the consequences of your own actions the crux of modern society?
DeltaTangoLima@reddrefuge.com 1 year ago
Lol. They fucked up by releasing a shitty AI on the internet, then act “disappointed” when someone tested the limits of the tech to see if they could get it to do something unintended, and you somehow think it’s still ok to blame the person who tried it?
First day on the internet?
kungen@feddit.nu 1 year ago
Why are you so upset that the store said that it’s inappropriate to write “sodium hypochlorite and ammonia” into a food recipe LLM? And “unleashing this piece of shit onto your customers”? Are we reading the same article, or how is a simple chatbot on their website something that has been “unleashed”?
DeltaTangoLima@reddrefuge.com 1 year ago
I’m annoyed because they’re taking no accountability for their own shitty implementation of an AI.
As a supermarket, you think they could add a simple taxonomy for items that are valid recipe ingredients so - you know - people can’t ask it to add bleach.
Yes, they unleashed it. They offered this up as a way to help customers save during a cost of living crisis, by using leftovers. At the very least, they’ve preyed on people who are under financial pressure, for their own gain.
Dave@lemmy.nz 1 year ago
Consider that they probably knew this would happen, and getting global news coverage is pretty much the point.
Steeve@lemmy.ca 1 year ago
Haha what? Accountability? If you plug “ammonia and bleach” into your AI recipe generator and you get sick eating the suggestion that includes ammonia and bleach that is 100% your fault.
Sabata11792@kbin.social 1 year ago
Let me add bleach to the list... and I'm banned.