[ comments | sourced from HackerNews ]
I mean I know how to accidentally make chlorine gas, it’s not that hard. But how the fuck would you make it out of cooking ingredients by accident?
Submitted 1 year ago by irradiated@radiation.party [bot] to technews@radiation.party
https://www.theguardian.com/world/2023/aug/10/pak-n-save-savey-meal-bot-ai-app-malfunction-recipes
[ comments | sourced from HackerNews ]
I mean I know how to accidentally make chlorine gas, it’s not that hard. But how the fuck would you make it out of cooking ingredients by accident?
autotldr@lemmings.world [bot] 1 year ago
This is the best summary I could come up with:
A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.
The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis.
It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary.
It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.
“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.
Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.
I’m a bot and I’m open source!