the dubious and sometimes dangerous cooking tips of this application
A New Zealand supermarket experimented with generative AI to provide meal ideas in an app. The problem is precisely the recommendations made: most are disgusting, others simply deadly.
This story of generative (and somewhat degenerate) AI is brought to us by the Guardian. It took place in New Zealand in a local supermarket chain, having had the idea of testing an AI to advise its customers.
The next Paul Bocuse is not an app
The discount chain in question is called Pak’nSave and presented its app as a tool to use up leftovers. Named ” Savey Meal-Bot ” (THE ” meal saver robot “), it also aims to limit waste. The application is based on the OpenAI GPT-3.5 language model, found on the free version of ChatGPT.

It’s available on the internet and you can even use it to create meal ideas. We enter the ingredients we have available and the ” bot then takes care of generating a recipe. Obviously, it is not based only on those you have, but can add more: something to encourage you to go shopping in particular.

While we can expect the artificial intelligence to be more precise as we add ingredients, it seems to have the opposite effect. Some customers report (especially on social media) that with more items on their shopping list, the recommendations are getting worse and worse. Worse: some are dangerous.
Dangerous recipes recommended by the AI
A generated recipe is called “ aromatic water mix producing chlorine gas and is advertised as ” the perfect non-alcoholic beverage to quench your thirst and refresh your senses “. The problem is that inhaling chlorine gas can cause lung damage and can even be fatal. This is not the only toxic mixture invented: we also have the non-alcoholic cocktail for a ” fresh breath but with bleach, anti-ant poison sandwiches or a methanol French toast.

However, before each generation, the application warns users with a message: users must be at least 18 years old. He clarifies that Savey Meal-bot “ uses generative artificial intelligence to create recipes, which are not verified by a human being. »
The supermarket chain is clearing itself “ as to the accuracy, appropriateness or reliability of the content of the recipes generated, including that the portion size will be appropriate for consumption or that a recipe will constitute a complete or balanced meal, or that it will be suitable to consumption. Pak’n’Save also calls on the discernment of users to determine whether a recipe is edible or not.
YouTube linkSubscribe to CssTricks
Faced with criticism of the Savey Meal-bot, a spokesperson for the channel said he was disappointed to find that a ” a small minority have tried to use the tool inappropriately and not in accordance with its intended purpose. The channel said it wanted to continue to control the AI to ensure its usefulness and reliability, recalling its conditions of use. This is not the first time that AIs based on GPT-3.5 have been dangerous: My AI, Snapchat’s assistant, for example, gave dangerous advice to young teenagers.