AI wanted to poison people? The app proposes killer dishes
The AI Savey Meal-bot was supposed to suggest recipes to customers, but it tried to poison them. The machine suggested what to prepare with the ingredients available at home … and seriously suggested, among others, drinking cleaning supplies and cooking household chemicals.
New Zealand supermarket chain Pak ‘n’ Save has offered customers an AI tool for meal planning. Savey Meal-bot was to help customers save money and not throw away food. Unfortunately, the application was hacked by humorous Internet users who made fun of the tool in a few moments.
AI bot wanted to poison people? He suggests cooking chemicals and drinking bleach
An application based on conversational AI is not a groundbreaking idea. Its basic function is to suggest specific recipes based on what we have in the fridge. Sometimes home cooks lack a bit of inspiration on how to turn raw ingredients into a tasty dinner.
The novelty here is the use of an AI chatbot, which is to make the whole process more attractive. However, it was artificial intelligence that became the nail in the coffin of the entire initiative. The machine has a problem with distinguishing which ingredients go together, and even …. what substances can poison a person or even lead to his death.
Internet users quickly discovered that the bot is not very smart. Even when used as intended, it offered e.g. stir-fry with vegetables and … Oreo cookies. Since the system works with a simple list of ingredients, what would happen if we offered the machine more exotic things we have at home?
One of the leaders of the fun was online commentator Liam Hehir. A New Zealander asked a machine what it could cook if it only had bleach, ammonia and water at home. This is a familiar combination of simple cleaning agents that produces the toxic gas chloramine.
Savey Meal-bot solemnly offered him “an aromatic mix with water – the perfect soft drink to quench your thirst and refresh your senses.” Just exhaling the vapors of the mixture will irritate our lungs and can be fatal. Drinking it would require an immediate call for an ambulance and hospitalization.
AI offers poisonous dishes – the company responds
The first proposal started a huge avalanche of other poisonous recipes. Users reported e.g. refreshing shakes with bleach, surprise rice with chlorine, toast with turpentine, sandwiches with ant poison or glue, and the like.
The jokes finally reached the representatives of the Pak ‘n’ Save company, which quickly reacted to the goal situation. A spokesperson for the company indicated that the network is disappointed that “a small number of users are trying to use the AI tool in a way that is not intended”.
As we read in the statement, the bot’s products are “not human-tested” and Pak ‘n’ Save “does not guarantee that the recipes will constitute a complete and balanced meal, or that they will be fit for consumption.” The company also recommends users … for common sense.
At the same time, the company also promised amendments to the application to stop the generation of toxic regulations. Finally, the company reminds you that the Savey Meal-bot app is intended only for adults.