New! Sign up for our free email newsletter.
Science News
from research organizations

Taste of the future: Robot chef learns to 'taste as you go'

Date:
May 4, 2022
Source:
University of Cambridge
Summary:
A robot 'chef' has been trained to taste food at different stages of the chewing process to assess whether it's sufficiently seasoned.
Share:
FULL STORY

A robot 'chef' has been trained to taste food at different stages of the chewing process to assess whether it's sufficiently seasoned.

Working in collaboration with domestic appliances manufacturer Beko, researchers from the University of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.

Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn't, making them better cooks.

When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato's flavour will change.

The robot chef, which has already been trained to make omelettes based on human taster's feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced 'taste maps' of the different dishes.

The researchers found that this 'taste as you go' approach significantly improved the robot's ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. The results are reported in the journal Frontiers in Robotics & AI.

The perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.

Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.

"Most home cooks will be familiar with the concept of tasting as you go -- checking a dish throughout the cooking process to check whether the balance of flavours is right," said Grzegorz Sochacki from Cambridge's Department of Engineering, the paper's first author. "If robots are to be used for certain aspects of food preparation, it's important that they are able to 'taste' what they're cooking."

"When we taste, the process of chewing also provides continuous feedback to our brains," said co-author Dr Arsen Abdulali, also from the Department of Engineering. "Current methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product."

The researchers are members of Cambridge's Bio-Inspired Robotics Laboratory run by Professor Fumiya Iida of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot 'chef' have produced a passable omelette using feedback from human tasters.

"We needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking," said Sochacki.

To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.

Using the probe, the robot 'tasted' the dishes in a grid-like fashion, returning a reading in just a few seconds.

To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. The different readings at different points of 'chewing' produced taste maps of each dish.

Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.

While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.

"When a robot is learning how to cook, like any other cook, it needs indications of how well it did," said Abdulali. "We want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can 'see' the difference in the food as it's chewed, which improves its ability to taste."

"Beko has a vision to bring robots to the home environment which are safe and easy to use," said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. "We believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users."

In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.

The research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.


Story Source:

Materials provided by University of Cambridge. The original text of this story is licensed under a Creative Commons License. Note: Content may be edited for style and length.


Journal Reference:

  1. Grzegorz Sochacki, Arsen Abdulali, Fumiya Iida. Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking. Frontiers in Robotics and AI, 2022; 9 DOI: 10.3389/frobt.2022.886074

Cite This Page:

University of Cambridge. "Taste of the future: Robot chef learns to 'taste as you go'." ScienceDaily. ScienceDaily, 4 May 2022. <www.sciencedaily.com/releases/2022/05/220504082625.htm>.
University of Cambridge. (2022, May 4). Taste of the future: Robot chef learns to 'taste as you go'. ScienceDaily. Retrieved December 20, 2024 from www.sciencedaily.com/releases/2022/05/220504082625.htm
University of Cambridge. "Taste of the future: Robot chef learns to 'taste as you go'." ScienceDaily. www.sciencedaily.com/releases/2022/05/220504082625.htm (accessed December 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES