New! Sign up for our free email newsletter.
Science News
from research organizations

Robots that perceive the world like humans

Date:
October 18, 2012
Source:
Basque Research
Summary:
Perceive first, act afterwards. The architecture of most of today’s robots is underpinned by this control strategy. The eSMCs project has set itself the aim of changing the paradigm and generating more dynamic computer models in which action is not a mere consequence of perception but an integral part of the perception process. It is about improving robot behavior by means of perception models closer to those of humans.
Share:
FULL STORY

Perceive first, act afterwards.The architecture of most of today's robots is underpinned by this control strategy. The eSMCs project has set itself the aim of changing the paradigm and generating more dynamic computer models in which action is not a mere consequence of perception but an integral part of the perception process. It is about improving robot behaviour by means of perception models closer to those of humans. Philosophers at the UPV/EHU-University of the Basque Country are working to improve the systems of perception of robots by applying human models.

"The concept of how science understands the mind when it comes to building a robot or looking at the brain is that you take a photo, which is then processed as if the mind were a computer, and a recognition of patterns is carried out. There are various types of algorithms and techniques for identifying an object, scenes, etc. However, organic perception, that of human beings, is much more active. The eye, for example, carries out a whole host of saccadic movements -- small rapid ocular movements -- that we do not see.Seeing is establishing and recognising objects through this visual action, knowing how the relationship and sensation of my body changes with respect to movement," explains XabierBarandiaran, a PhD-holder in Philosophy and researcher at IAS-Research (UPV/EHU) which under the leadership of Ikerbasque researcher Ezequiel di Paolo is part of the European project eSMCs (Extending Sensorimotor Contingencies to Cognition).

Until now, the belief has been that sensations were processed, and the perception was created,and this in turn then led to reasoning and action. As Barandiaran sees it, action is an integral part of perception: "Our basic idea is that when we perceive, what is there is active exploration, a particular co-ordination with the surroundings, like a kind of invisible dance than makes vision possible."

The eSMCs project aims to apply this idea to the computer models used in robots, improve their behaviour and thus understand the nature of the animal and human mind. For this purpose, the researchers are working on sensorimotor contingencies:regular relationships existing between actions and changes in the sensory variations associated with these actions.

An example of this kind of contingency is when you drink water and speak at the same time, almost without realising it. Interaction with the surroundings has taken place "without any need to internally represent that this is a glass and then compute needs and plan an action," explains Barandiaran, "seeing the glass draws one's attention, it is coordinated with thirst while the presence of the water itself on the table is enough for me to coordinate the visual-motor cycle that ends up with the glass at my lips. "The same thing happens in the robots in the eSMCs project, "they are moving the whole time, they don't stop to think; they think about the act using the body and the surroundings," he adds.

The researchers in the eSMCs project maintain that actions play a key role not only in perception, but also in the development of more complex cognitive capacities. That is why they believe that sensorimotor contingencies can be used to specify habits, intentions, tendencies and mental structures, thus providing the robot with a more complex, fluid behaviour.

So one of the experiments involves a robot simulation (developed by Thomas Buhrmann, who is also a member of this team at the UPV/EHU) in which an agent has to discriminate between what we could call an acne pimple and a bite or lump on the skin. "The acne has a tip, the bite doesn't. Just as people do, our agent stays with the tip and recognises the acne, and when it goes on to touch the lump, it ignores it. What we are seeking to model and explain is that moment of perception that is built with the active exploration of the skin, when you feel 'ah! I've found the acne pimple' and you go on sliding your finger across it," says Barandiaran. The model tries to identify what kind of relationship is established between the movement and sensation cycles and the neurodynamic patterns that are simulated in the robot's "mini brain."

In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and "feeling" the texture of the terrain on which it is moving (slippery, viscous, rough, etc.)by exploring the sensorimotor contingencies that take place when walking.

The work of the UPV/EHU's research team is focusing on the theoretical part of the models to be developed. "As philosophers, what we mostly do is define concepts.Our main aim is to be able to define technical concepts like the sensorimotor habitat, or that of the pattern of sensorimotor co-ordination, as well as that of habit or of mental life as a whole. "Defining concepts and giving them a mathematical form is essential so that the scientist can apply it to specific experiments, not only with robots, but also with human beings. The partners at the University Medical Centre Hamburg-Eppendorf, for example, are studying in dialogue with the theoretical development of the UPV/EHU team how the perception of time and space changes in Parkinson's patients.


Story Source:

Materials provided by Basque Research. Note: Content may be edited for style and length.


Cite This Page:

Basque Research. "Robots that perceive the world like humans." ScienceDaily. ScienceDaily, 18 October 2012. <www.sciencedaily.com/releases/2012/10/121018100131.htm>.
Basque Research. (2012, October 18). Robots that perceive the world like humans. ScienceDaily. Retrieved November 5, 2024 from www.sciencedaily.com/releases/2012/10/121018100131.htm
Basque Research. "Robots that perceive the world like humans." ScienceDaily. www.sciencedaily.com/releases/2012/10/121018100131.htm (accessed November 5, 2024).

Explore More

from ScienceDaily

RELATED STORIES