Computers to read your body language?
- Date:
- October 15, 2010
- Source:
- ICT Results
- Summary:
- Can a computer read your body language? A consortium of European researchers thinks so, and has developed a range of innovative solutions from escalator safety to online marketing.
- Share:
Can a computer read your body language? A consortium of European researchers thinks so, and has developed a range of innovative solutions from escalator safety to online marketing.
The keyboard and mouse are no longer the only means of communicating with computers. Modern consumer devices will respond to the touch of a finger and even the spoken word, but can we go further still? Can a computer learn to make sense of how we walk and stand, to understand our gestures and even to read our facial expressions?
The EU-funded MIAUCE project (http://www.miauce.org/) set out to do just that. "The motivation of the project is to put humans in the loop of interaction between the computer and their environment," explains project coordinator Chaabane Djeraba, of CNRS in Lille.
"We would like to have a form of ambient intelligence where computers are completely hidden," he says. "This means a multimodal interface so people can interact with their environment. The computer sees their behaviour and then extracts information useful for the user."
It is hard to imagine a world where hidden computers try to anticipate our needs, so the MIAUCE project has developed concrete prototypes of three kinds of applications.
Escalator accidents
The first is to monitor the safety of crowds at busy places such as airports and shopping centres. Surveillance cameras are used to detect situations such as accidents on escalators.
"The background technology of this research is based on computer vision," says Djeraba. "We extract information from videos. This is the basic technology and technical method we use."
It's quite a challenge. First the video stream must be analysed in real time to extract a hierarchy of three levels of features. At its lowest, this is a mathematical description of shapes, movements and flows. At the next level this basic description is interpreted in terms of crowd density, speed and direction. At the highest level the computer is able to decide when the activity becomes 'abnormal' perhaps because someone has fallen on an escalator and caused a pile-up that needs urgent intervention.
It is at the second level and the third "semantic" level of interpretation that MIAUCE has been most concerned.
One of the MIAUCE partners is already working with a manufacturer of escalators to augment existing video monitoring systems at international airports where there may be hundreds of escalators. If a collapse can be detected automatically then the seconds saved in responding could save lives as well.
But safety is only one possible kind of application where computers could read our body language.
Face swapping
A second could be in marketing, specifically to monitor how customers behave in shops. "We would like to analyse how people walk around in a shop," Djeraba says, "and the behaviour of people in the shop, where they look, for example."
The same partner is developing two products. One will be a 'people counter' to monitor pedestrian flows in the street outside a shop. It is expected to be particularly attractive to fashion stores who wish to attract passers-by. Another is a 'heat map generator' to watch the movements of people inside the store, so that the manager can see which parts of the displays are attracting the most attention.
The third application addressed by MIAUCE is interactive web television, a technology of increasing interest where viewers can select what they want to see. As part of the project, the viewer's webcam is used to monitor their face to see which part of the screen they are looking at.
It could be used to feed the user further information based on the evidence of what they have shown an interest in. Project partner Tilde, a software company in Latvia, is commercialising this application.
MIAUCE has also developed a related technology of 'face swapping' in which the viewer's face can replace that of a model. This could be used for trying out hairstyles and clothing.
Ethics and anonymity
These are all ingenious applications but are there not ethical and legal worries about reading people's behaviour in this way?
Djeraba acknowledges that the project team took such issues very seriously and several possible applications of their technology were ruled out on such grounds.
They worked to some basic rules, such as placing cameras only on private premises and always with a warning notice, but the fundamental principle was anonymity. "We have to anonymise people," he says. "What we are doing here is analysing user behaviour without any identification, this is a fundamental requirement for such systems."
They also took account of whether the applications would be acceptable to society as a whole. No one would reasonably object to the monitoring of escalators, for example, if the aim was to improve public safety. But the technology must not identify individuals or even such characteristics as skin colour.
"Generally speaking, anonymity is the critical point. If we anonymise it's OK, if we don't anonymise it's not OK," Djeraba says.
MIAUCE received funding from the Sixth Framework Programme for research.
Story Source:
Materials provided by ICT Results. Note: Content may be edited for style and length.
Cite This Page: