Putting A Face On Android Science By Exploring An Uncanny Valley
- Date:
- July 21, 2006
- Source:
- Indiana University
- Summary:
- We might be more responsive to robots designed to look human rather than mechanical, but other factors may determine what causes us to accept or shun these virtual humans.
- Share:
We might be more responsive to robots designed to look human rather than mechanical, but other factors may determine what causes us to accept or shun these virtual humans.
"Recent evidence indicates that androids are better able to elicit human norms of interaction than less humanlike robots or animated characters," said Karl F. MacDorman, associate professor at the Indiana University School of Informatics. "However, there's a heightened sensitivity to defects in near humanlike forms -- an uncanny valley in what is otherwise a positive relationship between human likeness and familiarity."
The so-called "uncanny valley" theory was proposed in 1970 by robotics pioneer Masahiro Mori. It suggests that the more realistic and humanlike a robot appears, the more positively a human will react to it, but only to a certain point. When the resemblance is too strong, the theory suggests, it actually causes a sense of repulsion or eeriness -- or perhaps even the beholder's grim realization of human mortality.
An expert in human-computer interaction at the school's Indiana University-Purdue University Indianapolis campus, MacDorman seeks to chart new ground in researching the uncanny valley based on previous and ongoing research in which he has been involved. And he believes there is no single explanation of this phenomenon.
One recent study MacDorman was part of sought to determine whether the uncanny valley is a necessary property of near-humanlike forms. The 56 participants -- young adult Indonesian college students, professionals and government workers -- were presented with 14 short video clips depicting different kinds of robotic devices engaged in various activities in different settings. The range of devices included a mechanical arm, walking humanoid robots, and android heads and full-bodied androids engaging in social interaction. They also viewed a clip of a human female.
The participants were individually asked to rate the video clips on scales gauging mechanical versus humanlike, strange versus familiar and also eeriness. "Contrary to an earlier experiment with morphed robot-to-human images, what we found does not indicate a single uncanny valley for a particular range of human likeness," MacDorman said.
"Rather the results suggest human likeness is only one of perhaps many factors influencing the extent to which a robot is perceived as being strange, familiar or eerie," he added. "This is important because it implies that factors other than human likeness could be manipulated to overcome the uncanny valley. This is also what David Hanson [a well-known creator of android faces and CEO of Hanson Robotics Inc.] found in an experiment using still images."
The California-born MacDorman and Hiroshi Ishiguro, with whom he collaborates at Osaka University, suggest such factors might include facial and bodily proportions, movement quality such as fluidity or jerkiness, and contingency and timing -- whether the robot can closely attune its voice, gestures and gaze without making too many pregnant pauses or rapid-fire reactions to people it's interacting with.
Why then is there a need for continuing robotics research on the development of more humanlike androids?
"Android science has great potential to help cognitive neuroscientists, and social and cognitive scientists understand human beings as well as improving medical training," MacDorman said. "We might be using androids, but what we're really studying is ourselves -- what motivates us and how we interact with one another as humans."
MacDorman and Ishiguro are organizing a long symposium on July 26 at the 28th Annual Conference of the Cognitive Science Society in Vancouver, Canada. "Toward Social Mechanisms of Android Science" brings together some of the world's experts in robotics and the behavioral sciences. Information about the session is at www.androidscience.com, as is a full text of MacDorman's paper on the robot video clips.
Three other IU Bloomington researchers are among those scheduled to be presenters at the session: Hui Zhang, a graduate student in the School of Informatics' Department of Computer Science; Chen Yu, assistant professor; and Chancellor's Professor Linda B. Smith, both of the Department of Psychology and Brain Sciences. The three have devised an interactive virtual reality platform to study human interaction in the context of language learning.
In addition to the android science symposium, MacDorman is organizing sessions, with University of Washington psychology professor Peter Kahn, on psychological benchmarks of human-robot interaction at the 15th IEEE Symposium on Robot and Human Interactive Communication at the University of Hertfordshire, United Kingdom.
Story Source:
Materials provided by Indiana University. Note: Content may be edited for style and length.
Cite This Page: