Navigation path

Themes
Agriculture & food
Energy
Environment
ERA-NET
Health & life sciences
Human resources & mobility
Industrial research
Information society
Innovation
International cooperation
Nanotechnology
Pure sciences
Research infrastructures
Research policy
Science & business
Science in society
Security
SMEs
Social sciences and humanities
Space
Special Collections
Transport

Countries
Countries
  Argentina
  Australia
  Austria
  Belarus
  Belgium
  Benin
  Brazil
  Bulgaria
  Cameroon
  Canada
  Chile
  China
  Colombia
  Croatia
  Cyprus
  Czech Republic
  Denmark
  Egypt
  Estonia
  Finland
  France
  Georgia
  Germany
  Ghana
  Greece
  Hungary
  Iceland
  India
  Ireland
  Israel
  Italy
  Japan
  Kazakhstan
  Kenya
  Korea
  Latvia
  Lithuania
  Luxembourg
  Malta
  Mexico
  Montenegro
  Morocco
  Namibia
  Netherlands
  Nigeria
  Norway
  Peru
  Poland
  Portugal
  Romania
  Russia
  Senegal
  Serbia
  Slovakia
  Slovenia
  South Africa
  Spain
  Swaziland
  Sweden
  Switzerland
  Taiwan
  Tunisia
  Turkey
  Ukraine
  United Kingdom
  United States


   Infocentre

Last Update: 19-07-2011  
Related category(ies):
Innovation  |  Information society  |  Research policy  |  Science in society

 

Add to PDF "basket"

Robots bridging gap between technology and society

Improving our knowledge about face perception could give researchers the help they need to develop the next generation of life-changing software and robots. Driving this effort are scientists from Queen Mary, University of London, as well as from University College London and Oxford University in the United Kingdom, who are examining whether robots and computers have the capacity to do the same thing. The research is an outcome of the LIREC ('Living with robots and interactive companions') project, which is backed with EUR 8.2 million under the 'Information and communication technologies' (ICT) Theme of the EU's Seventh Framework Programme (FP7). The researchers presented their work at the annual Royal Society's Summer Science Exhibition in London from 5 to 10 July.

Emys, a LIREC robot © LIREC
Emys, a LIREC robot
©  LIREC

When you interact with other people, your brain processes a number of small and subtle cues about faces. At the exhibition, visitors saw how the human brain understands faces, including how motions are transferred from one person's face to another, and what their faces look like when they switch gender. The visitors also saw sophisticated computer vision systems capable of recognising facial expressions.

'We will be showing some of the latest research from the EU-funded LIREC project, which aims to create socially aware companion robots and graphical characters,' said Professor Peter McOwan from the School of Electronic Engineering and Computer Science at Queen Mary, University of London before the exhibition. 'There will be the opportunity for those attending to see if our computer vision system can detect their smiles, watch the most recent videos of our robots in action and talk to us about the project.'

Being able to break up the movement of faces, specifically into basic facial actions, and understanding how actions are different between people will enable computer scientists to analyse facial movement and develop realistic motion into avatars. In doing so, people will be more accepting of avatars as channels of communication.

Said Professor McOwan: 'Robots are going to increasingly form part of our daily lives — for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible — understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness.'

Commenting on the research and work involved, Professor Alan Johnston from the Psychology and Language Sciences Division at University College London said: 'A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face.'

For her part, co-researcher Cecilia Heyes from All Souls College at University of Oxford noted that this type of technology can lead to the creation of great spin-offs. 'We're using it to find out how people imitate facial expressions, which is very important for rapport and cooperation, and why people are better at recognising their own facial movements than those of their friends — even though they see their friends faces much more often than their own.'


Convert article(s) to PDF

No article selected


loading


Search articles

Notes:
To restrict search results to articles in the Information Centre, i.e. this site, use this search box rather than the one at the top of the page.

After searching, you can expand the results to include the whole Research and Innovation web site, or another section of it, or all Europa, afterwards without searching again.

Please note that new content may take a few days to be indexed by the search engine and therefore to appear in the results.

Print Version
Share this article
See also

Queen Mary, University of London
LIREC
'Robotic hand made more natural'





  Top   Research Information Center