Robots learn to express emotions
A realistic goal for robot technology of the near future is the ability to “understand” four or five basic human emotions. These robots will be able to decipher complex facial expressions of a human, recognise emotional states based on body language and respond appropriately. The first steps towards this goal are currently being taken.
The European project, named Feelix Growing (FEEL, Interact, eXpress: a Global appRoach to develOpment With INterdisciplinary Grounding), is developing robots capable of adapting to new and altering environments. Their hope is that robots will be able to react to the emotional behaviour of their user.
One of the many robot models at the University of Hertfordshire, north of London, is designed to copy people's movements, similar to how new-born animals mimic their mothers. The robot can analyse a person's walking pattern with cameras and sensors in order to identify them. The robot then follows the person with the necessary variations in speed and direction. Since the robot recognises the person based on their walking, if the person suddenly takes on a different walking style the robot will walk away. Since people can change their walking patterns depending on mood, enabling the robot to, not only adapt to different people, but to different emotional states of the same person is the next step in the robot's development.
Another robot, a small robotic dog, will sense when it is in a situation that requires assistance. The robotic dog will start barking when it doesn't recognise its current environment. It will then calm down again when someone talks to it, pets it or when the dog sees its user.
Human recognition is also being used for robots designed to show various emotions. For example, a robot that expresses happiness whenever it sees its developer, the then sadness when the developer goes away. The ability to display emotions is the result of complex algorithmic calculations and artificial neural networks.
Lola Cañamero, coordinator of Feelix Growing, does not want the results of the project to be more pet or service robots. The robots should play an active social role. For example, if someone is depressed, the robot could try to cheer them up; if someone is stressed, the robot could try to calm them down. A strong potential lies between the companion and care provider role.
A medical application for such robots could be the assistance in treating those with difficulties with social interactions. At a laboratory in Paris scientists are seeing how children with and without autism respond to robots. Various tests with autistic children have show positive results. Children with severe autism have shown signs of sadness when the robot has a sad face and have reacted happily when the robot smiles. Furthermore they have been able to say which emotion the robot is expressing.
Other participants of the Feelix Growing project include the engineers at Aldebaran Robotics in Paris. They are currently developing NAO, a service humanoid designed to attend to various tasks, be it serving drinks or providing first aid.
Such a high integration of robots into our everyday lives leads to concerns as to how and whether the robots will be accepted. Ethical and philosophical issues soon follow once a robot begins caring for the sick. However, before any of these issues need to be faced, there are still many technical challenges to be solved.