Emotion robots learn from people
Feelix Growing is a research project involving six countries, and 25 roboticists, developmental psychologists and neuroscientists.
Co-ordinator Dr Lola Canamero said the aim was to build robots that "learn from humans and respond in a socially and emotionally appropriate manner".
The 2.3m euros scheme will last for three years.
"The human emotional world is very complex but we respond to simple cues, things we don't notice or we don't pay attention to, such as how someone moves," said Dr Canamero, who is based at the University of Hertfordshire.
The robots exhibit imprinted behaviour - following the 'mother around'
Dr Canamero likens the robots to babies that learn their behaviour from the patterns of movement and emotional state of the world around them.
The robots themselves are simple machines - and in some cases they are off-the-shelf machines. The most interesting aspect of the project is the software.
Dr Canamero said: "We will use very simple robots as the hardware, and for some of the machines we will build expressive heads ourselves.
"We are most interested in programming and developing behavioural capabilities, particularly in social and emotional interactions with humans."
The robots will learn from the feedback they receive from humans.
"It's mostly behavioural and contact feedback.
"Tactile feedback and emotional feedback through positive reinforcement, such as kind words, nice behaviour or helping the robot do something if it is stuck."
The university's partners are building different robots focusing on different emotional interactions.
The robots will get the feedback from simple vision cameras, audio, contact sensors, and sensors that can work out the distance between the machine and the humans.
"One of the things we are going to use to detect expressions in faces and patterns in motion is a (artificial) neural network."
Artificial neural networks are being used because they are very useful for adapting to changing inputs - in this case detecting patterns in behaviour, voice, movement etc.
"Neural networks learn patterns from examples of observation," said Dr Canamero.
One of the areas the robots will be learning from is human movement.
"Motion tells you a lot about your emotional state.
"The physical proximity between human and robot, and the frequency of human contact - through those things we hope to detect the emotional states we need."
The robots will not be trying to detect emotional states such as disgust but rather will focus on states such as anger, happiness, loneliness; emotions which impact on how the robot should behave.
"It is very important to detect when the human user is angry and the robot has done something wrong or if the human is lonely and the robot needs to cheer him or her up.
"We are focusing on emotions relevant to a baby robot that has to grow and help human with every day life."
One of the first robots built in the project is exhibiting imprinted behaviour - which is found among birds and some mammals when born.
"They get attached to the first object they see when born.
"It is usually the mother and that's what makes them follow the mother around.
"We have a prototype of a robot that follows people around and can adapt to the way humans interact with it.
"It follows closer or further away depending on how the human feels about it."
Dr Canamero says robots that can adapt to people's behaviours are needed if the machines are to play a part in human society.
At the end of the project two robots will be built which integrate the different aspects of the machines being developed across Europe.
The other partners in this project are the Centre National de la Recherche Scientifique, Universite de Cergy Pontoise, Ecole Polytechnique Federale de Lausanne, University of Portsmouth, Institute of Communication and Computer Systems, Greece, Entertainment Robotics, Denmark and SAS Aldebaran Robotics, France.
Article from: http://news.bbc.co.uk/2/hi/technology/6389105.stm