A European study progresses in the mechanism about face perception by computers

An EU-funded study is working to analyse whether robots and computers have the capacity to do the same thing and then, to develop sophisticated computer vision systems capable of recognising facial expressions. Progressing on recognising facial expressions by computers will serve to develop the next generation of life-changing software and robots and there will be the opportunity to create socially aware companion robots and graphical characters.

Scientists from Queen Mary, University of London, as well as from University College London and Oxford University in the United Kingdom, have developed a computer vision system which can detect smiles and how motions are transferred from one person's face to another, and what their faces look like when they switch gender. The research is an outcome of the LIREC ('Living with robots and interactive companions') project, which is backed with €8.2 million under the 'Information and communication technologies' (ICT) Theme of the EU's Seventh Framework Programme (FP7).

With the design of new sophisticated computer vision systems capable of recognising facial expressions, researchers will be able to break up the movement of faces, specifically into basic facial actions, and understanding how actions are different between people will enable computer scientists to analyse facial movement and develop realistic motion into avatars. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face. The scientists noted that this type of technology can lead to the creation of great spin-offs.