Design & Emotion Blog

// Learn from the experts
New breakthroughs in development emotion recognition software
0


Faces reveal emotions, and researchers in fields as disparate as psychology, computer science, and engineering are joining forces under the umbrella of “affective computing” to teach machines to read expressions. If they succeed, your computer may one day “read” your mood and play along. Machines equipped with emotional skills could also be used in teaching, robotics, gaming, sales, security, law enforcement, and psychological diagnosis.”

Lupsa continues to explain research that was done at the University of Amsterdam. They created emotion recognition software and used it to analyze faces from images and photos. Some nice examples were the Mona Lisa, who was found to be 83 percent happy and 9 percent disgusted, Che Guevara was mostly sad in the iconic photo used on so many t-shirts.

Computers are getting close to reading and interpreting emotions

Computers can now analyze a face from video or a still image and infer almost as accurately as humans (or better) the emotion it displays. It generally works like this:

  1. The computer isolates the face and extracts rigid features (movements of the head) and nonrigid features (expressions and changes in the face, including texture);
  2. The information is classified using codes that catalog changes in features;
  3. Then, using a database of images exemplifying particular patterns of motions, the computer can say a person looks as if they are feeling one of a series of basic emotions – happiness, surprise, fear – or simply describe the movements and infer meaning.

Research done at MIT by pioneer Rosalind Picard was used to help autistic people to get feedback on what emotions other people feel, because they do not have the ability to read those themselves. This had Picard wondering and she says she learned a broader lesson from this research: If you can teach a person when to be sensitive to others, you probably could teach a machine to do so as well.

Even though I was aware that this type of research was already advanced, this article made me realize that it is getting closer fast. At the moment I am involved in developing a tool that measures emotion during website interaction and with this type of software a very large problem we are facing could be resolved. We are looking for a way to get participants to indicate emotions by self-report. A limitation of having participants in research do self-report on their emotions is the interpretation process that they will go through before answering, let alone the difference in time between the experience and the answer (might be seconds, but can have a big impact).

The tool we are developing is an expert tool that can help designers to improve web- and interaction design, but with these developments I see a bright future of websites that might even adapt according to the emotion that you are feeling in that particular moment. “You look frustrated, here let me help you and give you this other option.”

Interesting stuff.

// |
 

Leave a Comment