Expressive facial reaction
The research area of facial expressions and emotions has a long history. In 1872 Darwin argued that certain emotional expressions are innate and the same for all people. Later evidence has indisputably shown that facial expressions are related to emotion both biologically and culturally. James and Tomkins promoted the idea that the feelings of emotions arise from the perception of characteristic bodily changes. For instance, if one smiles he or she will interpret this smile as a sign of feeling happy. Other psychologists disagree and state that the face says something about a person’s internal state. Ekman states that facial expression is only one of the factors providing information, next to factors as heart rate and blood pressure. Fridland thinks of facial expressions as tools for influencing social interactions; a smile may encourage people to approach while a scowl may warn people to stay away. Russel & Fernandez-Dols think it more likely that facial expressions tell others something about the overall character of a person’s mood and context then provides details about specific emotions.

As you may notice, there are many different theories, but all researchers acknowledge a certain link between affective states and facial expression.

FACS

facs-copy.jpg

FACS stands for Facial Action Coding System and is a detailed description for facial behaviour. Ekman and Friesen developed the system by determining how the contraction of each facial muscle changes the appearance of the face. The measurement units are Action Units (AU’s) (not muscles) because some appearances involve more combined muscle contractions and one muscle can be responsible for multiple AU’s. The scores for a facial expression consist of the list of AU’s producing it, they are descriptive only and provide no implications about the meaning of the behaviour. FACS is basically an education program to learn people how to break down facial expressions into one or more of the 64 AU’s.
The limitation of FACS is that it only captures facial activity visible to the observer’s eye, it does not measure invisible changes (e.g. certain changes in muscle tonus) or autonomic nervous system activity. EMG electrodes could be used but they only cover a certain area and alert subjects to the observation of their faces, which in turn may alter normal behaviour.

FACSAID is a database where facial expressions are linked with their psychological interpretations. They are described in terms of FACS scores that are assigned by experts interpreting the meaning of the facial behaviours. A user can look up the meaning of certain expressions by entering FAC scores.

Pupil dilation
The pupil constricts in bright light, while it dilates in weak light. Research tells pupils will also dilate to varying degrees of mental activity and attentional effort. Pupil changes also occur during emotional experiences. Pupil dilation is followed by pupil constriction during happiness and anger and remains dilated during fear and sadness according to Hess. There is, however some criticism about this research.
Pupillary sychophysiological studies have generally shown that negative pictures cause greater stimulation than for pleasant pictures. An important consideration is differentiating between pupil dilation as a result of a defensive or an orienting response. A defensive response (DR) occurs when a person perceives that a new stimulus could potentially cause pain or danger. The orienting response (OR) is a short-term exploratory response to a non-threatening stimulus. For a DR, heart rates will accelerate while for the OR, heart rates decelerate and pupils dilate. It is possible that interest in visual stimuli that causes pupil dilation is a consequence of the OR.

Eye tracking
Eye tracking is a methodology that can be used to register and analyze evaluation patterns of a product design perception. Eye tracking devices register the eye gaze of users. There are multiple practical approaches but the one described below is chosen for it allows head movements during use.
eyetracking.jpg

A camera focuses on one eye and records its movements as the viewer looks at some kind of stimulus. A collimated beam of infrared light is shined at the front surface of the eyeball, producing a corneal reflection. This reflection moves less than the pupil as the eyeball rotates in its socket. The same infrared illumination creates contrast between the pupil and iris or dark eyelashes. The corneal reflection and outline of the pupil are then observed by the video camera. Image processing hardware or software analyzes the image to compute the centre of each. Then the absolute visual line of gaze (the line radiating forward in space from the eye and indicating what the user is looking at) is computed from the relationship between these two points. Because the two points are used it is possible for the user to move his or her head during measurement. A short calibration session is necessary however.
eyetrack-copy.jpg

Eye tracking can be used to register and analyze evaluation patterns of a product design perception. Relevant information provided by eye-tracking is:

  • Scan path
  • Location of Areas of interest.
  • Time in each area of interest.

In this methodology gaze behaviour, combined with the knowledge of the features of the product or interface, reveals the features that capture attention. The result of applying this methodology is a selection of product (or interface) features that cause the variability of the emotional perception.

A technical limitation is that eye tracker does not provide absolute gaze direction, but can only measure changes in gaze direction. When the evaluated product is 3D in order to know precisely what a subject is looking at, some calibration procedure is required and the process become for complex. This is however less a problem with interfaces, as they are 2D objects.

Trackback URI | Comments RSS

Leave a Reply