What is the Facial Action Coding System (FACS)
Part Three in our Guide to Paul Ekman's Universal Emotions
The Facial Action Coding System (FACS) stands as a monumental contribution to the objective measurement and understanding of human facial movements. Developed by Paul Ekman and Wallace V. Friesen, it provides a granular, anatomically based framework that transcends subjective interpretation, allowing for precise analysis of how emotions are expressed on the face.
What FACS Is: A tool for objectively measuring facial movements.
FACS is recognized globally as a comprehensive, anatomically based system designed to taxonomize and objectively measure the full range of human facial movements. Its origins trace back to a system initially developed by the Swedish anatomist Carl-Herman Hjortsjö, which was later adopted, refined, and formally published by Paul Ekman and Wallace V. Friesen in 1978. A subsequent revision in 2002/2003 incorporated contributions from J. Hager, further solidifying its methodology.
A key distinction of FACS is its departure from traditional methods that categorize expressions into broad, prototypical emotions like happiness or sadness. Instead, FACS meticulously encodes the movements of individual facial muscles, enabling the analysis of even ambiguous, subtle, or non-emotional facial actions. This rigorous, objective approach makes it the most widely acknowledged, comprehensive, and credible system for studying visually discernible facial movements. By focusing on the underlying muscular actions rather than inferred emotional states, FACS remains free from theoretical biases, offering a method to analyze facial movement in its entirety and providing the potential to uncover novel muscular configurations associated with various states.
How It Works: Action Units (AUs) corresponding to muscle movements.
FACS operates by breaking down complex facial expressions into fundamental, discrete components known as Action Units (AUs). Each AU precisely represents the observable change on the face caused by the contraction or relaxation of specific facial muscles. The system identifies 44 FACS AUs, with 30 of these directly correlated to the contractions of specific facial muscles—12 in the upper face and 18 in the lower face.
AUs can manifest either individually or in various combinations. These combinations can be "additive," where the appearance of the constituent AUs remains unchanged, or "non-additive," where the combined action alters the appearance of the individual AUs. For example, AU 4 involves the brows being drawn together and lowered. However, in the combination of AU 1 + 4, the brows are drawn together but are simultaneously raised due to the action of AU 1. Another instance of non-additive combination is AU 1 + 2, where AU 2 alone raises the outer brow but often also pulls up the inner brow, resulting in an appearance very similar to AU 1 + 2. The system allows for a detailed description of every facial expression, quantifying its intensity on a five-point scale, as well as its duration and symmetry. Over 7,000 distinct AU combinations have been observed, highlighting the intricate complexity of human facial expression. The process of FACS scoring involves meticulous frame-by-frame analysis of video or photographic facial events, providing a precise record of muscular movements over time.
FACS Emotion Calculator
Select the Facial Action Units (AUs) observed, grouped by facial region:
Upper Face (Brow & Forehead)
Mid Face (Cheeks & Nose)
Lower Face (Mouth & Jaw)
Example Combinations
- Happy: AU6 (Cheek Raiser) + AU12 (Lip Corner Puller)
- Sad: AU1 (Inner Brow Raiser) + AU4 (Brow Lowerer) + AU15 (Lip Corner Depressor)
- Surprise: AU1 + AU2 + AU5 + AU26
- Fear: AU1 + AU2 + AU4 + AU5 + AU20 + AU26
- Anger: AU4 + AU5 + AU7 + AU23
- Disgust: AU9 + AU10 + AU17
Using FACS to Recognize Emotions
FACS has proven instrumental in substantiating the existence of the seven universally recognized facial expressions of emotion. Researchers utilizing FACS have identified that specific AUs or combinations of AUs consistently correspond to particular emotional states. For instance, the expression of Anger is typically coded as a combination of AUs 4, 5, 7, and 23 (with AU 23 being a particularly reliable indicator); Disgust involves AUs 9, 10, and 17 (AU 10 being reliable); Surprise is characterized by AUs 1, 2, 5, 25, and 26/27; Fear by AUs 1, 2, 4, 5, 7, 20, 25, and 26 (AUs 1, 2, 4 being reliable); Sadness by AUs 1, 4, and 15 (AUs 1 and 15 being reliable); and Contempt by R14 (a reliable unilateral action). The asterisk notation in these codes often denotes muscle movements that are considered highly reliable and difficult to voluntarily control or fake.
Beyond identifying basic emotions, FACS possesses the unique capability to differentiate between genuine and fabricated expressions. A classic example is distinguishing a Duchenne smile, which signifies genuine enjoyment and involves the involuntary contraction of muscles around the eyes (AU 6), from a social or polite smile, which typically only engages the mouth muscles. The absence of movement in the outer part of the orbicularis oculi muscle (AU 6) is a key indicator that a happiness expression may be false. To streamline the analysis of emotionally significant expressions, EMFACS (Emotion FACS) was developed as a selective application of FACS scoring. In EMFACS, coders focus specifically on behaviors that are likely to carry emotional meaning, by scanning video for core combinations of AUs known to suggest certain emotions. This method, while still yielding FACS codes, saves considerable time by not requiring the coding of every single facial movement.
Training Resources and Certification
Mastery of the Facial Action Coding System requires significant dedication and rigorous training. The FACS manual is designed for self-instruction, a process that typically demands between 50 to 100 hours of diligent study and practice, often extending over several months to achieve proficiency. Paul Ekman himself recommends group training, as collaborative learning can facilitate the assimilation of the high volume of complex information.
Certification as a proficient FACS coder is achieved by passing a final test. This assessment involves coding a series of video segments, and proficiency is demonstrated by achieving an agreement level of 0.70 or higher with a set of criterion codes established by experts. Successful completion of this certification indicates reliability and consistency in coding, qualifying an individual to apply FACS in research contexts. However, it is crucial to understand that passing the certification test signifies foundational knowledge of the system, not necessarily expertise or the qualification to train others. The Paul Ekman Group offers the FACS manual and the final certification test for purchase. Additionally, specialized workshops, such as the five-day FACS workshop taught by Erika Rosenberg, are available to guide students through the entire manual and prepare them for certification.
Applications in Security, Therapy, Acting, and Negotiation
The objective and detailed nature of FACS has led to its diverse application across numerous professional domains.
In security and law enforcement, FACS serves as a valuable tool for detecting deception and assessing the credibility of suspects and witnesses. It aids in identifying "hot spots"—subtle deviations from an individual's behavioral baseline that signal emotional or cognitive stress, providing crucial clues during interrogations. FACS research has empirically supported its utility in deception detection, with studies demonstrating that combining vocal and facial measures can achieve high accuracy rates in classifying truthfulness.
Within therapy and mental health, FACS is employed to evaluate emotional impairment in neuropsychiatric disorders and to analyze subtle expressions associated with conditions such as depression. It is also applied in therapeutic interventions for individuals on the autism spectrum, helping them to better understand and express emotions, thereby addressing difficulties in social interaction and communication. Furthermore, FACS has been utilized in research to predict coping mechanisms following traumatic loss.
In the realm of acting and character development, Ekman's work, including FACS, has found direct practical application in Hollywood. He famously served as a scientific advisor for the television series "Lie to Me," where he meticulously analyzed scripts and provided video clip-notes of facial expressions for actors to imitate. This scientific understanding of how facial muscles create specific expressions (AUs) empowers actors to portray emotions with greater accuracy, nuance, and realism, contributing to more compelling character development. His consultation on the animated film "Inside Out" further highlights the influence of his research in popular culture and the artistic portrayal of emotions.
For negotiation and conflict resolution, the ability to understand non-verbal cues through FACS significantly enhances communication and improves outcomes. Negotiators can leverage this skill to assess an opponent's true emotional state, identify inconsistencies between their verbal and non-verbal communication, and ultimately make more effective strategic decisions. While detecting lies in negotiation remains challenging, understanding these subtle cues allows negotiators to adapt their strategies, build trust, and foster a more collaborative atmosphere. Training videos based on Ekman's work are specifically designed to improve negotiation outcomes and facilitate conflict resolution.
Key FACS Action Units (AUs) and Associated Emotions
AU Code(s) | Description of Muscle Movement/Facial Appearance | Associated Universal Emotion(s) | Notes |
---|---|---|---|
4 + 5 + 7 + 23* | Brows lowered & drawn together, upper eyelid raised, lower eyelid tensed, lips pressed firmly. | Anger | AU 23 is a reliable indicator. |
9 + 10* + 17 | Upper lip raised, nose bridge wrinkled, cheeks raised, chin boss raised. | Disgust | AU 10 is a reliable indicator. |
1 + 2 + 5 + 25 + 26/27 | Brows raised high, horizontal forehead wrinkles, eyes wide open (upper lid elevated, lower lid relaxed), jaw drops open. | Surprise | |
1* + 2* + 4* + 5 + 7 + 20 + 25 + 26 | Brows raised & drawn together, upper eyelids raised, lower eyelids tensed, mouth stretched, jaw dropped. | Fear | AUs 1, 2, 4 are reliable indicators. |
1* + 4 + 15* | Inner brows raised, brows drawn together, mouth corners lowered. | Sadness | AUs 1, 15 are reliable indicators. |
R14* | Corner of lip tightened and raised slightly on one side of the face (unilateral smirk). | Contempt | R14 is a reliable indicator (unilateral). |
Note: The asterisk () indicates muscle movements that are considered highly reliable and difficult to voluntarily control or fake, as per Ekman's research.*
Simply Put
The development of FACS provided a scientific, granular, and unbiased method for studying facial behavior, moving beyond intuitive or culturally biased interpretations. Its anatomical basis makes it a universal language for describing facial movement, which is crucial for research across disciplines and for developing automated emotion recognition systems. This objectivity is precisely what lends FACS its power in applications such as lie detection, where subtle, involuntary movements are key indicators. However, the extensive time and effort required for FACS mastery underscore that its effective application is a specialized expertise, not a casual skill, highlighting the enduring value of human expertise in interpreting complex non-verbal cues, even as technological advancements emerge.
In the Part Four we will explore Microexpressions and Deception Detection
Explore Universal Emotions
Ready to discover Paul Ekman’s research on the seven universal emotions and how we express them?