Updated Emotions Test and some Unreal Engine Work

 

Hello. I have an updated emotion tests, along with some in engine work.

romtest3.gif

First, here is my updated emotion test in Maya. I currently have placeholder hair that I plan to replace later. Also no wrinkle normal maps or other shader features in this video. This is just the Maya test. My goal in the below video was to push the rig further from the older test. I’ve now got corrective blend shapes which help with bringing more life to the face, as well as giving the lips more deformation to better simulate real lips as they move. I also changed how the eyes are focused, providing an option to follow the head movement or the neck movement, as well as changing the distance that the eyes are focusing at. I’ll have another post later showing the difference it makes.

In the engine, I’ve been re-designing the way some of the blueprints work. This interface is just a test to get all of the individual animation pieces controlled properly. I’m currently designing another GUI for controlling that, but I need to learn a little more about Unreal’s UMG before I show it.

After sending the animations to Unreal (I will have a post about that later), in the character’s animation blueprint I separate out parts of the head and combine it back together with blend per bone node. Currently, I have the mouth and brow sections of the face separated out.

anim_blueprint.PNG

All of the animations are also put together with a blendspace. Here is an example of a Happy/Surprise blendspace.

blendspace.PNGblendspace2.PNGblendspace3.PNGblendspace4.PNG

In the shader for the character’s skin, I have scalar parameters that drive additional normal maps that get blended on with of the default normal map for brow, eye, and mouth line wrinkles.

 

wrinklemaps.PNG

Normal map blending for the wrinkles in the face.

 

HighresScreenshot00016.png

No wrinkles on a neutral face, compared with wrinkles on a worried face.

 

 

 

Variables then drive each animation. This is my test GUI for getting all of the animations into the engine and testing all of the combinations.

BugScreenShot_00000

Neutral face.

BugScreenShot_00001

A smile.

BugScreenShot_00002

The same smile with the eye wrinkles.

 

BugScreenShot_00003

Disgust and anger.

BugScreenShot_00006

Disgust.

BugScreenShot_00004

Fear.

BugScreenShot_00005

Sadness

 

In addition to the Paul Ekman research(Unmasking the Human Face by Paul Ekman and Wallace V. Friesen) I’ve also been using Anatomy of Facial Expressions by Uldis Zarins and Sandis Kondrats as a reference.

 

Test Emotions

The main skinning and head controls are complete. I started on some corrective blend shapes, wrinkle maps in Zbrush, as well as getting some of this into Unreal. My next step is to create a GUI in Unreal which allows you to control the various emotional states.

A range of motion test in maya for the first head model.

I also did research into the Facial Action Coding system, reading Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues by Paul Ekman. I split up the animation segments of my character into brow, eyes, nose, and mouth, similar to how the facial features are focused on in the book. I was also able to make a few blends between emotions.

 

I also made wrinkle maps in Zbrush based on the most extreme poses. I then imported the low poly versions of these models into Maya where I used them to aid in some areas of the face rig that wasn’t able to reach the correct shape. For instance, the eyebrows have a corrective blend shape that help with bulging the inner brow when lowered to an extreme.

 

 

 

emotions.png

Emotion “blends” using different facial pose combinations.

I also made wrinkle maps in Zbrush based on the most extreme poses. I then imported the low poly versions of these models into Maya where I used them to aid in some areas of the face rig that wasn’t able to reach the correct shape. For instance, the eyebrows have a corrective blend shape that help with bulging the inner brow when lowered to an extreme.

 

browcorrective.jpg

An example of one of the corrective blend shapes used.

 

head_wrinkles.png

Wrinkle maps in Zbrush.

My next update will include some Unreal Engine work.

 

 

 

Minor Update

 

 

rig2

Character Rig Controls

 

I’m just making a quick post to say I’m still working on thesis. My first character’s texturing is almost done. I’ll include finished concept art and rigging/skinning updates next.

 

Lia

The character rendered in substance painter’s Iray.

 

 

screenshot001.png

The eye texture rendered in Marmoset 2.

 

Mission Statement

Improve facial animations in games through studying and applying personality traits and the emotional spectrum, demonstrating how they could be procedurally animated.

Issues in facial animation that I would like to address:

  1. Over-exaggerated expressions used on high fidelity models.
  2. Lack of expression where faces are overly stiff and characters look like dolls with dead eyes.
  3. No variation between characters with different personalities.

Paul Ekman and “Microexpressions”

Paul Eckman is a psychologist who studied the faces of people around the world and concluded that emotions were not socially learned.1 

He coined the term “microexpression” which is the involuntary expressions that our faces make in response to stimuli. He categorized six emotional states; sadness, anger, contempt, disgust, surprise, and fear.

We only think about expression when we want to use our body for communication on a conscious level. And a lot of the time we aren’t very good at faking internal states. If someone is playing a role in a social situation, it’s often expected of them, but much of the time we aren’t fooled by the performance. Which I think is one of the reasons why great actors are fascinating. – Paul Ekman

FACS

Naughty Dog applied FACs (Facial Action Coding System) to their animations in The Last of Us. The muscle and anatomy considerations applied to the model and animations were used to improve their rigging pipeline.

See Judd Simatov’s presentation on character rigging and modeling in The Last of Us: here.

Variation

A player can be taken out of a game when they begin to notice the same animations showing up over and over again. With facial animations, two characters of vastly different personalities can be using the same animation for “sad” or “angry”. Some game teams take the time to work on these individual animations and they work. Good examples are Tell Tale Games and Naughty Dog.

However when you have smaller teams or a limit on the number of animations used and created for characters, especially custom made player characters, the use of repeated animations can take a player out of the experience. Some examples of this would be in smaller games like Life is Strange.  Or hugely scoped games who have so many characters that sharing animations is a requirement, like in Fallout, The Witcher or Dragon Age.

Procedural Traits

I want to simplify some of that animation work by combining it with procedurally driven animations that modify the character’s expression based on their emotional state and what their personality is like.

Project Goals

My first goal will be to study facial expression and applying some subtlety to character faces by considering the involuntary “microexpression” of the character along with the character’s unique personality.

My second goal is to use a character trait system that will alter the expression the character is displaying.

Challenges

1.Aesthetics: Establishing a style between exaggerated and realistic to focus on the methods and avoid the uncanny valley.

2.Applying: Studying and appropriately applying psychology and acting.

3.Technical: Not having a bloated and buggy, slow rig. Applying traits and emotional states in the engine.

4.Relevance: Convince animators I’m not attempting to make them obsolete. Make this approach possible in a production environment.

Links

Psychology

Kaszowska, Aleksandra. “Am I in Trouble? Interpreting Facial Expressions.” Emotion on the Brain. December 8, 2014. http://sites.tufts.edu/emotiononthebrain/2014/12/08/am-i-in-trouble-interpreting-facial-expressions/.

“Profile of a Psychologist: Paul Ekman.” Cognitive Consonance. http://cognitiveconsonance.info/2012/08/21/240/.

Plutchik. “The Nature of Emotions.” The Nature of Emotions. http://www.fractal.org/Bewustzijns-Besturings-Model/Nature-of-emotions.htm.

The Myers & Briggs Foundation. http://www.myersbriggs.org/.

Heffner, Christopher L., Dr. “Chapter 3: Section 2: Hans Eysenck’s Structure of Personality.” AllPsych. http://allpsych.com/personalitysynopsis/eysenck/.

Mapes, Diane. “How to Spot a Fake Smile: It’s All in the Eyes – NBC News.” NBC News. March 30, 2011. http://www.nbcnews.com/health/body-odd/how-spot-fake-smile-its-all-eyes-f1C9386917.

Ekman, Paul, Dr. “Parents’ Guide to Inside Out – Paul Ekman Group, LLC.” Paul Ekman Group LLC. 2015. http://www.paulekman.com/parentsguide/.

Character Rigging, Animation and Storytelling

Naughty Dog. “Making of Uncharted 4 Nathan Drake.” Computer Graphics Digital Art Community for Artist Job Tutorial Art Concept Art Portfolio. December 11, 2014. http://www.cgmeetup.net/home/making-of-uncharted-4-nathan-drake/.

Simantov, Judd. “Judd Simantov on character rigging and modeling in Naughty Dog’s The Last of Us” Game Character Academy. https://www.youtube.com/watch?v=myZcUvU8YWc

Simantov, Judd. “Uncharted 2: Character Pipeline” GDC Vault. http://www.gdcvault.com/play/1012552/Uncharted-2-Character-Pipeline-An.

Nelson, Paul. “Designing Branching Narrative.” The Story Element. 2015. https://thestoryelement.wordpress.com/2015/02/11/designing-branching-narrative/.

Croshaw, Ben “Yahtzee” “Making Faces – A Bioware Story.” The Escapist. December 16, 2014. http://www.escapistmagazine.com/articles/view/video-games/columns/extra-punctuation/12762-Improving-Bioware-s-Conversation-Animations-With-Mocap.

White, Olivia. “Firewatch Took Away Our Ability to Be Good People, and That’s Where It Shines.” Polygon. February 12, 2016. http://www.polygon.com/2016/2/12/10966494/firewatch-agency-campo-santo.

Procedural Content

Davidson, Kim. “Sponsored: Go Procedural – A Better Way to Make Better Games.” Gamasutra Article. http://www.gamasutra.com/view/news/233899/Sponsored_Go_Procedural__
A_Better_Way_to_Make_Better_Games.php
.

Hosking, Claire. “Opinion: Stop Dwelling on Graphics and Embrace Procedural Generation.” Polygon. 2013. http://www.polygon.com/2013/12/10/5192058/
opinion-stop-dwelling-on-graphics-and-embrace-procedural-generation
.

Moss, Richard. “7 Uses of Procedural Generation That All Developers Should Study.” Gamasutra Article. January 1, 2016. http://www.gamasutra.com/view/news/262869/7_uses_of_procedural_generation
_that_all_developers_should_study.php
.