Hello. I have an updated emotion tests, along with some in engine work.
First, here is my updated emotion test in Maya. I currently have placeholder hair that I plan to replace later. Also no wrinkle normal maps or other shader features in this video. This is just the Maya test. My goal in the below video was to push the rig further from the older test. I’ve now got corrective blend shapes which help with bringing more life to the face, as well as giving the lips more deformation to better simulate real lips as they move. I also changed how the eyes are focused, providing an option to follow the head movement or the neck movement, as well as changing the distance that the eyes are focusing at. I’ll have another post later showing the difference it makes.
In the engine, I’ve been re-designing the way some of the blueprints work. This interface is just a test to get all of the individual animation pieces controlled properly. I’m currently designing another GUI for controlling that, but I need to learn a little more about Unreal’s UMG before I show it.
After sending the animations to Unreal (I will have a post about that later), in the character’s animation blueprint I separate out parts of the head and combine it back together with blend per bone node. Currently, I have the mouth and brow sections of the face separated out.
All of the animations are also put together with a blendspace. Here is an example of a Happy/Surprise blendspace.
In the shader for the character’s skin, I have scalar parameters that drive additional normal maps that get blended on with of the default normal map for brow, eye, and mouth line wrinkles.
Variables then drive each animation. This is my test GUI for getting all of the animations into the engine and testing all of the combinations.
In addition to the Paul Ekman research(Unmasking the Human Face by Paul Ekman and Wallace V. Friesen) I’ve also been using Anatomy of Facial Expressions by Uldis Zarins and Sandis Kondrats as a reference.