Scripts

Hello again. I just wanted to make a brief post on some of the scripts I’ve been making to aid my thesis project.

The first script is very simple. I have a control rig and a skin rig. To save time for myself, I save the skin joints in a variable, then the control joints. Then the appropriate joints are skinned to the other. This is helpful when I have many joints I need to parent constraint and make changes to my rig hierarchy.

The next is actually just a few expressions I use for controlling blend shapes and the external face control rig. The reason I made the face control expression was so that I could easily control a joint with multiple controls. The blend shape expression was made because I’ve had to export and reimport my blend shapes a few times and I didn’t ware-import-key every blend shape individually. Also, when I bring in a new character, it will be easy to transfer the blend shapes over to the new geometry. I currently have 26 blend shapes, so this has saved a lot of time, and it’s much easier to debug.

The last script is the baking animation script. Before a rig gets sent to Unreal, I need to bake the joints and blend shapes, then strip out the control rig that won’t be used anymore in Unreal. Selecting all of the joints, baking them, etc, took a while. So this script allows me to select the chosen bones and blend shapes, the chosen meshes to export, and the “junk” stuff. It bakes the bones and blend shapes, deletes the “junk” then exports the mesh to a chosen folder after prompting for a name of the fbx file. There’s also an option to re-open the file with all of its joints returned to a pre-bake state so that I can then send out another different animation from the same file.

Eye Shader

Hello again. I also had an update on the shaders. Particularly for the eyes. I couldn’t get all of the features that Unreal’s latest eye shader has because I wanted to use my own geometry. But I found a few workarounds so that the shader would at least work for the project.

eye_turn.gifeye_light.gifeye_pupil_distance.gif

I might include eye dilation, but it’s a little buggy at the moment. It also does not react to the light, I’m modifying a parameter in the shader by hand.

Updated Emotions Test and some Unreal Engine Work

 

Hello. I have an updated emotion tests, along with some in engine work.

romtest3.gif

First, here is my updated emotion test in Maya. I currently have placeholder hair that I plan to replace later. Also no wrinkle normal maps or other shader features in this video. This is just the Maya test. My goal in the below video was to push the rig further from the older test. I’ve now got corrective blend shapes which help with bringing more life to the face, as well as giving the lips more deformation to better simulate real lips as they move. I also changed how the eyes are focused, providing an option to follow the head movement or the neck movement, as well as changing the distance that the eyes are focusing at. I’ll have another post later showing the difference it makes.

In the engine, I’ve been re-designing the way some of the blueprints work. This interface is just a test to get all of the individual animation pieces controlled properly. I’m currently designing another GUI for controlling that, but I need to learn a little more about Unreal’s UMG before I show it.

After sending the animations to Unreal (I will have a post about that later), in the character’s animation blueprint I separate out parts of the head and combine it back together with blend per bone node. Currently, I have the mouth and brow sections of the face separated out.

anim_blueprint.PNG

All of the animations are also put together with a blendspace. Here is an example of a Happy/Surprise blendspace.

blendspace.PNGblendspace2.PNGblendspace3.PNGblendspace4.PNG

In the shader for the character’s skin, I have scalar parameters that drive additional normal maps that get blended on with of the default normal map for brow, eye, and mouth line wrinkles.

 

wrinklemaps.PNG

Normal map blending for the wrinkles in the face.

 

HighresScreenshot00016.png

No wrinkles on a neutral face, compared with wrinkles on a worried face.

 

 

 

Variables then drive each animation. This is my test GUI for getting all of the animations into the engine and testing all of the combinations.

BugScreenShot_00000

Neutral face.

BugScreenShot_00001

A smile.

BugScreenShot_00002

The same smile with the eye wrinkles.

 

BugScreenShot_00003

Disgust and anger.

BugScreenShot_00006

Disgust.

BugScreenShot_00004

Fear.

BugScreenShot_00005

Sadness

 

In addition to the Paul Ekman research(Unmasking the Human Face by Paul Ekman and Wallace V. Friesen) I’ve also been using Anatomy of Facial Expressions by Uldis Zarins and Sandis Kondrats as a reference.

 

Test Emotions

The main skinning and head controls are complete. I started on some corrective blend shapes, wrinkle maps in Zbrush, as well as getting some of this into Unreal. My next step is to create a GUI in Unreal which allows you to control the various emotional states.

A range of motion test in maya for the first head model.

I also did research into the Facial Action Coding system, reading Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues by Paul Ekman. I split up the animation segments of my character into brow, eyes, nose, and mouth, similar to how the facial features are focused on in the book. I was also able to make a few blends between emotions.

 

I also made wrinkle maps in Zbrush based on the most extreme poses. I then imported the low poly versions of these models into Maya where I used them to aid in some areas of the face rig that wasn’t able to reach the correct shape. For instance, the eyebrows have a corrective blend shape that help with bulging the inner brow when lowered to an extreme.

 

 

 

emotions.png

Emotion “blends” using different facial pose combinations.

I also made wrinkle maps in Zbrush based on the most extreme poses. I then imported the low poly versions of these models into Maya where I used them to aid in some areas of the face rig that wasn’t able to reach the correct shape. For instance, the eyebrows have a corrective blend shape that help with bulging the inner brow when lowered to an extreme.

 

browcorrective.jpg

An example of one of the corrective blend shapes used.

 

head_wrinkles.png

Wrinkle maps in Zbrush.

My next update will include some Unreal Engine work.

 

 

 

Minor Update

 

 

rig2

Character Rig Controls

 

I’m just making a quick post to say I’m still working on thesis. My first character’s texturing is almost done. I’ll include finished concept art and rigging/skinning updates next.

 

Lia

The character rendered in substance painter’s Iray.

 

 

screenshot001.png

The eye texture rendered in Marmoset 2.