I built a FACS-based blendshape facial rig for the orc head, designed with motion capture in mind. My pipeline started with sculpting the base head, followed by generating and transferring the necessary FACS blendshapes. From there, I built out the controllers and finalized the setup for compatibility with ARKit motion capture.
To ensure animation-ready topology, I started the sculpt in ZBrush using a 3DScanStore basemesh, followed by a texture pass in Substance Painter. Building on that specific foundation allowed me to use the 3DScanStore Multi-Expression basemesh to easily generate and extract the initial FACS shapes. However, translating human expressions 1:1 onto an orc requires a lot of manual refinement. I spent significant time sculpting custom correctives, especially around the tusks, to fix broken anatomy from the raw transfers and ensure the deformations felt natural and alive.
The controller setup were fairly straightforward. I used multiplyDivide nodes to directly convert control translation and rotation values into blendshape weights. To handle opposing movements on a single control, like a brow moving up or down, I wired in condition nodes. These automatically check if the control's value is positive or negative and seamlessly trigger the correct blendshape.
Translating actual performance data onto the orc was surprisingly straightforward. I recorded facial mocap on an iPad, exported the animation as an FBX, and built a transfer scene to act as a bridge. By bringing in a neutral ARKit head to read the incoming data, I could wire those outputs directly into the orc's controllers. It is a simple setup, but it works.
If I were to take this project further, I would want to push it both artistically and technically. Artistically, the character could use a final polish pass, adding a proper groom, refining the textures, and improving the blendshapes for better deformations. Technically, the ultimate goal would be getting the rig fully playable in Unreal Engine. To achieve that, I would explore using a skinning decomposition algorithm to convert the current blendshape setup into a game-ready joint rig. Finally, I would look into writing custom tools to completely automate the facial rigging and mocap transfer pipelines.
I began developing this auto-rigger on my first day in the Technical Art program at TGA. My goal was to create a simple, intuitive tool capable of rigging any humanoid or creature in hours instead of weeks. Because game assets constantly evolve, the script is designed for rig iteration while the character model is still a work in progress. To round out the toolset, I integrated a custom "keying sets" feature to speed up animation, alongside dedicated functions for the painless export of skeletal meshes and animations.
At the core of the auto-rigger is a modular guide joint system. You simply select from a list of limb types, align them to the character's proportions, and parent the guides together to intuitively map out the final skeleton hierarchy. To speed up the workflow, I wrote a custom mirroring function that handles multiple complex selections simultaneously. It automatically updates all naming conventions and perfectly preserves your custom hierarchy across the axis.
Generating the final rig involves significant under-the-hood processing. It starts with strict validation checks, flagging missing suffixes, duplicate names, or version errors to ensure pipeline stability. Once validated, the script generates deformation joints directly from your guides, automatically calculating and inserting twist joints. Finally, it builds the complete control rig. Limbs are outfitted with IK/FK switches, soft IK, automatic pole vectors and foot roll setups. To support diverse character types, I also included specialized modules like a IK spline spine, FK chains and a dedicated Arachnid Leg module.
Once the initial rig is generated, the script allows for seamless, on-the-fly iteration to accommodate evolving character designs. You can dive back into a completed rig, reposition guide joints to adjust proportions, and propagate those updates across the setup while perfectly preserving your existing skinning. The system is also highly modular. If art direction changes, you can instantly bolt on new part, like a tail or weapons, or even swap out entire modules, such as replacing standard legs with arachnid limbs, all without breaking the rest of the character.
One feature I’m particularly proud of is the Soft IK setup, which solves the common animation headache of jarring "IK pops" when a limb fully extends. Under the hood, the script builds a system of hidden locators to intercept the animator's main control. It continuously measures the limb's extension, and just before the joints lock into a perfectly straight line, a custom MEL expression applies an exponential math curve. This gently slows down the output movement, creating a smooth, natural stretch instead of a rigid mechanical snap. It’s a complex setup behind the scenes, but for the animator, it just means they can push extreme poses freely without ever fighting the rig.
I also built an Automatic Pole Vector (Auto PV) system so animators don't have to constantly babysit knee and elbow controllers. Behind the scenes, the script uses hidden locators and math nodes to track the limb's position. It then adds custom sliders directly to the rig, letting animators seamlessly blend whether a joint automatically follows the root, the foot, or stays completely manual. For the animator, limbs naturally orient themselves as the character moves, drastically speeding up the workflow without sacrificing control.
Finally, I built dedicated export functions to automate the messy, destructive process of getting a character into a game engine. Behind the scenes, the script imports references, extracts the mesh and skeleton from the complex rig hierarchy, and bakes all animation directly to the deformation joints. It then cleans the scene to output a perfectly lightweight FBX containing only the baked joints and skinned geometry. Once saved, the tool instantly reverts your Maya scene back to its original working state so you can continue animating without interruption.I plan to continuously develop this tool, starting with upgrading the rebuild function to handle joint renaming and deletion, rather than just additive changes. From there, I want to expand the module library to include quadruped setups and refine the soft IK math to function perfectly with global rig scaling. There are always new features to add and systems to optimize, but I am incredibly proud of the robust foundation I’ve built since day one.
My main goal for this project was to prepare a character for procedural animation while stress-testing my custom rigging script on a complex humanoid/arachnid hybrid. Alongside building out the core multi-limbed rig, I also focused on creating a solid corrective joint setup for her upper body to keep the deformations looking natural.
I started with this high-poly sculpt by Marcus Whinney, I used ZBrush's ZRemesher to quickly generate a workable topology, since manual retopology wasn't the main focus of this project. After a quick UV pass and some basic texturing in Substance Painter, I ran the character through my auto-rigger. The resulting skeleton includes all the standard features from my script, like IK/FK switches, twist joints, soft IK, and automatic pole vectors.
A fun challenge was setting up the corrective joints. I used Unreal's Pose Wrangler add-on to drive the correctives with Radial Basis Functions (RBF) instead of traditional set driven keys. This approach gave me great results and made exporting the corrective drivers straight into Unreal Engine incredibly easy.
If I were to take this project further, there are a few key areas I would love to improve. On the art side, I’d like to remodel her face to support a full facial rig, push the textures to a higher standard, and do a proper manual retopology pass for better performance and cleaner deformations. On the rigging side, my main goal would be swapping her leg IK setup to use a spring solver instead of a standard rotate plane solver for more natural multi-joint movement.
The first part of this setup is a dynamic look-at system driven by aim constraints. Instead of relying on a traditional blendspace with baked poses, I built a hierarchy of target controllers for the head and torso. The head utilizes spring interpolation for smooth, natural transitions, while the chest controller features a built-in offset to generate organic secondary motion for free. This approach is not only much faster to set up, but it also creates a highly responsive, better-looking result.
The most exciting part of this setup is the procedural locomotion, driven primarily by Full Body IK and Control Rig physics. The stepping logic continuously checks the cycle time and the distance of each leg from its optimal default pose, automatically triggering a step transition when a specific threshold is met. The pelvis movement is highly dynamic: its horizontal position averages the placement of all the legs, while its translation and rotation are driven by velocity-scaled sine waves. To ground the movement, the tail uses spring interpolation for a natural offset, and the torso counter-rotates against the pelvis.
With my previous projets wrapped up, I dove straight into my next technical challenge: rigging "Achilles," a character model by Arcade Studio. For this project, my primary focus was simulated secondary animation and optimizing my skinning workflow by using proxy meshes.
On the left is the final deformation rig, and on the right are the proxy meshes used for skinning. To keep the pipeline efficient, the skirt mesh also doubles as a cloth simulation proxy in Unreal Engine. Because my weighting workflow relies on Delta Mush and skinning decomposition, I had to create an adjusted, dedicated joint hierarchy for the proxy. The Delta Mush solver distributes weights without considering initial influences, so stripping out non-deforming joints beforehand prevents unwanted skinning data. For secondary motion in-engine, the helmet and hair utilize joint dynamics, while the skirt is driven by cloth simulation.
Hi, I’m Nathaniel, a Technical Animator currently studying Technical Art at The Game Assembly.My background spans Game Art at Future Games and Game Animation at TGA, giving me a deep understanding of the entire character pipeline, from how a model is built to how it moves and functions in-engine. I thrive on bridging the gap between art and code. My day-to-day usually involves writing Python tools to streamline workflows, building robust rigs that animators actually enjoy using, and setting up runtime IK and animation logic.When I’m not staring at nodes, you can usually find me painting Warhammer miniatures, diving into the lore, or relaxing at home with my family and our dogs.
This showreel highlights my work as a Creature Animator on Mortal Online 2 as well as some personal work. These pieces were developed during my internship at Star Vault as the final chapter of my Game Animation studies at The Game Assembly.
This showreel highlights my work as an animation student at The Game Assembly focusing on Technical Animation.