Categories
Final Major Project

MOCAP data


Motion has been captured with use of the facilities at LCC. I have used both of the existing MOCAP, the IMU (inertial measurements) such as ROKO and the Camera Based MOCAP using body markers attached to the suit visible by cameras. This started as the experimental practice based approach into both of my thesis topic, where I evaluate the animation fidelity of the the avatars in the Virtual Worlds, of Social VR, but as well the experimental tool for producing animation on the level of the production studios of today.

There’s always a learning curve. Though it must be said that, this is a particularly hands on experience due to a nature of the technology, and despite the technician being there to supervise you, there’s so much practical things to learn through very own experience and self taught trough an array of mistakes of wrong assumptions. So here’s my very own list:

1. Do not stare at the PC. Although your avatar must be catching all your attention, you will end up with data of the head somewhat constrained to that TV screen.

2. Every time, start off your pose with T shape. This is going to be very important for retargeting of that data inside the Maya built in tool, the character definition. If you don’t have that T pose of reference, your retargeting will output very awkward results, broken body, twisted and tangled or maybe just misplaced slightly.

3. Come to mocap with the clear idea in mind. For example, creating game cycle animation, means you have to try match the starting pose with an ending pose, for the better result, with less straggle of recreating more poses later on.

4. Be mindful of timing. Long recordings gonna add up in data very quickly, which means Maya is going to take a sweet time importing, processing. Do even mention working with animation graph of long enough footage at 120 FPS.

5. Camera based system are the best mocap. These require more of the setup to start with, such as getting all the wearables and then the calibration process right. Sometimes it must be repeated. Give yourself an hour on top of anticipated time on the day of recording. This MOCAP is of a high precision and accuracy. Though, DIY setup for hands before the arrival of VICON gloves, was painful to work with, as sweating hands were causing all the taped on fingers markers to fall off.

6. IMU like ROKO are easy and quick (as long as the designated router is working) but there are not precise and drift over the time. There’s significant misalignment between the physical body and the avatar. It almost makes you think as you need to move very slowly to keep your avatar following along.



Workflow for Cleaning and Preparing Mocap Data

SOFTWARE USED: VICON – MAYA – CASCADEUR – UNREAL

The below is a guide I’ve written based on the process I undertaken.

Below is the full pipeline I use for processing motion-capture animation before bringing it into Unreal Engine. This workflow combines Maya, Mixamo, and Cascadeur to achieve clean, polished animation.


1. Prepare the Animation Environment in Maya

Set Maya’s animation mode to Spline (or Auto/Legacy Spline) so the curves are smooth from the beginning.


2. Import the Mixamo-Rigged Character

Import your Mixamo-rigged character into Maya.

If you have already created a HumanIK definition for this character, load the preset; otherwise, create a new Character Definition and save it for reuse.


3. Import the Mocap Skeleton

Bring in the raw mocap FBX.

  • Add a root joint and parent the mocap skeleton under it.
  • Rename its Character Definition (e.g., Driver).
  • Assign the appropriate joints in the HumanIK panel.

    Vicon data is missing root joint which result in the errors in Maya/Unreal. Solved with creating a joint (0,0,0) at origin and making it a parent of the rest of the skeleton.

Editing the namespace with the ‘mocap’ prefix for Maya to recognise the mocap skeleton and successfully apply HIK.

4. Characterise the Mocap Skeleton (HumanIK)

Use HumanIK to create a proper rig definition:

  • Use frame 0 (or any frame where the character is in a T-pose).
  • Complete the HIK mapping for the mocap skeleton.
  • This becomes your Driver.


5. Retarget the Mocap to the Character

Set the Source to the mocap rig (Driver) and the Character to your Mixamo rig (Driven).

Check joint alignment in the viewport to ensure the motion transfers correctly.


6. Baking Options

Depending on where you plan to clean the animation:

6a. Clean in Maya

Bake animation onto the Control Rig if you want to refine the movement using IK inside Maya.

6b. Clean in Cascadeur

Bake animation onto the skeleton if you plan to refine the movement in Cascadeur.


7. Export the Animation (Spline Curves Recommended)

When exporting from Maya:

  • Ensure the animation curves are in Spline.
  • Optional: Apply Euler filtering and Butterworth smoothing to reduce mocap jitter and improve visual quality.
  • Export only the animated skeleton.

8. Transfer from Maya to Cascadeur

Use the Maya-to-Cascadeur Bridge (must be installed beforehand).

Send the characterised Mixamo rig to Cascadeur.
Make sure you select Mixamo Spine 2 as the chest controller since Cascadeur’s Auto-Posing relies on that structure.


9. Import Animation into Cascadeur

Drag and drop the exported FBX directly into Cascadeur.

9a. Extract the Mocap Keyframes

  • Switch to Joints Mode.
  • Select all joints and all keys of the mocap animation.
  • Copy the full interval (Ctrl + Shift + C).

9b. Paste onto the Cascadeur Character

  • Open your Mixamo-rigged character scene in Cascadeur.
  • Switch to Joints Mode.
  • Select the character’s joints and timeline.
  • Paste the mocap interval onto the timeline.

Now you can use Cascadeur’s Auto-Posing, Physics Tools, and Tweaking workflow to produce high-quality, natural animation.




Resoures:

Retargeting MOcap in maya with use of mixamo character:

https://www.youtube.com/watch?v=sfNlMUMdVyw


Cascadeur bridge with Maya:

https://www.youtube.com/watch?v=tiTGzay7Xto
https://github.com/Nathanieljla/cg3d-maya-casc

Leave a Reply

Your email address will not be published. Required fields are marked *