Want to turn a simple video into a high-quality 3D animated MetaHuman? In this tutorial, we’ll show you how to use DeepMotion’s Animate 3D and Unreal Engine.
Introduction
If you’ve ever wanted to bring a MetaHuman to life without having to set up motion capture suits or complicated camera rigs, DeepMotion’s Animate 3D service provides an easy solution. By simply uploading a video of a person moving, Animate 3D uses AI to track and capture the motion data, which you can then import into Unreal Engine to drive a MetaHuman. In this tutorial, we’ll walk you through the entire process—from recording and uploading a video to retargeting the animation to a MetaHuman inside Unreal Engine.
1. Preparing Your Video for Animate 3D
Record Your Video
Use any recording device (smartphone, camera, etc.) to capture footage of a person performing the motion you want to replicate.
Try to have the person in full view and ideally with minimal background distractions. Good lighting and a clear contrast against the background make it easier for DeepMotion’s AI to track your subject.
Keep It Simple
Keep your subject centered and in frame whenever possible.
Don’t move the camera too much or too quickly; stable footage produces more accurate results.
Edit or Trim (Optional)
If necessary, trim the video to just the section of movement you need. This will help streamline the upload process and reduce potential errors.
2. Uploading and Generating Animation with Animate 3D
In your Animate 3D dashboard, click the “Create” button.
Select your recorded video file and confirm. The system will upload and pre-process your video.
Adjust Settings
Skeleton/Character Type: Choose the skeleton type that best suits your project. For animating a MetaHuman, please select the "Adult Female with Facial Rig (UE)" character that works best for retargeting in UE.
Animation Settings: Turn on Hand Tracking and Face Tracking. Fine tune other animation settings such as the Foot Locking option.
Process and Download
Push the "Create Animation" button to submit your animation job, wait for Animate 3D to finish processing.
Once complete, you can preview the motion capture result by pushing the "Preview" button.
Download the animation in a FBX format (recommended for Unreal).
Upload a Video
Select a Character
Animation Settings
Download a FBX
3. Importing Your Animation into Unreal Engine
Setting Up Your UE Project
Open Unreal Engine 5.5.2 or later version and create or open a project
Import the FBX File
Drag and drop the FBX file exported from DeepMotion to the Content Drawer
Unreal will prompt you to choose import settings for the skeletal mesh.
Skeletal Meshes Settings: Make sure the "Input Morph Targets" option is checked.
Animations Settings: Check the "Import Animations" and the "Snap to Closest Frame Boundary" options.
Click Import.
Morph Target
Frame Boundary
4. Retargeting the Animation to a MetaHuman
Prepare Your MetaHuman
If you haven’t done so, import or add your MetaHuman to the project. You can download MetaHumans via the Quixel Bridge plugin and add them directly to your Unreal project.
Go to the imported MetaHuman folder in the Content Drawer and open the MetaHuman blue print, record the skeletal mesh asset name under the Body bone of the MetaHuman. This is the name you will use as the target skeletal mesh name in the following step.
Retarget Your Animation to MetaHuman
Go back to the Content Drawer and right click the imported DeepMotion animation asset and select "Retarget Animation" from the context menu.
Configure the "Target Skeletal Mesh" name as the MetaHuman's skeletal mesh asset name you recorded earlier.
Double click the animation track you want to retarget.
Click the "Export Animations" button to export the animation.
Open the MetaHuman blue print
Record the MetaHuman Mesh Name
Retarget Animation to MetaHuman
5. Attach and Preview the MetaHuman Animation
Select the "Add Level Sequence" menu item to add a level sequencer for editing and previewing the animation.
Go to the Content Drawer, navigate to the MetaHuman folder and drag the blue print to the Sequencer.
Remove the body and face control rigs since we are not editing the animation now and rotate the character to face the light source for visibility.
Right click the "Body" animation track, select the "Animation" from the context menu and enter the MetaHuman animation sequence name to attach the retargeted animation to the MetaHuman blue print model.
Now you can select the entire range of the animation sequence in the timeline and push the Play button to review the body+face+finger animation retargeted to the MetaHuman.
Drag MetaHuman blue print to Sequencer
Clean Up MetaHuman configuration
Attach Animation to MetaHuman
Play MetaHuman Animation
Here is a more elaborated video tutorial of the above process.
Conclusion
Bringing a MetaHuman to life doesn’t have to require sophisticated motion capture setups. With DeepMotion’s Animate 3D website, you can turn a single video into a workable animation, then use Unreal Engine’s retargeting system to apply that animation to a MetaHuman. By combining the power of AI-driven tracking and UE’s robust animation tools, you can create lifelike, custom performances for your virtual characters quickly and efficiently.
Happy animating! If you found this tutorial helpful or if you have any tips of your own for refining Animate 3D motion captures, feel free to share them in the comments below.