Epic Games releases MetaHuman Animator
Originally posted on 23 March 2023 for the preview, and updated with details of the final release.
Epic Games has released MetaHuman Animator, its much-anticipated facial animation and performance capture toolset for its MetaHuman framework.
The system streamlines the process of transferring the facial performance of an actor from footage captured on an iPhone or helmet-mounted camera to a real-time MetaHuman character inside Unreal Engine.
Epic claims that it will “produce the quality of facial animation required by AAA game developers and Hollywood filmmakers, while at the same time being accessible to indie studios and even hobbyists”.
The toolset was announced during Epic Games’ State of Unreal keynote at GDC 2023 earlier this year, and is now available in the latest version of its free MetaHuman plugin for Unreal Engine.
Part of Epic Games’ framework for creating next-gen digital humans for games and animation
MetaHuman Animator is the latest part of Epic Games’ MetaHuman framework for creating next-gen 3D characters for use in games and real-time applications – and also, increasingly, in offline animation.
The first part, cloud-based character-creation tool MetaHuman Creator, which enables users to design realistic digital humans by customising preset 3D characters, was released in early access in 2021.
Users can generate new characters by blending between presets, then adjusting the proportions of the face by hand, and customising readymade hairstyles and clothing.
Generates a MetaHuman character matching video footage of an actor
MetaHuman characters have facial rigs, so they already supported facial motion capture, but to transfer that motion from video footage of an actor with different facial proportions required manual finessing.
MetaHuman Animator is intended to streamline that retargeting process: a workflow that Epic Games calls Footage to MetaHuman.
As with Mesh to MetaHuman, it generates a MetaHuman matching source data: in this case, video footage of an actor, and supporting depth data – about which, more later.
A ‘teeth pose’: one of the standard reference frames the MetaHuman Animator toolset uses to generate a MetaHuman character matching an actor’s facial proportions from video footage of that actor.
Works from one to four reference frames of an actor’s face
The process begins by ingesting the footage into Unreal Engine, and identifying key reference frames from which the MetaHuman plugin can perform a solve.
On footage captured with a professional camera, only a single frame is necessary: a frontal view of the actor with a neutral facial expression.
With iPhone footage, Epic recommends also identifying left and right views of the actor’s face to improve the quality of the solve.
A further reference frame showing the actor’s exposed teeth improves the quality of mouth animations.
The MetaHuman plugin then solves the footage to conform a template mesh – a MetaHuman head – to the data. Users can wipe between the 3D head and the source image to check the solve.
The template mesh is then used to generate an asset that can be used for animation.
Processing is done in the cloud – the only part of the workflow that doesn’t run locally – and the resulting MetaHuman downloaded to Unreal Engine via Epic’s Quixel Bridge plugin.
Extract facial motion from video and apply it to a MetaHuman
The result is a MetaHuman rig calibrated to the actor’s facial proportions.
The MetaHuman plugin can then extract facial motion from video footage of that actor and transfer it to the 3D character, with the user able to preview the result in the viewport.
The animation can then be exported to Unreal Engine as a Level Sequence or an animation sequence.
Exporting as an animation sequence makes it possible to transfer the facial animation seamlessly to other MetaHumans, meaning that the actor’s performance can be used to drive any MetaHuman character.
Other benefits of the workflow
The control curves generated by the process are “semantically correct” – that is, structured in the same way as they would be if created by a human animator – making the animation easier to edit.
MetaHuman Animator also supports timecode, making it possible to sync the facial animation with full-body motion-capture; and can use the audio from the facial recording to generate tongue animation.
Works with anything from iPhones to pro helmet-mounted cameras
MetaHuman Animator is also designed to work with a full spectrum of facial camera systems.
For indie artists, that includes footage streamed from an iPhone using Epic’s free Live Link Face app.
Live Live Face 1.3, released alongside MetaHuman Animator, updates the app to enable it to capture raw video footage and the accompanying depth data required, the latter via the iPhone’s TrueDepth camera.
Larger studios can use standard helmet-mounted cameras: MetaHuman Animator works with “any professional vertical stereo HMC capture solution”, including those from ILM’s Technoprops division.
Price, release date and system requirements
The MetaHuman Animator toolset is part of Epic Games’ free MetaHuman plugin. The plugin is compatible with Unreal Engine 5.0+, but to use MetaHuman Animator reqires Unreal Engine 5.2+.
Live Link Face is available free for iOS 16.0 and above. To use it with MetaHuman Animator, you will need version 1.3+ of the plugin and an iPhone 12 or later.
MetaHuman Creator is available in early access. It runs in the cloud, and is compatible with the Chrome, Edge, Firefox and Safari browsers, running on Windows or macOS. It is free for use with Unreal Engine.
Use of the Unreal Engine editor itself is free, as is rendering non-interactive content. For game developers, Epic takes 5% of gross beyond the first $1 million earned over a product’s lifetime.