Thursday, July 9th, 2020 Posted by Jim Thacker

Epic Games releases free iOS app Live Link Face for UE4


Epic Games has released Live Link Face, a free app for streaming facial animation data from footage of a live actor captured via the TrueDepth camera in modern iPhones to characters in Unreal Engine.

As well as facial expressions, the app can capture head and neck rotation data, and comes with a range of professional production features, including TentacleSync integration and support for Open Sound Control.

A powerful app aimed at large virtual productions as well as individual animators
Live Link Face isn’t the first tool we’ve seen to take advantage of the depth-sensing capabilities of modern iPhones to capture facial animation data: Rokoko and Maxon have both released their own free apps.

However, it is probably the most fully featured, with functionality aimed both at online streamers and individual animators looking to capture facial reference data, and at big virtual productions.

Stream facial animation directly from an iPhone to Unreal Engine
For first group of users, Live Link Face streams facial animation data from the live actor to a 3D character in Unreal Engine, for use in real-time performance, or to capture reference data that can be refined manually.

The data can be streamed over a wireless network, or by connecting the iPhone to a workstation via Ethernet; and the 3D avatar can be overlaid over the video display on the phone to preview the performance.

As with other apps of this type, it’s based on Apple’s ARKit augmented reality framework, so the 3D character needs the standard set of facial blendshapes specified by the ARKit documentation.

According to Epic, the app can “natively adjust when performers are sitting at their desk rather than wearing a head-mounted rig with a mocap suit”, also streaming head and neck rotation data.

As well as streaming animation directly to Unreal Engine’s Take Recorder, the app also captures reference video footage of the actor in MOV format, and raw facial animation data in CSV format.

Users can browse the library of captured takes and play back the reference video directly within the app.

Supports TentacleSync and OSC to ensure precise sync with other mocap hardware during production
For the second group of users, Live Link Face also supports a range of key technologies used in professional motion capture sessions and collaborative virtual production.

As well as providing “robust timecode support and precise frame accuracy”, the app supports TentacleSync, enabling it to connect to the master clock on a capture stage.

That should ensure that facial animation data recorded via Live Link Face syncs precisely with full-body data recorded by performers’ inertial capture suits or an optical mocap system.

The app also supports the Open Sound Control (OSC) protocol, making it possible for other software to control the app remotely: for example, to start or stop recording on multiple actors’ iPhones simultaneously.

As well as animation data, the app can capture production metadata like slate names and take numbers.

System requirements and availability
Live Link Face is available free for iOS 13.0 and above. To make use of it, you will need an iPhone with a TrueDepth camera: that is, an iPhone X or later. It is compatible with Unreal Engine 4.25.

Use of the Unreal Engine editor itself is free, as is rendering non-interactive content. For game developers, Epic takes 5% of gross beyond the first $1 million earned over a product’s lifetime.


Read more about Live Link Face on Epic Games’ blog

Read about facial capture workflow via Live Link Face in the online documentation

Download free iOS facial mocap app Live Link Face from the App Store