Sign up for the newsletter

Signup for the Newsletter

Videos: Epic Games’ Unreal new digital human demos

Thursday, March 22nd, 2018 | Posted by Jim Thacker

 
Epic Games has been showing off its work on photorealistic digital humans for a few years now, and its State of Unreal keynote from GDC 2018 didn’t disappoint, including not one but three impressive real-time demos.

The demos, created using 3Lateral’s facial capture technology, and rendered in Unreal Engine, turned one actress into a digital replica of another – and turned Andy Serkis into an alien.

Siren demo shows off state-of-the-art facial and full-body capture technologies
Epic’s flagship project was Siren (above), created in partnership with Vicon, facial capture specialists 3Lateral and Cubic Motion, and Chinese tech giant Tencent.

The demo, which is also being shown on Vicon’s booth on the GDC show floor, used the performance of actress Alexa Lee to drive a photorealistic digital replica of a second actress, China’s Bingjie Jiang.

3D and 4D scans of Jiang’s face were carried out by 3Lateral, which also supplied the digital character’s facial rig, based around its Rig Logic real-time facial solver.

Cubic Motion handled facial performance capture, tracking, solving and animation, taking video feeds of Lee’s performance – the version shown at GDC was recorded prior to the show – to drive the CG character.

Lee’s full-body performance was recorded using Vicon’s Vantage optical motion-capture system, and streamed using the new live link functionality of its Shōgun 1.2 software and Unreal Engine 4.19.

Vicon solved directly onto the Siren custom skeleton, removing the need to retarget data, and also developed a new algorithm to help realistically animate the CG character’s fingers.

New advances in real-time rendering for skin, eyes and hair
Although not as spectacular as the amazing live performance based around Ninja Theory’s Hellblade: Senua’s Sacrifice from GDC 2016, the demo also showed how Unreal Engine itself has evolved since then.

Epic Games CTO Kim Libreri ran through the “crazy improvements” that have been made to rendering eyes, skin and hair, including dual lobe specular reflections, back scatter and screen space irradiance.

“From a rendering perspective, we feel we’re getting pretty close to crossing the Uncanny Valley in real time,” he commented.

 

 
3Lateral’s Meta Human facial reconstruction system drives a digital Andy Serkis
But rendering is only one part of the challenge when it comes to bridging the Uncanny Valley: facial animation is equally crucial to creating a believable digital character.

Joining Epic on stage, 3Lateral CEO Vladimir Mastilovic ran through two of his studio’s own real-time demos, again rendered in real time using Unreal Engine.

The first showed a digital reconstruction of Andy Serkis – the motion capture actor’s motion capture actor, thanks to his digital performances as Gollum in the Lord of the Rings trilogy and Snoke in the new Star Wars movies – emoting his way through the ‘Tomorrow, and tomorrow, and tomorrow’ speech from Macbeth.

The CG character was created using 3Lateral’s Meta Human framework for volumetric capture, facial reconstruction and data compression: Mastilovic noted that data can be compressed up to one million times.

The data is also cleaned during streaming: the system estimates muscle contraction curves automatically.

The curves can also be offset manually: one mildly unnerving part of the keynote showed the gaze direction of the digital Serkis being puppeted in real time.

 

 
New possibilities for retargeting facial animation data
Mastilovic noted that the technology opened up a range of possibilities, from having a virtual character address each audience member directly in VR experiences to simply retargeting facial animation data better.

As an example of the latter, 3Lateral also showed Serkis’s facial performance retargeted to a CG character with quite different facial proportions: its own digital alien model Osiris Black.

Arguably, it looks more realistic than the CG version of Serkis himself: not because the output is better, but because the brain is better at spotting minor inconsistencies in human faces than those of creatures.

Either way, it was a demonstration both of the state of the art in facial capture and of the paradox of Andy Serkis as an actor: he’s better at playing fictional creatures than he is at playing himself.

 
Watch Epic Games demos of real-time digital humans in its GDC 2018 keynote
(The relevant section starts at 00:15:25)

Read fxguide’s article on how 3Lateral’s technology was used for the CG recreation of Andy Serkis

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Comments

You can follow any responses to this entry through the RSS 2.0 feed.

You can leave a response, or trackback from your own site.


© CG Channel Inc. All Rights. Privacy Policy.