Friday, August 8th, 2014 Posted by Jim Thacker

Faceware Technologies ships Faceware Live 2.0


Faceware Live powering a live marketing display for Snickers in Korea. The real-time markerless facial mocap system has been updated to support Unity and multi-character tracking, and is now publicly available.

Faceware Technologies has released Faceware Live 2.0: an update to its new real-time markerless facial motion-capture system – and the first one that most people will actually get to use.

Although announced at Siggraph 2013, the first version of Faceware Live has been in testing at a “handful of studios” since late last year.

Support for Unity and MotionBuilder, multi-character tracking
The system processes live video footage of an actor’s face and streams the processed data into MotionBuilder – and now Unity – where it can be used to drive a CG character.

Aside from Unity support – quite a big thing in itself – the update expands Faceware Live’s feature set a fair bit, including the option to stream multiple characters simultaneously, and a ‘one-click’ calibration system.

You can find more details in the news release below. You can only get details of pricing on request – which suggests it isn’t cheap – but you will be able to request a 30-day trial of the software later in August.

PRESS RELEASE (Excerpts)
Faceware Technologies, the leading provider of markerless 3D facial motion capture solutions, today announced that it has upgraded and released version 2.0 of its real-time facial mocap and animation product, Faceware Live.

Faceware Live produces facial animation in real time by automatically tracking a performer’s face and instantly applying that performance to a facial model. Faceware Live can use either an onboard computer video or webcam, the Faceware Pro HD Headcam System, or any other video capture device, to drive the real-time performance. Under the hood is patented computer vision technology, called Live Driver, from our partner company Image Metrics?, which is capable of real-time facial tracking in nearly any lighting condition with any performer.

Upgrades to version 2.0 include:

  • Instant Calibration – With Faceware Live 2.0, calibration of any new performer takes only 1-3 seconds. There is no need for a lengthy range of motions or a complex character setup process. Performers will make a single “neutral” expression and with one button click, they are “found” by the tracking technology and their performance can be captured.
  • Improved Facial Tracking – Faceware Live captures nearly 180 degrees of motion, allowing the live talent to be freer in their performances. In addition, Faceware Live tracks 22 more facial points than version 1.0, significantly increasing the quality of the facial motion capture.
  • Consistent Calibration – One area of real-time motion capture often overlooked is consistency in calibration from day-to-day with the same actors. Faceware Live features an entire interface to aid in this process by allowing the user to store and recall calibration frames as an overlay to the live feed, as well as the ability to toggle a grid for consistent camera and framing. Faceware Live also allows multiple actors to be calibrated at once by using Live’s multi-sync calibration feature.
  • Support for High-Frame-Rate Cameras – Studios looking to achieve ultimate quality and responsiveness will be eager to use Faceware Live 2.0?s unique ability to accept data from high-frame-rate cameras. The quality of facial motion extracted at higher frame rates allows for more believable performances in real time, which is critical for capturing areas of subtle movement and detail such as the lips and eyes.
  • Stream Multiple Characters – With version 2.0 of Faceware Live, there is no limitation to the amount of characters that Faceware Live can track at once. For example, studios can have multiple characters interacting with each other in front of a live studio audience, or in complex pre-viz sequences and game scenes.
  • Unity Game Engine Support – In addition to Autodesk MotionBuilder®, Faceware Live 2.0 now works in tandem with Unity, one of the most popular game engines on the market. Real-time facial animation can be displayed in the Unity editor as well as live in the game itself. Future game engine support is planned. Faceware Live 2.0 streams facial data over TCP, allowing studios to custom integrate Faceware Live 2.0 into nearly any rendering environment.

Read more about Faceware Live 2.0 on Faceware Technologies’ website