Sign up for the newsletter

Signup for the Newsletter

Sneak peek: Adobe’s Speech-Aware Animation system

Thursday, August 20th, 2020 | Posted by Jim Thacker

 
Adobe has unveiled Speech-Aware Animation, an interesting new AI-driven system for generating head movements for a character automatically inside Character Animator.

The feature is available in a new public beta of the software – which generates real-time puppet-style animation from video footage of an actor – along with a new Limb IK system and updates to the timeline.

Adobe’s blog post about the beta doesn’t give it a specific version number, but a lot of the online documentation refers to it as Character Animator 3.4.

New Speech-Aware Animation system generates head and eyebrow movements from audio recordings
Character Animator 3.4 – or whatever it ends up being called – is the first experimental version of the software Adobe has put out since launching its beta program earlier this year.

Its key feature is Speech-Aware Animation, a new AI-driven system that automatically generates head movements and eyebrow positions for an animated character from recorded speech.

The underlying technology is based on Project Sweet Talk, a demo Adobe showed at Adobe MAX 2019.

In its original form, it generated mouth shapes from audio recordings as well as head movements and eyebrow raises, but since Character Animator already has a lip sync system, that doesn’t feature here.

The implementation is slightly odd: Speech Aware-Animation has to be calculated separately from lip sync, and unlike lip sync, it isn’t generated from scene audio – you have to import a separate audio file.

Updated 2 September 2020: Adobe told us that separating the two processes increases workflow flexibility, but that it may combine the two operations in future, depending on the feedback to the public beta.

However, it can automatically turn and scale the character’s head as well as simply tilting it, and users can adjust a range of control parameters to finesse the animation manually.

Expanded Limb IK system, plus workflow updates to the animation timeline
Other new features in the public beta of Character Animator 3.4 include Limb IK which, as the name implies, extends the software’s existing Arm IK inverse kinematics system to a character’s legs.

The feature makes it possible to reposition a character’s entire arm or leg by moving the hand or foot (or paw, claw or tentacle tip), with the rest of the limb following and deforming automatically.

Users also get a self-explanatory new Pin Feet When Standing option for the Walk behaviour.

Other changes include the option to merge takes in the Timeline panel, plus a number of other workflow improvements, including the option to colour-code, filter or isolate takes.

Pricing and availability
Speech-Aware Animation is available as part of a public beta build of Character Animator, available free to existing users of the software. New users can install a trial version. It requires Windows 10 or macOS 10.15.

The current stable build, Character Animator 3.3, is available for Windows 10, Windows Server 2016+ and macOS 10.13 on a rental-only basis via Adobe’s All Apps subscriptions. They cost $79.49/month.

 
Read an overview of the upcoming features in Character Animator on Adobe’s blog

Read Adobe’s FAQs for its public betas

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Comments

You can follow any responses to this entry through the RSS 2.0 feed.


© CG Channel Inc. All Rights. Privacy Policy.