Adobe has released a new update to Character Animator, its work-in-progress 2D animation software, adding a new toolset for creating walk cycles, and a viseme editor for adjusting lip synching.
The new version – it isn’t officially numbered, but it’s the second beta – will be on show at NAB 2017, along with updates to Adobe’s other video tools, including After Effects CC 2017.2 and Premiere Pro CC 2017.1.
A simple real-time system for 2D puppet animation
First released in preview form in 2015, Character Animator is intended as a simple way of creating 2D animations from still raster or vector images, including standard skeletal animation tools.
Facial animation can be puppeteered in real time, with the software processing live footage of an actor and translating their facial movements to the 2D character, including lip-synching.
Although still in beta, the software has been used in production, including a live segment of The Simpsons.
New automated toolset for setting up walk cycles
New features in Beta 2 include an automated toolset for creating walk cycles. Workflow is pretty simple: users just click in the viewport to tag the character’s legs, arms and body, and hit play.
There are a range of preset movement styles (walk, run, sneak, and so on), along with a basic set of parameters like Stride Length, Step Speed and Step Phase, Arm Swing and Elbow Bend.
The toolset works in conjunction with the facial-capture system, so users can set up a walk cycle, then puppeteer the character’s head movements on top of it in real time.
Edit the timing of visemes to modify Character Animator’s automated lip-synch
There is also a new viseme editing system, enabling users to refine the output from Character Animator’s automated lip-synching.
Visemes now appear in their own track in the timeline, enabling users to adjust the length of individual mouth shapes, or even replace them entirely, selecting new shapes from a drop-down menu.
The Nutcracker Jaw behaviour, used for very simplified characters, has also been updated, with the jaws now widening automatically according to the loudness of the voiceover, and rotating clockwise or anticlockwise.
New options for streaming live to Facebook, YouTube and Twitch
Streaming options have also been expanded, with support for NewTek’s NDI technology for livestreaming animation to services like Facebook Live, YouTube Live and Twitch.
The update also adds support for Adobe’s Mercury Transmit system for sending output to a second monitor.
New workspaces for common tasks, more layer blending modes
Other changes include a new workspaces system, providing a range of readymade UI layouts designed to focus on the tools required for common tasks, like rigging, offline recording and streaming.
The software also now supports Photoshop’s full set of layer blending modes; and native language versions are now available for Simplified Chinese, Portuguese, Italian, Spanish, Russian and Korean.
Pricing and availability
Character Animator CC Beta 2 is available through subscriptions to Adobe’s Creative Cloud. It’s included with After Effects, which costs $19.99/month; or subscriptions to all Adobe’s creative tools start at $49.99/month.
Tags: 2D animation, Adobe, After Effects, After Effects CC 2017.2, automated, blending modes, Character Animation, Character Animator, Character Animator CC Beta 2, Facebook Live, facial animation, facial capture, layer blending, lip-synching, livestream, markerless, motion capture, multiple monitors, NAB 2017, new features, Nutcracker Jaw, price, puppet animation, run cycle, streaming, Twitch, viseme, viseme editor, walk cycle, YouTube Live