Sneak peeks: Adobe’s Project Perfect Blend and Project Scenic
Experimental Adobe technology Project Scenic makes it possible to generate low-res 3D layouts that guide the composition of images created using the Firefly generative AI toolset.
Adobe has previewed a new batch of experimental graphics technologies during the popular Sneaks session at Adobe MAX 2024, its user conference.
There were nine in total, ranging from new image-editing and vector animation technologies to generative audio and tools for identifying deepfakes.
However, two particularly caught our eye, both because of what they do, and because both seem likely to make their way into commercially available Adobe tools.
Project Scenic makes it possible to use 3D to control the layout of images created with Adobe’s Firefly generative AI toolset, while Project Perfect Blend is a neat AI-based light-matching and delighting tool for compositing photographic images.
Project Scenic uses 3D to guide the composition of images generated by Firefly
Project Scenic brings 3D to Firefly, Adobe’s set of generative AI tools – but as a layout tool, rather than as the final output.
It enables users to enter simple text prompts like ‘a cozy campsite’ to generate a 3D layout, populated with low-resolution 3D assets.
The layout is then used as a compositional guide for the final 2D image, with the AI model matching its camera position, and the placement of objects within the scene.
Users can adjust the layout by moving individual objects around using standard 3D gizmos.
To change the camera position, in addition to the usual keyboard controls, users can type in further text prompts, like ‘top view of the tents’ or ‘look at the campfire’.
It’s also possible to use the 3D layout to refine the look of the 2D image, since individual 3D objects can be assigned separate text descriptions when generating the final image.
Project Perfect Blend auto-magically matches the lighting of images in Photoshop
However, the demo that got the biggest response from the live audience at the Sneaks session was Project Perfect Blend, a new AI tool for compositing still images.
With it, users can composite a new foreground object into a background image in Photoshop, importing it as a separate layer, and removing the background.
Clicking a ‘Harmonize’ button then relights the foreground image, automatically matching the light direction, tonal range and color palette to the background image.
If the foreground image has baked-in lighting, Project Perfect Blend automatically de-lights it before applying the new lighting scheme.
It also generates shadows cast by the new foreground onto surrounding objects, helping to blend it seamlessly into the background.
It’s one of those tools that just looks immediately useful, and is something that we could see soon making its way into the commercial version of Photoshop.
So when will I be able to use this technology in Adobe software?
While Project Scenic and Project Perfect Blend are research projects, not commercial tools, tech shown in Sneaks sessions can make its way into Adobe products quite quickly.
One of the highlights of the 2020 session, physics-based scene layout system Physics Whiz became part of Substance 3D Stager, released less than a year later.
Watch the full recording of the Sneaks session from Adobe MAX 2024
Read Adobe’s overview of the other technologies previewed during Sneaks session
Have your say on this story by following CG Channel on Facebook, Instagram and X (formerly Twitter). As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.