Unity rolls out Unity AI in Unity 6.2
Unity has released Unity 6.2, the next version of the game engine and real-time renderer.
The release introduces Unity AI, a new suite of AI tools that includes generative AI features for creating game code and art assets like sprites, textures and animations.
You can read our FAQs on Unity AI in the story below.
Other changes relevant to artists include a new Mesh LOD system for automated LOD generation, and updates to the Shader Graph, rendering pipelines, 2D system, and UI Tookit.
What is Unity AI?
The headline change in Unity 6.2 – and probably the most divisive – is Unity AI, a new suite of AI tools integrated into the Unity Editor.
It isn’t actually as big a change as it sounds, since it supersedes Unity’s existing AI services, Muse and Sentis, and has very similar functionality.
According to Unity, the new incarnation of the AI tools provides better integration into the Unity Editor, a wider choice of AI models, and more flexible pricing.
How does Unity AI differ from Unity Muse?
Unity AI has three key components, two of which are very similar to the features in Muse.
For CG artists, the most pertinent is Generators, a set of generative AI tools for generating sprites, textures, materials, animations and sounds.
The key difference to Muse – which is now being retired completely – is that Unity AI uses AI models developed by third parties, not just Unity itself.
For sprites, that means a range of LoRAs from Scenario and Layer trained on the Stable Diffusion and Flux foundational models.
For textures, Unity has retained its own AI model, which generates tileable textures from text prompts or image references.
For animations, Unity’s own text-to-animation model is available alongside Kinetix‘s video-to-animation model.
The second component, Assistant, is a generative AI tool for answering users’ technical queries, automating routine tasks, and writing code.
It uses LLMs from OpenAI’s GPT series, and Meta’s Llama series.
How does Unity AI differ from Unity Sentis?
The third component of Unity AI, the Inference Engine, is simply Sentis.
Despite the change of name, it’s still the same product: a means for running custom AI models inside the Unity Editor, or in-game in the Unity runtime.
How does Unity AI use your data?
By default, Unity AI does not use user data to train AI models in Unity 6.2.
If you opt in to data sharing in the Unity Dashboard, Unity uses data like text prompts and “metadata from attached objects” for training.
Unity says that it does not use either runtime applications developed with Unity or media assets like user-created images, 3D meshes or audio to train its own AI models.
The developers of third-party AI models also do not use Unity user data to train models, regardless of what options you select in the Unity Dashboard.
What data sets were the Unity AI models trained on?
Each of the AI models was trained on different data sets, so by opening up Unity AI to third-party models, Unity is essentially leaving it to its users to do their own homework.
According to the Unity AI Guiding Principles webpage, “you are responsible for ensuring your use of Unity AI [deos] not infringe on third-party rights and [is] appropriate for your use”.
Does Unity AI run locally, or in the cloud?
The Unity AI Assistant and Generators run in Unity Cloud. The Inference Engine runs locally.
How much does Unity AI cost?
Unlike Muse, which cost $30/month, Unity AI does not require a separate subscription.
Unity has said that ‘Unity Points’ – included with paid Unity subscriptions, and available to buy separately – will be required to perform actions in the Assistant and the Generators.
Although the credit-based system is supposed to come into effect with the stable release of Unity 6.2, at the time of writing, there are no details of points on Unity’s Plans & Pricing webpage.
What are the other new features in Unity 6.2 for CG artists?
Although Unity’s blog post on Unity 6.2 focuses mainly on Unity AI, there are a few other new features in the release.
For CG artists, the most significant is likely to be Mesh LOD, which automatically generates Level of Detail versions of imported 3D meshes.
It provides fewer customization options than the existing LOD Group system, but is less computationally expensive, and has a lower memory footprint.
There are also updates to Unity’s render pipelines, with support for prepass layers in the URP, and support for NVIDIA’s DLSS 4 Super Resolution render upscaling in the HDRP.
The Shader Graph gets a new Append node, plus a number of workflow improvements.
The 2D toolset gets the option to preview changes made to sprites in the Sprite Editor window directly in the Scene view.
The UI Toolkit gets support for World Space UIs, making it possible for UI elements to be moved, scaled or rotated like any other in-game object.
What are the other new features in Unity 6.2 for developers?
Developers get new diagnostics features, including “enhanced crash and ANR reporting to diagnose issues faster”.
A new Developer Data Framework makes it possible to control how your data is used by Unity services: diagnostic data is collected by default, but you can choose to disable data collection.
There are also updates to support for XR devices, although support for Magic Leap headsets has now been deprecated, and will be limited to existing projects from Unity 6.3.
Price, system requirements and release date
The Unity Editor is compatible with Windows 10+, macOS 11.0+ and Ubuntu 22.04/24.04 Linux.
Free Personal subscriptions are now available for artists and small studios earning under $200,000/year, and include all of the core features.
Pro subscriptions, for mid-sized studios, now cost $2,200/year. Enterprise subscriptions, for studios with revenue over $25 million/year, are priced on demand.
Read an overview of the new features in Unity 6.2 on Unity’s blog
Read a full list of new features in Unity 6.2 in the online release notes
Read Unity’s FAQs about Unity AI
Have your say on this story by following CG Channel on Facebook, Instagram and X (formerly Twitter). As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.