With Muse, Unity aims to give developers generative AI that’s useful and ethical

Unity joins the rest of the group in providing generative AI tools to its users, but has taken care (unlike some) to ensure that these tools are built on a solid foundation and not based on theft. Muse, the new suite of AI-powered tools, will start with texture and sprite generation, then move into animation and coding as it matures.
The company announced these features along with a cloud-based platform and the next big version of its engine, Unity 6, at its Unite conference in San Francisco. After a tumultuous few months — a major product plan was completely reversed And the ousted CEO – they’re probably eager to get back to business as usual, if that’s even possible.
Unity has previously positioned itself as the champion of smaller developers who don’t have the resources to use a larger development platform like rival Unreal. As such, the use of AI tools could be seen as a useful addition for developers who cannot, for example, afford to spend days creating 32 slightly different wooden wall textures in high definition .
While there are many tools to help you generate or mutate such assets, it’s often better to be able to say “create more” without leaving your main development environment. The simpler the workflow, the more things can be done without worrying about details like formatting and siled resources.
AI assets are also often used in prototyping, where things like artifacts and slightly wonky quality – usually present regardless of the model these days – don’t really matter. But having your gameplay concept illustrated with original, appropriate artwork rather than stock sprites or free 3D model samples can make the difference in getting your vision across to publishers or investors.

Examples of sprites and textures generated by Unity’s Muse.
Another new AI feature, Sentis, is a little harder to understand: Unity’s press release states that it “enables developers to integrate complex AI data models into Unity Runtime to create new gaming experiences and features.” So this is a BYO model, with built-in features, and is currently in open beta.
AI for animation and behaviors is on the way and will be added next year. These highly specialized scripting and design processes could benefit greatly from a generative first draft or multiplicative wizard.

Image credits: Unit
A big part of this release, the Unity team emphasized, was making sure these tools didn’t live in the shadow of future IP infringement cases. As fun as image generators like Stable Diffusion are, they are built from the assets of artists who have never consented to their work being ingested and regurgitated.
“In order to provide useful, safe, responsible, and respectful results of other creators’ copyrights, we challenged ourselves to innovate in our training techniques for the AI models that power the generation of sprites and graphics. textures from Muse,” reads a blog post about Responsible AI. techniques accompanying the announcement.
The company said it uses a fully custom model trained on Unity-owned or licensed images. Although they used Stable Diffusion to, essentially, generate a larger synthetic dataset from the smaller dataset they had assembled.

Image credits: Unit
For example, this wooden wall texture can be rendered in multiple variations and color types using the Stable Diffusion template, but no new content is added, or at least that’s how they have it described as working. As a result, however, the new dataset is not solely based on data from responsible sources, but is somewhat removed from it, thereby reducing the likelihood that a particular artist or style will be reproduced.
This approach is safer, but Unity has admitted that the quality of the initial models it offers is lower. However, as noted above, the actual quality of the assets generated is not always of great importance.
Unity Muse will cost $30 per month as a standalone offering. The community will undoubtedly soon ask us if the product justifies its price.