June 7th, 2023 by chris
Leveraging both UNITY and UNREAL for our game.

Of the 4 games we have released, 3 have been created using Unity. The engine is incredibly robust, and we know it very well. Because our games are mostly 2D games, using something like Unreal is overkill for us. We also like the fact that our games look good but run on low-end spec‘d machines.

That said, I am an artist, and while I‘m comfortable enough with the art flow in Unity to get assets into the game engine and light them - I couldn‘t help but look at the art tools in Unreal with envious eyes!

My main focus has always been on creating high poly art and assets, which is why our games have used predominantly 2D prerendered graphics. But Unreal promised to break that limitation with Nanite and Lumen.

After watching the short film Irradiation by Sava Zivkovic last year, specifically his ‘Process video ( )‘, I decided to bite the bullet and learn Unreal.

Our games have always had prerendered cinematic sequences, but we have always had to be sparing. They take an enormous amount of time to do, and we have often had to design around those limitations. The hope was that, in switching to Unreal, we could eliminate much of the time sink in rendering these sequences and let loose!

I contacted Sava Zivkovic to get the PC specs he used to create Irradiation and bought a PC that mimicked what he had used. His main recommendation was a graphics card - a 3090 RTX (Not easy to acquire at the height of crypto and in far off South Africa, I might add).

While I knew that the game itself would still be created in Unity and the environments would still be rendered with VRay from 3DSMAX, I wanted to ensure parity between the different software. It should all feel cohesive.

While learning to use Unreal for our cinematics, I created a short 60-second film. I set myself a limited time to do this - one day. It was a gamble to buy such expensive hardware without knowing if it would even speed up our workflow, especially considering I had never used Unreal before.

The short film - - was the proof of concept I needed to be certain we had made the correct choice.

I kept my workflow for past cinematics unchanged. I animated the characters in 3DSMAX and exported the animations directly into Unreal. I think I would look at using Unreal‘s control rig for future games - but I wanted to use the tools that I‘m most comfortable with.

In our previous cinematics, a single 30 second sequence could take a week or more to create. The rendering alone - if done locally - would take the majority of that time. The downside as well was that if something went wrong, it would require a rerender to fix, or do a lot of alterations in the compositing stage.
This time could be saved by purchasing time on a render farm - but we are a small studio on a tight budget, so that wasn‘t feasible for us.
Using Unreal meant that the feedback was real-time. Alterations could be made and output at the same time. Having this level of feedback was incredibly freeing from an artistic and a production point of view.

For the environments, because the game uses high poly models, I could take what I had already created and bring them right into Unreal. They did require retexturing to use Unreals materials, but once I was used to the workflow, it went remarkably quickly.

We invested in a Rokoko Smartsuit 2, which helped in shaving days off of the time it took to create a sequence. The suit is incredibly easy to use with just one person, which was important as Stasis Bone Totem was built entirely remotely.
My small office served as a motion capture stage, along with polystyrene cubes as props, and masking tape on the tiled floors to outline the ‘stage‘.

For LipSync I used a mixture of LiveLink and NVidias Audio to Face software - with a lot of hand tweaking. Going forward I will more likely take advantage of the many AI tools that are being created for facial animation.

At the end of production, I created 44 minutes of high quality, 4k cinematic sequences.
To create the link between the Unreal renders and the VRay/Unity in-game renders, I matched the lighting of the scenes as well as possible. I also mixed in some elements rendered from Unreal (namely the character portraits) to ensure it all worked cohesively.

Unreal is an incredible tool for an artist. While our games core engines will likely continue to be built around Unity, we will most certainly keep on using Unreal as a rendering and art tool in our future games.

An interview about Adventure Games...
January 20th, 2022 [brotherhood]
Read More
December 25th, 2021 [brotherhood]
Read More
December 8th, 2021 [brotherhood]
Read More
Beautiful Desolation wins BEST GRAPHICS!...
December 7th, 2021 [brotherhood]
Read More
STASIS Kickstarter Post Mortem Throwback...
November 16th, 2021 [brotherhood]
Read More
November 10th, 2021 [brotherhood]
Read More
Practical advice for Indie developers PA...
October 20th, 2021 [brotherhood]
Read More
September 1st, 2021 [brotherhood]
Read More
August 9th, 2021 [brotherhood]
Read More
July 23rd, 2021 [brotherhood]
Read More
How to Make a Video Game While Rationing...
July 7th, 2021 [brotherhood]
Read More


February 19th, 2024

The Aggie Awards for 2023 have just been announced, and STASIS: BONE TOTEM won in four categories and was the runner up in five more! We notably won Best Non-Traditional Adventure and Best Adventur...

Read More