The virtual production industry is one of constant change and growth. Even several years after The Mandalorian brought many new eyes and developments into the space, new hardware and software are released quite often. The past couple of weeks have been no different. Shortly after NVIDIA’s RTX PRO™ Series announcement, Chaos announced the public release of Arena, their proprietary, real-time software for in-camera visual effects (ICVFX) on LED walls. A few days later, Foundry followed suit and announced their upcoming release of Nuke Stage.
When it comes to software choices for driving an LED wall for ICVFX, video game engines like Unreal Engine and Unity have been the only option. Game engines were a natural starting point for virtual production. For one, they are already real-time engines capable of producing highly detailed environments at high frame rates. Second, they are open for end users to configure code, write their own plugins, etc. This made it easy for various industries to develop new workflows.
These engines do have some drawbacks, though. Game engines were not designed to run massive LED walls with multiple nodes synced together, sharing the same frustum. However, developers at the movie studios saw the potential and, with the help of NVIDIA and Epic, were able to make things work. It is also very clear that they are still game engines, first and foremost. When major features are added to Unreal, they are rarely supported by the VP side or are labeled as “Experimental” and not advised for use in active production. The other issue is that these engines are pretty large and complex. Anyone who has used Unreal will attest to how difficult it is to learn. Not only is there a steep learning curve, but there are a lot of unnecessary features for any individual workflow. Many of the tools and features for games aren’t needed for ICVFX. It can be challenging to navigate everything in the editor to find precisely what is needed.
This is where Arena from Chaos and Nuke Stage from Foundry come in. They were both designed from the ground up for ICVFX, and ICVFX only. There are no extraneous toolsets that are not needed for virtual production. All features and updates that are announced will be available on an LED wall.
On top of this, many in the visual effects field are already familiar with both V-Ray and Nuke. They both promise to slot into existing workflows seamlessly. Scenes built for V-Ray will work in Arena without adjustments, and vice versa. That means if an environment will be used on the wall with live actors, but also in an offline rendered shot, both will match without needing to worry about adjusting lights or materials to work with different engines. Similarly, those familiar with compositing in Nuke will be able to use those same skills in Nuke Stage. After all, ICVFX is live compositing.
Exactly how well either of these works in practice is still a big question. However, Chaos has made its own short film on a volume to get firsthand experience. This shows a lot of commitment to the industry and a willingness to put in the work to really understand the needs of this sort of filmmaking. I have not yet seen a full shoot on Nuke Stage, but I do not doubt it will come soon.
I eagerly look forward to studios trying these applications on their LED Volumes. Multiple competing products should push the developers to build better, more useful tools. Unreal Engine is a fantastic piece of software, but having a tailor-made solution is attractive to many studios. If you are interested in either of these, let us know in the comments, and we will try to get some hands-on time and maybe some benchmarking with them.