The journey to making the upcoming film Gods of Mars changed course dramatically once real-time rendering entered the picture.
The movie, currently in production, features a mix of cinematic visual effects with live-action elements. The film crew had planned to make the movie primarily using real-life miniature figures. But they switched gears once they experienced the power of real-time NVIDIA RTX graphics and Unreal Engine.
Director Peter Hyoguchi and producer Joan Webb used an Epic MegaGrant from Epic Games to bring together VFX professionals and game developers to create the film. The virtual production started with scanning the miniature models and animating them in Unreal Engine.
“I’ve been working as a CGI and VFX supervisor for 20 years, and I never wanna go back to older workflows,” said Hyoguchi. “This is a total pivot point for the next 100 years of cinema — everyone is going to use this technology for their effects.”
Hyoguchi and team produced rich, photorealistic worlds in 4K to create rich, intergalactic scenes using a combination of NVIDIA Quadro RTX 6000 GPU-powered Lenovo ThinkStation P920 workstations, ASUS ProArt Display PA32UCX-P monitors, Blackmagic Design cameras and DaVinci Resolve, and the Wacom Cintiq Pro 24.
Stepping Outside the Ozone: Technology Makes Way for More Creativity
Gods of Mars tells the tale of a fighter pilot who leads a team against rebels in a battle on Mars. The live-action elements of the film are supported by LED walls with real-time rendered graphics created from Unreal Engine. Actors are filmed on-set, with a virtual background projected behind them.
To keep the set minimal, the team only builds what actors will physically interact with, and then uses the projected environment from Unreal Engine for the rest of the scenes.
One big advantage of working with digital environments and assets is real-time lighting. When previously working with CGI, Hyoguchi and his team would pre-visualize everything inside a grayscale environment. Then they’d wait hours for one frame to render before seeing a preview of what an image or scene would look like.
With Unreal Engine, Hyoguchi can have scenes ray-trace rendered immediately with lights, shadows and colors. He can move around the environment and see how everything would look in the scene, saving weeks of pre-planning.
Real-time rendering also saves money and resources. Hyoguchi doesn’t need to spend thousands of dollars for render farms, or wait weeks for one shot to complete rendering. The RTX-powered ThinkStation P920 renders everything in real time, which leads to more iterations, making way for a much more efficient, flexible and faster creative workflow.
“Ray tracing is what makes this movie possible,” said Hyoguchi. “With NVIDIA RTX and the ability to do real-time ray tracing, we can make a movie with low cost and less people, and yet I still have the flexibility to make more creative choices than I’ve ever had in my life.”
Hyoguchi and his team are shooting the film with Blackmagic Design’s new URSA Mini Pro 12K camera. Capturing such high-resolution footage provides more options in post-production. They can crop images or zoom in for a close-up shot of an actor without worrying about losing resolution.
They can also color and edit scenes in real time using Blackmagic DaVinci Resolve Studio, which uses NVIDIA GPUs to accelerate editing workflows. With the 32-inch ASUS ProArt Display PA32UCX-P monitors, the team calibrated their screens so all the artists can see the same rendered color and details, even while working in different locations across the country.
The Wacom Cintiq Pro 24 pen displays speed up the 3D artist’s workflow, and provides a natural connection between the artist and the Unreal editor, both when moving scene elements around to create the 3D environment and when keyframing actors for animation.