VIRTUAL PRODUCTION

The film industry continue to be imbued with more complex action, and invariably more complex visual effects, filmmakers are turning to new production techniques to imagine these scenes virtually even before they’ve been shot, and then to scout sets, interact live with CG assets and characters, and to shoot, revise and iterate virtual scenes on the fly.

As VFX have grown to be a greater part of movies and television today, there’s a growing divide between what the filmmakers can see through the camera on the live-action set, and what they have to imagine will be added digitally many months later. Virtual Production attempts to unite those two worlds in real-time using game engines like Unreal and Unity, combined with high-power graphics cards, camera tracking, as well as VR and AR, filmmakers are now able to create scenes across the physical and digital worlds, together.

A new type of filmmaking

The virtual production empowers the filmmakers to create shots based on what “feels” right, rather than looking at a computer and trying to guess where things should go when it’s not feeling right. And they can do it quickly without having to spend hours moving heavy equipment. 

The technology puts the tools of storytelling back in the hands of the filmmakers, rather than an army of technicians.This lets them explore ideas a lot faster with intuition and a stronger creative vision.

Art of (LED Wall) Virtual Production Sets, Part Two: 'How you make one' –  fxguide
The full video explores shooting with real-time parallax; blending CG and real-world sets with set extension tools; using VR tools to scout, set dress, and measure environments; working collaboratively in live, multi-user sessions; controlling lighting and environment from an iPad; and using nDisplay to blend the output from multiple render nodes.

New worlds

Several films with hugely fantastical worlds and characters have taken advantage of virtual production techniques to get their images on the screen. 

Avatar represented a huge leap forward in virtual production (and there are several sequels on the way). Ready Player One relied on virtual production to bring its OASIS world to life. It’s in the OASIS where avatars of human characters interact. That virtual world was ‘filmed’ with motion captured actors using a game engine-powered real-time simul-cam set-up that allowed director Steven Spielberg to ‘find’ the right shots, since the sets the actors occupied had been pre-made in rough form digitally. He could also re-visit scenes afterwards, re-framing and re-taking scenes even after the performance was done.

Ready Player One crew snuck Steven Spielberg references into the movie |  EW.com
Ready Player One

Epic Games Unreal Engine software is one of the significant players in this virtual production environment. It allowed for example to not only film a scene with motion captured actors and a virtual camera, but also to introduce raytracing into the real-time rendered output. I personally find the result incredibly insane.

This demonstration is a collaboration between Epic’s ray tracing experts, NVIDIA GPU engineers and the creative artistry of ILMxLAB.

The two major game engine producers, Unreal and Unity, are certainly pushing virtual production. It’s their engines that are the backbone of many real-time rendering and virtual set production environments. It’s important to note for independent filmmakers that these engines are generally available for free, at least in terms of getting started.

Epic claims Apple threat to remove Unreal Engine from iOS, macOS - SlashGear
unreal engine by epic games

Virtual Camera

A virtual Alexa ARRI lens 28mm, from I AM MOTHER

The Virtual Camera enables a user to drive a Cine Camera plug-in like Arri or RED in Unreal Engine using an iPad in a virtual production environment. With ARKit or an optical motion capture system such as Vicon or Optitrack, the position and rotation of the iPad is broadcast wirelessly to the PC, with the PC sending video back to the iPad.

FilmFormatAspectRatio.png
Example of selecting film format
SmartVCS: Intelligent Virtual Cinematography – Girish Balakrishnan

Camera settings such as Focal LengthApertureFocus Distance, and Stabilization can be adjusted using touch input. Additionally, the Virtual Camera can be used for taking high-res screenshots , setting waypoints, recording camera motion and other tasks related to virtual production.

Tutorial on how to use an ipad or iphone with Unreal Engine

The physical cinematography equipment on a film set is replicated in the virtual world, and tracked so they match exactly. This allows the director of photography to put their hands on the gear and react more naturally as the performances unfold.

The assets of the film is made with the game engine, instead of creating a movie set.

The history of Virtual Production:

Its earliest use and iterations can be traced to advancements and innovations in filmmaking technologies. Peter Jackson’s The Lord of the Rings: The Fellowship of the Ring (2001)used virtual reality (VR) goggles and virtual cameras to plan camera moves.

James Cameron took it a step farther with Avatar (2009), as seen in Image 1, to create a bioluminescent species and exotic environments with a motion capture stage and simulcam. Simulcam is a VP tool used to “superimpose virtual characters over the live-action in real-time and aid in framing and timing for the crew” (Kadner 2019).

Jon Favreau continues to lead the charge with ground-breaking film and television projects such as The Jungle Book, The Lion King (2019), and The Mandalorian (2019), by designing and altering photorealistic environments in real-time.

Image 1: Facial and motion capture in  Avatar . Source:  ComingSoon.net .

Leave a Reply

Your email address will not be published. Required fields are marked *