What is Virtual Production? Part 1

By 30 September 2022December 11th, 2022One Comment


Virtual Production (VP) is a current buzzword in the film and entertainment industry that’s radically increased in popularity since its mainstream introduction with The Mandalorian in 2019. In this three-part blog we’ll discuss what virtual production includes, explore the different types of virtual production, and examine the critical role motion capture technology has in virtual production workflows.

The Mandalorian (2019). Image from ILM.

Definition of Virtual Production:

Leading studios and developers in the virtual production space define virtual production in a variety of different ways. Weta states that virtual production is “where the physical and digital worlds meet”, where as Unreal Engine describes virtual production as “a broad spectrum of computer-aided production and visualization film making methods”3,7. These definitions are somewhat ambiguous because of the vastly different ways in which filmmakers can engage in technology during various stages of a production.

Computer-aided production?! Isn’t that just VFX?

Although these definitions may sound like traditional visual effects (VFX) found in film making, an important distinction between traditional VFX and VP is the real-time workflow approach. Traditionally VFX workflows heavily utilize chroma keying green/blue screens with a heavy emphasis on post-production.

The traditional workflow has been a pain point for most creatives on the set of these productions. This is because they cannot see a final pixel result of their on-set work until deep into the post-production workflow process3. Seeing final pixel results in the post production pipeline often takes many months. This problem manifests uncertainty during the production because guesswork is often required3.

Some examples:

  • The cinematographer may have to “guess the lighting of the unseen green screen element”3.
  • The director may not know exactly what the creature behind the green screen looks like3.
  • The actors performing are traditionally required to envision what is beyond the green screen. These imagined end results likely differ from actor to actor causing a disconnect.

Avengers: Endgame (2019). Images from AWN.

But I want to see it NOW!!!

This green screen pain point, of everyone having to imagine the scene, has pushed filmmakers towards technology capable of displaying real-time visualizations. Innovation in LED panels, video game engines, and motion capture tracking have made live virtual backgrounds a possibility.

Thor: Love and Thunder (2022).Image from ILM.

  1. Video game engines capable of powerful real-time physics and lighting simulations that can be projected onto LED screens.
  2. LED screen that possesses smaller pixel pitches and accurate colour outputs.
  3. Motion capture accuracy when tracking physical camera movement sand translating the movements to virtual cameras in real-time.

These innovations are powerful because they can all be manipulated in real-time. The game engine physics, and lighting simulations mimic real life lighting/physics giving the 3D world convincing qualities. Game engines give the ability for directors to request changes onset like moving a tree, shifting the sun, altering the wind direction, and much more.

The LED screens having increasingly smaller pixel pitches and accurate colour profiles allow for seamless capture of them in-camera. It further helps with minimal colourist post correction and reduced chance of lighting artifacts from the LEDS.

Finally, mocap camera tracking improvements allow for further immersion as the camera is tracked accurately in real-time, preventing jitter and ensuring correct movement of the virtual background. This ensures the perspective of the background maintains correct to the camera position and lens configuration.

Thor: Love and Thunder (2022).Image from ILM.

So Virtual Production is just big LED walls and tracked cameras?

Pioneered by The Mandalorian (2019) the most well-known method of virtual production is in-camera visual effects (ICVFX). ICVFX is a virtual production method in which a traditional VFX blue or green screen is replaced with LED panels to play a virtual background in real-time. These panels display images from a video game engine to create a live background that actors can perform in front while being filmed by a physical camera.This process requires tracking the physical real-life camera (or cameras) on the stage. The camera movements are translated to the 3D camera(s) in the digital world to create a seamless match of the physical and digital world. This realistic movement is known as parallax and is vital to pulling off the illusion of making real-life characters look like they are in the virtual environment.

Parallax examples:

Virtual production examples | ILM The Mandalorian Season Two:

Although ICVFX is the most known version of virtual production it is not the only one. Epic Games defines four main types of virtual production2:

  1. Visualisation
  2. Performance Capture
  3. Hybrid Green Screen Live
  4. Full Live LED Wall (ICVFX).

In the next blog post we will take a deeper dive into these variations. Part 2


  1. Hogg, T. (2019). Weta and thanos come full circle in ‘avengers: endgame’. Awn. Retrieved 27September 2022, from https://www.awn.com/vfxworld/weta-and-thanos-come-full-circle-avengers-endgame?fbclid=IwAR15nco4YPZHn-wNEfln-0tQ3xTo1IeWdoWWV56YRkHbpgvna0hUuROO9so
  2. Kadner, N. (2019). The virtual production field guide volume 1.Unreal Engine. Retrieved 13 September 2022, from https://cdn2.unrealengine.com/vp-field-guide-v1-3-01-f0bce45b6319.pdf
  3. Kadner, N. (2021). The virtual production field guide volume 2.Unreal Engine. Retrieved 13 September 2022, fromhttps://cdn2.unrealengine.com/Virtual+Production+Field+Guide+Volume+2+v1.0-5b06b62cbc5f.pdf
  4. Rao, N. (2021). What is virtual production?.Cg spectrum. Retrieved 12 September 2022, from https://www.cgspectrum.com/blog/what-is-virtual-production
  5. Unreal Engine. (2022). The future of film making is here. Unreal Engine. Retrieved 12 September 2022, fromhttps://www.unrealengine.com/en-US/virtual-production
  6. Vicon. (2021). Vicon x ilm. Unreal Engine. Retrieved 12 September 2022, fromhttps://www.vicon.com/resources/case-studies/vicon-x-ilm/
  7. Wetafx. (2022). Virtual production is where the physical and digital worlds meet. Retrieved 14 September 2022, fromhttps://www.wetafx.co.nz/research-and-tech/technology/virtual-production/

One Comment

  • Louise Harvey says:

    great work Kevin, keep it coming! It would be great if you could include some discussion about facial capture too!

Leave a Reply

4 × five =