top of page
  • Writer's pictureAsher Vast

Virtual Productions and the future of the Sound Stage.

Updated: Jun 15, 2020

I have been a filmmaker for about 8 years now and one of the most common challenges we face as filmmakers, time after time is finding quality locations to film at. In a corporate setting, if a client is tasked with finding a filming location, you might be provided with a board room or worse. In a narrative setting, you may have a small budget and not have the funding to secure the locations you need so you get stuck with a sub par location that has its own set of challenges.

You could have all the best camera and lighting equipment money could buy and be the best of the best at your craft but at the end of the day, your end product will be boring at best if you don’t have quality locations to shoot in. This is where Virtual Productions come in.

Virtual Productions are already taking hold on both large productions as well as on Indy sets. James Cameron made one of the first iterations of this tech for Avatar back in 2009 but it wasn't widely used until 2014 with the release of Unreal Engine 4 that these kinds of tools started to become available to the public. For anyone who is new to this technology, a virtual productions (Also referred to as Real Time Compositing) is basically where you film a real person in front of a green screen while using a computer and a software engine such as Unreal Engine 4 to place a 3D room or 'environment' around the person.

This allows for big production looks with basically endless scene possibilities that you can see in the monitor in real time as you shoot it.

Furthermore, you can even use a motion tracking device that sits on the camera and adapts the scene around the character based on the camera movement in real time so the 3D room moves as you would expect based on what the camera is doing but enough talking, let’s look at what a virtual production looks like.

Here's an example of a virtual production where they're using the Unreal Engine and a camera motion tracker (HTC Vive Tracker).

Here's how it can be used in an interview setting.

Here's an example of the Weather Channel using Unreal for their Virtual Production.

Equipment and Software Needed:

  • Camera with SDI Output $1K+

  • Vive Pro (Base Station 2.0) $200

  • Vive Tracker $100

  • Modern Computer with Fast GPU and CPU $2K+

  • Blackmagic Design Decklink 8K Pro $650

  • Green Screen $50+

  • Unreal Engine 4 Free

The use cases for Virtual Productions are basically unlimited but some areas where it's particularly useful are:

  • Product Demos

  • Music Videos

  • Interviews

  • Short and Feature Films

  • Commercials

Models are also an important factor of your Virtual Production as in many cases, you would be using a prefabricated 3D environment model that you would place your actor in. You could even build your own but unless you're an experienced 3D designer, you'll probably be better off purchasing a professional product. There's an online market for these environments where you can buy them individually or even in packs. One example is but there are others.

92 views0 comments

Recent Posts

See All


bottom of page