What is Virtual Production?
Aniceto explains that virtual production is essentially real-time visual effects. When you’re looking through your camera, you’re seeing what the final shot will look like. In the past, filmmakers would shoot against a green screen and then replace the green background with a digital environment in post-production. But with virtual production, everything happens in real-time. Actors can see the world they’re interacting with, and the director can visualize the final product as it’s being filmed. This approach involves a lot of previs, allowing for a seamless and dynamic production process.
5 Categories of Virtual Production
- Virtual Reality (VR) – A computer-generated environment that immerses users by creating realistic scenes and objects that make them feel like they are truly inside a different world. A great example of VR would be the Oculus headset. When you put on an Oculus, you’re transported into a fully virtual world, experiencing it as if it were real, thanks to the immersive visuals and interactive features.
- Mixed Reality (MR) – blends the physical world with digital elements, allowing users to interact with both simultaneously. A great example of MR is the Apple Vision Pro headset. When wearing the headset, you can still see the real world around you, but with virtual objects or information layered on top. For instance, you could be looking at your physical surroundings while also interacting with a virtual screen, like writing a Word document, all seamlessly integrated into your environment. This combination of real and virtual elements enhances the way we work and experience digital content.
- Computer Generated Imagery (CGI) – CGI and green screen work were traditionally done in post-production, where special effects were added after filming by shooting actors in front of a green backdrop and replacing it with digitally created environments or elements. However, with advancements in virtual production, much of this work is now done in-camera, meaning that actors are immersed in the digital environments in real-time as the scene is being shot. This allows the director, actors, and crew to see the finished effect while still on set, improving the creative process and reducing the need for extensive post-production work. In-camera VFX enhances the realism and efficiency of filmmaking by enabling a more seamless integration between live-action and digital content.
- Augmented Reality (AR) – An interactive experience that enhances the real world by overlaying computer-generated elements onto a user’s view of their surroundings. A well-known example is the Pokémon Go game, where the camera on your phone adds virtual characters to the real world as you explore. In broadcast media, AR is often used when, for instance, a presenter is on stage and a map or other visual elements appear on the screen, enhancing the information being shared. Essentially, AR blends the physical and digital worlds, augmenting the user’s experience with additional layers of virtual content.
- Extended Reality (XR) – A comprehensive mix of all immersive technologies, combining virtual reality (VR), augmented reality (AR), and mixed reality (MR) into a single experience. While it’s harder to capture a single image of what XR looks like, it encompasses various forms of virtual production. A simple example is when a person stands in front of an LED wall with a scene displayed. As the camera pans past the wall, an augmented scene extends beyond the physical wall, creating the illusion of a larger environment. XR blends and expands the digital and physical worlds, enhancing the overall immersive experience.
Obstacles with LED for Virtual Production
Types of Virtual Production Stages
The presentation highlighted several virtual production stages built by Orbital Studios and other facilities across the U.S., showcasing the advanced capabilities they offer for film and commercial production. Orbital Studios, for example, developed a large main stage used in their short film “Goliath,” as well as a car process stage, designed with a curved LED wall and ceiling. This setup enables car scenes to be filmed in a controlled studio environment, replicating a wide range of backdrops. Similar to recent car commercials, including those featuring Matthew McConaughey, the driver never actually hits the road but instead appears to drive through diverse landscapes projected on the LED walls.
Resolution Studios in Chicago offers various sized LED walls, including a larger one approximately 60 feet wide and 15-20 feet tall, which they use creatively for commercial rentals. Renovo Media Group’s stage setup includes modular LED panels, making it easier to monitor for pixel issues and ensure smooth visual displays.
T-Mobile’s production space takes a different approach, using a 90-degree angled wall to maximize efficiency for in-store commercial shoots. This setup allows T-Mobile to film realistic store scenes without the need to visit individual locations, as scenes can be tailored to look like various stores by capturing photographs of each. This approach saves on production costs and provides logistical flexibility, with the team able to adapt visuals quickly.
The Planar team emphasized their nationwide support, with techs, salespeople, and engineers on hand to assist with any panel or production needs. Key team members were introduced, including Alex Zhu, the general manager of the virtual production team; John Foecking and Sean Dervin, application engineers; and colleagues from their sister company, OptiTrack. The team also invited attendees to explore the stage setup and interact with the camera, further demonstrating the power and flexibility of virtual production technology.
If you missed the session, don’t worry—you can watch the full recording here.