Skip to main content
At Bluewater’s Tech Expo, attendees had the opportunity to explore an engaging breakout session led by Andrew Aniceto, the VPXR Business Development Manager at Planar. The session, titled “Filming with LED,” highlighted the thrilling possibilities of virtual production using LED technology.

Aniceto begins the session by showcasing the short film Goliath. This project, he explains, provided the Orbital Studios team with a unique opportunity to push their capabilities even further, incorporating everything from intricate stunts to wire work. With the use of properly tracked cameras, the digital environments seamlessly blend with practical sets and real effects, creating an immersive transition between the real and the imagined. This powerful combination exemplifies the level of realism and creative freedom that virtual production with LED technology can achieve.

Orbital Studios is dedicated to empowering filmmakers to ignite their imaginations and create at the speed of thought. Through innovative techniques, the team placed both the stage and the virtual world directly in the palm of the performer’s hand. This enabled her to control her surroundings live on camera, allowing the actor to be swept away into the sky by a towering, twenty-story giant. Orbital Studios continually pushes beyond what was once considered impossible, combining magic with technical expertise to pave the way for the filmmaking possibilities of tomorrow.

The “Goliath” short film was created at Orbital Studios using cutting-edge Planar walls and OptiTrack tracking systems. These technologies enabled the team to seamlessly blend digital environments with practical effects, creating a captivating and immersive experience that brought the story’s visual elements to life.

What is Virtual Production?

Aniceto explains that virtual production is essentially real-time visual effects. When you’re looking through your camera, you’re seeing what the final shot will look like. In the past, filmmakers would shoot against a green screen and then replace the green background with a digital environment in post-production. But with virtual production, everything happens in real-time. Actors can see the world they’re interacting with, and the director can visualize the final product as it’s being filmed. This approach involves a lot of previs, allowing for a seamless and dynamic production process.

5 Categories of Virtual Production

  • Virtual Reality (VR) – A computer-generated environment that immerses users by creating realistic scenes and objects that make them feel like they are truly inside a different world. A great example of VR would be the Oculus headset. When you put on an Oculus, you’re transported into a fully virtual world, experiencing it as if it were real, thanks to the immersive visuals and interactive features.
  • Mixed Reality (MR) – blends the physical world with digital elements, allowing users to interact with both simultaneously. A great example of MR is the Apple Vision Pro headset. When wearing the headset, you can still see the real world around you, but with virtual objects or information layered on top. For instance, you could be looking at your physical surroundings while also interacting with a virtual screen, like writing a Word document, all seamlessly integrated into your environment. This combination of real and virtual elements enhances the way we work and experience digital content.
  • Computer Generated Imagery (CGI) – CGI and green screen work were traditionally done in post-production, where special effects were added after filming by shooting actors in front of a green backdrop and replacing it with digitally created environments or elements. However, with advancements in virtual production, much of this work is now done in-camera, meaning that actors are immersed in the digital environments in real-time as the scene is being shot. This allows the director, actors, and crew to see the finished effect while still on set, improving the creative process and reducing the need for extensive post-production work. In-camera VFX enhances the realism and efficiency of filmmaking by enabling a more seamless integration between live-action and digital content.
  • Augmented Reality (AR) – An interactive experience that enhances the real world by overlaying computer-generated elements onto a user’s view of their surroundings. A well-known example is the Pokémon Go game, where the camera on your phone adds virtual characters to the real world as you explore. In broadcast media, AR is often used when, for instance, a presenter is on stage and a map or other visual elements appear on the screen, enhancing the information being shared. Essentially, AR blends the physical and digital worlds, augmenting the user’s experience with additional layers of virtual content.
  • Extended Reality (XR) – A comprehensive mix of all immersive technologies, combining virtual reality (VR), augmented reality (AR), and mixed reality (MR) into a single experience. While it’s harder to capture a single image of what XR looks like, it encompasses various forms of virtual production. A simple example is when a person stands in front of an LED wall with a scene displayed. As the camera pans past the wall, an augmented scene extends beyond the physical wall, creating the illusion of a larger environment. XR blends and expands the digital and physical worlds, enhancing the overall immersive experience.

Obstacles with LED for Virtual Production

When using LED technology for virtual production, there are several key challenges to address. One of the main issues involves scan lines and moiré patterns, both of which can impact the visual quality of the final product. LED walls can look different through a camera lens than to the human eye, largely because digital cameras are more sensitive to light, leading to visual discrepancies. Scan lines, for instance, appear as black bars that can be especially visible on cameras with rolling shutters, which scan the image sensor in a line-by-line manner, increasing the chances of these artifacts. Global shutter cameras, which capture the entire frame simultaneously, reduce the likelihood of scan lines, and the use of genlock can further help synchronize the camera with the LED wall to remove them.

Moiré patterns, on the other hand, result from interference between the camera’s sensor grid and the LED panel grid. This effect is particularly pronounced when the grids don’t align perfectly, causing visible artifacts. To mitigate moiré, optical low-pass filters can be used, which soften the image slightly but help reduce these unwanted patterns. Adjusting the camera’s angle, focus, or increasing the pixel pitch of the LED wall—ideally to a tighter pitch like 1.5mm or lower—can also alleviate this effect.

Choosing the appropriate pixel pitch and distance between the talent and the LED wall are also essential considerations. Lower pixel pitches provide higher image quality but require more processing power, impacting budget. For instance, a 2.6mm pixel pitch might be more affordable but requires talent to stand further from the wall, around 11-14 feet, to avoid artifacts. Testing and setting guidelines based on pixel pitch and distance can significantly improve visual results and streamline virtual production.

Types of Virtual Production Stages

The presentation highlighted several virtual production stages built by Orbital Studios and other facilities across the U.S., showcasing the advanced capabilities they offer for film and commercial production. Orbital Studios, for example, developed a large main stage used in their short film “Goliath,” as well as a car process stage, designed with a curved LED wall and ceiling. This setup enables car scenes to be filmed in a controlled studio environment, replicating a wide range of backdrops. Similar to recent car commercials, including those featuring Matthew McConaughey, the driver never actually hits the road but instead appears to drive through diverse landscapes projected on the LED walls.

Resolution Studios in Chicago offers various sized LED walls, including a larger one approximately 60 feet wide and 15-20 feet tall, which they use creatively for commercial rentals. Renovo Media Group’s stage setup includes modular LED panels, making it easier to monitor for pixel issues and ensure smooth visual displays.

T-Mobile’s production space takes a different approach, using a 90-degree angled wall to maximize efficiency for in-store commercial shoots. This setup allows T-Mobile to film realistic store scenes without the need to visit individual locations, as scenes can be tailored to look like various stores by capturing photographs of each. This approach saves on production costs and provides logistical flexibility, with the team able to adapt visuals quickly.

The Planar team emphasized their nationwide support, with techs, salespeople, and engineers on hand to assist with any panel or production needs. Key team members were introduced, including Alex Zhu, the general manager of the virtual production team; John Foecking and Sean Dervin, application engineers; and colleagues from their sister company, OptiTrack. The team also invited attendees to explore the stage setup and interact with the camera, further demonstrating the power and flexibility of virtual production technology.

If you missed the session, don’t worry—you can watch the full recording here.