Are you ready for the future of filmmaking? Virtual Production is the buzzword on everyone’s lips, but before you jump in, it’s important to understand the different workflows and technologies involved. Don’t be caught saying, “You know, the Mandalorian thing,” in 2023. Let’s break it down.
Firstly, Virtual Production isn’t just about the LED wall made famous by The Mandalorian. It encompasses a range of techniques that combine virtual and physical filmmaking. Previsualization (Previs) is the process of creating a virtual version of a film before shooting, allowing filmmakers to stage action and block camera angles. Techvis is similar, but focuses on planning the shoot to ensure it’s possible to film what the director wants. Postvis generates temporary visual effects after principle photography, providing VFX studios with a better understanding of the director’s intent.
In-Camera VFX (ICVFX) is what most people think of when they hear Virtual Production. It includes LED volume and greenscreen workflows, where visual effects work is done before the shoot and projected onto LED panels or composited in real-time. LED volume workflows project the CG environment behind actors and foreground set pieces in real-time, creating a natural integration of foreground and digital background. Greenscreen ICVFX removes the expense of an LED display wall while still providing real-time feedback of the final shot.
Finally, the Camera Frustum and Spherical Projection are techniques used to ensure objects appear to shift against their background when viewed from different positions. With this guide, you’ll be well-equipped to navigate the world of Virtual Production and take your filmmaking to the next level.Are you curious about how virtual production works? Let’s dive in! One of the key techniques used is generating a spherical projection from the center of the stage space and displaying it on LED walls. This creates a stunning background lighting effect and provides a source for scene reflections. However, the background image won’t match the exact perspective of the camera filming the scene. To fix this, a camera frustum view is generated specifically for filming the camera’s point of view. This is projected a little wider than the actual area captured by the camera lens and feathered into the main spherical projection on the screen. The result is a seamless blend of lighting and reflections that looks amazing on camera.
DMX lighting is another crucial element of virtual production. This technology was invented in the 1980s and has survived thanks to its bulletproof design. Most professional production LED light fixtures include DMX controls. Pixel mapping is an emerging technique that takes a view of the virtual world and converts it to DMX signals to drive a bank of DMX lights. This creates a low-resolution LED volume that can be interactively positioned and contains much lower metamerism than an LED wall.
Driving high-resolution LED wall displays used to require niche and expensive dedicated playback servers with proprietary software. But now, real-time rendering in generalized game engines like Unity and Unreal Engine has changed everything. Unreal Engine, in particular, has become a generalized platform for digital content creation. Even better, it’s completely free to use for virtual production and other applications. Unity is also pursuing virtual production, but recent layoffs have raised questions about their commitment to the market.
In conclusion, virtual production is a fascinating and rapidly evolving field. By combining cutting-edge technologies like spherical projection, DMX lighting, and real-time game engines, filmmakers can create stunning and immersive virtual worlds that look amazing on camera.
Unleashing the Power of Unreal Engine
Unreal Engine is a game engine that has taken the virtual production world by storm. It has a visual scripting language called Blueprints that allows non-programmers to create custom behaviors and tools without engaging a full development team. Unreal is based on C++ and the entire code base is open, so developers can extend it with additional C++ and make changes to the base code if needed. Unreal is constantly working to increase realism while rendering at 60 frames per second and beyond.
The new combination of the Lumen lighting system and the Nanite polygonal streaming system in Unreal Engine 5 enables physically-based rendering (PBR) at quality levels previously unseen for real-time gaming. Unreal includes software called nDisplay designed to drive multiple synchronized clusters of real-time rendering servers. This allows separate sections of an LED wall to be rendered at tandem in high resolution.
For all its power, Unreal Engine is first and foremost a game engine. As such, many of the workflows for Virtual Production are a little, “clunky.” This has led many systems integrators to build an additional interface layer over the top of Unreal Engine, simplifying workflow for end users. Keep this in mind if you expect to simply “plug in” Unreal Engine and start making movies.
Notch: The Streamlined Authoring Environment
Notch is a dedicated real-time media server system that has been around for a long time in the live entertainment space. Notch’s main appeal over Unreal is its streamlined authoring environment. Unlike Unreal Engine that is first and foremost a game design tool, Notch is much more tailored to artists designing live experiences. The learning curve is much gentler and the artist isn’t fighting an interface cluttered with non-relevant toolsets and a workflow paradigm focused on creating packaged games.
On the flipside, Notch lacks Unreal Engine’s expansive features and almost limitless expansion capabilities. Notch tends to be run through TochDesigner or Disguise rather than operating as a standalone media server environment.
TouchDesigner: The Prototyping Tool
TouchDesigner is another node-based programming tool and media server for creatives. TouchDesigner seems to have made less of an impact as a core real-time engine for running virtual production systems, but has found plenty of use as a tool for quickly prototyping control interfaces for adjusting various aspects of a virtual production set—time-of-day lighting adjustments, toggling hero camera screen displays, switching setups and scenes etc.
Disguise: The Turnkey Solution
A Disguise server is a powerful hardware device designed specifically for real-time rendering and playback of high-resolution video content. Its primarily use is live events, concerts, broadcasts, and immersive experiences where real-time visual effects and interactive elements are crucial.
Developed by the company Disguise, the Disguise server is known for its ability to handle large-scale video processing and manipulation. It’s equipped with a combination of high-end GPUs, CPUs, and large amounts of RAM to efficiently handle real-time rendering and playback tasks.
Disguise supports both Unreal Engine and Notch as the real-time engine to drive the hardware. Disguise also provides enhanced features like digital set extensions.
StageCraft: ILM’s Proprietary Real-Time Engine
StageCraft is ILM’s proprietary real-time engine for virtual production and specifically ICVFX. Unreal Engine was the used for the first season of The Mandalorian, but was replaced by StageCraft in season two. Evidentally ILM felt that an internal tool streamlined for virtual production was a better fit than using a generalized game development platform. It will be interesting to see if that investment pays off long-term. ILM have bet against commoditized general platforms in the past and lost; their internal compositing tool CompTime was replaced first by Shake, and then ultimately by Nuke. There have already been reports of compatibility issues with teams working on assets in Unreal and then needing to convert them for use in StageCraft.
Genlock: The Critical Technology for LED Walls
Part of the magic of displaying content on LED walls is distributing the workload to multiple workstations rending the scene. It’s critical that the frame of video being rendered from each computer is displayed at exactly the same as the others. If not, spatial lag, flicker., or tearing of the frame may be the result.
Genlock (generator locking) is actually an old technology from the analog video days that has found new life in the era of virtual production. Unlike your run-of-the-mill VITC or LTC timecode generator, a genlock typically uses a tri-level sync pulse to force all frames in a system to align to exactly the same time.
The timing alignment is so critical in an LED volume that the length of sync cables needs to be taken into account and delay compensated in order to ensure that everything fires in perfect synchronicity.
Virtual Art Department: The Crew Behind the Scenes
Virtual Art Department is the name given to the crew responsible for designing virtual sets, as well as sourcing, scanning, modeling, and texturing the digital props that will appear in the scene. In many ways this is simply a fancy name for something that’s been taking place in animation and game design for decades: modeling, texturing, and scene layout. The “new name” is very much an effort to engage traditional art directors in the hybrid digital process of designing sets for virtual production.
Are you curious about the digital dark arts of virtual production? Production designers and art directors have been using everything from Sketchup to CAD software to design and layout live action sets for years. But with virtual production, the layout created in the computer is the final product, not just a representation of what will ultimately be built as a physical set.
What sets virtual production apart from traditional animation and game design is the blending of the virtual and the practical. Virtual production sets include practical foreground stage props that need to blend seamlessly with virtual props. Reality capture is one way to achieve this, creating digital replicas of real-world prop pieces and matching the color and lighting of virtual background objects to the physical ones on the practical foreground set.
As AI and imaging tech continue to improve, physical props may become less important, with digital foreground elements being superimposed over actors. However, actors will always perform best with real set pieces to interact with, so it’s unlikely that we’ll ever see a complete shift to entirely virtual set design.
The concept of virtual production is still in its infancy, with little standardization of workflow and virtual prop formats. Unreal Engine’s “packaged game” paradigm makes asset control and iteration difficult, but the Pixar-created open standard USD schema may be the long-term solution. Current production pipelines leverage version control systems like Perforce and Git to help tame the asset creation pipeline.
To perfectly marry the practical set to the virtual one, the LED wall’s base needs to be color corrected to match the live action foreground floor. Edge blending is used to dial in a feathered strip of pixels at the edge of the LED panels to match any local variation of color and contrast in the practical set.
Reality capture is the process of converting a real, physical object into a digital representation that looks indistinguishable from the real thing when filmed by a virtual camera. There are several methods for capturing real-world objects, including photogrammetry, NERF object capture, and LIDAR. Each method has its strengths and limitations, but LIDAR is typically used to capture entire scenes and can range from $20,000 to six figures for extreme quality and precision of capture.
Virtual production is still the Wild West, but with the right tools and techniques, the possibilities are endless.Looking to create stunning virtual worlds without breaking the bank? Look no further than desktop scanning! These scanners use laser technology to create detailed 3D models of objects, making them perfect for creating realistic virtual environments. And if you’re worried about file size or mesh deformation, don’t be! 3D wrapping techniques like those found in the “Wrap” software can help you create clean, animated meshes that are perfect for bringing your virtual creations to life. But that’s not all! There are plenty of other terms you need to know to succeed in the virtual production landscape, from Brainbar (the collection of computers used to operate the system) to LED panel pixel pitch (the density of pixels on an LED panel). So what are you waiting for? Start exploring the world of virtual production today!Get ready to be blown away by the latest advancements in virtual production! With Unreal Engine’s Lumen and Nanite systems, filmmakers can achieve unprecedented levels of detail and realism in their virtual sets. Say goodbye to the hassle of offline baking and hello to real-time lighting calculations that allow for on-the-fly changes. And with virtual location scouting and DMX simulation for lighting, the possibilities are endless. Don’t miss out on the future of filmmaking – dive into the world of virtual production today!