Monday, 1 November 2021

Virtual Production 101: What You Need to Know

NAB

https://amplify.nabshow.com/articles/virtual-production-101-what-you-need-to-know/

It’s less a question of if you’ll ever find yourself shooting in an LED volumes but when. With the technology still at the cutting edge, there are some essentials to consider before budgeting, tech provisioning and filming with virtual production.

Virtual production with LED walls would have become popular on its own without the pandemic. But the operational impact of this technology, which dramatically reduces the need for a large crew footprint and eliminates the need for location work and travel, cannot be overstated.

Major productions that would have been shot in real-world locations or on green screens have been reconfigured to be partially or entirely shot on LED volumes instead. These include Star Trek: Discovery; Star Trek: Brave New Worlds; Netflix’s 1899; Thor: God and Thunder, and Bullet Train.

Frame.io (newly acquired by Adobe) has an enlightening set of tips and tricks for newcomers to virtual production which I’m going to précis here.

1 Fix it in Pre

To make an LED volume perform its best the lion’s share of visual development occurs in pre-production. That’s a reversal of the recent norm where issues on set were fixed in post.

On a virtual production, schedules for films are pre-loaded with more pre-production time and a less extensive post period.

“Many seasoned filmmakers aren’t accustomed to the idea of making every decision in terms of effects imagery before production occurs and may find the process counterintuitive,” says Noah Kadner, the virtual production editor at American Cinematographer and author of the Frame.io guide.

Assets such as models, characters, 3D environments, etc., must be completely camera-ready before production starts. Along the way, this also means a lot more iteration and visual development can occur. Indeed, the Virtual Art Department, previsualization, and virtual scouting are all vital parts of the LED volume pre-production workflow.

“In many ways, the production day becomes about executing a carefully validated plan instead of best guess shots in the dark, as non-virtual productions often seem.”

2 New Production Roles

A Virtual Production Supervisor acts as the liaison between the physical production team, the Art Department, the VFX team, and the “brain bar” (ILM’s term for its Volume Control Team).

Frame.io suggests the VPS combines the roles of VFX Supervisor and Art Director. Their responsibilities include overseeing the Virtual Art Department (another acronym to juggle – VAD) during pre-production and supervising the LED volume during production.

“The VAD is where all elements that ultimately wind up on the LED walls are designed and created. This area encompasses a traditional art department, with an emphasis on digital assets. The VAD is constantly creating objects which may be digital models, real-world props, or both.”

Clearly, understanding what the VPS and the VAD do in a virtual production is essential.

3 Avoid looking like a video game

Photorealism is the aim nine times out of ten but the pitfalls of the virtual environment looking like a video game are real. Photogrammetry is the go-to technique. It’s a method of measuring physical objects and environments by analyzing photographic data from which to construct 3D assets.

Frame.io name checks a few useful photogrammetry tools such as ML/AI software RealityCapture  Library.

“The effort needed to create a photorealistic 3D asset from photogrammetry is often far less than making the equivalent from scratch digitally.”

4 Get most powerful system you can afford

The more GPU power in your system, the greater the level of detail in an environment you can have on your LED wall in real-time. It’s not something you should have to worry about: A quality integrator can ensure you have a system that performs well and doesn’t blow its fans nonstop. Many of the key components and plugins for virtual production, such as camera tracking and LED panel support, are only available on Windows, and you may need multiple PCs if the volume has multiple surfaces.

5 LED panels trade quality for cost

Pixel pitch is the distance between individual LED lights on the screen and is measured in millimeters. Because you’re re-photographing the screen, the pixel pitch directly correlates to how the image looks. If it’s not dense enough, the image can look low resolution. Or even worse, you may see moiré (when the camera image sensor conflicts with the pattern created by the spaces between the LED lights on the screen).

The higher the pitch is, the more likely moiré will appear when the camera focuses close to or onto the screen.

For reference, the pixel pitch of the LED panels used on The Mandalorian is 2.8mm. But that screen is also approximately 20 feet tall by 70 feet across, so that the camera can be much further away and less likely to focus on the screens. If you are working in a smaller sized volume, this can become even more of an issue.

Panels are now available at 1.5mm and even more dense, which can mitigate or eliminate moiré. The trade off is that the lower you go, the more expensive the screens become.

“So, there’s ultimately a perfect storm to consider which factors in pixel pitch, camera-to-screen distance, focal length, focus plane, camera sensor size, and content resolution to determine whether your footage shows a moiré pattern or not.”

6 Need walls, a ceiling and a floor?

There’s a significant scale continuum between the simplest single wall, rear projection LED setup to the massive volumes used on The Mandalorian.

In general, the larger the volume, the more expensive it will be to rent or to purchase if building from scratch. So, it’s critical to determine how much LED volume you need.

“The choice you make in volume size and form also has a huge impact on interactive/emitted light. If, for example, you put actors/set pieces in front of a single, flat LED wall, your subjects will be dark silhouettes against the screen, like someone standing up in a movie theater. On the other hand, if you have LED sidewalls, ceilings, etc., you will have emissive lighting falling naturally on your subject.

But even if you don’t need or can’t afford an enveloping volume, it’s still very possible to create interactive lighting in sync with the screen content. See below…

7. Using interactive lighting

Digital Multiplex or DMX is a protocol for controlling lights and other stage equipment. Specifically, you can use DMX lighting to turn multicolor movie lights into effects lights for LED volumes.

“You can program specific lighting cues and colors with DMX directly in Unreal Engine or via a lighting board. Or, through pixel mapping, you can set any light on your stage to mimic the color and intensity of a portion of your 3D scene. You can mimic passing car headlights, street lamps, tail lights, you name it.”

To make it all work, you need a DMX compatible light, preferably with full-color control. Some great examples of full-color DMX lights include Arri Skypanels, and Litegear LiteMats.

Next, you need pixel mapping software. Unreal Engine has DMX control, so you can control DMX lights directly from within scenes. Some other examples of external pixel mapping applications include Enttec ELM and MadMapper.

8. Mastery of color

Understanding color science is integral to the cinematographer’s craft and essential when using one digital device to rephotograph the output of another digital display.

The light cast from LED screens can cause unexpected/undesirable results depending on what surface it falls on. Kadner warns about metamerism, which refers to the visual appearance of an object changing based on the spectrum of light illuminating it. LED panels are designed to be looked at directly, not act as lighting sources.

“One way to mitigate this issue is to supplement the emissive light coming off the LED panels with additional movie lighting. It’s more work to set up but the results are worth the effort.”

Manufacturers are also responding by developing LED panels with better full-spectrum color science.

9. Virtual production is not zero-sum

To my mind this is the most important piece of advice, an approach rather to shooting in the Volume. There’s a lot of talk about being able to produce pixel-perfect final shots on set eliminating post altogether. Well, maybe the technology will advance to that extent in time – but it may not be creatively desirable either.

For example, according to ILM the percentage of final shots captured in-camera on The Mandalorian was around fifty percent on season one. The finality shot captured in an LED volume can vary from “ready to edit” to “requires some additional enhancements.”

“Don’t think of this as a zero-sum game. Think of more on a continuum of potential additional finessing in post vs. all or nothing,” says Kadner.

“Most visual effects supervisors who’ve worked in LED volumes agree that it’s far easier to fix visual issues with in-camera effects shots than to start with green screen cinematography and add the entirety of the background imagery in post-production. It’s a reductive and refining process vs. starting with a blank canvas.”

10. Prepare to experiment and be outmoded

The pace of change of virtual production with LED technology and related areas such as AI, camera to cloud, 5G connectivity and volumetric photography inevitably means that as soon as you’ve locked the tech spec down for a project, elements of it will have advanced.

Frame.io picks on Epic Games latest release of Unreal Engine which is accompanied by a host of tools expressly designed for the virtual production filmmaker.

What was completely impossible or highly difficult to accomplish one day may be standard operating procedure the next. Each version offers advancements that will make things faster and more realistic in virtual production.

“So, to save your time, sanity, and budget, embrace constant change. Attend many webinars, watch a lot of YouTube videos, read all you can, and above all, experiment.”

  

No comments:

Post a Comment