Tuesday, 13 April 2021

Fix it in prep: the new mantra for Covid safe virtual-production

NAB Amplify

If you’ve got $15 million then you can make a 35-minute VFX-driven drama on par with The Mandalorian but most producers will be lucky to be making an entire series on that budget. Fortunately, the virtual production technology and techniques with which Disney+ wowed the industry is coming on tap fast thanks to COVID-19. Virtual production happens to be the form of live action production most suitable to COVID safety.

https://amplify.nabshow.com/articles/fix-it-in-prep-the-new-mantra-for-covid-safe-virtual-production/

“The pandemic forced a lot of companies that had been unwilling or unable to work with remote artists to do have to do so in an instant,” says Adam Maier, producer at LA’s ReelFX.

“That broke down a ton of boundaries. We’re seeing a huge acceptance of virtual production and sudden interest in what it can bring in terms of remote collaboration.”

A year ago, Maier was working at “transmedia” studio Brud and part of a project organized by the Entertainment Technology Center@USC to road test Covid-safe procedures and affordable virtual production on a live-action short called Ripple Effect.

The results, exhaustively detailed in a white paper, are a blueprint for how producers anywhere might resume production on more virtually than before.

“Virtual production brings together the best of traditional film with the best of video games with graphics and VFX to be able to do a lot more than on a traditional film set,” Maier added.

Katherine Billhart, exec producer and director VP at ETC, said, “The goal is to walk away from each of our setups with final visual effects captured in camera.”

This “Fix it in Prep” philosophy in which a production re-allocates at least two-third of its post VFX resources to the pre-production stage is a key difference between virtual and traditional production workflows.

“We planned out shots ahead of time to cut down time spent on stage which makes people safer and saves studios money,” Billhart said.

“Ripple Effect” was made with the participation of Stargate Studios which did a lot of the mapping, geometry and playback work, XR Stages which provided the LED wall and visualization company ICVR which created the virtual world and virtual assets in Unreal.

“Yes, you’ve got The Mandalorian on one hand, but we’re were trying to look at how what a small production or a regular studio can do,” explained executive producer Erik Weaver. “There are not a lot of people familiar with VP so one change was getting people to understand how virtual production works. The great thing is that once they do experience it it’s not difficult to comprehend.”

While “Ripple Effect” was filmed in an LED volume, the paper’s authors don’t dismiss greenscreen. It weighs the pros and cons of each and suggests producers do this too before committing to one route. “‘Fix it in Prep’ and ‘Fix it in Post’ are two different philosophies that can both be applied to either an LED wall or traditional greenscreen workflow. There are pros and cons for either which can impact schedule, cost, and savings. Virtual Production workflows, techniques, and tools should provide a path toward balancing the two philosophies.”

The paper concludes that VP should be considered an integral part of “physical production,” rather than a separate entity. It reads, “The concept of virtual production as a separate entity exists today because virtual production tools, techniques, and workflows applied to physical production require new skill sets, an adoption of computing and computer language on set, and team leadership with a strong understanding of VFX.”

The goal for the industry should be to educate existing departments to help them acquire skills in computing and real-time technology and ensure that baseline VFX knowledge is a minimum requirement.

Safetyvis

In parallel to the short film, the ETC designed a separate “Safetyvis” project in partnership DigitalFilm Tree and ICVR to develop real-time production safety planning tools.

Weaver gathered 309 control practices from U.S. states, various countries and the Hollywood unions and entered them into a spreadsheet. “It’s hard to visualize on a spreadsheet,” he said. “So, we took it to DigitalFilm Tree and brought it to a whole other level.”

ICVR simulated the virtual workspace in Unreal including where things are staged and where people can walk. “We made Avatars using game engine mechanics to work out what works from a safety point of view,” said Weaver.

They scanned the entire stage using LiDAR for accurate measurements. “If the idea of LiDAR on a phone catches on, it means anyone can have an accurate environment that they can creatively iterate in,” said DigitalFilm Tree CEO Ramy Katrib.

The scan also showed how many people could fit safely in the space.

“In order to represent that, we developed ring lights,” explained game engine artist Andrea Aniceto-Chavez. “Any time a person gets too close to another, the ring lights turn red. We can see where all the departments are and avoid the space getting too crowded.”

Key advantages from both cost and COVID points of view in the use of volumetric capture techniques is that they productions can re-create exotic landscapes or basic city streets while reducing the size of the crew that travels to those locations.

Directors, production, department heads, and other stakeholders can see the creative and understand its intent as it is displayed on the LED walls with minimal latency, allowing for remote participation on set. This positively impacts schedule as those working remotely may not need to provide input all day, they may only need to join in at specific times.

However, the paper notes the currently limited amount of operational smart stage LED volumes available for use, a fact that also makes their rental costs very high.

 


No comments:

Post a Comment