KFTV
Virtual Production (VP) is an evolution of established film and TV production methods with as many benefits as there are potential obstacles. With VP stages, or Volumes, shooting up all over the world filmmakers with experience of its tools are keen to share both pluses and pitfalls in order to keep virtual production on track.
article here
“VP means a lot of things to a lot of people; it’s a very
broad umbrella,” says Callum MacMillan, virtual production supervisor,
Dimension Studio. “At its core is a realtime games engine and the
representation of 3D worlds on a giant LED wall to create the illusion of a
real environment.”
In this sense VP is an extension of classic filmmaking
techniques like front and rear projection or green screen in which actors are
filmed against artificial backdrops for cost efficiency and greater control.
“It is the next evolution of VFX,” says John Rakish,
president of the Location Managers Guild International. “What is radically
different with virtual production is the introduction of an interactive light
source that enables the cinematographer to capture final frames in-camera.”
The Mandalorian, for example, featured a central
character with a reflective suit able to be photographed with light cast from
surrounding LED panels.
VP also needs to be understood as a set of tools and
processes that extend beyond the stage environment. Applications range from
previz and remote scouting to character creation based on performance
capture. Assets such as models, characters and 3D environments must
be completely camera-ready before production starts.
“It takes a process that might have happened in post and
moves it into preproduction,” says Steve Jelley, Joint MD, Dimension.
“It means you can shoot VFX straight through the camera with
no long post process time afterwards,” adds Craig Stiff, lead real-time artist.
“Department heads are shooting what they see,” emphasises AJ
Sciutto, Director of Virtual Production at Magnopus whose credits include The
Lion King. “There’s no guesswork needed to ‘fix it in post’ when you can
reframe a shot on set based on what you see through the lens. This production
value alone, if done right, is a key motivator.”
The practical benefits of virtual
VFX houses MPC and ILM laid much of the ground work to make
Disney’s The Jungle Book, The Lion King and The Mandalorian. Now
virtual production stages have multiplied across the world.
“The pandemic accelerated demand,” says Sciutto. “There was
a lot of motivation to get back to work safely without sending hundreds of crew
on location.”
Even before Covid, VP was heading into the mainstream.
That’s because Hollywood may be reaching the limits of efficiency using
traditional production methodology and why the predictability of the virtual
environment is of value.
“You can send a two-person crew on location to scan and
digitise backgrounds for filming in an LED volume,” says Sciutto. “All the
variables of shooting on location such as weather conditions or where craft
services will actually be can be controlled.”
Not having to set up a huge base camp or pay for travel
translates to direct cost savings but indirect savings mount up too.
“Simply not unloading and repacking kit each day saves
time,” says Max Pavlov, CEO & Co-founder, LEDunit. “You can shoot at golden
hour all day long if you want. There’s no need to stop and wait for perfect
light or fret about paying overtime.”
Location based reshoots, which can account for 20 percent (or
more) of the final production cost of high budget films, calculates Deloitte,
can be eliminated with careful pre-planning.
This should also result in better working conditions for
crew. “There’s no need to work long and anti-social hours and risk burn out,”
says Sciutto.
VFX costs on a high-budget sci-fi or fantasy film can be as
high as 20 percent of the total film budget but shooting against an LED wall
significantly reduces overheads for compositing and rotoscoping. That’s because
VP flips the traditional linear process of making movies on its head.
“With traditional VFX, if you want to make a change at any
point you have to go all way back to that point and take the shot all the way
through the pipeline again,” explains Ed Thomas, head of VP, Dimension. “With
games engines we can put all the individual aspects of a linear pipe into one
environment to co-exist at the same time. Potentially you have production
design, lighting, layout modelling and animation all happening at the same time
in one environment.”
This pooling of resources, “allows for greater creative
freedom through more rapid iteration and feedback by being able to see the
results prior to principal photography,” says MacMillan.
What content works best
VP stages have found a natural home for filmmakers wanting
to bring fantasy story worlds to the screen but the technology is equally
applicable “from a spaceship to a pack shot,” says DoP and camera operator
Richard Dunton.
Ryan Beagan, Warner Bros' vp of virtual production says
it opens up opportunities to shoot in areas which are difficult to travel to or
simply downtown in a city where permits would be cost prohibitive.
“Instead, you can use photogrammetry and Lidar scans to
rebuild those locations in a studio. Other locations might let you scan, but
not film in – a cathedral, for example.”
Scenes of conversations in moving cars, walk and talks and
other ‘process shots’ - “things you would spend a lot of money to shoot on
location can be shot on a VP stage at just the same quality without the need to
send the whole crew,” says Sciutto.
A key advantage is the benefit it gives to actors able to
key their performance to locations they can see rather than to invisible
creatures or environments requiring greater leaps of the imagination.
“The ability for actors to play within true to reality
environments is the real focus of virtual production,” Philipp Klausing, EP and
MD at 1899 producer Dark Ways. “We are recording performances not the
background.”
But for all its advantages VP does not make creative or
commercial sense for every production.
When director Denis Villeneuve invited The Mandalorian
cinematographer Grieg Fraser to photograph Dune it was not to tap
Fraser’s virtually unique VP expertise but the gritty realism he brought to
films like Zero Dark Thirty. Both Fraser and Villeneuve wanted audiences
to feel the heat and sand of a desert by shooting in the middle east.
“VP is not a silver bullet,” says Sciutto. “There are a lot
of people who are trying to force things into a VP workflow that they
shouldn’t. There is absolutely a tactility to locations you don’t get inside of
a volume.”
Rakish advises that a volume works best when there’s a
physical boundary to the scene but less well if you want to portray crowds of
people. Photographing LED screens in close up can result in problematic visual
artefacts which is why shots in a volume tend to have their background’s
blurred and shot with a shallow depth of field.
“If your scene is set in a room as tight as a bunker it
makes no sense in VP,” says Pavlov. “The camera would be so close you would
probably see the pixel pitch (of the screen). You are better off building it.”
Framestore’s head of episodic, James Whitlam, advises
that a dialogue heavy drama to be filmed in tight close ups would be too
expensive on a virtual stage but that “a volume is perfect for recurring scenes
across multiple seasons where you can achieve economies of scale.”
Shooting in a volume is not an all or nothing either. Stages composed of modular and portable LED
panels can be built to spec. “A scene set on board a train for example might
only need enough panels to cover the windows to give your interior’s realistic
reflections,” says Chris Hunt, CFO at LED specialist Brompton Technology.
Even The Mandalorian and 1899, make use of
regular sound stages alongside a volume. This can be to accelerate schedules
(by shooting different scenes simultaneously) or for aesthetic reasons.
“Shooting an entire show in a volume can look too sterile” says Sciutto.
Physical builds and props on the volume stage need to merge
convincingly with the CG backdrop, putting additional pressure on production
design.
“The whole LED image in camera should transition seamlessly
between floor, ceiling and wall of the Volume,” says Sciutto. “Most LEDs are at
90-degree angles to the stage and require some painted clean up in post.”
Warner Bros’ virtual production stage housing House of
the Dragon was designed with this in mind. “Based on the shot we are able to lower the LED ceiling and tilt it toward the
camera at an angle the image to
be captured with the best colour quality (like a TV, LEDs will display the best
colour when viewed head on) and so that the pixel join is seamless and looks
infinite,” Beagan explains.
“The hardest thing to shoot is just a person and a screen,”
he adds. “Placing physical sets or objects like a car, an obelisk or columns
that repeats seamlessly into the wall works really well because the camera and
your eye can fix on content that is lit in the real world.”
Physical set and virtual world blending
Nor are volume stages necessarily
cheap. Depending on the project scope it can require additional labour to
create VFX.
“The cost of using a local realtime animators is not
comparable to the outsourced prices you get from sending VFX to vendors around
the world (to places like India),” says Sciutto.
Another big cost are LED panels, originally built for the
digital outdoor signage industry and currently expensive rental items.
“Everything is bespoke and there’s no assembly line
methodology for putting together a volume stage,” Scuitto says.
Moreover, for all the incredible photoreal quality of games
engines like Unreal, this software is not capable yet of rendering some
computer graphics in the highest quality. Elements like human background
characters, rainforest foliage and waterfall simulations are picked out as
troublesome.
“There are many interlocking parts to VP which can create
cascading problems if one element is misjudged or overlooked,” says Erik
Weaver, Director of Virtual & Adaptive Production at the Entertainment
Technology Centre. “For example, the idea that the LED screens themselves can
illuminate the actors and set is broadly true but masks a maze of complications
and decisions that the cinematographer and gaffer have to master.”
Understanding the decision-making process
Technology aside, the biggest impact of virtual production
is the shake-up it means to conventional filming. In theory, with the advantage
of seeing finished pixels through the lens, the DOP should be more confident
that their creative decisions on colour and light will translate to the final
picture. For that to happen, there are new production roles and lines of
decision making which need to be agreed and understood.
The Virtual Art Department (VAD) for instance, led by the
VAD Supervisor will include leads for creating lighting and assets within the
virtual world. Their work needs to interface with a gaffer and the production
designer (who works across both real and virtual sets). The ultimate
responsibility for the image composition rests with the DP and director.
“Filmmakers as well as animators and crew from the games
engine world both need to understand what the boundaries are,” says Klausing.
“Basically, we need a ‘bible’ for delivery of content and who talks to whom.”
“There is a well-established hierarchy on a normal set. You
plug in virtual production and this changes,” agrees Wilding. “On 1899
we got into a flow where we were not treading on each other’s toes but we’d
have to do ten of these jobs to really know whether this is a template that
works.”
Various bodies are is working to agree job descriptions and
pay rates. A new lexicon for creating in Virtual Production is being developed
by organizations including Netflix, SMPTE, EBU, VES, and MovieLabs.
Education required
Another SMPTE initiative aims to create “a set of best
practices and standards,” says Katie Cole, Gaming & Virtual Production
Evangelist. “At its core it is a library that will store any type of digital
assets.”
That’s becoming increasingly urgent in order to slash the
time in pre-production needed to build CG assets, particularly for producers
needing to convince investors to finance the production.
“I
am looking for an independent unified library of location-based data and a
unified technology standard that will enable producers working anywhere to pick
and choose assets and download them, with some adjustment, ready to play in any
volume,” Klausing says.
Beyond that there are numerous education initiatives to get
experienced filmmakers and graduates alike hands-on with VP stages.
“Just as when the industry went from shooting film to
digital, some directors and DoPs need convincing [about VP],” says Pavlov. “But
when they try it, they understand how easy it is.”
Paul Franklin, VFX producer-turned-director is a convert.
“Once in a digital world your creativity is infinitely malleable. For me,
virtual production is a way of taking what I would have done as a VFX artist
and bringing it live onto the set.”
The Future of VP
VP is evolving at pace to make content creation within volumes more accessible to filmmakers. Workflow frictions will be smoothed with practice and cinematographers will gain more experience of lighting with LED walls. Technology components will require greater standardisation so that productions can get up and running quicker on stages which are less bespoke.
LED technology will receive most attention. The current generation of panels were built for the advertising sector – and it shows.
“Colour temperature is a big issue,” Klausing says. “The spectrum that the display can emit is limited to RGB and not as rich as we’d like it to be.”
Sciutto points to the “hugely reflective” black plastic substrates between each pixel. “In a film environment you lose a lot of the contrast in your image because the LEDs are reflecting the light. We need denser pixel arrays or darker nano black substrates.”
The ability of games engines to render images in realtime can hit a buffer as more digitally detailed image quality is processed. It can mean that the background animation is unable to run at the same frame rate as the cameras. A solution for this is being provided by Mo-Sys to offload the rendering directly into the cloud from set.
“Mo-Sys Neartime will allow you to render Unreal scenes in higher quality while maintaining real-time frame rates,” says Grieve. “It can remove Moire (visual interference) completely – that’s a big issue in LED volumes.”
As this approach continues to advance, we’ll move towards more realistic rendering in real-time. “This can power videos screens that can turn a movie set into the Holodeck,” says Sciutto. “Having put a film crew inside the world of their story, we can soon put the audiences inside the films as well.”
Klausing further points to the benefit of VP to local language producers. “Without a big budget they have normally been prevented from travelling and restricted to making local shows in the local market. With VP they can create local language content that is located anywhere.”
Netflix is investing €500 million on German-language titles from Germany, Austria, and Switzerland over the next couple of years and has committed to house multiple series at Dark Bay.
“I envision that Germany can become a European leader in virtual production,” Netflix’s director of International Originals, Rachel Eggebeen, told Deadline.
No comments:
Post a Comment