IBC
article here
With 3D, High Dynamic Range, high frame rates and 4K capture, the sequel to the most successful movie of all time, is intended to open a window into another world.
Thirteen years after its release, Avatar remains the
number one grossing film of all time at more than $2.8 billion worldwide. It is as widely lauded in the industry for
its groundbreaking use of virtual production technology which has helped develop
more sophisticated performance capture techniques and being able to view actors
and CG assets live on set with a ‘virtual’ camera.
Avatar: The Way of Water will have some way to go
just to break even on its reputed $350-400m budget (much of which is calculated
to be spread across three more follow-ups) but already the film is being talked
about as a major Awards contender, notably in technical and craft categories
including cinematography, production design, VFX and editing.
Cameron himself is no stranger to sequels, having written
and directed two of the most successful of all-time: Aliens and Terminator
2: Judgment Day. Planning for a follow-up to Avatar began as soon as
it became apparent just what a huge hit it had become. In spring 2010
the filmmaker, producer Jon Landau and other key heads of department met to
review what aspects of the filmmaking process had worked best, and what they
could have improved on. That exercise led to a decision to explore the story
world further and resulted in 1500 pages of notes – too many to tell in one
film alone.
Screenwriters including Rick Jaffa and Amanda Silver (who
both scripted The Planet of the Apes reboots) were hired to shape the
notes into an ambitious series of four films. The process took months but
Cameron wanted to have all four screenplays completed before moving on to
production.
“I wanted to map out all the stories and then get the
economy of scale of capturing the actors across multiple films and then filming
the live action,” Cameron says in the Disney film’s production notes. “The
thinking was we could consolidate the different stages of production
together—performance capture, live action and then post-production.”
Rather than create a host of new planets and moons, Cameron
chose to continue to explore more new biomes and cultures of Pandora itself
with the Avatar sequels. He reasoned that the moon could contain a range of
landscapes—just like Earth. Pandora is after all a metaphor for our world.
The director has spent much of the intervening decade underwater
himself pursuing environmental and exploration projects including setting a
solo deep diving record of 35,787’ to the bottom of the Mariana Trench in 2012.
Seemingly at home in the sea as on land, Cameron appears to want to take the
legacy of filmmaker and aquatic pioneer Jacques Cousteau to another level.
Production design
Production Designer Dylan Cole was tasked with designing
everything relating to natural Pandora and the Na’vi, while PD Ben Procter was
charged with focusing on the environments, vehicles and weapons of the film’s
human industrial/military unit called Resources Development Administration
(RDA).
That’s unusual in itself since most films have one
production designer who manages everything that goes in front of the lens.
“Dylan and Ben weren’t just designing for movie two—they were designing across
the whole metanarrative,” Cameron says.
Cole gave the reef people (the Metkayina clan) a slightly
different shade of blue than the Omatikaya, with a different physiology (large
hands, wider chests and rib cages) and thick protuberances of fin-like
cartilage beneath the skin.
A mammalian species called Ilu was conceived as “a cross
between a bi-plane version of a manta ray fused onto the long neck of a
plesiosaur with the canard wings of a European jet fighter.”
By contrast, the creatures called skimwing are amphibious
with a design inspired in part by the flying fish, but with a very different
head shape and bright Pandoran wings. “The design can’t just look cool,” Cole
says. “It needs to function as if it were real.”
Underwater performance capture
Setting much of the film underwater proved no barrier to
attempting what no-one had done before - performance capture underwater. The
key to it was to actually shoot underwater and at the surface of the water so
actors were seen swimming, diving and emerging from the water realistically.
“It looks real because the motion was real,” Cameron says.
At Manhattan Beach Studios in LA, the home of Cameron’s
production company, Lightstorm, they built a water 120 ft x 60 ft and 30 ft
deep to hold more than 250,000 gallons of water complete with a wave machine.
This was the films’ underwater Volume.
“We could create wave interaction with the creatures and
people surfacing, getting hit by a wave and trying to say their lines and
trying to breathe at the same time,” Cameron says.
Two six-foot diameter ship propellers drove a 10-knot
current in the tank – an effect that looks faster on film thanks to the high
frame rates. The tanks had windows for reference
cameras to film underwater, while camera rigs also recorded the facial
performance of the actors – underwater.
For the performance capture technology to work underwater
the water itself needed to be clear. Initial plans to have the camera crew wear
SCUBA gear while shooting in the tank had to be ditched because the breathing
apparatus created disturbances.
“Every one of those air bubbles is a little wiggling mirror,”
Cameron relates, “and the system that’s trying to read all the marker dots on
the actor’s body can’t tell the difference between a marker dot and a bubble.”
That left only one option: Everybody who was working in the
tank including actors and crew working a camera or holding a light had to hold
their breath. They even employed free diving expert Kirk Krack to help. Kate
Winslet apparently was able to hold her breath underwater for over seven
minutes.
Part of the solution involved covering the surface of the
tank in small white balls that prevent overhead studio lights from
contaminating the performance capture system below, while still allowing anyone
below to surface safely through them should the need arise.
As soon as the characters emerge
from the water the action continues ‘on the ground’ which necessitated a
separate adjacent volume capture stage at Manhattan Studios ringed by cameras
recording data in 360-degrees.
Once the performances were captured
Cameron and the team then shot out the scenes including with characters clad in
CG costume with digital props and CG environments, using virtual cameras on
stage in Wellington, NZ.
Fusing the performance capture data
from the underwater scenes seamlessly with the ‘air bound’ scenes was among the
production’s trickiest problems.
Cameron explains, “The computer’s taking data from one
volume, data from the other volume and in real time, integrating all that
information. [It’s] showing me on my Virtual Camera people coming and going,
swimming up, getting out onto a dock or diving in and swimming underwater.
Obviously, the software to do that took quite a while to get worked out, but
the end result was amazing.”
Cinematography
Performance capture of the lead actors including Kate
Winslet, Zoe Saldana, Sam Worthington and child actor Jack Champion began
as early as September 2017 and ran for roughly 18 months, with Cameron and the
cast working on scenes for all four of the sequels.
Russell Carpenter ASC who shot Cameron’s True Lies
and won the best cinematography Oscar for Titanic was tasked with
designing an interactive lighting scheme that would combine CG with live
action. Virtual lighting for the film took a full year in prep alone.
“Our lighting that we did in the live action scenes had to
merge seamlessly with whatever environment we were in, whether it was a dense
jungle, or underwater, or in the RDA facilities,” says Carpenter.
The lighting team built a system of moving lights, which
could be operated remotely, allowing them to make extremely precise strikes of
light exactly where they should be.
Acquisition was made natively (ie not postprocessed) in 3D
and 4K using Sony Venice cameras in their Rialto format (CineAlta Venice 3D
which enables the sensor block to be separated from the camera’s processing
hardware – handy for stereoscopic pairing. Data was fed through a pipeline at
various resolutions and frame rates including 3D 48fps in 2K and 4K, 3D 24fps
in 2K and 4K, and 3D 24fps in HD for Cameron to monitor on set.
In turn, this necessitated viewing feeds of the live action
on stages in Wellington, NZ from multiple 3D camera systems, simultaneously.
“We are shooting stereoscopically from one 3D rig, often two
rigs and sometimes three stereo pairs simultaneously and everything is
processed instantly,” explained Geoff Burdick, SVP of Production
Services & Technology for Lightstorm Entertainment.
A screening room adjacent to the stages and a mobile
projection pod built into a small trailer housing a Christie Digital 3D
projector was set for projecting DCI compliant dailies.
Massive amounts of data were being pushed around live on
set. Burdick says, “We needed HFR and high res and everything had to be in 3D.
This may not be not the science experiment it was when shooting the first Avatar
but the sync for 3D at those higher frames and resolutions is still an
issue.
“In effect, we are seeing it in a theatrical environment
instantly. We look at every set up, every rehearsal, every take and every feed
live as it is shot on-the-fly in 2D and 3D. We are looking at back focus,
actual camera focus and lighting. We can see the good with the bad at the point
of acquisition and we address issues live.”
The production is also using a variety of additional Sony
cameras including multiple Alpha mirrorless interchangeable lens cameras,
PXW-Z450 and PXW-X320 camcorders, and the waterproof RX0 camera.
HFR
High frame rates alternate between 24 and 48 with the
director dialling between the two in the final picture, using the higher speed
to smooth the motion in faster action sequences and toning it down during
slower dialogue scenes. This was likely done using a postproduction process
called TruCut Motion with which Cameron recently remastered Avatar and
Titanic.
Pixelworks’ motion grading software is a form of frame
interpolation without the dreaded soap-opera effect that can make narrative
movies look like video. The technology
allows filmmakers to dial in the motion, with any source frame rate, shot by
shot, in post. You can also defer the desired shutter speed until post. The
software then ensures that these creative choices are delivered consistently to
screens.
The process does not touch the colour grade but it can begin
colour grading, if required. Being able to subtly ramp up, to a wide pan
of a dramatic landscape for example, without the viewer noticing the jump is
happening retains suspension of disbelief without the judder of fast pans which
are accentuated in stereo 3D.
Editing
With Cameron, the editorial team led by Stephen Rivkin, ACE,
John Refoua, ACE (both Avatar alumni) and David Brenner, ACE, selected
the best performances for each moment of a given scene into ‘performance
edits’, in preparation for the Virtual Cameras to create specific shots.
This was a technique that Cameron helped pioneer in 2009 and
has since evolved into full blown virtual production enabling him to integrate
CG versions of live action performances into a CG environment.
“I could see everybody where they’re supposed to be, above
or below the water, and I could talk to them over the diver address system,”
Cameron says. “They were acting to real-time direction based on what I was
seeing on the virtual camera.”
Once the Virtual Camera shots were edited into cut
sequences, the CG shots and live action performances were turned over to Weta. In effect, the editing team were pre-editing sequences including
all CG lighting, CG props, CG costumes, characters, creatures and environments ahead
of the live action photography. It’s a hybrid form of the craft that
merges the editing techniques of pacing and shot selection used to create
wholly animated films with the flexibility of honing the story based on
performance and multicam footage captured on set.
A notable advance is the ability for
Cameron to be able to select any moment in any one of multiple takes by an
actor and use that to build the scene. Without performances being captured as
data this would be impossible but means that in ensemble scenes for example he
could select take 1 from one actor, take 10 of another and take 15 of a third –
which represent the best performance of each actor – and build a scene
seamlessly.
VFX
For the first film, Weta had developed an image-based facial
performance capture system, using a single SD head-rig camera to record the
actors’ facial expression and muscle movements including eye-movement. This head-rig has been upgraded to HD for the
sequel with two HD cameras designed to capture an even higher fidelity
performance.
“We look at every actor and every performance at a
frame-by-frame level to make sure it matches [with the VFX],” says Senior VFX
Supervisor Joe Letteri. “To me, it always comes down to the characters and the
ability to be in the moment with them. Having that performance as detailed as
possible [helps us] make sure that that’s what we see in the final shots.”
Every element of the lush exotic world needed to be created
and rendered digitally. More than five years of R&D went into writing new
software and methodology for the sequel which claims significant breakthroughs
in lighting, shading, and rendering of scenes set under water.
Solving, accurately, how water moves was the biggest
challenge and ranged from the movement of water when a huge creature moves
through it to when the tiniest raindrop lands on somebody’s forehead, trickles
down into their eyebrow and down their face. It was an incredibly complex
problem, but they weren’t starting from scratch. Cameron had first made water
simulations on Titanic.
“The beauty of it is, if you can solve water for this movie,
you can do all water anytime until the end of time,” he says. “So, these tools
become incredibly important for the effects industry at large.”
With a combination of live action, and CG, one of the most
difficult things to conceive let alone execute is the interactive lighting. We
were also shooting this in 3D at High Dynamic Range at 48 frames per second.
Russell had to embrace all those things
Costume Design
Although the vast majority of Na’vi costumes were only going
to be realised digitally on screen by Weta FX, many of the costumes and much of
the jewelry were fabricated by costume designer Deborah Scott as real, tangible
items.
“One of the reasons that we’ve made the garments to
completion is that the motion of the garment cannot be understood without
having a whole piece,” she explains. “If something’s heavy or feathery or light
or stringy, the way these things move in air, standing at a breeze, underwater,
you really have to have the sample to see what happens to it.”
Letteri adds, “If someone walks and moves their arm, the
cloth folds and wrinkles and bends with them. If their costume is made out of
lots of little pieces like beads and strings or feathers or woven bits, that
all has to go through this really detailed physical simulation to make it
behave as if it were a real piece of cloth.”
Cameron says he’s determined to make sure the sequels are
entertaining and laden with spectacle. At the same time, he’s imbued them with
themes that are important to him—environmental stewardship and the importance
of family.
“With Avatar and where I’ve chosen to take the story
and open up the landscape and the characters that I’ve brought in and some of
the questions that get asked, I don’t feel there’s anything that I need to say
cinematically that I will not say across these four films,” he says.
Shooting out Avatar 2-3-4
Not only has much of the performance
capture been already made for some of the principal actors but many scenes have
already been shot for Avatar 3 and some for the fourth in the series. Film
productions usually shoot all the scenes on set before striking it but this
time they were shooting scenes spanning later sequels so as not to have to
rebuild the set.
Right from the start, editing too
had to take into account the story and character arcs spanning multiple movies,
since any change to a character in The Way of Water would impact on Cameron’s
global vision for the story span in Avatar 4 which is not due to hit screens
until December 2026.
A fifth film is planned but yet to
be greenlit.
No comments:
Post a Comment