NAB
Told in real time, Hijack is the
Apple TV+ thriller starring Idris Elba that follows a hijacked plane as it
makes its way to London over a seven-hour flight while authorities on the
ground scramble for answers.
article here
With so much of the action taking
place in midair the production made extensive use of virtual production stages
and techniques. The show could have been shot on blue/green screen, though this
would have necessitated far more VFX and would not have given the actors the
experience of “seeing” the film’s environment during their performance.
In addition to which, director and co-creator Jim
Field Smith was keen to achieve as much in-camera as possible, as production
VFX supervisor Steve Begg explained to Vincent
Frei at Art of VFX.
“He hates, as I do, the giveaway
camera positions and moves that signpost the unreality of a lot of CG shots, no
matter how beautifully they are lit and rendered. For example, shots like
flying up to a jet and passing through the window into the cockpit. We tried as
much as possible to make the shots look feasible in the real world. We never
have shots just outside the aircraft looking in, for example.”
Sequences featuring Eurofighter jets
never have the cameras outside their canopies when we see the fighter pilots.
All are shot with locked-off cameras on wide lenses within their cockpit space.
All other shots of those fighter jets are on long lenses for POVs or wide in
non-subjective shots.
Director of photography Ed Moore elaborated on the
need for authenticity in a case
study posted to the Lux Machina website.
“Almost everyone has been on a plane, so if something doesn’t feel real to the
viewer, they’ll immediately be taken out of the story,” he said. “There are
visual cues we all recognise when we’re on a flight, like the sun’s beams
coming in and hitting your TV screen, for example. Little things like that make
the plane’s world feel real. It was important for me to immerse the audience in
it as much as possible, so when the hijack occurs, they feel like they are in
this pressure cooker with the passengers.”
The series was shot on four Volume
stages in the UK, all operated by Lux Machina and featuring a combination of
LED configurations. One stage was for the cockpit itself, situated on a fully
automated gimbal that faced an LED wall. Another stage was used to film the air
traffic control room and featured a large LED screen that played back the
flight map in real time, mimicking a real air traffic facility.
The set piece was a full-scale
230-seat Airbus A330. LED screens were installed on tracks on either side of
the plane, providing moving sky content and giving the actors the illusion that
they were in flight.
To create this illusion, Lux’s
Virtual Art Department (VAD) created assets containing an array of digital
clouds hovering above land, water or desert.
“We were able to create and control the content
that was put onto these LED screens, and put them on either side of the plane
to really create an immersive space for the actors to be in,” Lux Machina producer Kyle Olson explained
in a promo video.
Previs on the two main VFX sequences
was completed by NVIZ. As this was not an overt VFX project (though still
containing 900 shots), the main creative work, comprising the jet shots and
crash and a handful of matte shots, were assigned to one major vendor, Cinesite
UK.
The main approach to the imagery on
the LED backgrounds were cloudscapes and landscapes generated using Unreal
Engine and rendered as 12K Notch LC plates for playback.
One episode featured the plane flying
from the Thames estuary in London, approaching Northolt and then landing at
Denham aerodrome in a spectacular crash. All the cockpit views of London and
the estuary were created using a stitched six-camera array shot from a
helicopter that was sped up two times on playback.
“The plane crash was originally going
to be a lot wilder and audacious than the one we ended up with, with the A330
crash landing onto a motorway through loads of recently abandoned cars,” Olsen
reveals in the video. “Then there was a bit of a reality check figuring out
this will not be believable and countering the reality factor we’ve built up in
the rest of the show, ultimately switching the location to a crash landing at
Northolt [on the outskirts of London]. Northolt is too short for an A330 landing,
we were told, adding to the jeopardy.”
During pre-production, Moore had a
four-screen flight simulator in his office, complete with cockpit controls. He
played it for seven hours, tracking the same route the show’s plane was
journeying to get a sense of the lighting.
“It gave him a lot of ideas for the
type of imagery that he was interested in, and that imagery was provided to us
a mood board, so to speak,” Lux CTO Kris Murray explained. “Our team could
recreate portions of that, or take inspiration from them, to create customized
versions of images that Ed could control.”
The system’s playback technology was
at the core of the virtual shoot, allowing LED screens to display footage of
cloudscapes, air traffic maps, and airport information monitors.
“We took the 3D workflow we used on large-scale
productions like House of the Dragon and mashed them with the
type of work we’d previously done using 2D plates,” said Murray. “That meant
building a pipeline that allowed us to export content, in the same format as
Unreal’s nDisplay, that could be played on a pre-rendered playback system.”
The VFX department shot master plates
of the London skyline from 5:00 a.m. to 9:00 p.m., which provided a range of
lighting options from sun-up to sundown, depending on what time of day the
script called for. With that content on the LED wall, no post-production work
was needed.
“This set piece could easily have
been the most expensive location in the entire production — because prime real
estate in London with views of Big Ben and the River Thames would cost you an
arm and a leg to rent out,” said Spencer Chase, VP operator and technical
director for Lux Machina.
Begg said that the scenes featuring
the Eurofighter cockpit were most challenging.
Having elected to shoot them in the
270-degree volume on a motion base, they assumed they’d “get a nice ambient
light wrap-around the cockpit and pilot” — which they did to an extent, but it
reflected more than 300 degrees and you could just see over the edges of the
volume.
“Being a TV show it was shot in a mad
hurry (i.e., no testing time) and although everyone was initially quite happy
with the results, after closer scrutiny we saw all sorts of issues that needed
fixing. Lots!” Begg said.
“I’d anticipated we’d have problems
with the reflections in the visors so we had them high-res scanned in order to
get a good track and replace everything in them, with CG versions of the
cockpit and the pilots arms and sky. The moment the reflections were sorted the
shots really started to come together with added high-speed vapor on top of the
Volume sky along with a high frequency shake. I stopped them doing in-camera as
I had a feeling we’d be fixing these shots, big time. If we had they would have
been a bigger nightmare to track.”
No comments:
Post a Comment