IBC
The visual effects category of the 2021 Oscars is wide open, with last summer’s blockbuster Tenet the frontrunner. But, with none on the shortlist up for Best Picture - which is usually a guide to success - it is a tough category to call.
https://www.ibc.org/trends/oscars-nominations-inside-the-vfx-category/7373.article
UK-based facilities Dneg, Framestore and last year’s winner MPC Film are involved in four of the five nominations and arguably the fifth too. Toronto’s MR.X is nominated for its work on Love and Monsters but it was Technicolor stablemate Mill Film that was awarded the contract in 2019. The facilities merged under the MR.X brand last June.
Tenet
Espionage thriller Tenet required 300 VFX shots but maestro Christopher Nolan’s desire was to film as much in-camera as possible. “He wanted everything to feel grounded in the real world and not appear magic,” explains Dneg VFX supervisor Andrew Jackson.
Wrapping their heads around a story that works forwards and backwards often in the same scene was the first problem. “After the fourth script reading, I’d got to point where I thought I understood but trying to explain it to someone else was another matter,” Jackson says. “After a couple of months, we developed a muscle that we’d never used before - this ability to think forwards and backwards at the same time. What would it look like when an inverted event interacted with the forwards world?”
For example, when a car pulls away and wheel spins what does the kicked-up dust look like in reverse? Showing an inverted bullet hole is one thing but how long has the hole been there before the inverted bullet is fired?
Theoretical physicist Kip Thorne, who previously guided Nolan’s time-warped Interstellar, was consulted. “He liked the idea of there being a little area of disorder in front of something and area of order behind as an object moves through as if the forwards world is pushing back on the inverted entropy of the object. The ratio was 70% - 30%.”
To put that another way, Dneg hit upon the idea that an inverted event was swimming against the tide. “Pushing against the normal flow of time would push back on the event,” Jackson says. “An explosion would explode out and at the same time the world would force it to go backwards. For a bullet hitting a wall you would see the hole in the wall and see a tiny bit of debris falling out as if the hole is still forming before the inverted bullet hits it.”
Still confused? Imagine having to translate these concepts to screen. To Nolan’s brief, Dneg came up with many practical ways of filming “with a twist” that would look real but unusual. For example, they dragged cars behind a truck going backwards while its wheels spun forwards so the kicked-up dust span ‘the wrong way’ and then reversed it so the dust appeared sucked into the wheels.
Much of this came into play during the main car chase sequence filmed on location in Tallin. The audience sees the action once in a linear point of view and later the same event in reverse. Both directions had to make sense and appear as a logical progression.
“Half the cars and characters are inverted on the same road with people driving forwards, transfers of characters to and from inverted vehicles. It’s a really complex interaction.”
Dneg built a pre-viz model of the sequence in Maya including the physical positions of all cars and actors to allow Noland and DP Hoyte van Hoytema to scrub back and forwards to make sure it worked both ways.
The climactic battle required a ten-storey building to be simultaneously imploded and exploded. Dneg built two thirds scale miniatures and filmed them blowing them from matching camera angles, reversed one half of the footage and composited them together.
More subtle touches helped the audience understand what they are watching. Snow is constantly falling during the battle scene except that half the time when the scene is inverted the snow is ‘falling’ upwards.
The One and Only Ivan
Photorealistic creatures that are required to act was the task set for 500 artists at MPC Film’s studios in London, Montreal and Bangalore. Disney’s family film centres on silverback gorilla Ivan (voiced by Sam Rockwell) who lives in captivity with other animals, including an elephant (Angelina Jolie).
The production began with a traditional set and location shoot of live action actors including Bryan Cranston with puppeteers used to represent the key animated characters. This allowed the actors to have an eyeline to play their scenes and for DP Florian Ballhaus to frame shots.
Ivan’s physical movement was recorded on a motion capture stage using playback from the voice actors to govern the pacing of scenes. Director Thea Sharrock used a virtual reality headset to plan out her shots and best angles. CG artists used in-house software to simulate hair, skin and feathers.
The next stage was Virtual Production within the Unity games engine. The film makers used VR tools for dollies, camera heads, cranes and Steadicam to shoot the master scenes on a stage and view the pre-recorded animation clips from any perspective.
A large part of MPC’s 1055 shots involved photorealistic CG environments and digital set extensions. “A lot of care was taken to match the full CG shots to the cinematic look and feel of the live action sections of the movie,” explains production VFX supervisor Nick Davis. “Each of the camera lens’s distinctive features was replicated and subtleties like lens-breathing during focus shifts were included on full CG shots to give added realism.”
Ivan got special treatment. MPC’s Character Lab team researched gorilla bone structure, muscles, the nuances of movement and, most importantly, their eyes.
Davis adds, “Ivan’s eyes had to speak figuratively to the audience and his thoughts and emotions needed to be expressed through subtlety of expression.”
The Midnight Sky
In the George Clooney-directed The Midnight Sky a lonely scientist in the Arctic races to stop a group of astronauts from returning home to a mysterious global catastrophe.
Clooney admitted that his role as a stranded astronaut in Gravity (for which Framestore shared the Oscar for Best Visual Effects) helped him to conceive some of the space sequences, telling Vanity Fair, “once you’re in the antigravity kind of world…up isn’t up, and down isn’t down.”
Framestore were tasked by overall VFX supervisor Matt Kasmir to deliver nearly 500 shots (One of Us provided additional work). Notably they used ILM’s performance capture ‘Anyma’ to record actress Felicity Jones’ face and then transferred the facial performance to a body double. That solved the challenge created by Jones’ pregnancy as she was not unable to travel to the shoot location or hook up to a rig to simulate zero gravity.
“The head motion and eyelines were adjusted to work with specific actions and the position of the camera,” explains animation supervisor Max Solomon. “Considerable sensitivity had to be used though as it was surprising how quickly small adjustments made shots feel broken.”
To simulate weightlessness, cast members were shot suspended on wires, prevized by Clooney and cinematographer Martin Ruhe using the Nviz virtual camera system. Animation was done in Autodesk Maya, rendering with Freak (Framestore’s proprietary renderer) and compositing in Foundry Nuke.
The ‘Aether’ spacecraft, was designed by Jim Bissell, production designer of E.T., with the support of Framestore art director, Jonathan Opgenhaffen. “The buzz word was ‘topological optimisation’,” explains Opgenhaffen. “The ship’s components had to work practically. It’s beauty stems from its functionality and availability of existing and emerging technologies.”
Iceland doubled as the polar cap but the actual amount of snow posed some problems. “Unfortunately, when they went to shoot the plates, a lot of the snow had melted so we had to replace it,” explains VFX supervisor Shawn Hillier. “We’d worked on snowy landscapes for the first season of His Dark Materials, but we really had the opportunity to push our snow shaders further to hold up in all of the close-up shots of the snow moving across the surface of the ground, with the light scattered through it.”
Mulan
The epic sweep of director Niki Caro’s and cinematographer Mandy Walker’s vision for Disney’s Chinese fable required multiple VFX shops under the command of supervisor Sean Andrew Faden.
Weta Digital took care of the Imperial City, using a LiDAR scan from a backlot in China’s Hubei province to measure distances and depth for use in CGI. The film’s rooftop chase sequences called for paint and roto work to remove stunt rigs and any glimpses of modern China in the background.
Framestore’s 300 shots included the Phoenix - Mulan’s ‘spiritual guide’ - and the design of the long sleeves of the Witch (played by Gong Li) that transform into weapons.
“The key was to ensure the sleeves moved in a realistic way, giving the sense that they are alive while keeping the look of cloth,” says VFX supervisor Hubert Maston. “A significant chunk of time was spent in achieving that balance.”
A sequence in which Mulan embarks on a journey to battle the Witch, was filmed against blue screen on a partial set, providing a worldbuilding challenge with VFX backgrounds of large mountains, blocks of rocks, and a CG sky.
“It was quite a challenge to rebuild that volcanic environment filled with fully-simulated CG effects including steam and smoke and volume rendering for a realistic physical lighting,” adds Maston.
Sony Pictures Imageworks enhanced the third act battle sequence. From plates shot in a valley on New Zealand’s South Island, artists had to add a mountain that would be the site of an avalanche. To achieve this, they built a new system inside Houdini called Katyusha, for simulating snow and ice.
Faden also encouraged Imageworks to create what he called the ‘Pepsi Challenge,’ running slow-motion footage of Mulan on her horse Blackwind, riding through a canyon, side-by-side with the CG Blackwind. The CG Blackwind is seen in battlefield shots too chaotic and dangerous to be shot with a real horse.
For example, in a sequence in which Mulan races to catch up with her friend Chen when he is swept up in an avalanche, actress Yifei Liu was shot on a green mechanical buck which was replaced by a CG horse. Imageworks created a hair solver called Fyber that was less data-heavy so artists could integrate it into scenes faster.
Image Engine and Crafty Apes also worked on shots for this show.
Love and Monsters
Paramount’s monsterpocalypse Love and Monsters gave Technicolor’s Toronto-headquartered facility MR. X lots of scope to create several outlandish creatures. The trick was to give them personality.
“The performance goal was to create creatures with a sense of character, identity and individuality, born from their environment and circumstance,” the film’s Animation Supervisor, Matt Everitt told AusFilm.
“For example, we have a creature that is 15 meters in length, blind and rises from its hiding place beneath the earth. It feels its way around the world through its 7-metre-long face tentacles, has hundreds of legs and moves like a snake.”
MR. X (then Mill Film) partnered with the production on set in Queensland in 2019 working with VFX Supervisor Matt Sloan, to deliver 463 shots. This included lots of blood, slime, goo, dust, and debris to interact with. Plus, environment work.
For a monstrous crustacean called Hellcrab, the team used heavy machinery on the beach set to destroy real world props before integrating CG crab, extra props and sand.
“One of the first conversations with [director Michael Matthews] was the moment when our hero Joel realises that the Hell Crab is actually a wounded soul, in need of its freedom. It’s a moment of subtle performance; a dilation of the pupils, micro darts and contractions in the eyes showing fear and a need for connection, not anger. Finally, a moment of connection between the monster and the man.”
No comments:
Post a Comment