Tuesday, 4 February 2020

The Mandalorian totally redefines CGI for television

RedShark
The Mandalorian’s digital backlot upgrades how we think about rear projection and matte paintings.
Star Wars ‘side-wel’ The Mandalorian lands in our universe next month with state-of-the-art visual effects that build on one of the oldest tricks in cinema’s book. Rear projection was introduced by Hollywood in the 1930s, typically as moving backdrop to characters ‘driving’ a car or, more heroically, by Alfred Hitchcock to film Cary Grant chased by a crop duster in North by Northwest. The concept was adapted to front projection (by Kubrick for 2001: A Space Odyssey) and then to green screen with computer graphic backgrounds to the live action filled in in post.
There’s always been criticism of green screen by directors and actors unable to visualise what they are playing against and by cinematographers not able to light the scene properly. In addition to which it’s an inherently post process, adding time to the production and removing many creative decisions from the set.
Director Jon Favreau, who masterminded VFX breakthroughs in Disney films The Jungle Book and The Lion King, was having none of that when he was asked to guide The Mandalorian into being.
Taking virtual production techniques arrived at when shooting those features as well as new technology from Disney’s VFX house ILM, The Mandalorian has redefined CG for TV.
It’s done so on a whopping budget compared to most series, rumoured to be north of U$100m for eight episodes, but this is Star Wars we’re talking about. It’s not only a heavily VFX-driven story but it’s carrying the Empire’s nit-picking fanbase into new territory, not to mention being the flagship for the Disney+ VOD service. So, it had to work.
Instead of investing in massive sets and occupying large soundstages (which Star Wars films have historically done at Pinewood), the show uses a series of rear-projected LED screens that essentially act as a real-time green screen.
Called ‘the Volume’ during production and subsequently dubbed ‘Stagecraft’ by ILM, the technology’s real innovation is that when the camera is moved inside the space, the parallax of the CG environment changes with it. If the camera pans along with a character the perspective of the environment moves along with it, recreating what it would be like if a camera were moving in that physical space.
This wouldn’t have been possible a few years ago but compute power is now fast enough to render realistic 3D environments in real time. The rendering itself is powered by Unreal Engine, the games engine used to create the digital sets of The Lion King.
As with The Lion King, The Mandalorian’s directors (including Bryce Dallas Howard and Taika Waititi) and cinematographer (Grieg Fraser) used virtual reality headsets to pre-vizualise scenes. The blocked and lit approximate scenes were assembled in editorial into a more refined cut with the right pacing and that served as the template for ILM to pre-populate the rear projection screens with photo-real landscapes prior to shooting.
For the actors this approach was beneficial since they could relate more to the story surroundings, for instance knowing where the horizon is, even if the screen was only in their peripheral vision. It meant too that light from the LEDs served as fill light on the actors and props, making the illusion much more complete and eliminating the need to remove green screen light in post. The result is that much of the episode can be shot ‘in camera’ and is therefore a boost to the faster turnarounds of a TV schedule.
“We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine,” Favreau explained at Siggraph 2019. “For certain types of shots, depending on the focal length and shooting with anamorphic lensing, there’s a lot of times where it wasn’t just for interactive – we could see in camera, the lighting, the interactive light, the layout, the background, the horizon. We didn’t have to mash things together later. Even if we had to up-res or replace them, we had the basis point and all the interactive light.”
It’s also like a digitally upgraded version of the matte paintings used in Hollywood’s golden age to create an epic scale to studio-bound productions. Half a physical prop for a starship might be built on set with the remainder an LED illusion.
ILM’s design is also an advance on the system pioneered by international postproduction group Stargat Studios. The US company built a multi-facility business with offices in Colombia, Vancouver and Malta (and at one point in London and Germany) offer a ‘virtual backlot’, designed as a low-cost alternative to shooting on location. It featured on series such as The Walking Dead, Grey’s Anatomy and NCIS.
The Mandalorian is set five years after the events of Return of the Jedi but 25 years before the events of The Force Awakens and like Rogue One can be considered a side story (sidewel) rather than pure prequel. Series two lands at the end of 2020.

No comments:

Post a Comment