Friday, 1 April 2022

Behind the Scenes: 1899

IBC

article here

The Netflix drama, shot on a volume stage with revolving turntable, is the new blueprint for episodic virtual production.

Netflix period drama 1899 is set on a migrant steamship on the high seas and was originally intended as a multi-location shoot in Spain, Poland and Scotland. Covid derailed those plans and instead the show pivoted to being shot almost entirely on a virtual sound stage, becoming the largest production since The Mandalorian to do so.

Netflix helped fund the build of a new VP facility from scratch in order to make the series with Dark Ways, the production outfit of showrunners writer/producer Jantje Friese and producer/director Baran bo Odar.

Dark Bay, the new volume stage at Studio Babelsberg, may have been born out of necessity but the ambition for the project was not curtailed. The technology was new to both Friese and bo Odar but that didn’t get in the way of the epic scope of their high seas adventure.

“What makes Dark Bay and this production special is that it emerged from the perspective of the creators,” says Philipp Klausing, 1899 producer and MD at Dark Ways. “Our entire approach was focused on and influenced by a showrunner capturing the image. That created a lot of industry firsts.”

Friese and bo Odar sought advice from Barry Idoine, the DP who had shot episodes of The Mandalorian. They hired Framestore, Oscar winners for virtual production VFX on Gravity, to become the show’s VFX and VP producers.

“The main challenge was the fact that the volume didn’t exist before we started shooting,” explains James Whitlam, MD, Episodic, Framestore. “It meant an enormous amount of testing. We had a test volume in London but it was nowhere near the same size so we couldn’t tell if it would work on screen. When we wanted to be testing on the actual volume, they were still pouring the concrete floor for the stage. It compressed our R&D into an incredibly uncomfortable place.”

The set would have to convincingly display life at sea. Large physical sets of the ship were constructed and background plates were shot on the ocean for rendering in Unreal. The stage included rain and water atmospherics designed so that it wouldn’t damage the set. The alternative would have been to shoot in a tank with water cannons and greenscreen.

“Everybody is focused on content on the wall and think that this is virtual production but it is not,” says Klausing. “VP is about the composition of the foreground and the actors and giving them a true-to-reality environment to play in. We want to record them, not the background.”

Believable virtual assets (in-camera VFX) don’t just come down to quality real-time rendered content on a wall. The illusion is created by having a foreground set that is lit to match and blends seamlessly with the VFX when you look through the camera.

“When you compose a picture, you need to blend your physical set and the atmospheric effects (fog, rain, dust) and blend them with the virtual backdrop to create as authentic a scene as possible,” Klausing says. “The majority of scenes will have an actor moving through a door or interacting with an object or a character so it’s how you design the physical space and the virtual space behind it that creates a perfect composition.”

Physical revolutions

In the case of 1899, there would have been huge gaps in the shooting schedule had Dark Bay not been designed with a revolving stage.

“We started with a blank piece of paper and the notion of making the volume a perfect cylinder with tiny doors just big enough to squeeze the actors in so that your set is shootable in 360-degrees. But our show is set on a ship with massive set pieces. Some of these sets are 20m-wide decks or a big engine room. These set pieces, if you just bring them all through the small door, will take 3-4 weeks each to construct. This is clearly inefficient.

“So, the only way to make it work was to have a big opening - but if you have that then you lose the reverse shot. You can only shoot, say, in 200 degrees. That’s frustrating. We solved it with the rotating turntable.”

The entire stage can be rotated 180-degrees in 30-seconds. The camera always faces the same direction but the filmmakers can quickly set up the reverse shot.

“The original idea was to give big set designs access to the stage faster, but it turned out to be an enabler for the shooting process. Your base light is set, the camera on a crane is set and you can quickly rotate the set and shoot.”

DOP Nikolaus Summerer shot on Alexa Mini LF with two cameras simultaneously (pointing in the same direction) at up to 48 frames a second synced for display on Roe Ruby 2.3mm pixel pitch panels. Arri modified its large-format anamorphic Alfa lenses for 1899 “so that bokeh fell perfectly into the wall”, reports Klausing. “We had no problems with moirĂ©. With that we gained so much more movement inside the volume. All of a sudden we could move actors inside the space rather than just within a 10sqm area.”

On-set VFX workflows

Virtual production upends the traditional process by shifting VFX, usually the last shots to be finished in any picture, into the centre of production. That means there are potentially more voices impacting the creative decision making on set. New roles, such as VP Supervisor and the Virtual Art Department (VAD), can cloud lines of decision making and it was something that Dark Ways had to work through on 1899.

“I think the biggest learning is to merge these two worlds together,” Klausing says. “The on-set people [such as VAD] need to understand that there are technical issues that might take a little longer than expected. For example, you can move [set-up] a camera fast in a natural [regular set] environment. It’s not so easy in a VP world. With Steadicam, if you have a lot of movement and actors and someone bumps the camera then you have to reset the entire system and recalibrate. That said, we never experienced big delays with our tracking system.”

He stresses that they enjoyed special collaboration with Framestore and other key partners: “The end result was that we captured a lot of shots in-camera and made the illusion perfect.”

According to Whitlam the team achieved their target of recording more than half of the entire production in camera without needing much in the way of additional post treatment.

Normally, on an ambitious 2500-shot show the schedules are often so tight that the studio needs to hire multiple vendors. No one vendor can devote the resources to it from start to finish in the timeframe, in addition to which the studio lessens their risk and gains buying leverage.

The VFX for The Mandalorian was all done by ILM and 1899 was controlled by Framestore.

“We got a great result because of that,” Whitlam says. “We didn’t have the additional complication of multiple VADs from different companies trying to feed into one environment. We’re seeing numerous shows that will split the VAD between different companies. Some of these vendors are doing it for the first time and delivering content to a team trying to get it all to fit together and look like it’s in the same world.”

One piece of money-saving advice is to create just the section of digital asset that is required for the scene. There’s no point building a photoreal 360 world if you can hide portions of it with physical set or will only show, say, 200-degrees on screen.

“On 1899 our volume supervisor [Jack Banks] interacted a lot with production designer Udo Kramer and the DP. Our VP Supervisor [Alyssa Mello] was running the floor and she had a really strong relationship with the 1st AD on set. The two of them were traffic control making sure everybody knew what needed to happen and were in the right place at the right time to keep the thing flowing.

“We got into a flow where we weren’t treading on each other’s toes. But what we did on 1899 worked because of the particular individuals. Whether that’s a template for what we do in future I don’t know. We need to do 10 of these jobs to really know.

“But we’re all sharing our experience. There is an incredible sense of camaraderie between different companies and providers because we can all see the power of this technology and we don’t want to scare filmmakers off. We want to see this succeed. Over the next couple of years we want people to have a good experience with this tech and see how it can benefit them.”

Standardisation of methodology

Klausing thinks the age-old convention of telling a story from the creative perspective of a director or showrunner won’t change but that everyone involved needs to adapt lines of decision making to keep this true.

Another issue is that every volume stage – and there are now dozens worldwide – is bespoke and custom-built hindering quick set-up times.

“You could build an asset that works fine in Babelsberg and take it to another volume with different servers and it wouldn’t work without further optimisation,” Whitlam says. “There is no standardisation of the machines that are driving the screens and therefore every single time you go to a different stage with an asset you are going to have to modify it so it works for that particular environment.”

It’s not just the technology, the skillsets of crew are variable too.

He adds: “You could have a great New York street alley as your virtual backdrop but if you haven’t built a believable foreground, if you don’t have a DP who is sympathetic to the technology and knows how to light it, if you’ve got an inexperienced VAD, then your results won’t be the same in volume A as it is in volume B.”

Standardisation and education issues are being addressed by numerous overlapping initiatives including by SMPTE, EBU, VES, disguise and Netflix.

Framestore wants to iron out standardised hierarchies on set, job descriptions, pay rates and nomenclature. “Basically, a language book not just for delivery of content but who does what and who talks to who,” Whitlam says.

Klausing calls for an independent open source library of digital assets so that producers can pick from a database of pre-existing background scenes and, with some enhancements, get up and running quickly.

“I think this is essential because, with the exception of a few tentpole productions, all projects need prep time to get casting together, to get the script in order and to close financing. It is not possible in terms of time and cost to commit to such a heavy VFX investment upfront. A library takes away this risk. Right now, this is the biggest obstacle preventing this technology to breakthrough. We need all the big players, the streamers, to come up with a revenue model, to unify the pipeline work and make it accessible for everyone.

No comments:

Post a Comment