IBC
The Lord of the Rings: The Rings of Power is the new reference point for postproduction in the cloud. Arguably no show of this scale has worked in the cloud so comprehensively as the Amazon show.
article here
Across the eight hours of the first series a mammoth 38,090 shots were captured of which 9164 were visual effects. By contrast Marvel movies routinely cater for 2000-3000 shots. Footage totalled 1648 content hours and since it was shot in 4K UHD that translated to 860TB of data. The post-production pipe for the show was entirely built in the cloud from the ingest of camera Raw, metadata including the whole gamut of image science, VFX pulls, returns, conform, finishing and all deliverables.
Production was based in Auckland but required 12 VFX shops around the world who in turn farmed work to 1500 artists plus hundreds more artists in New Zealand all linked and collaborating with assets in the cloud.
You can argue that an Amazon show, which over the course of five planned series is budgeted at $1 billion, can afford to put all its content in the cloud. You might also imagine that AWS would provide favourable rates of storage, movement, ingress and egress for the flagship show of another part of its business.
But putting all of your eggs in one basket (in this case, a S3 bucket) might be deemed a risky venture given that by their own accounts the team had no idea how to build such a global interconnected cloud pipe when they set out.
The innovation seems to have paid off.
“That I would be able to manage and produce all of the technical departments from pre-visualization to exhibition meaning all of the technical trades including camera capture, colour pipeline, editorial, post-production, visual effects final archive and delivery would be connected and interconnected was the real key the power,” said POTR producer Ron Ames.
This is not about a technology as a standalone thing. It is actually practical, it is useful, it is efficient. It is about making art.
“So when we talk about the Movie Labs 2030 vision it's right now. It's doable and it's actionable and it is useful.”
Starting with a map
Author JRR Tolkien described his process of creating the world of Middle Earth as beginning with a map. Trying to make the story fit the other way around, he wrote, “lands one in confusions and impossibilities and in any case it's weary work to compose a map from a story.”
So, the Rings team started with a map also. “We recognised that we were going to have to plan this very carefully,” Ames said. “We moved to New Zealand a at the end of 2019 and started to built a cloud production map.”
This came into its own after three months of shooting when Covid hit and production forced closure. They reverted to a cloud-based production from home.
“We were doing it cloud production anyway but Covid made it a requirement and sent us to the next level of development.”
Perhaps the most important aspect of the pipeline was metadata management. It was the glue that enables hundreds of people to work together on assets at the same time in multiple locations.
“We think of these things as dry and technical but they're not - they're art making,” Ames said. “Underlying all of this are thoughts and ideas, notes, about a note of music or a single line of dialogue. Every one of those was a potential asset.”
Metadata is most precious
Metadata collection was standardised, centralised and comprehensive. Lens and camera metadata was captured onset automatically from ARRI DNA smart lenses.
Each department had their own iPads and their own software that fed into QTAKE and then was ingested via Autodesk’s review tool Moxion into every frame. The metadata was logged and tracked in Shotgrid and stayed with the shot all the way through editorial and sound mixing and beyond.
“We could always identify the nature of that frame, where it came from and how it was connected to the rest,” Ames said. “Each department only cared for the metadata that was important to them so we knew the information was accurate.”
Remarkably, all footage was whisked from set to cloud without any on the ground LTO tape backups.
“We were actually concerned that we would lose data or that somehow or other it might be corrupt because there's no checksum,” said Ames.
But none of that was required. Instead, they used just one S3 bucket: “One bucket to rule them all” with push and pull permissions for all artists and vendors.”
The 9000+ visual effects shots were divided among twelve vendors including ILM, Weta, Plains of Yonder, Outpost, Method Studios, Atomic Arts, Rodeo VFX, Rising Sun Pictures, Cause and FX and Cantina. All of different sizes and different internal pipelines spread all over the world.
Ames said, “The price of admission to join us was a willingness to explore new technology, to share assets in a standard way so that nobody could put a gadget or a gizmo on a visual effect that meant somebody else couldn't use that asset. So by creating a standard we determined that we could actually do this.
“We doubled, even tripled, our data because we had never done this before but by the end we had total faith. We never lost a frame, we never lost an asset. All of it was trackable and easily shared with our vendors.”
None of the production team had paper scripts. Nothing was kept in any form other than digitally and that could be shared immediately via iPads. When producing under Covid protocols team members needed to be separated. They put monitors around the stage “like a TV network” so all three shoot units could collaborate at a distance using iPads.
Collaborative production
Since no-one can know asset might have value they tracked every single iteration from the very beginning. Concept art, set drawings, tests - all materials that were created at every stage of the show were shared on the cloud platform. This meant that departments like marketing could access work much earlier than normal.
“The ability to share in this democratic way changes everything,” Ames said. “I cannot describe the power of it. Walking onto a set and having a cut or a diagram or a note from the Show Runners - anything that was necessary was available at all times at a push of a button, securely with permissions.
“I could ask someone to show me every asset to do with King Durin III and it would show me a list of a set drawing, a beard layout, a costume note, all of which could be easily found and shared.”
Into post
They created a common asset methodology using USD (originally devised by ILM and WETA as a means of tracking 3D objects.
“You can look at every one of our assets and know precisely what it is and it could be shared with any other vendor,” Ames said. “And no vendor could get into trouble.”
4K colour pipeline
With the showrunners, DPs and directors in New Zealand watching a reference monitor, a colourist in Idaho was able to real-time collaborative DI sessions.
“We could watch the colour just as if we were in the room with him,” Ames said. “All the files were cloud-based with no local files anywhere.”
Archive
Ames has worked on digital restoration projects for Powell and Pressburger’s classic The Red Shoes and for Martin's Scorsese Gangs of New York and knows only too well the perils of storing nitrate film.
“When I asked for the answer prints to Gangs of New York no one, including Marty, knew what the colour was supposed to look like. If you’ve ever tried to restore something from LTO tape it's near impossible. Now, we have the technology to keep assets forever in the cloud with colour information and everything that is required.
Localisation and QC
The team were even able to work from home on the massive QC and language localisation project by spinning up virtual machines and moving giant files onto and around the cloud platform.
Season 2
Storage of every asset in the cloud also gives the team a jump-start on Season 2, which is already filming. “Every single piece of artwork that we created, every piece of cut footage, music, sound, concept art is all at our fingertips.”
“When we're on set and a director is chasing the light and it's dirty and it's cold we have every possible asset at our fingertip and can share. When we had crews spread across New Zealand and artists all over the world, we were able to connect and communicate, efficiently and scale as necessary to meet all of our goals and deadlines cost-efficiently.
At the end of that Martin Scorsese’s The Aviator (on which Ames worked), Leonardo DiCaprio, playing Howard Hughes, is standing in front of the mirror and says to himself over and over, “The wave of the future.”
“That's what we did every single day,” Ames said. “We knew that this is the wave of the future and the future is now.”
Company 3 Synapse
Company 3 was the leading technical partner. It built the whole post-production environment in the cloud for POTR.
“We took the traditional facility and put it into the cloud,” said Weyron Hendriques, SVP of product development at Company 3. “On the finishing side we also have HDR and SDR reference quality streams distributed over AWS. We used the whole technology to collaborate on the show with a colourist in Idaho and conform in the US West coast liaising with facilities in New York, Auckland and London.”
Company 3 integrated its workflow automation system Synapse into AWS. The concept is to provide uniform high-quality content to downstream post-production pipelines. The tool is allied with dailies ingest for all onset capture, syncs the sound with picture, checks the quality of what was shot and makes QuickTimes for editorial.
“Modern productions use many camera platforms with multivariate colour spaces, picture geometries and metadata formats,” explained Hendriques. “Synapse connects the metadata to all of the camera source files. This allows the system to create editorial VFX and finishing deliverables with the correct colour and geometry.”
On POTR onset and dailies were in Auckland with conform in Sydney. Dailies of camera raw and metadata were uploaded to a Sydney AWS bucket. Synapse harvested camera raw and metadata from the AWS bucket and built its database in the Synapse core in Los Angeles.
For VFX, the editorial team would submit a visual effects pull request into Synapse. Synapse orchestrates its virtual machines in Sydney to process VFX pulls from the camera raw. Synapse then delivers processed VFX pulls to each vendor’s cloud buckets. A round trip of the process is made for VFX returns.
No comments:
Post a Comment