Wednesday, 15 July 2020

MPEG at war: The future of video codecs beyond 2020

IBC

The codec wars that have been brewing for the past few years have claimed their most high-profile casualty. MPEG, the very body that has led audio and video standards for over three decades, has ceased to exist – at least if you believe its co-founder and chair Leonardo Chiariglione.
He resigned at the beginning of June blaming the “feudal” and “obtuse” bureaucrats of MPEG’s overseers at the International Organisation for Standardisation or ISO.
https://www.ibc.org/trends/mpeg-at-war-the-future-of-video-codecs-beyond-2020/6246.article
“MPEG is dead as the thing that has existed for 32 years,” he tells IBC365. “If it survives it is another thing entirely.”
Chiariglione has made the website he administered for MPEG a tombstone for the organisation but the body itself continues to live on within ISO.
“Don't believe what you read,” warned Larry Horn, President and CEO, MPEG LA in a statement to StreamingMedia. “MPEG’s doors are not closing.  Its foundation is strong, its data flows, and its work will continue to leave a timeless mark.”
So, what’s the truth? On the one hand this can be dismissed as so much office politics, on the other it threatens to plunge video compression including next-gen knight-to-the-rescue Versatile Video Coding into an uncertain future.
IBC365 unpicks the bones.
Since its establishment in 1988, the Moving Pictures Experts Group has developed a peerless portfolio of standards and technologies that by its own estimation has created an industry worth more than $1.5 trillion a year, or 2% of the world gross product.
Quibble with that figure if you will but there’s little doubt of the role MPEG has played in assisting the transition from analogue to digital media.
As codec expert Jan Ozer puts it, “MPEG started with the promise to end ‘format wars,’ and it delivered on it mostly—with clear successes of MPEG-1, MPEG-2, and MPEG-4/AVC (aka H.264).”
There are dissenting voices. Yuriy Reznik, Video Technology Fellow at Brightcove says MPEG’s past is not a continuous success story, but rather “a cascade”, including periods when it played catch-up or was reduced to secondary function.
The much-celebrated MPEG-4 AVC (H.264) came about not because of MPEG’s initiative, but because of ITU-T, that had been working the project since 1998, he says.
“HEVC and VVC were also joint projects with ITU-T, so yes, [they] became another successful MPEG codec, but MPEG’s function in its creation was rather secondary,” he says.
Companies which have had their IP in MPEG standards have, however, been handsomely rewarded. According to Chiariglione, MPEG-2 patent holders shared revenues of U$1bn a year during the validity of the MPEG-2 standard.
Most IP holders reinvested in new technologies that have feed the MPEG virtuous cycle of more than 180 standards including six video coding standards and six audio coding standards.
Unfortunately, as Chiariglione writes, only in fairy tales does the story end with happily ever after.
“MPEG did not live happily after MPEG-2 because most MPEG-2 IP holders had difficulty adjusting to internet video distribution,” he says.
By anyone’s account, HEVC has been a debacle. The technology is sound but its administration muddled with a war of words and attrition simmering among patent holders.
“The HEVC standard has some use in broadcasting, but its use on the web is estimated to be at 12%,” Chiariglione says. “If one considers that broadcasting is a declining market and video on the web is constantly rising, one understands that ISO standards will be gradually relegated to a more and more marginal market.”
AOM rival
Into the void stepped Amazon, Cisco, Google, Intel, Microsoft, Mozilla and Netflix. They developed AV1 as a viable competitor to HEVC for streaming and in so doing have made the Alliance for Open Media (AOM) the most likely successor to MPEG.
Meanwhile, Chiariglione’s attempts to change policy at ISO level were falling on deaf ears. Some 57.5% of all patent declarations submitted to ISO relate to MPEG standards yet to its former chair it is an “ungovernable secretariat” incapable of change.
“The problem was not internal to MPEG but due to poor handling of the standards outside of the organisation,” he maintains. “ISO is a feudal organisation. If you are a peasant no one listens to you. If you are a duke maybe they do.”
A recent restructuring of the working groups and technical committees by ISO was the last straw.  Seemingly this left the Italian without the autonomy he had previously enjoyed and triggered his departure.
“I was being asked to undo what I had created and therefore I left,” he says sadly. “A more indecent offer could have been conceived.”
To add salt to the wound, Gary Sullivan the electrical engineer who led the development of H.264 and HEVC and who is also an employee of AOM founder member Microsoft, has been elected chair of the new group (ISO-SC29) overseeing MPEG.
“A candidate for that position should not even have the suspicion of the appearance of a conflict of interest,” says Chiariglione.
The fate of VVC
So where does this leave MPEG’s next-gen video standards?
Three MPEG codecs are due for release in 2020; Versatile Video Coding (VVC), dual Essential Video Coding (EVC) profiles (baseline and main) otherwise known as MPEG-5 Part 1, and MPEG-5 Part 2 Low Complexity Enhancement Video Coding (LCEVC) which is largely a rebranding of technology from V-Nova.
Technically, nothing is expected to derail these. “The train is on the track and unless deliberately stopped it will continue,” Chiariglione says.
VVC/H.266 was finalised this month and promises to cut bit rates attainable with HEVC in half.  According to Ozer, “EVC is a response to AV1’s supposedly royalty-free status and the need for licensing simplicity. LCEVC responds to the increasing CPU requirements for encode/decode which have become burdensome.”
As with any codec, industry adoption will depend on technical merit, ease of implementation and business cases. “The change in MPEG’s top-level structure does not mean that it will be less effective,” says Reznik. “All sub-group chairs that have been driving technical work are the same. All member countries, companies, and individual experts are also the same. I don’t see any impact on innovation in the near term.”
That said, seeking to land three codec schemes at the same time “is evidence of an internal struggle, lack of clear thinking, and some bad decisions made in the process,” argues Reznik.
“In retrospect, it is clear that EVC could have been avoided by more thoughtful requirements and selection processes in VVC, and, LCEVC could have been left alone at V-Nova without calling it an MPEG standard and introducing confusion about it somehow being related or helpful to EVC.”
More pertinently, Ozer observes that the landscape in which H.264/AVC succeeded as a one-size-fits-all codec is now vastly changed. Launching a single codec every ten years no longer works.
“We may end up with seven video coding formats that will need to be considered for future video products and services by content distributors,” suggests Christian Timmerer, Head of Research and a co-founder at Bitmovin. “In other words, the multi-codec scenario is becoming a reality [which raises] some interesting challenges to be addressed in the future.”
“We need competition to get the licensing terms of MPEG under control,” says Thierry Fautier, Vice-President, Video Strategy at Harmonic. “If, at the end, the codec is a commodity, then it might become royalty free. In a unicast world, there is by definition room for multiple codecs, meanwhile there is a cost associated to it, and the market will regulate how much it can accept.”

Future media, future standards
It is not, then, the current crop of codecs we should be concerned about but the future of the digital media technology which is, per Chiariglione, “developing at an aggressive rate.”
Codec development costs millions if not billions of dollars over many years to deliver. MPEG succeeded in balancing reasonable access to users for its technologies with reasonable return and incentive to inventors, in turn helping to make the most widely used standards in consumer electronics history available to the mass market.
In Chiariglione’s opinion that model is broken while AOM’s royalty-free imperative does not drive a virtuous cycle of investment in R&D. 
“AOM could do what I was prevented from doing by having the freedom to go on to create a good standard,” he says. “But I am of the strong opinion that if AV1 is the only game in town then innovation will suffer incredibly. Once you have control of the market with a standard you no longer have a motivation to go beyond that.”
AV1 is not perhaps as royalty-free as first thought. IP administrator Sisvel unveiled licence costs for certain patent holders earlier this year. 
Chiariglione continues, “MPEG was unique in the sense that we controlled the market but inside of that was a continued search for innovation because it was coupled with royalties. I don’t see how AOM can deal with a world of continuous innovation in capture and display technologies.”
In his blog he even contends, “The future is one where thousands of well-paid jobs in media compression and academic research that feeds it will go because there is no longer a pressing demand for new compression technologies.”
If not AOM, then who should lead codec standardisation?
“Another standard setting organisation like the DVB,” says Ozer. “What’s remarkable about MPEG is the amount of work and rigor that goes into testing, defining, and documenting new standards, which is needed to incorporate disparate technologies from dozens of contributors. I’m not behind the scenes with AOM, but I’ve never seen that level of documentation or effort from them.”
Whichever body it is, he warns, “Avoiding HEVC-like licensing imbroglios is essential.”
Thierry Fautier points out that MPEG’s strength is based on its members. “I have not seen any company deciding to step out of the ‘new MPEG’, so I believe MPEG still has the same fire power.”
We could perhaps look to China where the standards consortium behind the AVS2 codec is now releasing an AVS3 for 8K.
Between AOM, AVS, and the ‘new look’ MPEG that is probably more than the market can chew,” Fautier says. “There will not be any unified standardisation of video codecs as AOM and AVS were created to work around the MPEG licensing issues.” 
Media to simulate the senses
From MPEG-1, developed to compress SD signals, to VVC which is designed for 8K, MPEG has only “scratched the surface” of the possible.
“We have handled 2D rectangular video and we have handled multi-channel audio (MP3) but we have only just started on video-based point cloud compression which is at comparable stage to where MPEG-1 was thirty years ago,” Chiariglone says.
The ultimate goal of codec development, he suggests, is to create media experiences which simulate the human senses. MPEG initiated a project three years ago on Immersive Media to enable VR 3DoF, 3DoF+ and, in future, 6DoF.
“2D video is a long, long way from being a replacement of the visual experience of human eyes,” he says. “There is so much potential in immersive 3D video and so much work to do that there is plenty of evidence to demonstrate a need for an organisation such as MPEG.”
Yuriy Reznik believes the industry has more codecs than it needs or can deploy in the next 5-10 years. “There is really no need for anyone to ‘step in’ and accelerate work on new ones,” he says. “If anything, it would be good if MPEG, AOM and other codec-producing entities would just take a break and do some longer-term research. It is better to bring fewer codecs and with significant gains as opposed to many with marginal ones.
Fundamentally, he says, the need for codecs to be ‘standard’ may eventually diminish – at least for most web/streaming applications.  “Therefore, it is entirely possible that some of the next-generation codecs will come from individual companies and not from organizations like MPEG or AOM.”
Chiariglione himself has wasted no time since resigning. Look out for news of his new venture coming soon which is “more or less” in the codec development space.
As his MPEG homepage cryptically states: ‘Remember the phoenix’.

Tuesday, 14 July 2020

3 Predictions for the Media and Entertainment Industry Emerging from COVID-19

copywritten for Avid 
As the media and entertainment industry gradually unlocks from months of quarantine, it’s preparing to reenter a world that’s been permanently changed. Above and beyond social distancing and sanitary measures, the film and TV business faces an uncertain future stretching from the foundations of production to the economics of consumption.
However, there are some navigational markers CEOs and CFOs can use to guide their strategic decisions. In the aftermath of the pandemic, Avid predicts that the future of media and entertainment is poised to see large changes:
  • An acceleration of existing trends toward subscription models for production technology
  • A changing mindset toward distributed workflows and cloud applications
  • A growth in niche content from OTT providers
While challenging, the prognosis isn’t gloomy. In fact, the media and entertainment community has an opportunity to reimagine how it creates, produces, and delivers great content.

Growth in Subscription Models for Production Technology

Already well underway prior to COVID-19, the transition across the media and entertainment industry to cloud-based subscription services has accelerated in response to the crisis.
Data from the IABM’s Coronavirus Impact Tracker found evidence that demand for legacy types of software such as permanent licenses could decline significantly. In fact, this trend may accelerate suppliers’ transition to as-a-service offerings.
This holds true for even the biggest media organizations, which have begun migrating to enterprise subscription commercial models such as enterprise license agreements, Avid’s Chief Revenue Officer Tom Cordiner said in a video interview.
“There are lots of benefits for them in doing that,” Cordiner explains. “They’ve got a lot more flexibility in how they deploy their technology, so they can maximize use of their licenses for both their colleagues in-station or in-facility but also those working increasingly remotely.”
He adds, “Subscriptions also give them faster access to new releases and it’s a much easier, more streamlined license activation or ordering process. I certainly see that as a trend that will continue. The economics are just so much more compelling for our customers, and it’s happening right now.”
The desire for subscription models could mirror the transition to business models based on operating expenditure, according to Josh Stinehour, principal analyst at research consultancy Devoncroft Partners. “Part of the tail wind for operating expenditure is the inescapable impact of the current circumstances on capital expenditures for the next 18–24 months,” Stinehour says. “It is simply going to be a new technology budget reality.”

Rise in Remote Distributed Workflows

Arguably the biggest change wrought by the industry’s collective experience is an enlightened attitude toward the benefits and practicalities of remote workflows. Where media technology buyers were inclined to take a wait-and-see approach when investing in new cloud and remote production, those risk-averse preferences are shifting radically as a result of the external shock.
The IABM Tracker certainly underlines this, highlighting increased investment in virtualization and remote production. Most buyers suggest that, after the pandemic, they can’t imagine things going back to the way they were before.
“Our customers are racing to adopt many new modes of working,” confirms Cordiner. These ways range from streamlined business continuity and disaster recovery workflows to remote distributed editing, finishing, review, and approvals, to Edit on Demand, using the collaborative power of production asset management for editing and content repurposing in the cloud.
“I think [technology vendors] can play a key role in helping customers adapt and thrive in this new world,” Cordiner says. “Some of our customers are taking very bold steps and going right in right now. Some are a little more cautious and testing workflows, starting small but then rapidly scaling.”
Devoncroft has tracked cloud adoption within the media sector for the past decade and now suggests that the circumstances of the first half of 2020 will represent yet another milestone for cloud. “The most difficult aspect of changing workflows is convincing users to alter behavior,” says Stinehour. “[COVID-19] has changed behavior to a degree no amount of [argument] could have ever accomplished.”

Increasingly Niche OTT Options

The pandemic and subsequent lockdown led to an increase in content consumption and new subscriptions. This surge benefited subscription video-on demand (SVOD), catch-up services, and even linear TV viewing figures.
As lockdowns lift across the world, analysts expect some of these gains to reverse. However, the acceleration of consumer behavioral change looks set to benefit streamers. An Ampere Analysis report featured on Rapid TV News forecasts that streaming may gain 12% of additional growth in revenue by 2025.
“The drive to reach consumers directly . . . is going to continue to be front of mind as the dominant strategy for all media companies,” Cordiner says.
They will do this, he suggests, either by sheer scale of content (the model pursued by Comcast, Disney, Netflix, Amazon, and Apple) or by catering to people’s passions with more niche OTT options.
Examples include BritBox, an OTT platform for British TV shows from UK broadcasters ITV and BBC; the June launch of Barça TV+, a new SVOD from Spanish La Liga champions FC Barcelona; and the launch of Nowave—a French-based SVOD specializing in rarely seen art house movies.
Netflix and Amazon Prime Video are also ramping up local content in international markets. The US, India, and China have the highest proportion of audiences regularly watching local content, according to an Ampere Analysis report featured on Digital TV Europe. In markets like the UK, popular local content can outperform even some blockbuster international titles, the analyst found.
Direct-to-consumer strategies are geared toward the long-term future of media and entertainment alongside a mix of revenue models. Together with subscription and advertising-led models, increased transactional VOD is likely to follow some studios’ recent experimentation with collapsing traditional release windows. There will, of course, be tiered or hybrid offerings of these monetization strategies to give consumers even more choices.
Whatever the strategy, audiences need more content—and quickly—according to Cordiner. He advises that every platform will need to keep its value high with fresh programming to satisfy the demands of increased viewing and prevent content-hungry viewers from churning away to something else.

The Old Guard: Interview with DP Tami Reiker

RedShark News
Following the trail blazed by Patty Jenkins’ Wonder Woman and Anna Boden and Ryan Fleck’s Captain Marvel, director Gina Prince-Blythewood and star Charlize Theron are ready to kick-ass in female powered superhero movie The Old Guard. Unlike, however, the DC and Marvel blockbusters of the past decade, the made-for-Netflix feature eschews fantasy world building and wall-to-wall VFX for a realism that grounds its characters in the present day.
“Pretty gritty,” is how Tami Reiker ASC, describes the story’s visual tone. “This is a big beautiful action drama with two very powerful female characters who we definitely wanted to look and feel natural.”
The key reference for Reiker and Prince-Blythewood was Zero Dark Thirty, Kathryn Bigelow’s take on the Navy Seal raid on Osama Bin Laden shot largely handheld by Greig Fraser ASC.
“The heroes of our film can’t crush buildings or fly,” Reiker says. “Their super power is the ability to regenerate until they are thousands of years old but their near immortality doesn’t mean they don’t experience great pain or can’t be hurt. That is what we want to evoke.”
Such a down to earth treatment reverses the colour saturated, cartoon-like gloss of the genre and is becoming a trend in itself. Low budget sci-fi Code 8 (2019) placed its X-Men style mutants in a recognisably bleak lo-fi future. Earlier this year, Sony Pictures’ Bloodshot, starring Vin Diesel as a superhuman marine, was filmed by DP Jacques Jouffret like a live action documentary. Marvel’s The Eternals, due next year, is being directed using improvisational techniques by Chloe Zhao for a cast headed by Angelina Jolie.
“I never wanted any moment to take an audience out of the fact that these could be real people,” Prince-Bythewood told Vanity Fair about the heroes in Greg Rucka’s graphic novel adaptation. “This is somebody that can get stabbed and walk away, but it’s going to hurt.… This is a woman alive 6,000 years, and we come to her at the point where she wants it over.”

Crossing convention

It is still worthy of note that The Old Guard also cuts across convention by having key creative talent filled by women. The picture is edited by Prince-Bythewood’s regular collaborator Terilyn Shropshire with VFX supervision by Sara Bennett, the first and only female VFX super to win an Oscar (for Ex Machina).
This is Reiker’s sixth collaboration with the director after they met in New York and worked on projects including Disappearing Acts (2000), Beyond the Lights (2014) and TV show Cloak & Dagger (2018).
“What drew Gina and I to this story was the opportunity to tell an action story driven by powerful women,” Reiker says. “They are not over sexualised. They are not in crazy seductive outfits. They are bad ass fighting women.”
Theron, who is also a producer, stars alongside If Beale Street Could Talk’s KiKi Layne and trained for months to be able to film much of the fight scenes in camera. One such scene takes place on a military plane filmed 15 metres in the air on a gimbal rocking back and forth to simulate motion and making handheld photography especially challenging.
“There was a lot of discussion about whether we really did want to shoot this handheld even as the plane takes these crazy dives but we our mindset was very much that we needed the audience to feel part of the action,” Reiker explains. “We could have removed a wall or shot from outside but we needed the space to feel real not imaginary. For the same reason we use minimal green and blue screen.”

Using the ALEXA 65 handheld

Camera operators Simon Tindall and Tom Wilkinson were strapped into the plane set to maintain the illusion for the audience of being onboard. That’s no mean feat given that the cameras are hefty ARRI ALEXA 65’s which weigh over 10kg without lenses [Prime DNAs] and accessories.
“This is the first time Gina and I have done a multi-million dollar budget movie so we had a much bigger sandbox to play in but we felt that the ALEXA 65 delivers such incredible image quality that this was our first choice,” Reiker says. “The one hurdle was whether we could shoot an entire film over 16-weeks handheld [which, excepting a few crane, Steadicam and aerial shots, The Old Guard is].
“We made a lot of tests using various rigs but the clincher was finding Simon [who coincidentally operated on Zero Dark Thirty] and Tom who were confident they could operate over the shoulder. That meant we could film the story using the latitude of natural movement.”
Another scene, set in a bunker, features Theron and team ambushed and riddled with bullets. Filmed with three ALEXA 65s, all handheld, the scene is illuminated only sparely by a pair of ceiling lights.
“We’re wanting to stay true to the confines of the space we created. It’s the first time we see our heroes in action and the first time we see them regenerate so it’s important to understanding how painful their death can be – even if they aren’t quite dead yet.” 

Varicon LED

The other principal craft decision was to use an ARRI Varicon LED, a filter device that allowed Reiker to selectively reduce the overall image contrast range of individual scenes by adding light to shadows and darker areas of the shot.
The Varicon is an illuminated filter that slides into the matte box, and can be viewed directly through the viewfinder for manual dialling of the flash level. She also used it to add a colour cast to shadows without affecting highlights.
“I’d use the Varicon before on High Art (1998) and episodic drama CarnivĂ le (2008) to flash the film and bring out detail in darker areas. For The Old Guard, I worked with ARRI Rental to custom make a 4x5 filter just for our film so I could not only vary the contrast but adjust the hue.”
Reiker made extensive tests to calibrate the precise degree of illumination required for each scene.
“The effect lends the picture a more filmic quality,” she explains. “You are opening up the shadows, particularly in the faces and contours and beauty of the skin. We can put warmth into the shadows or add a blue to scenes set in Paris at night. It’s a really incredible tool.”
The studio shoot took place at Pinewood Shepperton with trips to Sandwich, in south east England, dressed as French town. Desert scenes were filmed on location in Morocco.
After one year on the film Reiker had to leave filming due to a prior commitment. “Barry was the perfect person to pick up when, unfortunately, I had to leave.”
Working to Reiker’s template, Barry Ackroyd BSC, who filmed handheld for Bigelow’s The Hurt Locker and Paul Greengrass’ Captain Phillips as well as Theron’s previous action pic Bombshell, guided the film through the end of principal photography. He also remotely supervised the colour grade while the production, like the rest of the industry, was in lockdown.

Battle Stations: Shooting large format on Greyhound

Panavision 

Shelly Johnson, ASC shoots large-format 8K with DXL and Sphero 65 lenses to captures the close-quarters naval drama "Greyhound"

The World War II action drama Greyhound puts the audience on board a U.S. destroyer with its heroic but inexperienced captain and crew as they battle to protect a convoy of merchant vessels from wolf packs of U-Boats in the North Atlantic. It’s a fictional account for which authenticity was the key. Established in the screenplay written by Tom Hanks — who also stars as Navy Commander Ernest Krause — that authenticity was reinforced throughout the production by director Aaron Schneider and cinematographer Shelly Johnson, ASC.
“Our movie is very procedural,” Johnson explains. “The dialogue is freighted with technical jargon about rudder commands and nautical vectors, but the real story lies between the lines and is about the impact that these actions have on the crew. So many of the captain’s commands are gut calls made in the moment, with life and death consequences. In order to convey the heroism and vulnerability of the characters, we had to give the audience situational awareness and context. We had to teach them this language.”
Schneider asked Johnson to research everything he could about the mechanics of naval operation — how to plot an engagement, when threat levels are raised, even how sonar navigation works. “In the heat of shooting the movie, Aaron wanted me to tell him if anything needed clarification,” Johnson says. “For example, if Tom’s script called for the vessel to maneuver 30 degrees starboard, then it was my job to ensure the procedures carried out on the bridge were represented visually. If I could find a place for the audience to stand within the pilot house [the destroyer’s command bridge], I was doing my job.”
A replica pilot house was built onstage in Louisiana. The close quarters of the 10’x18’ interior set — filled with charts, wheel, compass and other instruments, not to mention a camera operator, focus puller, and cast dressed in period- and climate-appropriate jackets — naturally informed the filmmakers’ decisions regarding the movie’s visuals. To be able to get close to the characters, and particularly to Krause, from whose perspective the story is told, Schneider was keen to shoot large format. 
“The larger the format, the longer the focal length that can be used to get the shot you need,” explains Schneider, who was a cinematographer before becoming a director and who remains a member of the ASC. By using a longer focal length closer to the actor, he says, “you can actually make a more intimate film on a human level. That was particularly important for our story, which is set in the confines of a World War II pilot house.
“If we were in a corner on Greyhound,” he continues, “instead of a 24mm, we could use a 35 or 40mm and get much the same shot but with a more natural focal length that didn’t feel like we were looking through a peephole on a door. It meant we could get up next to Tom and explore the reactions on his face, which showed the weight of the decisions [Krause] had to make. Greyhound is a character study in that regard, so I wanted to get the audience up close to the actors. Large-format photography translates that proximity into intimacy.”
“We were less interested in showing the detail associated with large-format shooting than we were in how the optics would perform,” Johnson says. “An immediate advantage of having the camera in that 2- to 3-foot range [from the actor] is that, with the longer focal length, it adds a textural depth to the set. It’s something the audience will respond to almost subliminally.”
Knowing that the combination of set and story also called for a handheld approach, “maneuverability certainly played into our camera selection,” Johnson reflects. “I knew I wanted the VistaVision-size sensor and ideally to shoot 8K. That led me straight to the [Panavision Millennium] DXL. It’s not just a computer with a lens mounted onto it; it’s set up like a true production camera. You can take it into the heart of a production with all the modules and inputs that make shooting so much easier. Plus, the full Panavision service and support is always reassuring on any project.”
For his lens package, Johnson turned to the Sphero 65 range. “I wanted them because they did everything that an old lens does exceptionally well,” he shares. “They had those unique ‘imperfections,’ distortions and aberrations instead of a sterile separation of sharp colors. When you get to the edge of the face, there’s not that cut-out feel. Everything just kind of melds. It feels very analog. We couldn’t get them until the last minute because they are in such demand. [Panavision’s] David Dodson was digging through every corner of the Earth to pull together a set.”
Johnson shot mostly with the 35mm, 40mm and 50mm focal lengths. “There were a few things we shot from afar, as though from a chase boat, where we’d put on a longer focal length,” he recalls. “But most of the time we were right there next to the characters, specifically Krause and the faces of the young crew, so it made total sense to have a short depth of field, reasonably close focus, and expressive optics.”
Tungsten practicals provided illumination through doorways and portholes of the set, augmented by some warmer interior sources — all of which contrasted with the cool, stormy cyan and gray exteriors. “Since we were going to constantly mix light as the actors moved from inside to outside and as the time of day changed, I expected to use filtration, but it just wasn’t necessary,” Johnson notes. “I found the DXL color science separated the tones extremely well while the Spheros blended them together in a very unelectronic and beautiful way. The combination of lens and camera really is the look of the movie.”
Working close to the actor in a confined space with large-format lenses presented an advanced task for operators Don Devine and George Billinger as well as 1st ACs Michael Charbonnet and Ryosuke Kawanaka. “The job they did was incredible,” Johnson says. “In my opinion the 65 format is the most difficult format to pull focus for. Even though the focal length is fairly similar to anamorphic — a 40mm in anamorphic is similar to a 40mm in 65 spherical — the 65 is a lot less forgiving. That’s part of using this specialized gear. You rely on your crew so that the creative process on the set remains pure.” 
Both operating and focus pulling were complicated by the set’s placement atop an elevated gimbal that continually rocked to simulate the ship’s motion. “Even in a ‘static’ frame when the actor was 3 feet away and standing still, the focus would have to go from 2 feet to 4 feet,” Johnson says. “This was all done largely by feel. The gimbal would be moving one way, the actors another, and the operators were also moving. The assistants were able to watch on a monitor, but they and the monitor were also rocking! 
“So much of what we were doing was about freeing the camera so the actors could run from outside, through the pilot house, and out the next door to the deck on the other side,” the cinematographer continues. “Being able to pan, turn and follow the actors helps the audience understand the scale of the space the crew are operating in — that running from one side of the ship to the other is just a few steps.”
The delivery for Apple TV+ was 4K, but Johnson captured at 8K to take full advantage of the large-format sensor and optics. “It’s the edges where all the goodies are,” he says. “We didn’t want to lose the vignetting or halations in the highlights, which are most active at the edges.”
The film takes place over three days and two nights, a period that Hanks’ script structures into a sequence of four-hour “watches” signifying the rotation of crew on the bridge. Johnson used these precise times to build what he calls his “manifesto,” which provided a blueprint for programming the lighting on set and critical guidance for visual effects. “Aaron wanted most of the film to take place against stormy atmospheric seas and skies, but we wanted to differentiate between the watches,” Johnson explains. “Otherwise it would all look the same.”
Johnson’s manifesto detailed how the light would be impacted by changing conditions such as sky, spray, rain, and the state of the ocean. “As a starting point we took the latitude of the vessel, which was sailing in November, so we knew the sun would rise late and set early,” he says. “I asked myself, ‘What is the color of dawn, and does that change day to day? How intense is the color of the horizon? What is the contrast ratio between interior and exterior, and how does the sunset transitioning into darkness alter the interior lighting?’”
With this manifesto in hand, the filmmakers were then able to program an LED lighting array that enveloped the set and afforded actors and camera complete freedom of movement. “Each watch could have its own look or looks, so there were maybe 20 looks with which we could program the entire movie,” Johnson notes.
“I wanted to avoid the contamination of green light reflected from greenscreen, so we went with white screen [surrounding the set],” he continues. “It meant that all the light reflections inside the set on the metal details are real. It provided a solid base for the lighting scheme, and the VFX team were able to pull keys very effectively. It really freed us up and let me get light in from any angle I needed, and the skilled electric crew and I could adjust and mix color intensity pretty quickly according to my manifesto.” 
To further enhance the movie’s authenticity, Johnson kept the set’s low ceiling in shot and was able to hide small lights among its conduits, ducts and gauges. “There was something about the way the Sphero lenses paired with the DXL camera and with the style of lighting we were creating that defined the world of Greyhound,” he says.
Aside from five days aboard the USS Kidd, a museum ship moored in Baton Rouge, the film was entirely stage-bound — not that you’d know it from the movie’s absolute attention to detail. “Every single day on the film, when I was walking on the stage and up the stairs to the gimbal, I would say to myself, ‘There’s probably 150 DPs who would love to have this job, and I’m the one who’s here,’” Johnson reflects. “I’m very thankful to have had this opportunity.”

Homecoming: Playing tricks with the mind

Panavision 

Cinematographer Jas Shelton achieves continuity with a twist using DXL2 for the return of Homecoming

The psychological thriller Homecoming returns for its sophomore season with fresh twists, nail-biting cliff-hangers, and more of the unnerving cinematography that characterized the critically acclaimed first run.
Cinematographer Jas Shelton and director Kyle Patrick Alvarez took the reins for all of Season 2, following the work of DP Tod Campbell and director Sam Esmail on Season 1. For Shelton and Alvarez, who had previously collaborated on the features C.O.G. and The Stanford Prison Experiment, their challenge with Homecoming lay in maintaining continuity with the storytelling that made the first season of the Amazon Prime original so distinctive while finding a new visual language for a timeline that overlaps and extends the drama.
“Our intent was to deliver continuity with Season 1, for example in color palette, the use of long takes, and a subjective point of view, but to find a new camera language to tell some new ideas,” Shelton explains.
The setup for Season 2 finds lead character Jackie (Janelle Monae) waking up in a row boat adrift in a lake, unable to recall her name, birthday or address. Her ensuing search for identity will lead her into the heart of the Geist Group, the unconventional wellness company for war veterans behind the Homecoming Initiative.
“Our camera package was guided by Season 1, but that was not the only reason for continuity with the Millennium DXL2,” Shelton explains. “We wanted to shoot with a large sensor and shallow depth of field. The Light Iron color science in the DXL2 maximizes these attributes to create beautiful images."
He continues, “I’ve worked with the DXL many times before [including on the feature Like a Boss and the series Soundtrack] and I keep coming back to it. Each digital cine camera can be treated like film stock in that you select the sensor that will give you the right look for the particular project at hand, but I really like the color science of the DXL. It’s not just what you can capture off the sensor, it's what the Light Iron color science is able to do with the image that is impressive.
As with Season 1, Shelton paired the camera with G Series anamorphic lenses that were modified for the show. Panavision’s Guy McVicker explains, “Through the use of some new proprietary processes and unique adjustments, we were able to give Jas the opportunity to really embrace the aesthetic that anamorphic has to offer. The larger imager of the DXL also gave us more negative to exploit and stretch.” 
Shelton adds, “The uniqueness of the anamorphic aesthetic was heavily embellished, yielding a much stronger curvature of the field and additional obliqueness in the bokeh. Also, the contrast was lowered, the sharpness reduced, and the gradation of focus from sharp to soft enhanced.
“We recorded full-resolution 8K in compressed 7:1, sometimes dropping to 3:1 for scenes with heavy VFX,” he continues. “It was important to obtain the maximum information from the sensor and to cover the anamorphic glass, which lends a 3D quality to the visuals.” 
While the first series used changing aspect ratios to relay different story timelines, Season 2's filmmakers opted to lean into a cleverly calibrated color design. Shelton explains, “We decided to keep anything that takes place in Geist’s headquarters as sun-kissed and warm, and we made a different LUT for scenes with Jackie. This is apparent from the moment she wakes up in the boat and goes on her journey of self-discovery. It emphasizes dark blues, cool greens, and a gothic palette — especially in the first part of the show — with lots of fluorescent lighting built into the sets.” 
Because of the complex nature of the show, Shelton used meticulous storyboards. This is also reflective of his close collaboration with production designer Nora Takacs Ekberg. The wide range of locations meant, for instance, shooting exteriors at the Big Sky Movie Ranch in Simi Valley, Calif., and then traveling south to Torrance for Geist HQ interiors, which posed their own complications because there was only eight hours of daylight and the facility had a giant sun roof. “Trying to create a daylight look in there that we could use for 12 hours rather than eight was challenging,” the cinematographer says. “Then we went up to Lake Arrowhead for a week, shooting out on the water quite a bit for the opening sequence.” 
Touch points for Season 2's tone included Alan J. Pakula’s 1970s conspiracy films like The Parallax View and Todd Hido’s dark and moody landscape photography. 
“Kyle wanted shots of Jackie to have this subjective point of view so that the audience is always following with her and experiencing her discovery,” Shelton notes. “The idea was to never let the camera settle until she comes face to face with Audrey Temple [played by Hong Chau] at the end of the second episode.
“To underscore her state of mind," he adds, "we borrowed ideas from the movies of Pakula and Brian De Palma where the camera has a very probing quality to it.”
Shelton mounted the DXL2 on an Oculus gimbal head and moved it on a dolly to convey a floating camera movement. “The design of the camera makes it easy for me to devise shots,” he reports. “We play with POV by using wider, more deliberate angles and longer takes in the early episodes, and then, as we unravel the mystery of what happened to Jackie, we employ tighter, more frenetic handheld shots in the later episodes. 
“DXL2 just suits the way I want to work,” he adds. “Its look is very filmic and soft, especially when using depth of field. The colors are muted, not overly punchy, whereas with other cameras the results are more saturated and I have to reign them in.” 
The final shot of Episode 1 features one of the series’ elaborate long takes. The setup starts with a close-up of Chris Cooper (portraying Leonard Geist) in a farm field. As Cooper walks into his farm house, the camera, on a Technocrane, pulls back with him. The move then continues through the house and out of the front door, landing in a high, wide establishing shot of the farm.
“The exposure range between the day exterior of the fields and the dark interior of the farm house was fairly extreme,” Shelton explains. “It was important for us to maintain a shallow depth of field with Chris, so that the fields in the background were somewhat soft and hidden, preserving the reveal of the farm for the wide viewpoint at the end of the shot."
To navigate that exposure range, Shelton employed the Panavision LCND filter system, which can be dynamically adjusted across a 6-stop range of neutral density, from ND0.3 to ND1.8. “The LCND system proved to be the perfect solution,” the cinematographer shares. “Rather than having to do an iris pull and starting with a deep stop that lets the audience see too much of the background, we were able to start at a T4 and maintain that stop throughout the entire shot, despite the extreme exposure shift. The exposure pull was seamless with the LCND — very smooth and graduated as the camera transitioned from the exterior to the interior and back again. It’s a truly remarkable tool that has become a permanent fixture in my toolkit.”

Thursday, 9 July 2020

F1 promotes remote production to front of the grid

IBC
F1 has returned to action with big changes to convince governments of its ability to wrap the sport’s normal circus in a Covid-19 biosphere. 
Aside from barring all spectators, F1 has dramatically reduced the number of officials at eight announced events, slashing its usual cast of 3,000 – 5,000 staff to 1200 and on-site broadcast personnel from over 250 to just 60. 
That prompted Formula One Management (FOM), which controls host broadcasts, to bring forward a remote production model it had been planning to introduce from 2022. 
Instead of producing and distributing the world feed on site as normal, camera feeds are being sent from track directly to FOM’s headquarters at Biggin Hill airfield in Kent from where the race is directed, packaged and distributed to rights holders. 
One of those, Sky Sports, is also managing presentation remotely via Sky Studios in west London which is the base for former drivers-turned-pundits Anthony Davidson and Karun Chandhok. Sky still have a presentation team led by presenter Simon Lazenby and Martin Brundle on-site but Brundle’s famous grid walks are absent for the time being. 
Channel 4’s 2020 coverage produced by Whisper includes highlights of every qualifying session and race, but Covid conditions mean the entire team, including presenter Steve Jones and former F1 drivers David Coulthard and Mark Webber, along with race commentator Ben Edwards, are now based in the UK.  
“Using a studio would have been the straightforward option but we wanted an environment with more energy,” explains John Curtis, Whisper’s F1 editor of the decision to locate presentation at The Silverstone Experience museum for the opening few race weekends. 
Regular on-the-road partner Gearhouse built the set and manages cameras, comms feeds and relevant track time pages as well as connectivity back to Timeline in Ealing which provides the presentation gallery and edit suites. 
“Our attitude was, ‘if we are going to be here let’s turn it into a positive’,” Curtis says. “Rather than replicate normal production, how do we create a programme we couldn’t create on site?” 
Accelerating production Upping the production values includes use of a touchscreen for analysis, access to more UK-based talent, additional presentation positions, a camera jib and triple the number of cameras (from 2 to 6). 
 “We’ve a Championship winning Mercedes car on set and a 70-inch monitor for viewing live action, neither of which we would have on the road,” explains Curtis. 
The world feed is sent from Biggin Hill to Whisper’s production gallery in Ealing and onward to Silverstone. The presentation feed is sent back to Ealing where it is pulled together with the live feeds and distributed.   
“We rely on entirely on the world feed, we’ve no cameras on site,” Curtis says. “The F1 team has been helpful in providing us an extra camera on-site for interviews where slots are available.” 
The team aim to present from the Silverstone track when Channel 4 airs live full race coverage of the British Grand Prix on 2 August. Whisper also hopes to gradually get presenters back on-site at Grand Prix as the season unfolds but production will remain remote for the foreseeable. 
“The track is still the best place to present,” Curtis says. “While we already had quite an established remote workflow in place where producer, director, vision mixer and assistant producer were all based in Ealing, F1 is unique in the sense that presenters have the opportunity to mix with teams and gather information from team managers and ex colleagues between races. That’s a softer, less visible but valuable benefit of having direct access. Being remote makes programme packages with interviews and reactions harder to build.” 
Whisper also use Blackbird to remotely edit interviews and behind-the-scenes long form content. 
Shift to cloud distribution FOM is still using satellite as the dominant transmission technology for world feeds produced out of Biggin Hill but it has taken steps to move this into the cloud. 
Beginning with the Austrian Grand Prix at the Red Bull Ring it is working with London-based M2A Media to take supplementary feeds, such as onboards and pit lane, into the AWS cloud for global distribution. 
“It’s a way for F1 to maximise their reach and gives them more flexibility,” says Marcus Box, CTO, M2A. “The traditional model was restrictive in the number of feeds they could produce. Over time they are looking to expand and enrich content. A satellite may only carry a subset of onboard cameras in a particular region so where there’s a driver from that country – say Finland or Brazil - we’re able to give broadcasters access to that driver’s onboards.” 
The M2A Connect service is not a playout solution but it is used to overlay graphics and to change audio tracks. 
“It can also be used to acquire (contribute) links from the venue which F1 may look to do in future. We are also talking with F1 about further regionalising the feeds that broadcasters receive at ingest or egress location.” 
Operating as an orchestration layer over AWS Elemental MediaConnect, AWS Elemental MediaLive, and AWS Elemental MediaPackage, M2A CONNECT offers Formula 1 a single web console for confidence monitoring and to setup live sources, schedule events, manage entitlements and deliver tailored live video. 
M2A’s founders, Box and Mariana Kalkanis, were previously at the BBC where Box helped get iPlayer running and was instrumental in bringing in AWS as a cloud vendor. He says M2A’s philosophy is “simplicity at scale.” DAZN is its key client. 
“We’ve taken some complex use cases and wrapped them in an orchestration layer which is aimed at media operators rather than broadcast engineers,” Box says. “Orchestration can be abstract for users so we’ve set this up for operators to clearly monitor confidence feeds with clear alerts for any issues with the source.” 
Refreshing the formula Already planned for 2020 and introduced in Austria is a new set of racing statistics powered by Amazon Web Services (AWS) in continuation of its official cloud and machine learning partnership with F1 since 2018. 
Six real-time racing stats, including car performance scores, ultimate driver speed comparison, driver skills rating, high-speed/low-speed corner performance, car/team development and overall season performance as well as qualifying and race pace predictions will roll out as on-screen graphics from July through December of this season. 
The new gauges are aimed at helping fans to further understand the split-second decisions and race strategies made by drivers or team strategists that can affect a race outcome. 
To generate the graphics, live data is taken from 300 streaming sensors that are fitted to each F1 race car, generating more than 1.1 million data points per second and transmitted from the car to the pit. This is then compared with 70 years of historical race data stored in Amazon S3. Data is output in a contextual way on F1’s international race broadcast feed, including its digital platform F1 TV. 
“Our existing relationship with F1 has already produced statistics that have brought fans [virtually] into the race paddocks, and our study of race car aerodynamics is influencing vehicle designs for the 2022 season,” said Mike Clayville, VP of worldwide commercial sales at AWS in a press release. 
F1 is also considering using 5G mobile networks in future for a better experience, as the technology has reduced latency. 
From this season, subscribers to F1 TV Pro now get access to all onboard cameras and hear unedited team radios, as well as watch an upgraded multi-screen Pit Lane channel. This channel covers multiple battles at once, with additional graphics, additional angles and expert commentary. 
All F1 TV subscribers also have access to improved Live Timing features on the official F1 app, “with more data than what the F1 teams get on the pit lane with” according to the somewhat clumsy wording on the official website.  
Audience crashes as paywall goes up “Formula 1 has been viewed or consumed very much in the same way for the last 30 to 40 years,” admitted Rob Smedley, chief engineer of F1. “It is images of cars going round and round a track and it’s not really telling much more of a story than that.” 
The increased use of car data is an attempt to better engage fans. 
In the UK, F1 was already facing an uphill struggle after beginning a new contract last year which made Sky Sports the exclusive live broadcaster of all but the British GP. 
In data shared by The Independent, this led to UK audiences tanking by 13.6% over the course of last season to a cumulative (Sky and C4) 54.8 million. Of that diminished figure, the number of people watching online was just a fraction.  
Data from Barb shows that across Sky Sports and Channel 4 meant each race averaged just 70,680 viewers online. 
That’s not quite what F1 owner Liberty Media had in mind when it put online streaming at the heart of its plans for the sport when it got the keys to it three years ago in a £3.7bn takeover.  
There are two driving forces behind it, concluded The Independent. First, F1’s dramatic action is better suited to a bigger screen and, second, its fans tend to be older than those of other sports so they aren’t as used to watching TV online. 
Backing up that idea is Barb data showing that the smallest screens attract the fewest viewers. Some 45.6% of viewers across both Sky and Channel 4 streams last year did so using a PC/laptop with. This was followed by tablets, which accounted for 32.3%, and smartphones at 22.1%. 
F1 TV’s debut on the Roku platform addresses this and marks the first time F1 TV will be available to watch on a large screen format. It is only currently available in North America though. 
Green is go Environmental concerns are heating up with F1 conscious of reputational damage related to its perceived fossil fuel emissions. Of the 256,000 CO2 equivalent tonnes emitted during the 2019 race calendar, just 7.3% was attributable to the operations of events including broadcasting. However, the international travel and logistics required to put on each race that exhausted 72.7% of F1’s total carbon output. It plans to reach net zero in a decade, but that seems an eon away when going green demands more urgency. 
“Environmental impact is part of every decision we make whether that’s what water bottle we use on site to how we travel,” says Whisper’s Curtis. “We will work with F1 and any other group we can to try and speed up the pace of change and we look at ourselves and what we can do better. Sometimes the greenest option is not the cheapest one but it’s about making the right choice.”