Tuesday, 2 March 2021

Behind the Look: Mank

Written for RED

Erik Messerschmidt ASC channels cinematographic legend Gregg Toland to recreate classic era Hollywood for David Fincher 

https://www.red.com/mank

Shot in black and white and often in deep focusDavid FincherMank evokes 1930s classic cinema with rigorous attention to digital detail.  

Made for Netflixbiographical drama Mank stars Gary Oldman as Citizen Kane screenwriter Herman J. Mankiewicz as he races to finish the Kane screenplay for Orson Welles.  

Famously, Fincher was among the first A-list directors to embrace digital filmmaking. Since The Curious Case of Benjamin Button in 2008 he hasnt deviated from using RED cameras. There was no question he wasnt going to shoot digital for Mank and, according to cinematographer Erik Messerschmidt ASC, Fincher had always envisioned the picture in black and white

“It would be a crime not to make this movie in black and white,” explains Messerschmidt who earned an Emmy nomination for shooting FincherNetflix series Mindhunter. “Digital was just right for this project for all manner of reasons.”  

He continues, “David and I need to be able to look at a monitor and get very specific about everything that exists in the frame - the set dressing, the composition, the lighting, the overall tone. Being able to see the image on the monitor and make those creative decisions analytically is crucial to David’s process. In contrast, there is an imprecise nature to composing a shot with a film camera. It just doesn’t provide the same level of control we need.” 

There were additional editorial reasons why digital was appropriate for Mank.  

For this movie we wanted to shoot very deep focus photography for most of the film and then be very specific about where we used shallow focus,” Messerschmidt says. Shooting on film would have significantly limited our creative choices, particularly with focus and depth of field.” 

Deep focus keeps everything in the frame in focus – foreground to background – and requires a small aperture and lots of light. Its a technique pioneered by American landscape photographers like Ansel Adams in the 1920s, adopted for cinema in the 30s and popularised by Citizen Kane

In prep, the DP evaluated a number of RED cameras including a prototype of Komodo, a Red DSMC2 Weapon housing a color Monstro sensor and both a 6K Dragon and 8K Helium monochrome sensor in a RED RANGER body.  

 

Recent arthouse hits like Roma were shot in color and converted to black and white, but this route would not offer the range of control the filmmakers required. 

I was keen to learn if having color information in the digital negative would give us expanded freedom in the DI or tonal control we would otherwise lose if shooting just in black and white,” Messerschmidt says. What we found was that the 8K Helium black and white camera was superior in tonal quality. It gave us a silvery platinum print quality and tonal depth that we werent seeing from the colour cameras. It was overwhelmingly clear to us when we sat in a theatre and screened the tests.” 

Monochrome sensors are capable of higher detail and sensitivity because there is no color filter (no Bayer Pattern) of the incoming light. A color sensor would filter the light to match the primary red, green, blue pixels. REDs monochrome sensor effectively records three times as much light from the scene, which in turn translates into a 1 to 1.5 stop improvement in light sensitivity.  

Having decided on the Helium Monochrome RANGER, Messerschmidt made further tests against charts and then in real world settings with stand-in actors in wardrobe, comparing day and night interior and exteriors, filtration, lens resolution, and depth of field. 

The aim was to see how far we could push the camera to where the balance of speed and grain (digital noise) was where we wanted it,” he explains. We wanted to shoot as deep an f-stop as we were comfortable with so speed was very important. We found the camera recorded a very clean image at 1600 ASA but we actually preferred the grain at 3200 ASA. That changed the contrast slightly. You got more noise in the shadow and a bit more highlight retention.” 

Working with colorist Eric Weidt, Messerschmidt took footage through a rough grade, examined it projected on large screen in 4K and in 4K on a HDR display in the DI suite. 

The greater light sensitivity of the monochrome sensor was helpful in meeting the intent to shoot large parts of the movie in keeping with the techniques Toland helped pioneer for Citizen Kane

To emphasise depth of field, Messerschmidt used a Cmotion Cinefade, an accessory that allows the gradual transition between deep and shallow depth of field in one shot at constant exposure. 

Footage was acquired in 8K Redcode RAW at a 2:1 aspect ratio but framed by a 20% reduced center extraction in 2.2:1. The resulting capture has a resolution around 6.5K. This gave enough latitude to help with some reframing and shot stabilisation before delivery to Netflix at 4K HDR at a 2.21:1 aspect ratio. 

With Panavision LA and Keslow Camera, Messerschmidt tested dozens of different lens options on exterior and interior sets before selecting the Leica Summilux-C, the same primes deployed on Mindhunter

“They seemed to perform best in terms of resolution and also apparent depth of field. Other lenses technically held more resolution at T11 but their apparent depth of field was also less.” 

With Weidt working on FilmLight's Baselight, Messerschmidt developed a similar workflow to the one they devised for Mindhunter S2, notably monitoring on set exclusively in HDR, only this time in black and white.   

“We took LOG3G10 out of the camera and monitored on a Canon 24-inch professional display in Dolby PQ (Dolby Vision). In post, Eric applies the same LUT so its empirically extremely close to what we saw on set. The beautiful thing about this was being able to see images that were very close to the dynamic range of the sensors capabilities. When the sensor was clipped, the monitor and waveform was clipped so we were in a good position to protect highlights while being confident the images we captured on set would be replicated in Eric’s DI suite.” 

In keeping with the classic Hollywood look, Messerschmidts camera is relatively static on a dolly or crane, with no handheld or Steadicam work.  

“We restricted our tools to what would have been available in the 1930s,” he says. We tried very hard to stay true to long takes and holding on two shots.” 

Day for night challenge 

A singularly challenging scene where Hearst walks with Davies through the gardens and zoo of Hearst Castle was shot day for night outdoors.  

Any day exterior is always challenging and day for night just makes it more complicated. Its all about controlling contrastoften by adding a tremendous amount of light onto the actor without pointing lots of 18KW HMIs at their face and making it uncomfortable for them to perform.” 

DP and colorist built a day for night LUT into the camera for the scene which was shot at Huntington Botanical Gardens near Pasadena.  Often the biggest tell’ that a scene has been shot day for night is if the car headlamps or street lights are not bright enough. Theyve been noticeably underexposed in relation to the front light. To counter this, Messerschmidt deployed 400w bulbs in specially built practical fixtures.  

Other night scenes in Mank are shot night for night. The decision in this case was partly in homage to 1930s filming when common production techniques included hiding light bulbs behind a candle, painting shadows on a dark wall and shooting day for night. 

Mank and Marion enjoy a platonic romance and in this scene she opens up to him about her relationship with Hearst,” Messerschmidt explains. Fincher wanted the scene to have a bit of a magical quality to it. This is enhanced because were showing them walking among Hearts zoo of elephants, giraffes and monkeys, all of which required some visual effects help.” 

To enhance the picture’s contrast and period look they added effects such as flare enhancement and distortion around highlights in post.  

“I really love the circular halation lenses of the period produced around highlights,” Messerschmidt adds. “In the release prints of the period you got this very subtle bloom in the blacks – a kind of halation around the darker parts of the frame. That was something David really wanted to bring out. So, with Eric we worked out a way of keying the blackto a certain level and added a bit of blur. Were kind of art directing each frame.” 

In another subtle nod to cinemas photochemical heyday, Fincher simulates gate weave’ caused when 35mm release prints advanced through the sprockets of a projector. The artifact is most noticeable during title cards and dissolves, which is when, in Mank, a similar effect is applied digitally. 

Colour cinematography relies heavily on focus and color separation to guide the audiences eye but in black and white you are more reliant on light and texture to tell the story,” Messerschmidt says. Its been a tremendously rewarding experience.” 


Monday, 1 March 2021

Going viral with Venice in lockdown London

written for VMI

Enterprising young filmmakers Edoardo Forato and Fyras Slaiman hope their new short film Influenza will catch the attention of producers and directors.  

https://vmi.tv/blog/production-story/going-viral-with-venice-in-lockdown-london/

The period drama, set amid the 1920 Spanish flu outbreak, is vividly told with a combination of Sony VENICE camera and Leica Summicron lenses.  

That’s an uncommonly high-end package for a production by two recent film school graduates and not possible without the perseverance of cinematographer Slaiman.  

“We had a starting budget of £4,500 from London Film School as part of our MA which we pushed beyond £20k using kickstarter and personal funding,” he explains. “I wanted the very best for our story and having used the Sony F65 before I wanted to try to see if we could get hold of a VENICE in particular because of its wide dynamic range from ISO 500 up to 2500.  

“A friend introduced me to Jonathan at VMI who quickly understood what we needed. They really helped us out with support and pricing and, even better, the results were exactly as I imagined.”   

Originally titled The Last Dance, writer-director Forato’s story about a man trying to cope with the death of his wife, was adapted to reflect our current pandemic.  

The filmmakers had to be patient as their initial shooting plans were delayed and delayed as a result of lockdown. Finally, they got permission for the nine day shoot in February in three London locations including an abandoned flat, a luxury Soho apartment and the Langdon Down Center’s Normansfield Theatre which was used for scenes of a cabaret.  

“We used a lot of our budget on buying Covid-19 tests for our actors and crew. We were all working under Covid-safe protocols including social distancing and wearing masks,” Slaiman says.  

He explains that the film’s story takes place in two worlds. That of a cabaret in which everyone is dressed in colourful nineteen-twenties costumes, and the more personal world of the lead character called Coleman.  

“Expressing these distinct colour schemes was the reason I wanted to work with the VENICE,” he says. “The cabaret world is vivid and brightly coloured while Coleman’s world is darker, monochromatic and blue. The sensitivity of the camera and its incredible color space offers a good mix so I could push the saturation without affecting the skin tones. To keep the surrealistic mood of the film we shot the blue monochromatic scenes day for night, changing the colour temperature and under exposing while shooting RAW to be able to fully adjust the parameters in colorgrade.”  

Using Summicrons enhanced the overall mood. “The blur and depth of field from the Summicrons make it look like a painting,” he says. “I mainly used a 35mm for framing close to the actor so we can see some background without forcing the face too much.”  

For lighting, Slaiman chose to work with a mirror kit and Source Fours. “Because of the location we needed to play with tight spaces and still have directed light. It had to look like theatre lighting. It’s remarkable how a 3cm x 3cm diffused mirror can light a fill on a person from 10 metres away.  

At London Film School, Forato and Slaiman were tutored by luminaries including editing master Walter Murch ACE and Tristan Oliver BSC who shot Wes Anderson’s Isle of Dogs.  

It is Darius Khondji AFC, ASC who has worked with David Fincher, PT Anderson, Jean-Pierre Jeunet, Wong Kar-wai, and the Safdie brothers who is Slaiman’s hero.  

“I don’t believe a DOP should have a style. They should be the translator of the director’s vision,” Slaiman says. “When you look at someone’s showreel you shouldn’t be able to discern a singular style. Darius Khondji has worked with so many great directors you almost can’t believe it all comes from one person. I feel like he always creates the perfect mood for the movies he’s working on. There will be beautiful arresting images of course but I believe that what makes a good film are images that tell that story.”  

Influenza is in postproduction before being entered for film festivals later this year.  

Slaiman adds, “It was a great experience with VMI and I’ll be taking my future projects, hopefully with bigger budgets, back to them too!”  

Sunday, 28 February 2021

Mars Rover landing: How NASA sent high quality video from another world

RedShark News

Major sports events like the Super Bowl, World Cup Final or the Olympics have long been the pinnacle of global televised viewing and of TV tech innovation. Over the next few years, that status could be dwarfed by a new genre of live broadcast events from space. 

https://www.redsharknews.com/mars-rover-landing-how-nasa-sent-high-quality-video-from-another-world

As nation states and commercial enterprises bid to control tickets to the moon and to colonise Mars, we’re entering a fevered new age of galactic travel and exploration. 

Most of it will be televised. That’s evident at NASA which increases the number of cameras it carries on each successive mission. 

The recent live coverage of the Perseverance landing was part of NASA’s PR effort to justify the $2.4 billion it took to build and launch the thing. The estimate to land and operate the rover during its prime mission alone is $300 million. 

The cameras and onboard microphone “can be considered a ‘public engagement payload,’” say NASA. “They are likely to give us a good and dramatic sense of the ride down to the surface!” 

As if that wasn’t enough to grab people’s attention, the space agency dubbed the rover’s entry and descent to the crater as ‘Seven Minutes of Terror’. One wonders how they will up the ante when it comes to billing the human landing which it is under presidential orders to deliver by 2033. 

Part of this sensationalism is because of the blackout in transmitting data from the craft to earth.  

The delay, inconsequential between Earth and Moon but the margin of four to 24-minutes when covering the 225 million km (140 million miles) to Mars, means a delay in the ‘live’ broadcast during which apparently not even NASA’s ground control are sure of the outcome.  

Another reason for not broadcasting the live stream in something like realtime is because its quality would not be deemed suitable for telly. 

“We probably could do it today, but definitely not in HD,” Stephen Townes, Chief Technologist for the Interplanetary Network at NASA’s Jet Propulsion Lab told Forbes.

Perseverance is able to send data directly into NASA’s Deep Space Network (DSN) antennae on Earth. However, at between 800 bits-per-second and 15,625 bps to a 70m DSN antenna, that’s not going to cut it for HDTV which requires at least 8Mb/s (UHD requires over 57 Mb/s). 

Today, that’s not the primary way of getting data back to Earth. Perseverance was able to send data from the surface to a Mars orbiter at a maximum of 2 Mb/s, for onward relay to Earth via the an orbiting satellite. There’s still a delay but 2 Mb/s is about the same bandwidth required for a stable stream from Netflix. 

“Once it is on the Mars Reconnaissance Orbiter [the signals] can be sent to Earth at 500 kilobits-per-second (kb/s) up to 3 to 4 Mb/s depending on the distance between Mars and Earth,” said Townes.   

Ye cannae change the laws of physics, but you can bend them a bit.  

NASA is currently upgrading from radio to optical communications using a system it calls Deep Space Optical Communications (DSOC). Working with data encoded in photons and beamed over laser light is set to vastly increase the data rate.  

“It’s a very significant step in demonstrating the viability of optical communication at Mars distances,” Townes explains. NASA has demonstrated optical communication of up to 622 Mb/s from the Moon, but Mars at its closest range is over 150 times farther away from Earth, which makes communicating from Mars 22,500 times harder. 

Right now, using a 5 metre telescope in California as a receiver, NASA expect to get around 50 Mb/s using DSOC. With a larger ground telescope of around 10m diameter, it could support 200 Mb/s. 

Eyes in the sky 

Perseverance is carting 25 cameras to the Red Planet – the most ever flown in the history of deep-space exploration. Twenty-three are part of the spacecraft and two are on the Ingenuity Mars helicopter. 

Of the 23, nineteen of them are fitted to the rover. They are all assembled from easily available commercial hardware, according to NASA and include nine colour cameras for engineering; three for entry, descent, and landing (two of which are in colour and dedicated to ‘public engagement’). Another two colour cameras have a zoom and are mounted on the mast that stands up from the rover deck. These are aligned to capture stereo 3D images “providing a 3-D view similar to what human eyes would see, only better” NASA claims. 

There are various scientific and navigation cameras too, the most intriguing of which is the SuperCam that fires a laser at targets smaller than 1mm and vaporises the rock into plasma. An onboard spectrograph records all this to reveal the chemical composition of the mineral. 

If there is anything in those sponge-like rock formations revealed by the rover then it had better watch out. That’s the trouble with tribbles. 

Next Giant Steps 

Space watching via channels like NASA TV is already routine but it’s going to become a cinematic event over the next decade as the number of missions and the ambition of them rockets upwards.  

Ingenuity, the Mars-Copter, will shortly being a series of flights filming over the Martian landscape, ExoMars a Russian and European Space Agency program plans to land another rover on Mars in 2023. There are also probes due to launch to Jupiter and Saturn’s moon Titan in the late 2020s and space tourism is being launched by SpaceX, Virgin Galactic and Russian agency Roscosmos. 

Tom Cruise of course plans to make an actual Blockbuster in orbit on the ISS, in a film to be funded by NASA. 

But ratings will go into stellar overdrive with manned missions back to the lunar surface and to Mars which should lift off by 2030.  

Even if you think it’s all being filmed on a lot in Burbank these seat-of-the-pants thrill rides will be must-watch moments. 

Friday, 26 February 2021

VR? AR? Today, It's All About XR

Streaming Media

VR is dead, long live VR! That about sums up the reality check the industry is having to spin about next-gen immersive video. The GSMA, which represents the interests of mobile network operators worldwide, admits that virtual reality (VR) continues to suffer from "post-hype realities," a hype that the GSMA's annual Mobile World Congress (MWC) played a role in building. At the same time, the GSMA points to the current mobile ecosystem as a "laboratory" for AI and immersive reality: "Whereas the smartphone wars centred on the app economy, the new battleground is in AI development and a push towards immersive reality." 

https://www.streamingmedia.com/Articles/Editorial/Featured-Articles/VR-AR-Today-Its-All-About-XR-145251.aspx

Analysts predict the global augmented reality (AR) and VR market will grow. Prescient & Strategic Intelligence charts a yearly 42.9% increase, topping $1.274 billion in 2030, largely based on an uptick in the sales of tablets and smartphones. 

Futuresource Consulting expects that the market will have remained relatively flat in 2020, with just fewer than 18 million VR and AR headsets estimated to have shipped. It says this is mainly driven by smartphone VR viewers, which account for more than half of all VR shipments. The market mix is expected to rapidly develop, however, with Futuresource Consulting forecasting that console VR headsets, PC headsets, and all-in-ones (such as Oculus' Quest 2) will enjoy solid growth through to 2024.

In short, VR has failed to take flight, but its time will come again. This article reviews what happened, what's going on now, and what's next for immersive video.

VR 1.0 Stalls

Facebook lit the fuse for VR in 2014 when it acquired Oculus for $3 billion. VR tech and production companies multiplied, with 2017 being the pinnacle. Jaunt raised $100 million from Disney, Google, and Sky, among others, before being bought by Verizon in 2019. Since then, though, things fizzled. Nokia sidelined its OZO division. Google shuttered its VR platform, Daydream. NextVR hit a market value of $800 million before selling to Apple in 2020.

"The VR market was overhyped and did not deliver meaningful results besides CG-based VR games," says Thierry Fautier, president of the Ultra HD Forum and VP of video strategy at Harmonic. "VR failed to take off because the technology was not convincing, due to severe issues like motion sickness and the ‘screen door effect,' and the quality was not good enough (4K vs. the 16K probably needed). We now see a second wave of VR enabled by 5G and new HMDs [head-mounted displays] with 8K capability."

The hype cycle began for VR and AR linked to 5G deployment in 2017. In April that year, Korea Telecom and Verizon claimed the world's first end-to-end 5G network in a demonstration in which a Verizon employee appeared as a hologram on a monitor at Korea Telecom's headquarters. 

"If the 5G network is commercialized, 3D hologram video calls will be available as one of the representative 5G-based services," a Korea Telecom spokesperson told The Korea Herald. "Through a complete hologram video call, users will be able to meet a person in a remote area in a real size in real time."

At MWC 2018, South Korea's SK Telecom debuted 8K "social virtual reality" and "hologram AI." In September that year, Vodafone demonstrated a live holographic appearance of the England women's soccer captain over 5G. In February 2019, U.K. mobile operator EE showcased its 5G network by putting a hologram of "digital supermodel" Shudu on the red carpet at the BAFTA Awards. At MWC 2019, ZTE and China Telecom teamed up to show 5G 8K VR panoramic live streaming from a traveling bus. 

For years, the industry has been building the hype around 5G and, in particular, its "game-changing" capability to stream video games and AR/VR experiences in real time. After all, 5G is the first network infrastructure that can deliver the speed (100Mbps) and latency (10 ms) required by VR and AR mobile applications. But despite heavy investment from Apple, Microsoft, Google, and Facebook, VR/AR is far from mainstream. 

"The lack of a suitable supporting network is one of the factors that has contributed to a slower than expected uptake in mainstream adoption of VR," says Sol Rogers, CEO and founder of immersive content creator REWIND, in an article he wrote for Forbes. "5G will usher in the next era of immersive and cloud-connected experiences. Faster, more uniform data rates, lower latency and lower cost per bit will ensure it."

VR headset ownership has dipped from peaks in 2017 in Australia, the U.K., and the U.S. Whereas 13% of U.S households owned one in 2017, by 2019 that number had fallen to 8%. According to the GSMA, "Expensive, clunky hardware and continued challenges with dizziness have not helped, but limited content libraries beyond gaming and lack of edge infrastructure are also to blame." 

Sports leagues and broadcasters are among those that continue to see VR as a means of boosting viewer interest. The challenge will be to convince pay-TV customers to spend more for VR broadcasts when they are already being pulled in the opposite direction toward cheaper and more-flexible streaming packages. 

Even the $1.2 billion market size predicted for VR/AR in 2030 is dwarfed beside the existing $1 billion-plus market value of esports or the $1.52 trillion in total global mobile revenues anticipated by the GSMA by 2025.

"XR [extended reality, the catchall term for VR, AR, and mixed reality, or MR] has been a disappointment insofar as it's not met the hype that has often surrounded it, but these expectations were, in hindsight, largely misplaced," says Stephen Mears, a research analyst at Futuresource Consulting. "Too many technological developments needed to be made and coincide with one another for XR to really take off, and there's still a few steps to go. Notably, improved visuals must be combined with spatial audio technologies, and likewise the SoCs [systems on a chip] must be further developed for specific XR use cases."

AR is a little different because it is inherently more suitable for on-the-go mobile computing. Google, Microsoft, and Magic Leap quickly realized that AR's biggest early-days potential lies in B2B. Even Lenovo has launched AR platform ThinkReality for manufacturing.

"[Consumer AR] is hard, as the technology cannot deliver the consumer expectation at a price consumers find agreeable," Fautier says. "Google Glass was a failure because the experience was flawed and [the] price was high. The Magic Leap experience was not great, the ecosystem incomplete, as it required a special piece of hardware, and the price was high."

Earlier this year, Gartner removed AR from its list of emerging technologies, considering the tech to have matured and ready to move into the enterprise space.

VR 2.0 = XR

The term XR is now vogue. You can't get a clearer description of its components than that provided by futurist and strategic advisor Bernard Marr: 

  • VR: A fully immersive experience where a user leaves the real-world environment behind to enter a fully digital environment via VR headsets.
  • AR: An experience where virtual objects are superimposed onto the real-world environment via smartphones, tablets, heads-up displays, or AR glasses.
  • MR: A step beyond augmented reality where the virtual objects placed in the real world can be interacted with and respond as if they were real objects.

Qualcomm is a leading promoter of an XR future. The chipmaker is explicitly linking XR with 5G and has corralled a number of mobile operators to commercialize a head-worn "XR viewer" that can be connected to a 5G smartphone and powered by Qualcomm silicon. "Mobile XR has the potential to become one of the world's most ubiquitous and disruptive computing platforms of the next decade, just as the smartphone of this decade has become," says a Qualcomm blog post. "At some point in the future, we see the convergence of the smartphone, mobile VR headset, and AR glasses into a single XR wearable. … XR glasses could replace many other screens in your life—even big ones like the TV in your living room." 

China Mobile—the world's largest wireless network operator—Deutsche Telekom, EE, KDDI, Orange, SoftBank, Verizon, Vodafone, Tele­fón­ica, and NTT DOCOMO are also on board. "We believe that the next decade will see a transition into a heads up society where eventually glasses will be an alternative to smartphones," says Sean Seaton, SVP of group partnering and devices at Deutsche Telekom, in a press release. 

Qualcomm supports Nreal, an XR viewer that connects via USB-C to an Android device, with all of the processing, storage, and wireless network connectivity needed for XR taking place on the phone instead of the headset. Brands including iQIYI, 3Glasses, and Panasonic are expected to follow next year with gadgets bearing a "Qualcomm XR Optimized" badge. Qualcomm mentions enterprise applications such as how "workplace meetings can be revolutionized through holographic tele­presence with virtual collaboration platforms." 

Exploring XR

Meanwhile, Deutsche Telekom is teaming up with Orange to develop consumer XR content. The first partnership is with Montreal-based Felix & Paul Studios to provide "the ISS experience." Morgan Bouchet, director of digital content innovation and head of XR at Orange, says, "[We] have been moving forwards with projects to develop platforms that deliver XR/5G, and have been confronted with a common challenge: finding high-quality interactive and immersive content. Supply is still limited in this market, with designs being few, far between, and expensive. Together, we are better equipped to compete with larger players, especially [Google, Apple, Facebook, and Ama­zon], in order to purchase premium content."

The two companies are also looking into XR in sports and TV. Daniel Aslam, Deutsche Telekom's global partnering and XR business development manager, states, "We are exploring XR [Immersive] Media use cases where for example the TV interacts with mixed reality glasses. This we will continue in 2021 in a co-development project with Orange." 

"Growth in XR in the short-term will be driven by gaming and B2B, particularly in education and training," says Futuresource Consulting's Morris Garrard. "Improvements in productivity in the workplace will help bring XR into consumers' homes."

Perhaps the biggest (non-Qualcomm) XR product will be Apple's glasses, rumored to launch in mid-2021, possibly with an 8K spec.

"Ultimately, there will be compelling use cases for standalone (untethered) XR headsets and wearables, most likely in enterprise AR initially," says Mears. "However, the industry must ensure a seamless consumer experience and, crucially, software and content developers need to be brought on board to make sure that there is something actually worth doing with these devices when they're developed."

VR and AR require bandwidth between 20­Mbps and 30Mbps and can therefore be consumed in a stadium without an HMD, which Harmonic believes will be a popular use case. Multi-view, multicam, and multi-game HD video, which will also require 20Mbps, has been demonstrated at the RG Lab with France Télévisions.

"Before 2025, we will see in-stadium applications on mobile devices," predicts Fautier. "For VR or 2D, the applications are multicam, wide-angle camera for live and catch-up action. For AR, we envision score and statistics overlay and point cloud for volumetric capture. We believe multi-view and replay of action in the stadium can now be deployed at scale with 5G."

XR Applications

Nor does XR need to be considered a purely visual opportunity. For example, there is capacity for audible AR, in which the world is described to the wearer through a voice assistant. Simon Forrest, principal analyst at Futuresource Consulting, posits that this may take the form of an advanced hearable product that's able to use location-based sensors and spatial awareness to deliver precisely timed, contextually aware information that's relevant to what the user is doing. "This has consequences on the advancement of AI in voice assistants and how internet search operates because all results must be spoken as the top answer, each and every time," he says.

Beyond headline speeds, the ultra-low latency in 5G effectively brings edge devices and cloud services far closer topologically, blurring the boundaries over where a compute resource can be placed. Furthermore, the opportunity for network slicing—effectively reserving guaranteed bandwidth for specific applications—will present mobile networks with advanced capability. 

"This opens up new applications that demand real-time response from cloud services, with XR one vertical that is able to take full advantage," says Forrest. "Nevertheless, we're still around 5 to 7 years away from blanket 5G service coverage in most developed regions, and certainly, devices won't be ubiquitously connecting at gigabit speeds, since this would necessitate a massive densification of 5G network infrastructure."

Forrest concludes that XR products cannot simply rely on high-bandwidth mmWave. Centered on 26 GHz, mmWave adds the "super data layer," providing ultra-high bandwidth service of up to 10Gbps over very short distances. "Therefore, 5G should be considered as one part of the overall solution for XR," he says. "In tandem, SoCs are being developed with neural network accelerators on board. This innovation is affording opportunity for edge-AI, massively improving the compute capabilities available on XR devices themselves. Combining this with low-latency 5G connectivity then allows cloud-based compute to run alongside. Ultimately, 5G enables a balance of compute between edge and cloud, with capacity to execute in near real-time."

For in-home XR experiences, 5G will be "essentially meaningless," says Forrest, with Wi-Fi 6 far more important in ensuring low-
latency, high-quality XR experiences if the content being consumed or used is being directly streamed, such as sports. Moreover, the Wi-Fi 7 (802.11be) standard, presently under specification and scheduled for commercial launch in 2024, is expected to offer peak connectivity speeds of 30Gbps and lower latencies, competing directly with 5G mmWave. Forrest points out that this will necessitate fiber-to-the-premises in order to connect that massive local bandwidth to the internet backbone.

New codecs such as Versatile Video Coding (VVC) and Essential Video Coding (EVC; MPEG-5) will help reduce video bandwidth for broadcast/push VR applications, and these codecs will be used in tandem with 5G and Wi-Fi 6 connectivity. But video frames aren't necessarily the best technology for XR, and there are alternatives under development. One such innovation is in volumetric capture techniques: Video can be captured using multiple cameras, and then a "point cloud" is generated, which describes the 3D [x, y, z] coordinates and color of each pixel in the scene. 

"From thousands of points, a set of 3D objects are created, and these can be rendered in the headset; coupled with the increase in local compute performance, this offers potential for new XR products based around graphical rendering techniques, rather than just video frames," says Forrest.

BT Sport is exploring how to capture volumetric video and deliver interactive immersive experiences over 5G, both within sports stadiums and to augment the live broadcast at home. Ideas include streaming a real-time virtual volumetric hologram of a boxing match onto a viewer's coffee table, simultaneously with the live feed. It is all at the proof-of-concept stage, but is part of a nearly $40 million U.K. government-funded program to develop applications that will drive 5G take up. 

6DoF Free Viewpoint

Companies within MPEG are actively defining algorithms for the compression of point clouds, such that capture and transmission are standardized. These are MPEG V-PCC and MPEG G-PCC for video and graphics respectively, both of which seek to capture 3D volumetric data and reduce bandwidth significantly.

"Beyond 2030, we anticipate there will be 6DoF [6 degrees of freedom] light field applications that will require high computation on the client side, as well as new display technology," Fautier says. "Those instances will [be suitable for] location-based environments. A second application is the free viewport application to render a game. While that application is not in real time today, we expect it to be."

In 2019, Canon used its Free Viewpoint Video System to provide highlight videos at the Rugby World Cup. The company says that the system "makes use of multiple cameras positioned around a stadium … and the high-resolution feeds are combined and converted into 3D spatial data, generating a comprehensive three-dimensional map of the action moment by moment."

Another application, which requires specific software in the 5G network, is FeMBMS (Further evolved Multimedia Broadcast Multicast Service). This new mobile broadcast mode in 3GPP Release 17 addresses distribution of popular content up to a radius of 60 km. Harmonic expects to see the first deployment at the Paris Olympics in 2024. If that is conclusive, it could be widely deployed after 2025. 

"Mobile broadcast is the wild card, as it has failed in 4G," Fautier says. "However, if the QoE is much better in situations where streaming is congested, then FeMBMS will be successful. MNOs will be able to monetize their network to content providers, broadcasters, or event organizers (like the IOC or FIFA)."

If XR is to fly, it needs a disruption in display technology, according to Qualcomm. Attributes of these "glasses" include delivering field of view for both immersive VR and useful AR (completely opaque for VR, yet at least about 85% transparent for AR), driving high dynamic range (HDR) of at least Rec. 2020 gamut, and refreshing at a minimum of about 120Hz (per eye). It also has to cost less than $100. 

Qualcomm also says that 5G-enhanced mobile broadband is required for XR mass adoption. "XR video will be the killer use case for 5G," it states. Target attributes include latency down to 1 ms and a uniform experience—even at the cell edge. Qualcomm predicts that over the next decade, 5G XR should advance to attain speeds of 200Mpbs to 5,000Mbps, with interactive, real-time, 3D Free-Viewpoint, 6DoF, 8K/90 fps–120 fps HDR video.

Spatial Computing Frontier

XR devices are the gateway toward the real prize targeted by tech giants like Magic Leap, Microsoft, Apple, Google, and Nvidia: the next-gen internet, which is theorized as 3D and tactile, requiring a new human-machine inter­action designed around voice and gesture.

Spatial computing, as defined by Magic Leap, "allows digital content to interact with its physical surroundings and people to interact with content seamlessly, without the limits of a screen." The company is bringing this forward by running spatial computing on the 5G infrastructure of partners such as Japan's NTT DOCOMO.

Nvidia's bid for a controlling share of this online future is called Omniverse. Targeted initially at the enterprise, Omniverse "fuses the physical and virtual worlds to simulate reality in real time with photorealistic detail," according to president and CEO Jensen Huang in a recent keynote. "Cloud native and photo­real, with path tracing and material simulation, the Omniverse allows designers and artists and even AIs to connect in a common world. This is the beginning of the Star Trek Holo­deck, realized at last." 

Thursday, 25 February 2021

Florentine Films Finishes Hemingway Remotely with ClearView Flex

copy written for Sohonet

Influential documentarian Ken Burns has made many landmark films in his four-decade career including the Academy Award-nominated Brooklyn Bridge, The American Civil War and The Vietnam War. His latest work is Hemingway, a three-part, six hour series made for PBS with producer-director Lynn Novick, exploring the life and work of the legendary writer.

https://www.sohonet.com/our-resources/blogs/florentine-films-finishes-hemingway-remotely-with-clearview-flex/

Like all of Burns’ work it is produced by Florentine Films, the production company he set up in Walpole, New Hampshire with fellow filmmakers Elaine Mayes, Lawrence Hott and Roger Sherman. Each member works independently, but releases content under the shared banner of Florentine Films. 

“We’ll never run out of stories,” says Daniel White, post-production supervisor, Florentine Films. “We’ll have half a dozen in the hopper at various stages of production. I’ll be leapfrogging from film to film to film. There’s always something going on.”

In February 2020, White was working on a restoration of Burns’ marathon 1994 documentary Baseball when Covid started to shut things down.

“We had to very quickly figure out a solution to be able to continue coloring remotely with Technicolor PostWorks, New York. Right away we tested ClearView but it just wasn’t quite ready for what we needed. Within a month Sohonet had ramped it up including with 5.1 audio and we tested it again.”

White sent LG C9 55-inch monitors to PostWorks for it to calibrate to professional standard. The facility shipped them to the homes of Florentine’s post team who, in the meantime, had equipped themselves with the latest version of Apple TV and upgraded to the fastest local internet speeds.

“With ClearView, everyone who works at Florentine Films was working remotely on the project,” White says. “Suddenly, we were able to stream directly from the post house to do color and titling sessions. We were pretty impressed at being able to do it remotely. Usually, we’d send our editors and producers in and out of the edit facility to sit with the online editor. The travel would be expensive and disruptive to our families.” 

We were streaming remotely and doing color grades,” says White. “It looked terrific and worked smoothly – essentially like a Netflix stream.”

While White was in his basement office in Keene, New Hampshire, colorist Jack Lewars was at home in Brooklyn.  “We were streaming remotely and doing color grades,” says White. “It looked terrific and worked smoothly – essentially like a Netflix stream.”

A similar workflow applied to the show’s sound mix. Re-recording mixer Joshua Berger had a ClearView box at his stage at Harbor Picture Company in Manhattan streamed to Hemingway’s sound editors who were remote at home.

“We had one of our dialogue editors in Ithaca, New York listening to streams played out of Harbor in Manhattan. When we did our mix playbacks we had picture and sound coming through in sync and at cinematic quality via ClearView Flex. We pay a lot of attention to detail on both the picture and sound side and being able to accomplish all of this hundreds of miles away from our mixer and colorist was remarkable.”

The pandemic made access to the archive of Hemingway’s manuscripts, correspondence, scrapbooks and photographs housed at the John F. Kennedy Presidential Library in Boston a little tricky. As a result, some of this material requiring a 4K scan came in at the last minute but still the project finished on time and on budget. 

“ClearView Flex improved by leap and bounds during the crisis. More facilities started using it. Without it, I’m not sure where we’d be.”

“ClearView Flex improved by leap and bounds during the crisis. More facilities started using it. Without it, I’m not sure where we’d be.”

White has already moved onto other Burns’ projects on subjects as diverse as Muhammad Ali, the American Buffalo and the holocaust. Even as the pandemic eases the most immediate concerns of working in a facility, remote solutions have opened up a flexible work environment that won’t be reversed. 

“Now that we have these tools in the future we won’t need to travel as much,” says White. “If we’re setting looks it would be nice to sit in the room with your colorist directly or to point to something on a screen to your online editor. Many people will want to do their final review in the facility too. But for two thirds of the process, including titles or graphics sessions, you can do it from the comfort of your couch.

The regular workflow for White and the production team is to have Zoom open on one screen to chat and while sound and picture is streamed from the sound house or finishing facility. 

“I can tell the online editor at PostWorks to ‘pause right there’ or ‘can you back up 5 seconds’ or ‘let’s look at that graphic again’. I can ask the colorist to darken this or that area or make that parchment paper a little bit warmer.

“By contrast, for all of us to schedule something where we all had to be in Manhattan at the same time and figure out hotels and travel and food took a lot of time and expense. 

“Just being able to text each other ‘let’s do a Zoom at 11am tomorrow’ and set up a ClearView session very quickly has more than paid for itself in just one person’s travel costs alone.”