Wednesday 31 March 2021

How IP is bringing broadcast and AV closer together

IBC

Once, there was clear daylight between broadcast and professional AV production but now, thanks mainly to IP adoption, the gap is narrowing. 

https://www.ibc.org/trends/how-ip-is-bringing-broadcast-and-av-closer-together/7426.article

Innovation is two-way process; broadcast workflows are being used to address ‘enterprise’ requirements, while video conferencing tools and LED displays are increasingly bleeding into broadcast. But the division between the two sectors could all but disappear if a new proposal to underwrite proAV equipment with an open IP standard based on SMPTE 2110 gains ground.

“IP opens the floodgates for the two to merge,” says Liam Hayter, senior solutions architect at NewTek. “The tools to create and present content will homogenise around commercial off-the-shelf product.”

The proposal is called IPMX (Internet Protocol Media Experience) and is backed by SMPTE, AMWA and AIMS, the trio that (with VSF) designed and delivered ST 2110. There are already a number of competing and fairly entrenched media over IP protocols in proAV including NVX, NDI, HDBaseT and SDVoE. That’s why AIMS may have a harder time persuading this adjacent industry to yield than it did in broadcast.

“When an industry achieves a certain level of maturity it will plateau with only proprietary solutions,” says David Chiappini, EVP of R&D, Matrox. “Markets explode when open standards arrive.”

This article goes deeper into IPMX and underscores the blurring of the AV-broadcast lines.

Converging expectations
In truth, there have long been deep impacts in both directions. Broadcast refined the art of the video production workflows and built products and methodologies for content management designed to extract maximum value over time from funded-for-monetization content.

“In broadcast, the ability to maintain the most pristine digital copy of source content is essential because content gets continuously edited and re-adapted much beyond the original live broadcast,” says Sam Recine, VP of sales Matrox. “In proAV, the focus is often elsewhere including: situational awareness, shared decision making, training, customer and channel support, and more.

“The fully networked enterprise or government organisation facility has continued to evolve into networks of facilities. And in live events, proAV dealt with professional experiences that also needed to be torn-down and re-built night-after-night as venues are continuously re-jigged for different events.”

Both markets are very similar in their expectation on reliability and performance. Neither are willing to have ‘black to air’ or a signal failure although AV customers may be more inclined to test and experiment solutions.

“In many instances, AV has to adapt to the location or the brief of a client,” says Darren Gosney, technical sales manager EMEA, Blackmagic Design. “Traditional broadcast can be slower to evolve, with large infrastructural changes needing to be made to facilities, production galleries, OB trucks and transmission.”

Broadly speaking, the demands in areas of live production - whether for broadcast or AV - are the same. ProAV has adopted the same cameras, monitoring, routing and distribution, as say, a national broadcaster.

“The difference is when you move into more niche areas of AV such as video conferencing or corporate,” Gosney says. “Then the demands for AV become more defined around user interface and control. Video systems become a communication device and not a production-based technology. Here is where we see more specific product development and integration with other AV vendors.”

Higher resolutions such as 8K have been adopted far more quickly in the proAV space, specifically for applications such as video walls and projection mapping. In the live event space, the ability to have higher resolution content displayed in full quality is hugely beneficial.

“There is a focus on “broadcast quality” which won’t always be interpreted in the same way between different groups,” Gosney says. “While a broadcast engineer might determine this to mean 10bit 4:2:2 video, this may not necessarily resonate with a live AV company, who may define broadcast quality by a specific video resolution, a product feature such as keyers on a switcher, or redundancy for mission critical applications.”

Graham Sharp, Broadcast Pix CEO thinks there’s no real divide between product. “I honestly think it’s marketing. There are many more ProAV users than broadcasters who have 2K and 4K equipment. It may be that a broadcaster is more likely to avoid on-air mistakes, but the amount we now see on network TV in the US, I’m not even sure that is true.”

High-end content demand
Heightened consumer demand for high-end content is a driving force for the overlaps between broadcast and proAV technologies.

“Whether watching a blockbuster film on a mobile device or streaming a virtual event, audiences are accustomed to consuming the largest video resolution formats with deep colour, HDR and the highest fidelity audio,” says Bryce Button, director of product marketing, AJA Video Systems. “This demand roots back to the democratisation of high-end content production and distribution, made possible by IP and advancements in broadcast technologies that simplify affordable capture and delivery of high-resolution content.

“Tablets and affordable UHD HDR HDMI monitors have also raised end user demands,” Button adds. “We live in an age in which you can easily record 4K or even 8K at high frame rates, often with Dolby Vision HDR already integrated.”

The trend towards more open, universal, and interoperable content is not specific to either market. It is also a logical expectation from customers.

“The abstracted nature of an IP network allows all manner of different devices to communicate on a common platform,” says Kieran Walsh, Director of Application Engineering, EMEA at Audinate. “Interoperability is key. In audio, being able to mix and match different vendor equipment from a variety of different traditional ‘vertical’ markets through a common interoperability platform such as [audio network] Dante, allows for the greatest palette of solutions to the designer.

“Having a unifying technology, allows for greater imagination, and ultimately superior execution of a particular requirement to be realised.”

A use case for the seamless crossover between AV and broadcast can be found in a sports stadium. While one crew is producing the live match for broadcast, another team might be routing elements of the same media to pitch-side screens, VAR, concourse and VIP screens or fan-parks outside. When communication is video one foundation makes sense.

Enter IMPX
Pressure is building on proAV to adopt a single media over IP standard to ensure interoperability and build out AV at scale.

“Taking advantage of the standards work done by SMPTE in the broadcast world and adapting it for the proAV market with discovery protocols and KVM style extension, will allow manufacturers to produce more interoperable products,” says Sid Lobb, head of vision and integrated networks, Creative Technology. “This will offer integrators and AV companies alike much more flexibility in offering solutions.”

From Aquariums to Zoos and everything in between, Broadcast Pix say they are fielding enquiries from all areas of business, education, government and churches trying to communicate with their constituents through streaming video.

“There is a whole generation of users who are not broadcast-trained and they need products that are easy to understand, install and use,” Sharp says.

AIMS began making its case in earnest in 2020. The Alliance’s 100+ backers include Avid, AJA, Sony, Riedel, Leader, Calrec, and Canon many of which sell gear into both camps.

IPMX builds on the four years of work that SMPTE has done in battle hardening ST 2110 and added a number of things to meet the requirements of the AV industry.

This includes support for HDMI which is the most ubiquitous signal connector in proAV; digital copy protection with HDCP and the compressed media using JPEG XS.

“An open standard allows vendors to focus their development energy on building more value add and less on reinventing that foundational layer,” says Chiappini. “The installer base benefits because they have multiple choices to create solutions for customers and the customer and benefits because open standards protect against obsolescence.”

Relaxed timing
In a marked change from ST 2110, IPMX will support asynchronous or relaxed timing.

“2110 is based on a genlocked studio and insists on expensive grandmaster clocks and PTP switches. IPMX does not require these since the vast majority of proAV cases don’t need this level of timing,” he explains. “In proAV I am using set top boxes, Blu-ray players, PCs, digital signage players - all independent of any synchronisation effort. Adding asynchronous support allows us to use standard inexpensive kit to drive all our sources and make integration easier.”

On the roadmap for inclusion in later versions of the IPMX standard is support for highly compressed profiles such as H.264, HEVC and AV1 to enable wireless, second screen and mobile delivery over the public internet.

“There is no way to unlock the full potential of converged alpha-numeric data, live communications, and high-performance proAV media experiences without open standards,” says Recine. “There is no way for proAV to most fully harness new developments like artificial intelligence without volume and scale. The bottom line is that innovation combined with customers seeking to protect and scale and adapt their investments make open standards unstoppable.”

ProAV live and event producers are eager to embrace VR and AR as well as mixed reality applications by working with broadcast and film production technologies to create new applications.

“In these instances, the processing and performance of resolution and colour sampling for keyers or compositional software will match proAV needs from broadcast technical development,” says Gosney. “We’re already seeing incredible development in these technologies which bridge broadcast technology with AV user experiences or engagement.”

 

How Sky News swiftly launched a channel dedicated to the George Floyd murder trial

IBC

Sky News conceived and launched a channel dedicated to the trial of Derek Chauvin, who is accused of the murder of George Floyd, in just nine days. IBC365 finds out how it was done.

https://www.ibc.org/how-sky-news-swiftly-launched-a-channel-dedicated-to-the-george-floyd-murder-trial/7428.article?clearcache=1

Former Minnesota police officer Derek Chauvin is on trial in Minnesota on two counts of murder and one count of manslaughter for the death of George Floyd, an unarmed black man, in the city of Minneapolis last May.

The jury’s decision is the most eagerly anticipated in the United States since that of OJ Simpson in 1995, when Sky first broadcast live feeds from a court.

“An idea came forward in the newsroom for a dedicated channel which is something we’ve done before,” says James Whicher, director at Sky News.

“With Covid lockdown and working from home we were keen to look at new and different ways of working a complicated production without being the studio. Our galleries are quite a precious resource and tying one of them down for a month could impact our day-to-day operations.

“So, the question was asked of the tech team, is it possible to do a dedicated channel with a remote set up with minimal crewing to look after it for a month?”

A key component was mirroring the look and feel of Sky News branding without using any of the existing graphics capability controlled by hardware in-house.

“I thought the main problem we’d face would be to recreate our entire branding at such short notice,” Whicher says.

“We had a chat with Singularlive (a cloud graphics specialist) and they put together our Sky News look within a couple of days. It sends out a URL from the cloud that you just copy into the system here and the live graphics appear. It’s remarkably straightforward.

“We’re taking the main courtroom mix and one ISO of the judge so that if the producer felt anything was inappropriate for the viewer to see we can cut away,” Whicher says.

The live feed for George Floyd Killing: The Trial is from US network Court TV distributed over a LiveU Matrix in the cloud and downloaded at Sky’s premises at Osterley, just outside London. It’s a feed that Sky News would take anyway in order to pull clips for news coverage of the case. The feed is converted from US standard 30fps to 25fps (1080p 50i) and decoded from MPEG (HEVC) transport stream to uncompressed video in SMPTE 2022-6 using kit from Open Broadcast Systems. Its software-based encoders and decoders are used for a number of operations at Sky.

Cloud flexibility
The program feed is routed from Osterley in SRT into the easylive.io cloud and delivered in SRT to the Sky contribution hub. An MWEdge software gateway from Techex provides protection and monitoring for the content on ingress and egress from the cloud.

“Sky and Techex have been working together for almost 10 years designing and delivering systems for video compression, uncompressed multiviewers and video transport,” says Russell Trafford-Jones, Manager, Support & Services at Techex. “The deployment of software-based MWEdge for transport is a natural next step.

The MWEdge software can sit on VM, bare metal or cloud and was recently shortlisted for an IABM BaM award. “MWEdge works well because it can use RIST and SRT for delivery over the internet with seamless switching between paths using ST 2022-7 even when using RIST and SRT,” he adds. “It allows Sky to verify the signal is good incoming to them with encryption supported for security.”

The whole channel can be run by production staff rather than a director. Explains Whicher: “Because this is the first project of this nature and scale we set up a workstation in the Sky news room. Once set up, a producer logs in and runs the channel for the day which is a bit unusual. We needed something that was operationally relatively straightforward to use. To make it as easy as possible we put the workstation in the newsroom with access to other producers if required but we could just as easily run this from someone’s home.”

Indeed, Whicher is able to monitor the channel from home and could take control at any time. He even customised an X-keys panel for easy cutting between sources although mixing can also be done using a keyboard.

“We’ve decided to let the trial speak for itself but we do have the flexibility to add studio presentation or live commentary if we need to. One of the nice things about a cloud-based system is that it is expandable in short order. If a decision were made to have a correspondent across the coverage we can set that up and bring it in as another input. That reporter could work from home as well.”

With a deadline of little over a week before the trial began, Whicher admits to being initially a bit daunted by the task but in two days came up with a demo platform to show to the editorial team.

The broadcaster has created pop-up channels before (Sky prefers the term ‘dedicated channel’) including a Brexit-Free news channel in 2019, but this is the first time it has done so entirely in the cloud.

“It involves a lot of collaboration with the technical department at Sky but we’ve got a fabulous wealth of knowledge so there was never any doubt that we’d deliver.

The project was rapidly spun up, doesn’t lock up any gallery resources at Sky HQ and can be operated by just one person.

“It was a very quick turnaround from inception to going live in a matter of days and we did so without any real difficulty,” says Richard Pattison, Sky News deputy head of news technology.

“It was mainly a matter of getting everyone lined up in time. We also needed it to run for several weeks so the ability to create graphics and tickers in the cloud without having to tie up expensive machinery in Osterley was important.”

Whicher suggests that setting up channels dedicated to covering significant events could become a regular feature of Sky News output.

He says: “On the basis of how quickly we managed to put it on air without any major technical hiccups I think dedicated channels in the cloud will be increasingly looked at and I would think be a normal operation in years to come. The ability for 24-hour news channels to react to breaking news events, in greater depth, with clean, stable feeds and on brand is incredibly important. The technology is here to do it.”

George Floyd Killing: The Trial is available on Sky channel 524, on YouTube and online through the Sky News website and app from 3pm to 11pm Monday to Friday for the duration of the trial.

Mixing in the cloud

Sky’s brief to easylive.io was for a 24/7 video mixer, cloud based, from where they could pilot and run the show collaboratively and remotely.

Another goal was to ingest multiple live sources (SRT, WebRTC, and other video protocols) and to be able to mix it with overlays, lower thirds, graphics, perform picture in picture and add remote commentators as needed for live commentary on the trial - all the basics required with a video mixer.

A further requirement was that one operator be able to run the show (including the other technical partners onboard).

“This is exactly what we do,” says Philippe Laurent, CEO and co-founder Easylive.io. “We offer a fully cloud based visual mixer for switching multiple live inputs, VODs, graphics and overlays. Everything is mixed in real time in the cloud to deliver broadcast-grade live feeds.”

While easylive.io operate one cloud, Singularlive’s graphics operate in another with no issues integrating the two.

“Singular has been a long-time partner of ours,” explains Laurent. “We are the first cloud-based service to integrate HTML overlays, therefore it was a no brainer to work closely with Singular. You add a new input in your mixer and select Singular to receive their overlays. This action is literally performed in two clicks.”

Control of the channel is enabled on Sky’s desktop through Easylive’s GUI. “Our video mixer has been built under an API, meaning that you can fully control it from external services and automate the production,” he says.

“We support every type of video protocols in ingest (RTMP(s), SRT, WebRTC, MPEG-TS (UDP/TCP, RTP, HTTP(s)), HLS, DASH, etc.) in both ingest and contribution. The studio itself delivers the expected experience from any professional video mixer (overlays, playlist, media bin, PIP, templating, etc.).

The service is fully SaaS, runs on AWS and is being deployed worldwide.

“Wherever you are located in the world, you can contribute and collaborate with your colleagues in real time,” Laurent adds.

 

Face Off: “Welcome to Chechnya” Brings AI to Civil Rights Fight

 NAB Amplify

When the Academy announced its Oscars shortlist for VFX it had pundits scrambling to make sense of one of the nominees. Out were Wonder Woman 1984, Tom Hanks’ naval drama Greyhound and Marvel spin-off The New Mutants. In were Tenet, Love and Monsters, Mulan… and Welcome to Chechnya, a documentary detailing the brutality the LGBTQ community faces at the hands of the Chechnyan government. 

https://amplify.nabshow.com/articles/face-off-welcome-to-chechnya-brings-ai-to-civil-rights-fight/

The film may not have made it to through the bake-off but retains the distinction of being the first ever doc to be shortlisted for consideration in this category. It’s also to the Academy’s credit that it recognised the achievements of work a long way outside the mainstream.

In investigative reportage when the identity of subjects need to be protected they are conventionally blurred out or appear in silhouette.  That’s even more the case here when the queer Chechens are in danger inside and outside of the country from what has been described as ethnic cleansing.

Filmmaker David France was all too aware of his obligations but also wanted to preserve the emotional resonance of their experiences. So with VFX supervisor Ryan Laney, they struck upon a ground-breaking combination of digital face-replacement and machine-learning software to replace some of the individuals in the film with digital stand-ins.

They’re have been a multitude of face-swapping CG in recent blockbusters including Thanos / Josh Brolin in Avengers: Endgame but this is journalism where the integrity of the story really matters.

Laney describes the technique as “a bit like a prosthetic where the new face is painted over the old face, so it’s 100 percent the original person’s motion, we’re not fabricating anything new.”

France and Laney, a VFX guru who has worked on Fight Club, Jurassic Park III, Harry Potter and the Chamber of Secrets and Ant-Man, trailed several ideas including rotomation — of the kind seen in A Scanner Darkly  — to standard issue pixelated faces to asking artists to reinterpret the faces, only to find this superimposed an artist’s impression on the reality.

They even tried a Snapchat-like technology to put digital glasses, masks or new noses to disguise them in some way.  “What that really wasn’t doing, though, was helping us tell this really urgent, human story,” France and Laney tell Digital Trends.

“We kept losing the human aspect of it. It wasn’t until we saw Ryan’s first pass on the face swap using a volunteer that we knew we had something that would allow us to show the movie to an audience. We had promised everyone in the film that we wouldn’t release it until they were satisfied with their disguises and their presentation.”

France asked 22 people — mostly queer activists in New York — to lend their faces as a physical shield to protect the people in the film. They were filmed against green-screen from multiple angles in varied lighting conditions with their faces subsequently mapped using machine learning over the subjects in the film. 

“While all eye and mouth movement and facial tics belong to the original subject they are all being carried out beneath the skin of these volunteers,” France says. “This could allow us to know their stories; it’s still them. We see the weight on their faces. It comes through, it’s unmanipulated, we’re picking up those expressions.”

The filmmakers also went to extraordinary lengths to capture original footage on location in Chechnya, including over writing the cards rather than just deleting the data in case it fell into the wrong hands.

In the U.S in postproduction “we built a secret lab that was entirely offline, so all of our turnovers were on a hand-delivered drive that was given to me in person. All of the transfers we did for dailies and work reviews were all done in very encrypted fashion with passwords that weren’t shared online.”

Every frame was scrutinized to insure nothing that could put a subject at risk was removed. “We knew it was going to be studied on a forensic level,” France admits.

R&D alone took the best of a year. And budgeted at just $2.2 million the film includes 400 shots, or an hour of its 107 minutes run time. 

“That’s a big change for documentary filmmaking. There’s now this tool for filmmakers to tell their stories in ways that haven’t been done before, and it also provides some additional security for witnesses to tell their stories and do it in a human way. They don’t need to be monsters in the shadows. They can have a voice and be in the light and have their story translated effectively and truthfully.”

 


Monday 29 March 2021

Is End-to-End Cloud Production Possible? Seems Like Yes

NAB Amplify

Mulan and Hidden Figures cinematographer Mandy Walker ASC ACS has expressed confidence in using cloud tools and services for dailies having experienced the technology as part of the HPA Tech Retreat.

https://amplify.nabshow.com/articles/is-end-to-end-cloud-production-possible-seems-like-yes/

Walker was in hiatus while filming paused (due to Covid) on Baz Luhrmann's 'Elvis' biopic and was able to supervise production of a short film for the HPA Tech Retreat’s real-world stress-test of remote distributed creative workflows.

“Going forward, I feel much more relaxed about accessing remote systems on a movie,” she said.  “For instance, if I have a second unit in another location or part of the world, I’d feel confident getting really good quality dailies to be able to pass comment.

“The other thing is that, in the post process, I want to be able to say we can work with someone in LA or London while I’m in Australia, or vice versa, and not feel the experience is going to suffer in terms of quality of time. I feel we’re there.”

Showing technology and workflows connected

The camera to post demonstration at last year’s Tech Retreat proved remarkably prescient as the world entered lockdown a month later.  The 2021 Supersession took this up another level by following the progress of six short films made under Covid-conditions in different parts of the world with every element of editorial and post-production managed remotely in software and in the cloud.

“It’s very important we show the industry different variations on a similar theme: how do we solve VFX and post with a remote crew that could be anywhere in the world?” explained organizer Joachim Zell, a former EFilm and Technicolor technologist. “We want to show technology connected and creating real deliverables.”

The HPA asked groups of young filmmakers variously in London, Dubai, Mongolia, Mexico City, Brisbane and Hollywood to each write and produce a short film related to the pandemic and shot under Covid conditions.  All the principal filmmakers are women and include Mexican DP and producer Sandra De La Silva, Saudi Director and Producer Abeer Abdullah, Australian director and producer Ruby Bell, British Producer, Writer and Actor Bayra Bela and Mongolia’s Azzaya Lkhagvasuren.

To test the remote cloud workflow to its limits, all the films were acquired at UHD resolution with cameras including Alexa LF (4.5K), Sony Venice (6K), RED Komodo (6K) and a Blackmagic Design Ursa Mini Pro 12K.  Each movie is being made available with deliverables from HD to 8K, a range of color spaces including Rec.709 and Rec.2020 and sound mixes such as stereo, 5.1, Dolby Atmos and DTS. Language versions and archiving were also performed using distributed teams.

HPA helping organize access to a worldwide pool of craft artists for supervision including Walker and Christopher Propst ASC.

For example, the Brisbane film was shot over a weekend on the Gold Coast with the score and editorial performed at different locations in Sydney. One producer was in Australia, another in LA with sound post at Skywalker, picture post at Dolby in LA and VFX in London.

Supervising cinematography Walker was able to view dailies from another location to the set on Moxion. She also supervised the DI using an online application from Colorfront. “The quality was amazing,” she reported. “It meant I could watch what they were doing, make comments, even rewind takes. There are existing digital systems for dailies and color timing but this has now stepped up.”

In keeping with the progressive aspects of the Supersession productions, this short included a number of women HoDs. Walker and Bell were also keen to use the opportunity to give other young filmmakers a leg up. The B camera DP on Elvis is the short’s cinematographer. One of the film’s electricians is gaffer, a grip became key grip.

AWS management

Content for all the projects was uploaded to a central data lake on AWS cloud where AWS applied encryption and authentication as well as orchestrated compute instances to fire up virtual workstations.

“We were able to connect various editors in different to high-speed storage so they could edit BMD raw or ARRI raw without needing to take their workflow outside of the cloud,” says AWS Solutions Architect Matthew Herson. “Since the virtual workstation in the cloud are in proximity to the data it is high speed and fundamentally shifts from having to download to a laptop or edit bay. Once that process is done, it can be handed off to the next part of the chain, such as color.”

AWS’ Media Asset Preparation Service (MAPS) was used to exchange assets across AWS storage mediums such as Amazon S3.

“MAPS gives anyone the availability to individually upload files, do previews or dynamic data movement for editorial workflows,” says Herson. “When they get to the end of a project they can use media exchange to seamlessly hand off the files between facilities or vendor companies in a fast and secure fashion.”

A sticking point was uploading rushes to AWS in the first place. The first plan, which was to rely on the filmmaker’s home bandwidth connections, was a non-starter.

“We had Arri raw or 12k raw Blackmagic files set to upload overnight but when the operator checked the next morning only a couple of shots had uploaded,” said Zell. “It’s a calculation; You know what your upload speeds are and what your data volume is. We had to call on friends to help us.”

They included professional high-bandwidth links locally in each city including Sohonet and Brisbane post-house Cutting Edge.

“As 5G improves it will enable a much higher quality production in the cloud,” noted AWS Global M&E Partner manager Jack Wenzinger.

Camera to cloud

A live stream of the productions on set was also recorded. This was achieved by taking a feed from Teradek wireless units on a camera into the Wi-Fi network. Qtake and Moxion video-assist software were used to manage the signal to destination. In Mexico, the live TX was managed by 5th Kind, and in Dubai the camera feed was routed directly over LTE and 5G networks managed by Samara.

In a separate demo, raw camera images were also being taken straight to the cloud using Frame.IO’s Camera to Cloud (C2C) application. The potential here is to link location capture of background plates to realtime playback on a LED volume.

“A director could request another angle, or a different camera move and these could be fed live as background plates to the virtual production stage,” says Zell.

Eliminate conform

Hosting everything in the cloud can also eliminate the traditional conform stage of relinking high rez media back to editorial assemblies. This was made easier by doing editorial and DI in one platform.

“We had original 6K camera sources and a few scenes with three different tracks which we had to cut all together,” explains Aleksandras Brokas, a Lithuanian based editor who worked on the London project Kintsugi. “When building the workflow we decided not to make proxies. We basically eliminated conform by going straight from editorial to DI within Resolve. It was game changing and saved us lot of time.” Resolve’s audio tool, Fairlight, was also used by the audio mixing team in Taiwan for the same project.

On the audio front, Steve Morris, Director of Engineering at Skywalker Sound, said, “Ideally every sound mixer is at least in a controlled room to get the basics right to start. Trying to interact and review on zoom meetings and streaming content with time zone differences on this was quite challenging.”

He added, “You can take a multi-channel mix and encode it binaurally and get some spatial aspect of the mix for listening back on headphones. If the headphones are high quality you can add the signature of the [soundstage] itself. With Covid everyone has to work with whatever kit is in their house.”

Skywalker Sound mixer Roy Waldsburger found he was able to suggest ways of improving the audio while it was being captured on set, rather than having to fix it in post months later.

“I could watch the live stream of the location shoot in Brisbane and felt I wanted them to see if they capture some background chatter by miccing up a pair of the other actors. I learnt that while it’s very useful to communicate with the location recording there are limits to what I should be saying.”

In other words, there are protocols to be worked through about how much ‘interference’ is helpful from crew not physically present on set.

Cloud costs

The Supersession was a demonstration of what is possible but one area deliberately not touched on as part of this is the cost. Zell and the HPA were able to draw on the resources and expertise of vendors and cloud providers like AWS at no cost. Being involved in an event which such high visibility among Hollywood CTOs is a big draw. Zell says another event is needed in order to discuss the various cost implications of ingress and egress in and out of cloud and cloud archive.

“These are absolutely key questions but I wanted to take everything to its limit and not to worry about cost,” he says. “What it has done is thrown up other questions such if the data gets lost then whose fault is it? If the company which hosts your data goes bankrupt what happens to your content? We will need to do another major investigation of this in time.”

He added, “I also learned a lot about certain people I will never want to work with again and certain technologies I never want to work with again. I won’t be making this list public.”

 

ends

 

Will Motion Grading become a new cinematographic tool?

RedShark News

When Tom Cruise urged us all to switch off motion smoothing he knew it was mission impossible. Even if people do make the effort to retune the default setting on their TVs, the outcome is often worse.

https://www.redsharknews.com/will-motion-grading-become-a-new-cinematographic-tool

Motion smoothing certainly helps remove picture imperfections like strobing, blur and judder but it leaves the image looking less filmic. The disparaging term for this is that it leaves cinematic looking shows looking like a soap opera. That’s because TVs use video interpolation to artificially boost the frame rate and smooth the viewing experience of fast-action sports and gaming but leaving it on can “make most movies look like they were shot on high-speed video rather than film,” according to Cruise.

The situation appears to be getting worse. Line up a dozen different TVs and you’ll see a dozen different looks, none of which match what the DP saw in the grading room. HDR only amplifies this issue.

“Judder increases by factor of three or more and shutter speed seem faster and more choppy and there’s nothing a TV can do about it,” says Richard Miller, EVP Technology at Pixelworks. “Motion smoothing can take off the worst of it but it brings in these other artefacts and Tom Cruise will come looking for you.”

But there is a tech solution riding to the rescue.

Motion control

The appearance of motion is affected by factors like camera and subject motion within a shot, contrast range and frame rate.

“When you watch streaming TV the image you see does not match the approved reference in the grading room,” explains Miller. “The issue is exacerbated by brighter screens, larger screens and higher dynamic range and increasing use of more aggressive camera work.”

Some of the issues include:

The Queen's Gambit - Episode 1, approx 3 mins and 5 secs in showing a pan with a bridge.

Stranger Things - S3, Ep2 - about 25 mins showing a pan inside a mall. 

Unbelievable - Ep1, the opening sequence of a pan across a street.

The Stranger Things example is most noticeable where the bright lights and colours seem to blur and the lines of the mall jag. You want to tear your eyes away.

“The dilemma is that most users have motion smoothing turned on,” Miller says. “This is the way the TV makers determine is most comfortable for users. Filmmakers hate it. It causes the fake soap opera effect- artefacts like halos behind heads. Sometimes it janks back and forth. You wouldn’t know that unless you were an expert but over the course of a show it does change the feel of storytelling. Spielberg was looking for that harsh hard look in Saving Private Ryan to make you think you are on that beach on D-Day, not something so smooth it destroys any suspension of disbelief.”

Miller explains that judder becomes much more objectionable when the overall brightness is a lot higher than usual. This is a human perceptual effect that’s becoming a problem as more and more filmmakers look to make use of high dynamic range and more devices are capable of displaying it.

“If too much contrast is put into the grade, judder and strobing can become a real problem,” Miller says. “While HDR is supposed to go up to 10,000 nits, most of what we are seeing is actually around the same as SDR. It does have wider dynamic range so you get a bit more in the dark areas but colorists are having to compromise their HDR grades in order to avoid judder artifacts.”

He says, “If you want to use HDR you have to deal with motion. HDR without motion grading is compromised.”

Pixelworks, a Silicon Valley-based imaging specialist, claims to have developed the only solution to the problem. The motion grading capability is layered onto the company’s TrueCut content creation tool and delivery platform. It has been out for a while and now the company is making a new push, hinting that streaming services are getting ready to back the technology.

“TrueCut allows the filmmaker to control the judder, shutter speed or frame rate for an entire show or shot by shot,” Miller explains.

A key element is a Motion Appearance Model. Just as algorithms (such as TrueCut’s Color Appearance Model) are used in the grade to maintain the perception of a colour hue in any lighting environment, the MAM does the same for movement. When combined with a Color Appearance Model, a new Motion Picture Appearance Model is created. This is the first time such a model has been developed.

How it works

The software works during capture, into post and through to distribution.

On-set, much like a Colour Decision List is created today, cinematographers can use the TrueCut tools to preview and create an initial motion look which is captured as a Motion Decision List. TrueCut Motion can be used to deliver this motion look for dailies as well.

It provides a ‘virtual frame rate’, enabling higher frame rates to be used where appropriate in production, while retaining a 24fps filmic look if desired.

“Even though HFR scares some people, we are focused on keeping a 24 fps look to the final show if that is the director’s intent,” says Miller.

Members of the American Society of Cinematographers were also consulted for their input and Pixelworks says feedback is positive.

In post-production, TrueCut Motion is generally used after colour grading. It’s available as a plug-in to DaVinci Resolve. For example, the colorist/DP could adjust Judder on a scale from 0 to 360, where 0 is the judder typically seen in 24 fps footage and 360 is no perceptible judder. Motion Blur is calibrated on a scale from 0 to 720, where 0 is no motion blur added to the original, and 720 is 720 degrees of effective shutter added.

For home streaming, Pixelworks has developed TrueCut as a high frame-rate streaming format to play back across TVs and mobile devices. It is compatible with HDR formats like Dolby Vision and HDR10.

Pixelworks says the format is supported by streaming services and suggests that announcements will made later this year. You’d expect that this at least one of Netflix, Amazon and Disney+.

It explains that, when selecting a title, the TV or STB streaming app will play the TrueCut motion graded version if such a version of the title exists and the device is certified compatible. This is similar to the approach used already to determine whether to play the SDR, HDR or other formatted version of the title.

TrueCut is backwards compatible with “tens of millions” of TVs in homes today including many if not most of LG displays and Samsung TVs sold since 2019.

“We expect most new models [of any TV maker] coming to market from 2022 will support the format,” says Miller.

In China, six features have been mastered using TrueCut Motion including Pegasus and The Bravest. Sony has also remastered Men In Black: International as a showcase screening for its theatrical LED screens.

With a few of the top streamers on board the workflow is likely to take off. Time will tell.

 


Friday 26 March 2021

Id3as: Streaming Tech Designed for When Things Go Wrong

Streaming Media

Id3as directors Steve Strong and Adrian Roe have been in business together for 30 years. Roe began his career in retail and moved into fintech. He admits that they knew nothing about the media space until a decade ago when they joined forces with video tech specialist (and Streaming Media contributing editor) Dom Robinson.

https://www.streamingmedia.com/Articles/News/Online-Video-News/Id3as-Streaming-Tech-Designed-for-When-Things-Go-Wrong-145960.aspx

"What became obvious when we looked at media clients embarking on streaming live events was that when things went wrong there was very little you could do about it," Roe says. "You were lucky if you had a log entry for a fall over.

"Since we came from dev background, we thought that we could make that pain go away and Id3as came into being as a result."

The UK-based outfit set itself the task of delivering streaming solutions which are more reliable than any one link in the chain. The aim was to drive out costs associated with downtime, overcapacity, and undercapacity.  

When Roe helped build online banks for companies like Northern Rock he did something similar by using back-to-back service level agreements (SLAs). "That isn't a viable way of doing it now," he says. "The approach needs to be that you construct a system so that if any single element of your system goes wrong nobody notices.

"There have been attempts to do this with generic containers and Kubernetes, but they're not really solutions on their own. Our technology can self-orchestrate or coexist within Kubernetes and Docker environments, and runs on nearly any chipset, OS, and platform. We use Erlang, a programming language developed 25 years ago to deliver Carrier Class telecommunications."

True high availability, he continues, is about accepting things will go wrong and making sure that when they do, it doesn't impact customers downstream.

"We're not interested in how well a solution is doing in lab conditions. We're much more interested when the s--- hits the fan."

The company developed a virtualised video pipeline that can be tailored and installed to deliver premium streaming models for broadcasters and operators

Advanced Video Pipeline (AVP) is a modular architecture that spans ingest through transcoding and enrichment to packaging and CDN. All functionality and data capture is exposed through published OpenAPIs. As such, AVP can be integrated into existing control and monitoring systems.

"We are not seeking to be the next Wowza or Unified or ATEME," Roe says. "We understand our niche and want to partner with a comparatively small number of high value enterprise customers."

For Nasdaq's digital media services—owned by Intrado—Id3as enables 75,000 live events a year. These are mostly financial fair disclosure events and a "perfect example of how we can help scale business and manage cost by looking carefully at where in practice things can go wrong."

In this case, Nasdaq was finding that phone calls were dropping out regularly during teleconference events. Roe says introducing AVP immediately took the impairment rate down from 8 to less than one, and that loss is now negligible.

For DAZN, Id3as provide both mezzanine encode and ABR ladder creation for a number of services, particularly rich audio manipulation. One of these is a remote commentary solution for DAZN's live event production which began half a dozen years ago and was Id3as' first WebRTC project.

"While we manage the super high-quality encode of the livestream, we could also do a decent quality low latency feed pushed to a commentator based anywhere else on the internet. This decreased the costs for DAZN of having to send multiple commentators to the event and increased the number of languages they could produce in."

Another example of Id3as' "real-world practical dirty stuff," rather than clinical lab tested performance, it its work to deliver Arqiva's Hybrid TV capability. "They had purchased a company that kick-started their capability in the area, with technology based on off-the-shelf components," Roe says. "The problem they faced was that many of the TV manufacturers had paid at best scant regard to international standards—the solution only worked on 16% of target devices.

"Six weeks into an engagement with us, we had a solution running on 96% of target devices. Lots of devices didn't follow the standards, so we provided compatible (but not standard) streams for them."

It has also worked with Limelight to replace its real-time streaming platform with one based on WebRTC. Limelight Realtime Streaming delivers reliable, broadcast-quality, real-time video streaming using the UDP data transfer protocol and is integrated with Limelight's global CDN.

Id3as, which still has only 8 full time employees, is intent on enabling CDNs like Limelight to hit their SLAs.

"You can kind of ignore the 99% and below SLA," says Strong. "Frankly, you've got so much time to respond to stuff that you can just have a single box set in your cupboard doing the job. At 99.9%, your time is more limited. If you're very organized, you can probably still get away with a fairly simplistic approach to delivering that sort of number.

"Once you get up to four nines, it starts getting out of the realms of human control. You've got to have multiple systems live the whole time. You might get away with something like an n+1 model where you've got a bunch of live systems and a couple of hot spares that are sat there running. But you're certainly in a world where you're going to have to have some form of distributed system."

With SLAs of 99.999 or more, "the reality is you've got 0.86 of a second to respond," he says. "And that's not just to respond; that's to detect, fix, and have the service back up and running. That's very little time to do anything, particularly on wide area networks. You've got ping times measured in hundreds of milliseconds. You've only got 860 milliseconds to do it. You've got to have multiple systems live delivering the service the whole time."

Wednesday 24 March 2021

Production for the Planet: Extreme E’s Big, Fast Experiment

NAB Amplify 

Extreme E is an ambitious attempt to marry live sports with urgent environmental messaging with as lean a production footprint as it is possible to achieve. NAB Amplify has the full story.

https://amplify.nabshow.com/articles/production-for-the-planet-extreme-es-experiments/

Most sports would give anything to bring fans back into the arena but not Extreme E. The new all-electric rally-style racing series that launches next month was designed as a purely televised product broadcast live without spectators. That could have been the project’s Achilles Heel when it was conceived two years ago but in these strange times, it’s remarkable serendipity.

“We desire to be the biggest off-road sport on the planet and one of the world’s biggest motorsports,” says chief marketing officer Ali Russell.

Extreme E is an ambitious attempt to marry environmental education with a professional sports. Its founder, the Spanish businessman Alejandro Agag, successfully launched electric single-seater championship Formula E in 2014 with the backing of Formula One’s governing body FIA. Extreme E, which also has FIA status, takes this to another level by using electric SUVs to race in five areas of the globe most affected by climate change.

“Climate change is the biggest issue facing all of us but studies show that it is not cutting through to the mainstream outside of news,” Russell counters. “Sports is an incredibly popular passion point and people watch appointments to view events together so the opportunity to use sport to communicate vital information is too big to pass up.”

With 30 percent of the planet’s CO2 emissions coming from transportation, Extreme E exists to showcase the performance of electric vehicles and to accelerate their adoption.

“We ultimately want to make sustainability sexy and to be as an innovative platform for renewables as F1 has been for the combustion engine. It’s about making electric cars cool and aspirational by showing we can drive in a variety of epic locations already damaged by climate.”

After the Saudi Arabian desert, the five-event calendar then visits a beach location in Senegal, a glacial valley in Greenland, the Brazilian Amazon, and ends in December in Tierra del Fuego.

“We’re not on the ice cap but on the area where the ice has melted. We’re not in the middle of the Amazon but in an area that has been deforested. And when we leave, we replant a larger area with trees.”

He says, “We want to highlight e-mobility and the electric vehicle. The laziest thing would be to do an electric version of World Rally or Dakar. We needed a concept that would be easy to consume and understand, that would be high impact and high adrenalin to attract younger audiences.”

Inspiration was taken from cricket’s Twenty20, a heavily abbreviated version of the traditional five-day game.

“We need a media product which is futuristic,” says Russell. “Our use of heavy data is like the film Tron. It’s more like an eSport than a traditional motorsport.”

All of this is particularly challenging for a live broadcast given the locations are remote and infrastructure-free. The plan is to have as little footprint as possible.

“Extreme E is very progressive. It links the environment with gender equality, with green technology, and it’s also biosecure at a time when sport is going through severe challenges. I think it’s captured people’s imaginations.”

The Race and Broadcast Format

Nine teams, including one owned by seven-time F1 champion Lewis Hamilton, compete over a 10 km course over the race weekend. In a break from tradition, each team must field a male and a female driver sharing the same car and swapping driving duties during the race.

“Every business knows you have better decision-making when you have gender equality,” says Russell. “We’ve followed mixed doubles tennis in having male and female drivers competing, winning or losing together. It’s their teamwork that is will be one of the most fascinating aspects.”

All3Media group stablemates Aurora Media Worldwide and North One will produce live coverage across the weekend (3 x 90mins blocks including qualifiers and the final 2-hour race) plus highlights shows, a 30-min race preview, 300 VOD films for digital, and a 20-part behind the scenes doc series Electric Odyssey.

In another change, the conventional paddock lane of team garages has been compressed into one area. Aurora hopes that this “Command Center” will conjure a sense of theatre as teams jockey together over monitors.

The E-SUV’s themselves use 400kw (550hp) batteries and are built from the ground up by Spark Racing Technology. Each race team is limited to eight people on site, which include the drivers, one engineer, and five mechanics.

More than 70 broadcasters have bought rights. Discovery will take Extreme E to more than 50 markets on Eurosport. Sky Sports will air all five races live in the UK. Fox Sports has carriage in the US (along with Discovery’s streamer MotorTrend). Other rightsholders include the free-to-air ProSieben Maxx in Germany, Austria, and Switzerland, RDS in Canada, Disney ESPN in Latin America, and TV Globo in Brazil. Chinese distribution included on Channel Zero and sports content app Douyin.

“A lot of PSBs have seized the opportunity to talk about climate change in a different way,” says Russell.

Designing Remote Production on a Global Scale

Unlike a NASCAR or F1 circuit, Extreme E race tracks cover wild terrain. Even if cabling such a massive area weren’t an issue then putting any kit or people in harm’s way was a no-go.

“Imagine a Red Bull air race on the ground,” says Donald Begg NEP’s director of technology for major events. “Racers have to go through certain gates but how you go through them is kind of up to you.”

NEP had to come up with a wireless solution. This is based on four nodes with receiving aerials (2Ghz and 7Ghz) which take in the nearest camera feeds, encapsulate them in IP and send them back to the TV compound over millimeter -wave (75Ghz frequency).

“The ideal topology would be to have two nodes on either side of the TV compound cabled to the compound – but in practice, due to the terrain, this doesn’t look possible in every case. Instead, we’ll look to have at least one node cabled with the other three connected wirelessly.”

Each node has two paths (clockwise and anticlockwise) back to the compound for redundancy. “Interference is most likely to occur due to is wind,” Begg says. “The wave links have a fairly tight beam and there’s a greater risk of movement when putting the nodes on sand than on the side of a building.”

Mobile RF System

The TV compound at each site needed to self-sufficient. Everything is being backhauled to the London production gallery, with track-side equipment at a minimum.

NEP built a flypack designed to survive extreme temperatures and conditions. Included are Marshall Electronics’ all-weather lipstick POV cameras. Four of these are mounted on each car.

Three drones, supplied by Aerios Solutions along with operators, carry Sony Alpha cameras. The drones can fly at 90km an hour and into 35 knots of wind. Two will be used to track the cars, while a third will be tethered and provide an overview of the entire track.

RF links are enabled by Mini Tx UHD, designed by NEP company BSI. This tiny encoder, which measures just 85mm x 56 mm x 28mm can transmit in two different frequency bands (2GHz and 7GHz) via its software-defined radio, offering complete on-site spectrum flexibility without the need for changeable radio modules.

These cameras are supplemented by Sony shoulder-mounted PXW-Z750 camcorders which are 4K ready should the requirement to UHD be made later. These cameras also offer the ability to record at 100p frame rates for post playback.
Similarly, a pair of Sony F55’s were selected because they can record at a higher speed in the camera whilst still allowing the production to capture a live 50p signal.

“There are four constituent elements to the production and four locations involved,” says Lawrence Duffy, managing director at Aurora Media. “All the camera sources plus original race mix are sent to London. Car telemetry is managed by Barcelona-based Al Kamel Systems in Barcelona, the AR and VR from NEP in Hilversum (in The Netherlands). This layering is why you need very experienced people on-site and in London to create the output.”

Westbury Gillett, a producer-director of Formula-E will mix the feed on-site using a Grass Valley Kula while the director and EP Mike Scott create a master show in London. Commentary is also remoted to London.

“Honestly, producing the city center racing of Formula-E is a lot more complicated than being in the middle of nowhere,” Russell says.

Backhauling to London

Two satellite uplinks at the OB provide “resilience and grunt” with up to 30 signals split between the links and sent simultaneously. The signals are downlinked in the UK (at Salford near Manchester, or at Milton Keynes closer to London) and sent over NEP’s network to its central London production hub at Grays Inn Road.

For the sporting graphics, Al Kamel sends a data stream of car telemetry over NEP links into a dedicated server at Grays Inn Road. This data is added to the raw drone feeds which are then bounced to Hilversum for the addition of augmented reality graphics, and routed back again to be mixed into production.

That trip from site to satellite to London adds half a frame delay. The round to Hilversum adds an extra 100ms.

Begg says, “Everything arrives at Grays Inn Road in sync but the additional five frames can be noticeable. We’ll add an element of delay in order to balance that out but we’re looking to technologies like Starlink to bring latency down.”

Starlink, SpaceX’s series of LEO satellites which are entering customer tests, could also provide additional capacity. Believe it or not, satellite bandwidth is one of the limiting factors of this production and a reason it is produced in HD 1080p 50.

“We had to design the system to operate anywhere in the world from Saudi Arabia to the top of a mountain in Nepal,” says Duffy. “The bottleneck is the satellite system. We trade off the best possible quality versus available bandwidth.”

Hybrid Storytelling

Al Kamel harvest car data including position and speed, as well as data about the climate and terrain at each location. This includes latitude and longitude, the rate of glacial ice loss and rise of sea level or square meters of deforestation, and is information that will be fed into the race narrative.

Duffy calls it hybrid storytelling.

“Motorsport fans are used to commentary about how [to] track temperature, humidity or rainfall affects the race but we’re also using the planet data to fuse the story of these challenging environments with the story of the racing. They provide a wider story of these locations and damage being done to them by humans.”

Environmental initiatives, such as a sea turtle sanctuary on the Red Sea coast, will be highlighted.

“There’s been a step-change globally where the scientific community has been joined by business and media to lead on climate change. It’s a commercial reality that science can’t do it on its own. Change of this scale needs politicians, organizations, and sport to make a difference.”

Graphics, AR and Unreal Engine

The AR will be overlaid on aerials of the track so fans can see where the cars are at any one time. For the VR, Aurora and Extreme E are creating a virtual world, mapping the track for each location based on a drone survey. They are also integrating car telemetry produced by Al Kamal to create a live 3D model of each car, including elements such as how the car is moving around on its chassis.

“We want to explain these diverse and unique landscapes in terms of angles of elevation and decent and to do that I felt it was best represented in a 3D world,” Duffy says. “I also wanted to do that in a live, not in a post-produced environment.”

One animated view will be from the driver’s eye showing the gradient of the terrain and the contours of the rock, ice or sand. To create this, a detailed aerial survey of each location will produce a topographical overview accurate to 2 cm. This data is translated into a point cloud and combined with stills photographs of the terrain for texture, then rendered in Unreal Engine. During the race, GPS data from cars will feed an AI image tracking algorithm to produce the real-time 3D model.

“The algorithm knows where the track cameras are in the environment and overlays the virtual objects,” explains David O’Carroll, Auora’s operations director. “The AR and VR are sophisticated storytelling techniques to get this high impact race over to the audience in realtime.”
Aurora has not used Unreal Engine in the live sport before. “It gives us an opportunity to enhance the graphical output top of the more traditional Vizrt systems.”

Multiskilled Crew

The production comprises just seven permanent staff and an occasional team of 24 which is far fewer than any comparable live sports broadcast.

“Everybody was tasked with finding individuals who could double up,” Duffy says. “Race teams hired engineers who could also run data. We wanted camera operators who can rig and ENG crew who can edit (preditors).”

Once an operator has set up the broadcast graphics gear they will then operate the race replay system for official adjudication. Camera ops will help rig POV cameras as well as their own camera.

“It’s difficult to predict how this will all gel but I hope to find that someone going in as a camera guarantee will come out after a few races as an audio engineer,” says Begg. “That’s a great boost to their skills”

A reduced crew is only possible by using a more fixed-rig approach to coverage. “Motorsport can eat a lot of operators,” says Duffy. “Luckily the Extreme-E car gifts itself to onboards.”

Onward Distribution

After transmission, the live programming and all the rest of the content including VT’s, highlights and digital is uploaded to a cloud-based Digital Media Hub (DMH) for the rights holders to search, view, and use.

“The DMH provides a dual purpose: to make content easily available to rights holders; and provide a rich suite of assets that rights holders can use to enhance their own content,” explains O’Carroll.

The DMH itself comprises cloud-native storage and content distribution platform developed and managed by Base Media Cloud and Veritone.

Red Bee Media provides satellite distribution services, picking up live signals from the production gallery to broadcasters.

In addition, Red Bee is providing services to transcode and live stream race content to Extreme E’s digital platforms. The production will further utilize Red Bee’s OTT platform for global live streaming on the championship’s website and other digital assets.

Sustainable Production

So how does its own sustainability add up?

A massive 75% of carbon costs are reckoned to be cut by using a ship rather than airplanes to get from A to B. RMS St. Helena is transporting the championship’s freight and infrastructure, including vehicles, to the nearest port to each of its five-race locations. It is both a floating paddock and a base for scientific research.

The ship’s engines and generators have been converted to run on low sulfur marine diesel – known as “champagne”– rather than heavy diesel.

Extreme E is committed to having a net-zero carbon footprint by the end of its first season, however current projections for the championship estimate that its footprint will equal 20k carbon tons.

“We can’t eradicate [carbon] totally so we are climate offsetting,” Russell says. “We have a legacy scheme which more than covers our carbon footprint and we will incentive the governments we work with to invest in these projects.”

This includes planting 1 million mangroves on the coast of Africa

Carbon offsetting is being arranged through ALLCOT certificated programs. A Lifetime Cycle Assessment, which calculates the overall series impact, is being audited by EY.