Monday, 30 November 2020

Is this the greatest real time single take feature ever?

No Film School

How director-writer-cameraoperator Gavin Michael Booth made two simultaneous feature length single-shot takes and why 

https://nofilmschool.com/realtime-single-take-feature

 Shot in two 80-minute single takes, filmed simultaneously in two different parts of a city, and presented in split screen, Last Call is perhaps groundbreaking to the degree it has pushed realtime filmmaking. 

Directed and co-written by Canadian filmmaker Gavin Michael Booth on a micro-budget, the technical feat is all the more remarkable for sustaining a genuinely potent narrative about shared humanity. 

Last Call follows a suicidal alcoholic (played by the film’s co-writer Daved Wilkins) on the anniversary of his son’s death. When he attempts to call a crisis hotline, a misdial connects him with Beth, a single mother working as the night janitor (Sarah Booth) at a local community college. The split screen feature showcases both characters in real-time as they navigate a life-changing conversation. 

“Anyone can do it single takes as a gimmick,” says Booth, who previously made short film Fifteen for Blumhouse which was broadcast live on Periscope. “It’s got to serve the story.” 

Booth met Wilkins through a filmmakers coffee morning group in LA where membership was only allowed if no-one complained about the business. “You had to be positive,” smiles Booth. “David came up to me and said he had a friend who had just completed training as a crisis worker and would I be interested in a project about a guy calling a crisis hotline in realtime. 

“That’s the outline of the film but it changed significantly because most crisis hotline calls are done in 20 minutes – they either get the caller to agree to be safe or they send someone for a wellness check out to take care of them. In our film, Daved’s character misdials and instead interacts with a random stranger.  This instantly puts the audience in the shoes of the person taking Daved’s call at that same time as ratcheting up tension because you also have this viewpoint - which she doesn’t share - of the caller’s actions.” 

British director Mike Figgis pioneered realtime split-screen cinema at the advent of digital photography with Timecode in 2000. While that film was a largely improvised project, Last Call has a tightly scripted narrative. It’s a two hander with each actor’s storyline taking place at the same time, but in separate locations, as they talk on the phone. 

It would have been easier perhaps to shoot one single take and then record the other but this would have been unfair to the actor going second. “They would have to perfectly memorise the timing of each line in order to synchronise with the other take. It just felt much more organic even though it was difficult to get both sides at the same time.” 

On location 

They scripted over the course of a year, basing locations on Booth’s home city of Windsor, Ontario. 

“Since we knew the locations as we were writing we were able to custom write and prep things knowing how things were laid out. That was a huge bonus otherwise we’d have to rewrite the script to match the practical locations. Two weeks before we arrived in Windsor to film, David was in LA and Sarah was in Montreal on projects so we did rehearsals over the phone which worked out perfect for them as actors.” 

Once on the ground in Windsor the budget only allowed for 10 days of rehearsal and four shooting days. “We were either going to get it or not,” he says.  

We filmed every rehearsal and watched it back to see if a particular section was getting boring and therefore to try something visually to spice it up. I was like an NFL coach being able to watch the game plays back to perfect the technical aspects of the performance.” 

Not content with shooting both takes simultaneously in real time they shot in locations several blocks away from each other. The crew for each was a camera operator and a sound operator. Cinematographer Seth Wessel-Estes was in charge of Daved’s storyline, Booth took charge of the other storyline featuring his wife. 

We didn’t have radio or comms with each other. It was like a stealth mission. As soon as we yelled action then unless a runner from one side of the city ran over to us and said the camera op had slipped down stairs we just rolled. If the actors fumbled a line or the camera shook a little bit we accepted that as part of the process of doing realtime. It’s all about getting to the end. We had eight tries over four nights and managed to get five complete takes of the movie.” 

They shot using a pair of RED Helium cameras in 8K. 

“Since nobody had attempted to shoot this long a format in 8K before we were able to develop a low-level partnership with RED to test it out,” Booth explains. “The RED has a fan that kicks in once every two minutes or so that can ruin a take by interfering with the audio so we were also testing how low you could run the fan without compromising the sensor that the fan has to cool down.” 

To minimise environmental heat, they cranked up the air conditioning in both locations and turned it off the second before filming. They tried wrapping the cameras in ice packs but end up with a watery mess.  

We did find a compromise where we could run the fan in constant speed just very low that wouldn’t affect the audio,” Booth says. “The only thing it prevented us from doing was getting an extreme close-up on an actor but that was okay because the closer you get to an actor the harder your focus challenge becomes and in a real time movie you don’t want that to be the thing that messes it up.”  

8K cropping and reframing 

Shooting in 8K gave Booth the latitude for potentially cropping and reframing later. Even though there is no editing in the film there are key frames along the timeline that are constantly shifting both images. The film also has two perspectives of horizontal and a vertical split.  

“When we rotate between the perspectives both images are slowly cropping and moving in in order to not show any edge of the frame,” Booth says. “Honestly, 8K is overkill but we are future-proofed. We have an 8K master so when 8K TV or streaming comes in we are ready.” 

Naturally, they only required a single identical lens for each camera. Approaching rental houses with that request wasn’t popular – as it would mean them being unable to rent the rest of the lens kit. Luckily, online shop lensrental.com came to their rescue with a pair of Canon 24mm primes. 

“The lens was great in low light, the iris opened up very well,” says Booth. “We used the tilt handle system for focus pulling. The focus wheel on the rig was wireless connected to the lens which was awesome because it eliminated needing another crew person who might risk casting a shadow or having radio frequency problems if they’re trying to pull focus from a distance.  

“There are times, particularly on Daved’s side of the story when he leaves the bar and goes out onto the street and there is a massive exposure difference. So, in real time we’re very, very carefully closing and opening the iris between these scenes. We did adjust this further in Resolve.” 

“Our biggest fear was that any slip on that focus wheel and it would break the whole take and nobody wanted to be at that 70-minute mark and be the one to do that! 

Booth pulled some favours to get access to key locations including a education college whose students were willing to help out. 

“The college was incredibly generous to hand us the keys at 6pm each day.  We shot over 14 days and had our base there. That’s a huge ask if you don’t have money to pay them.” 

Filmming on the fly 

For Daved’s side of the story they used Booth’s favorite local watering hole, owned by a friend who plays the bar tender in the movie. They also needed a high tower within walking distance “because if you’re going to walk somewhere 40 minutes away it’s going to be a very long boring movie!” 

In the college they dimmed the lighting by removing a number of overhead fluorescent tubes and hiding small magnetic LEDs. The bar’s lighting is barely changed but in the small apartment they were challenged to hide fixtures, given that the camera has a 360-degree viewpoint. 

They had no permits, just run and gun, which meant they were always at risk of unintended extras breaking the frame.  

We started shooting at 2am which is when the bars close so we tended to frame away from the strip and look into the street. We’re dodging real people, real traffic and anything can happen. I like the energy and the franticness of not being able to cut and trying to get everything together.” 

They planned alternative routes out of the bar via the kitchen and into the alley out back which had less chance of running into people. 

In the take used in the movie a van pulls by and someone leans their whole head out of the window and shouts something derogatory, just as Daved was walking into an apartment. “Luckily we used the 8K to crop that out,” Booth says. 

“If we had the money, we would have had fancy wireless systems so I’d be able to monitor everything but Seth and I were monitoring on our own.  It’s a bit like theatre. Once the curtain’s up the director can’t do anything. 

“What is great about the Helium is that you get a 1080p proxy at the same time as the 8K so I was able to take those two proxy takes, quickly throw into Adobe Premiere, line them up and we could watch a rough cut of the movie within half an hour of each take. But as a director, I was blind until we watched it. We’d do one take, have an extended lunch break during which we watched the whole movie, made notes and went again.” 

No ADR required 

Recording audio is just as tricky if not more so when shooting long single takes. The biggest fear was getting a boom shadow in shot. Instead they used wireless lav mics.  

We figured out a way to put two mics on both actors so if one became loose we had a backup.  We played with what fabrics to use and how loose the clothing could be so we didn’t get ruffled sound. Daved is ankle-strapped with transmitters and the cords run up his legs to his chest. We had Sarah wear janitor’s overalls mostly to hide the two mic packs strapped to her back. We managed to get every single word clean.”  

There is no ADR in the movie though there is one moment where Beth screams and it spiked the microphone so they had to patch one second of audio from a previous take to correct it.  Props to the sound team of George Flores, Joey Lavoie and Fernando Henna 

“I didn’t want to do this movie and have to do 50 percent of it in ADR because why do it real time if you’re just going to replace it in post?” 

In keeping with the raw production values, composer Adrian Ellis recorded the music live to picture. 

Shot almost exactly two years ago, the film picked up 25 awards on the festival circuit including the Founders Award at Napa Valley and Best Feature at Hamilton, eventually landing a theatrical release with Mutiny Pictures and a streaming distribution deal with Apple TV+ with more to follow. 

“In indie film the slog is finding a distributor,” Booth says. “That side of the business is not the friendliest to filmmakers. There’s a lot of people who don’t have the interests of the filmmaker at heart. We bided our time to get the right deal. 

He adds, “When I feel most alive as a filmmaker when things are on-the-fly. That’s been my whole upbringing since high school which has been let’s go pick up a camera with your friends and go make something. This film retains a lot of that mentality.” 

 

Wednesday, 25 November 2020

Leapfrog to the future today

Copywritten for Blackbird

The media technology industry has casually used the phrase ‘next-generation’ for years without ever spelling out what it means. In the age of Covid the term has become meaningless. Next-generation is as redundant as the tools and services it is meant to classify.

https://www.blackbird.video/uncategorized/leapfrog-to-the-future-today/

Next-generation has come and gone. What the industry needs is the chance to rapidly catch up and even outpace the accelerant that the pandemic has forced on businesses worldwide.

It needs leapfrog tech. Let’s define what that is.

Cisco identified the beginning of the Zetabyte Era in 2016. Already, we are overshooting that. The IEEE predicts that by the end of the decade (i.e this month!) we will be creating more than 50 trillion bytes of information per person, per year, and entering the era of the Yottabyte.

Even before we get there the limitations of existing media networks are apparent.

Video is the chief culprit. Video is on course to drive global Internet video traffic 33% upwards through 2022, with live Internet video growing at an astonishing CAGR of 73%. Video streaming is forecast to constitute 79% of all mobile network traffic by 2022.   

But these predictions made in 2018 are out of date. Not even Cisco forecast the pandemic.

Reset what you think you know

Coronavirus has accelerated digital behaviours. In its ‘Q3 2020 State of Streaming’ report, Conviva estimates that streaming video viewing time rose 57% globally, year-on-year. 

Media owners from Walt Disney on down are restructuring their entire output around direct to consumer services over IP.

Watching people play video games has emerged as one of the biggest drivers of online bandwidth. Overall viewing to Twitch has soared over 100% year on year.

With all this comes demanding expectations from consumers for better than broadcast quality. That means no delay, no glitches, superb video and audio quality and – increasingly – the ability to interact with and personalize the content.

Nor does it stop there. 4K HDR is the new standard. In short order, content owners and network operators will look to monetize investment in 5G by offering realtime interactive experiences such as sports betting, multiple choice of angles to watch a live event, 8K originated VR and cloud gaming.

Producing exponentially more content and better content when having to work in remote distributed workflows is hard – if you don’t have leapfrog tech.

Cloud is a necessity

Cloud-based remote collaboration is no longer merely a ‘nice to have’ but a necessity. Video production teams have to pivot their workflows to take advantage of the cloud’s scalability, flexibility and mobility. Not tomorrow. Yesterday.

Cloud video editing platform Blackbird wasn’t conceived around a pandemic but it’s no accident it can work in one. It was conceived around the concepts of resilience and freedom – freedom from location, freedom from proprietary systems and hardware and the resilience to continue to operate with very little resource. 

And one more thing. The biggest impact on media industry survival in 2021 is not Covid-19. Even as we emerge from the pandemic, crowds return to live events and work resumes a degree of normality, the global, sectorial, company specific and individual response to saving the planet should move front and centre.

Blackbird not only enables Covid-safe production to ride out the pandemic, it is permanently ‘green’ allowing carbon neutralizing targets to be hit today.

That’s leapfrog tech.

Tuesday, 24 November 2020

Halo Makes ClearView Flex Integral to Client and Artist Remote Workflows

copywritten for Sohonet

Halo had a heads up that something weird was happening at the turn of the year when filming on a nature documentary they were working on in China suddenly shut down.

https://www.sohonet.com/2020/11/24/halo-makes-clearview-flex-integral-to-client-and-artist-remote-workflows/

“We knew it was serious since the producers were saying they didn’t know when they’d be able to film again,” says Halo Head of Operations James Cregeen. “We never guessed this was part of something that would impact the whole world.”

Halo provides world-class creative talent, award-winning post production services and state of the art facilities from online to audio mixing, spanning all genres of TV and feature film grade in the heart of London. Its clients include all the UK’s major broadcasters as well Disney+, NBCU, Warner Music and Netflix.

The facility was already using PCoIP technology internally and was aware of other tools that could help in the transition to remote workflows.

“ClearView Flex was a tool we’d been talking to Sohonet about for a while. Sohonet manages our firewall and VPN so when we started to roll out remote services we had this existing relationship and were familiar with their support team and setup.”

When lockdown was enforced, Halo had a number of offline projects ready for finishing. Cregeen says, “We were fortunate that when things ground to a halt, we had a lot of shows cut and locked and ready to go.”

With three central London buildings connected via Sohonet’s high capacity, low latency fibre Cregeen explains that it was fairly straightforward to relocate remotely. 

“We’d already extended desktop control from our main hub at Noel Street to our other facilities via PCoIP so taking that remotely to someone’s home over VPN was not a massive jump and was quite quick to spin up.

“The main challenge was how to get a reference monitor with trustworthy full-screen pictures and in-sync audio for artists working from home. That was where ClearView Flex came in.

“It’s something we had looked at repeatedly over the previous year and just not had the operational need to roll it out on a continuous basis. There was always the odd occasion where somebody was remote and couldn’t attend a session and it would have been useful in those circumstances. Covid was the real drive to bring it in.”

In conjunction with the facility’s remote desktop and PCoIP solutions, ClearView Flex has been used across various different parts of Halo’s finishing process including for flame artists, online editing, even for some audio reviews where CVF was the main stereo output.

“It has worked really well,” Cregeen says. “It has integrated really well into our video matrix. You just give it a position on the matrix and can route any SDI signal into it; you just need to send out links to the session and away you go.”

ClearView had proved particularly useful for final post attended review sessions and signing off on credits sequences.

“The limitation on the colour side, which was originally an 8-bit stream, has been addressed,” Cregeen says. “The upgrade to a 10-bit service massively opens up the use of ClearView Flex in other parts of the business.

He continues, “In the review process, it’s vital that everyone trusts that they are seeing the same great quality pictures at the same time. The alternative is to send out a QuickTime for everybody to watch offline and feedback notes. That’s nowhere near the same experience as having an over the shoulder remote session facilitated by ClearView.”

Halo’s online editors and flame artists are able to work at home with Teradici as their PCoIP monitor and an Apple TV plugged into a HD screen streaming their output from ClearView. 

“We’re effectively using it to create the full three monitor setup you’d have in a suite. The artist can dial in their respective clients whenever they get to the sign off stage.”

Another deciding factor for Halo was that ClearView Flex satisfied all security concerns. “Working remotely has brought everyone’s minds back to security and knowing the tools you use are certified makes a big difference,” he says.

As Covid shows no sign of relenting, the industry has to get used to remote as the new normal. Halo expects to continue in a hybrid set up for the foreseeable future where remote is part of every workflow in some form. 

“It’s a constantly changing picture,” he reports. “The lead time on being requested to spin up a remote solution to a particular part of the post process is less than 24 hours—and we need to react. Clients are rightly concerned whether they should or should not travel, whether their team should be working together in the same room or remotely. We’ve got to have all these tools to be able to offer whatever solution is needed.”

Over the last six months, Halo’s technical operations staff have launched a new portal to give clients more involvement in the data management of their assets. Tools like this and ClearView Flex have made certain projects possible. 

“If we’d not done these things key jobs just couldn’t have happened,” Cregeen says. “The hardest thing for everyone is having this mixed world in which everybody is remote or everybody is locally in the office. You have to gear up to support problems and capacity in equal measure not knowing from one week to the next what the ratio will be of remote to local working. That’s tough from a planning perspective.

Arguably, the changing nature of post has been hardest on production company clients. Without a de-facto standard of performing a remote edit, clients are moving from one facility to the next not knowing what the new remote working setup will be. 

“It is vastly different from going from one Avid house to another where for the most part the technology is the same. As a result, we’ve ramped up our call centre and operate it as a support desk for clients to help guide their project.

“In this regard, one of the benefits of ClearView Flex is just how simple it is. You get an email link and put in the PIN, or open the app on Apple TV and away you go. It’s all very straightforward for the end-user. At the moment when everybody is swapping from one place to the next with confusion about what software they are expected to use, the simplicity of ClearView is key.”

As and when normal working practices resume, Halo is keen to retain the benefits of working from home. 

“Everybody is quite desperate to get back to the office to an extent, but we have found a lot of productivity benefits by just giving people the ability and flexibility to securely access media remotely and do things when they can and when they want to,” Cregeen says. “People want to hang on to the benefits of remote working. We all enjoy what we do and having access to systems and tools 24/7 is something we should be grateful for.”

 

Monday, 23 November 2020

MediaKindness: We are in this together

 copywritten for Mediakind

https://www.mediakind.com/blog/mediakindness-we-are-in-this-together/

These past few months have been an anxiety-producing experience for us all. The pandemic has affected us in many ways, uprooted our working lives, sadly delivered great suffering to many. There is still uncertainty ahead. But the shared crisis has also demonstrated the best of humanity. It has pulled people together more than it has caused them to push apart. That’s what happened at MediaKind, and I’d like to share with you some of the ways we have all contributed to reaching out and helping each other.

Global MediaKind teams working from home

As you may know, MediaKind has 1,600 employees and contractors. Our global team traditionally operates from our hubs in the U.S, Canada, France, the U.K, Israel, and India to our satellite bases spanning Brazil and Mexico to Russia, Spain, and Belgium to Australia and Japan.

 

Like many organizations, this time last year, we were predominantly office-based. Team members in our main hubs, in particular, rarely worked from home. That flipped 180-degrees overnight in March when the full extent of Covid-19 became clear. We decided to prioritize health and safety and get everyone home.

 

I remember discussing this with Angel Ruiz, then our CEO, and now chairman of the MediaKind board. We needed to send everyone home -that was clear – but how do we proceed from there? Few companies had detailed plans for business continuity and employee safety in the event of a global pandemic.

 

We believed that with the MediaKind community’s help, we would figure it out and pull through. And you know what? That’s what we did.

Global teamwork in the wake of Covid-19

China was the first population to endure the crisis. When our people there ran out of PPE, they put out a call for help, and our team in California stepped up and shipped masks over. As the virus spread worldwide and colleagues in other offices experienced lockdown, the China team reciprocated tremendously. They spontaneously offered to ship surplus masks to those who needed it and also shared how they had managed during the quarantine.

 

When everyone was forced into working from home, my biggest concern was the potential impact on mental health, particularly in areas where local laws restricted people’s ability to leave home at all. Maintaining ties to colleagues, projects, and the wider company as a whole was imperative.

LinkedIn Learning: whole employee education

We quickly decided to ramp up online training opportunities. LinkedIn Learning was a solution that allowed people the flexibility to benefit from accessing virtually any content of their choosing. That was a deliberate choice and antithetical to the conventional rollout of training programs.

 

Typically, training tends to be a top-down approach in which an organization will carefully script regimens of learning paths. But we didn’t have the time. To be frank, urgency forced our hand. The need to connect people to information and education resources was far more critical.

 

From the first week of June, we were able to turn LinkedIn Learning on for all our employees and contractors, and the results were incredible. Crucially, everyone was able to self-serve the positive learning programs that suited their environment. The curriculum ranged from technical engineering tools to materials encouraging a greater understanding of mindfulness and unconscious bias. Everyone took part. In LinkedIn’s view, this was among the highest activation rates they’ve seen.

Uniting #teamMediaKind through inspiring initiatives

Our moral and cultural strength as a company has significantly increased. I say this with confidence because I’ve witnessed it first-hand. MediaKind is in the process of integrating several legacy cultures from companies acquired going back many years in different corners of the world. The challenging 2020 circumstances have helped bring us together in a natural, organic, and authentic way.

 

This feeling of unity is reflected on Glassdoor, where we’ve seen our overall internal ratings improve naturally over the past few months. As a company, I’m proud that we could keep people focused on caring for other employees.

 

One of the most inspiring contributions came from Olie Baumann, based in Southampton, UK, who initiated an online cycling club for anyone interested in cycling and keeping fit. His idea personifies MediaKindness in bringing a group of people together from all over MediaKind in a communal experience. Today, the cycling club embraces 50+ people who meet virtually to exercise, compete in online races and charitable causes, and have fun as a team regardless of distance. We hope the launch of this MediaKind club will inspire many more ideas in the future.

MediaKindness to our MediaKind family

While the Covid-19 pandemic has radically changed the way we operate, I’m proud of our folks and the way they have embraced their own MediaKindness over the past nine months. Never before have people felt so intimately about the importance of caring for their own wellbeing, that of their families, communities, and colleagues.

 

Caring deeply about helping people through health concerns. Taking up learning opportunities to stimulate professional and personal development. Becoming a more productive member of the company and society. Taking part in virtual yoga sessions to free the body and the spirit or sharing virtual cooking lessons with colleagues. Whatever the motivation, all of our efforts to keep each other happy, informed, and safe throughout this unprecedented year should be applauded.

Friday, 20 November 2020

Verizon preps 5G edge for 8K live

Streaming Media

Hot on the heels of AWS’ play for live and uncompressed end to end video production in the cloud, comes news that another of its technologies is being used to test broadcast distribution over 5G. 

https://www.streamingmedia.com/Articles/News/Online-Video-News/Verizon-Preps-5G-Edge-for-8K-Live-with-AWS-and-Zixi-144011.aspx

An unnamed “major global broadcaster” is apparently testing live and live linear 4K and 8K broadcast workflows from Verizon’s 5G edge and using technology from Zixi and AWS. 

“The goal AWS, Verizon, and Zixi is to address the three main elements of live streaming success: latency, overhead, and uptime,” said Eric Bolten, VP of Business Development, Zixi in a statement. “The industry has evolved from science experiments to real world production deployments today, not next year.” 

Video is 5G’s killer app – but 8K? 

5G network operators view video as the killer app for 5G in its early phase rollout as they can take immediate advantage of the standard’s 10GBps+ speeds and service latency of less than 1ms to deliver more, higher quality content with realtime interactivity. 

Most previous 5G media tests have focussed on the contribution part of the process. Using the network to broadcast live events is the logical next step but is planned for in 3GPP release 16, which introduces enhanced ultra-reliable low latency communication (eURLLC) to deliver millisecond latency, time-sensitive networking and improvements to ‘high power, high tower’ transmissions to support higher mobility and better coverage of terrestrial TV.  

This would offer content owners such as sports franchises and pay TV broadcasters the chance to monetize new video-centric applications such as 8K VR, interactive viewer-selectable angles of a sports match, sports betting and realtime augmented reality content. 

The 8K live broadcast landscape is rarified but growing. BT Sport, arguably the world’s most progressive of broadcaster, had already earmarked the start of the 2020/21 soccer season in the UK to begin live broadcasts in 8K. It already delivers the world’s first regular 4K UHD matchdays. Restrictions on getting technicians into stadia and specific delays on Sony 8K cameras from Japan as a result of Coronavirus has delayed but not cancelled its ambition. 

In January, Verizon made a behind closed doors 8K over 5G test of a Pro Bowl viewing event in Miami. It’s a part of a long term partnership the operator made with the NFL to outfit NFL stadia with 5G networking to showcase the tech’s potential to give fans an even more immersive experience.   

The 49ers have five 8K cameras fitted inside the Levi’s Stadium earlier this month which are being used, initially at least, for zoom views of both end zones. 

The biggest use case in 2021 by far will be the postponed Tokyo Olympics which promises to be a showcase for 8K applications ranging from broadcast to VR. 

 One of Verizon Media’s priorities for 2021 is to address emerging use cases that it sees broadcasters investing in. According to the operator’s Darren Lepke, Head of Video Product Management, these include “realtime video and interactivity, wagering and gamification.  We’re developing new video protocols that deliver realtime video at scale and integrating things like gamification engines or video chat features so you can watch a football match with your mates.” 

He points out that for Verizon Media, realtime interactive video at scale is not dependent on 5G but certainly improves the experience. “Where you have a high-speed network and users consuming content on the go the reliability and performance of your video will increase and gets you a much more solid experience than today.” 

Lepke added, that he thought it unlikely we’d see 8K streaming any time soon. “My personal opinion is that we are still in early adoption of 4K video. You do need giant TVs in people’s homes to see the benefit of 8K but on a mobile device there’s barely a reason to stream 8K since you can’t tell the difference (between the image quality of a 4K signal). That said, there is an ecosystem of device manufacturers and encoding partners working to build out 8K media.” 

AWS Zixi and Verizon test  

The test announced today is from Zixi whose live streaming protocol is being used to secure the UHD stream 

“Today the business of live events is cumbersome, infrastructure intensive, and high cost,” said Gordon Brooks, Executive Chairman and CEO for Zixi in a statement. “What we’re doing with Verizon 5G and AWS Wavelength Zones is streamlining that process. We’re changing the economics; we’re changing how you go about doing it and how you go about experiencing it.” 

AWS Wavelength is described as AWS infrastructure deployments that embed AWS compute and storage services within communications service providers’ datacenters at the edge of the 5G network. Application traffic from 5G devices can reach application servers running in Wavelength Zones without leaving the telecommunications network. This avoids the latency that would result from application traffic having to traverse multiple hops across the internet to reach their destination, enabling customers to take full advantage of the latency and bandwidth benefits offered by 5G. 

To help content providers deliver what Zixi calls “new kinds of live streamed sports and entertainment experiences” it is using AWS Wavelength to power its Software Defined Video Platform (SDVP). The SDVP, says Zixi, leverages ultra-low latency access to AWS compute and storage services enabled at the Verizon 5G Edge to process huge amounts of UHD video and compress it for delivery to mobile devices. 

“Wavelength allows us to move video processing to the edge and to deliver additional performance at every part of the content delivery chain,” said Bolten.  

Zixi also has designed the SDVP to support remote and distributed production scenarios.  

“In order to scale, to go from hundreds of streams to tens of thousands of streams and clients, you need the ability to have views across the organization, between organizations, and easily manage both,” said Bolton. “5G better optimizes overhead, so you can maximize traffic spectrum and reduce latency. Now with Wavelength available at the 5G network edge, this combined mobile edge compute (MEC) solution foretells the future of mobile broadcast workflows.” 

More broadly on the benefits of MEC for broadcast and OTT delivery to the consumer, Bolton says it has very high potential for disruption in a number of ways for organizations who are acquiring or distributing a signal.  

Ongoing field testing are shows promising results with latency below 10 ms over an 80-100 Mbps pipe. Zixi says tests of linear video broadcast via an MEC solution are promising. It is looking to streamline the process for using termination devices, like 4K 200 Mbps decoders, and WiFi networks in a hybrid infrastructure for video file delivery. 

“The technology makes virtual control rooms possible, where a production team can order 10 or 20 camera sets and have curated subsets that are available to the general public,” Bolten said. “That’s when sponsorship and customization begin to support real business models. There’s no question that mobile edge computing is the future of live production and contribution. We are migrating what we have already been doing with AWS using Amazon S3, and now making a connection to Wavelength Zones at the edge of 5G networks. These kinds of gateways on the 5G network open up everything.”