Saturday, 31 July 2021

Google makes essential leap towards large scale quantum computing

RedShark News

Like herding (Schrödinger's) cats, calculating with qubits can go all over the place. Correcting errors in real time is essential for reliable large-scale quantum computations. Now, Google and Honeywell have separately devised error correction techniques to give the field a turboboost. 

https://www.redsharknews.com/google-makes-essential-leap-towards-large-scale-quantum-computing

Quantum computing companies from Amazon to Microsoft need to decrease the error rates that currently limit the sophistication of quantum computation.  

Progress has been held back by the nature of qubits, which can be built and controlled in different ways.  

“The trouble is all qubits can be easily perturbed, and calculations are derailed when they are. That's why quantum computers typically run at extremely low temperatures in vibration-proof housings,” reports CNET.

Every quantum computing company is trying to improve qubit operations, too. That work includes not just error correction, but also making qubits less prone to errors in the first place, lengthening the time multiple qubits stay entangled so they can perform calculations, and compensating for errors after calculations are complete.  

That’s why Google and Honeywell’s demonstration is considered a breakthrough.  They both use logical qubits – quantum compute’s storage and processing units - essentially making them more robust from disturbances like vibration and electromagnetic emissions. 

Researchers from Google laid out the results of their calculations in a paper showing logical qubits overpowering errors. It was swiftly followed up by research from Honeywell which claimed its solution could detect and fix qubit errors during operations, so calculations can run longer. 

Google mathematicians explain, “Realising the potential of quantum computing requires sufficiently low logical error rates. Many applications call for error rates as low as 10−15 but state-of-the-art quantum platforms typically have physical error rates near 10−3.” 

Honeywell began its quantum computer development program a decade ago and uses trapped-ion technology that uses charged atoms to hold quantum information. The firm says its Model H1 consistently achieves the highest quantum volume – a comprehensive performance measurement used widely by the industry – on a commercial quantum computer.  

The firm is in the process of merging its quantum compute division with that of quantum software experts Cambridge Quantum Computing to advance and build integrated hardware and software quantum machines. 

The new company will have a long-term agreement with Honeywell to help manufacture the critical ion traps needed to power the quantum hardware. Honeywell will also invest between US$300 million in the venture. 

“Joining together [Honeywell and CQC) will become a global powerhouse that will create and commercialize quantum solutions that address some of humanity’s greatest challenges, while driving the development of what will become a $1 trillion industry,” said Ilyas Khan, founder of CQC. 

While the H1 test machine uses 10 qubits to encode a single logical qubit, Google expects to require about 1,000 physical qubits for each logical qubit as it moves to deliver a practical quantum computer by the end of the decade.  

Is Gamification of the Olympics the Right Play?

NAB

Major public broadcasters and the International Olympic Committee share a desperate need to wean themselves away from their aging viewership. Both are trying to be hip to a younger crowd by making pacts with digital and wooing them with interactive options. The pending Tokyo Games, played behind closed doors, will be the largest scale attempt yet to bring gamification of live sports to Gen Z.

https://amplify.nabshow.com/articles/is-the-gamification-of-the-olympics-the-way-to-go/

For the IOC the writing has been on the wall for a decade. It’s why it began dedicating more effort into streaming and digital content back in 2012. It’s why IOC President Thomas Bach struck a $1.3 billion deal with Discovery to be the gatekeeper for all Olympics content in Europe from 2018-2024.

Speaking to Euro Sport, he stressed that younger people’s viewing habits would shape much of the content, in particular their consumption of content via smartphones.

Tokyo 2020’s official mobile app

At the time, David Zaslav, president and CEO of Discovery Communications, said its “Mobile and direct-to-consumer platforms can reach more than 700 million people across Europe. This agreement will bring the Olympics Games to more viewers on more screens than ever before.”

That boast was backed up only last week by Andrew Georgiou, Discovery president of Sports, who told IBC 365, “This is our first summer Olympic games and we definitely want it viewed by more people across Europe than any other Games in history. We’ll achieve it because we’ve got the breadth of coverage.”

And he’s not likely to be wrong. A new survey carried out by AdColony found that nearly half of fans plan to watch the rescheduled Tokyo 2020 Games on a mobile device. In addition, the majority are set to multitask on their mobile while watching the Games and most intriguingly the study found that the audience is expected to skew toward 63% females versus 35% males.

 

Growing Digital Engagement

Core to the IOC’s agenda over the next five years is to “Grow digital engagement with people,” and since 70% of all IOC revenue is derived from broadcast rights sales it is the ability of broadcast partners to reach youth audiences which is vital for its continued relevance.

Hence the battery of digital and gamified products being thrown at the wall from Tokyo. Just like host broadcaster Olympic Broadcasting Services experiments with AI and 5G at this Olympics, it has a is issuing a range of data and graphics driven gimmicks to see which ones will stick with the elusive millennial the most.

“Our ambition is to bring the magic of the athletes’ achievements to the world on an unprecedented scale,” says OBS CEO Yiannis Exarchos in an extensive official guide.

“Technology is going to play a critical role and allow us to bring fans ‘inside the venue’ virtually. The IOC and OBS believe these new digital innovations will leave a legacy which we will build on at future editions of the Olympic Games.”

Hit the Cheer Button

With fans barred entry, effort is being made to make them feel virtually present. For example, the reactions of fans watching back home will be displayed in five-second video selfies on a video matrix inside the venue, to give athletes something to respond to.

A virtual “cheer” button will be embedded on several broadcasters’ digital platforms. Fans can watch the broadcast feed of an Olympic event and virtually clap or cheer by clicking the button. The system collects all the cheers and renders a global map of “cheer activity.” The map is sent as a video stream to broadcasters and showcased on venue video boards.

OBS will produce 30% more content compared to Rio 2016, much of this headed to digital platforms. Digital publishers can draw on a repository of up to 9000 clips and short-form assets called Content+. This includes behind-the-scenes content from the competition venues and further content filmed with smartphones. Additionally, OBS plans to produce 1800 fast turnaround clips from all sports for digital consumption.

It will create 180- and 360-degree content specifically for Virtual Reality which Discovery (among rights holders) will distribute in Europe and will augment a number of events with graphics or tracking technology including a claimed first-of-its-kind broadcast enhancement using AI and computer vision to overlay visualizations during the athletics sprint events.

In another breakthrough, the coverage of the archery competition will provide biometric data from miniature sensors worn by the archers. The monitor’s receptors will detect the heart rate and transmit the data wirelessly to generate the on-screen television graphics.

“Audiences will be able to witness the heartbeat variations and adrenaline rush experienced by the archers as they shoot their arrow,” OBS says.

Twitch Sidecasting

Broadcasters are also making extensive digital deals for Olympic coverage. Discovery is pumping out bespoke content to Snap, Instagram and Facebook and NBC has a deal with Amazon-owned Twitch to produce and deliver live content. This includes “game-ified pre-Olympic activations, Olympic athlete interviews and Olympic-themed gaming competitions.”

“Twitch viewers tune in not only to watch their favorite athletes but to also take part in pre and post-game interviews and virtually connect with other fans from around the world,” said Michael Aragon, Chief Content Officer at Twitch.

The Tokyo 2020 FanZone

The most interesting activation, for its merger of the linear and the nonlinear sportscast is what NBC calls Primetime Sidecasting. During the primetime block of shows on NBC, Twitch creators will be commentating live on a companion interactive broadcast that encourages co-viewing of the nightly primetime broadcast on NBC. Twitch creators, watching off-camera NBC’s primetime coverage of the Games, will invite viewers to submit discussion topics in line with what anchors and guests are covering on the linear channel.

The overall aim is to attract advertisers to Twitch’s “robust youthful audience” with Olympic content with both parties NBC sharing in the ad revenue.

On top of this there is the Tokyo 2020 FanZone, described by the IOC as an “interactive gamification experience” available online on official Olympics digital channels.

eSports To Save the Day?

A Trivia game is focused on pulling in Gen Z with prizes up for grabs. A Magic Moments product seems to be a highlights reel plucked from the archive. Another initiative, Fantasy Challenge, asks fans to create a team by selecting their 10 favorite athletes from individual sports. They can start a league with friends or join an existing one to compete against other teams.

This will no doubt reach its apotheosis in Paris 2024 when competitive computer gaming is expected to become an official Olympic event. The path to eSports inclusion has already been laid when the Intel Extreme Masters Pyeongchang was broadcast on the Olympic Channel and had partial support from the International Olympic Committee ahead of the 2018 Winter Games. Five Korean League of Legends players bore the Olympic torch during its journey through South Korea, marking another first in the relationship between the Games and competitive gaming.

The fast-growing eSports industry doesn’t need Olympic certification as much as the IOC needs the eSports stamp of approval.

The IOC is “a 19th century organization trying to deal with a 21st century phenomenon,” said former IOC VP and chairman of the OBS board, Dick Pound.

He was trying to counter resistance to including eSports in the movement. Though he warned against the inclusion of violent computer games. “We can get taken to the cleaners in a major hurry if we’re not very careful about this.”

At the same event, Thomas Bach reiterated that the IOC had to connect with hundreds of millions of gamers worldwide if the Olympic body is to remain relevant with the younger generation.

 


Wednesday, 28 July 2021

Immersive Video: Here’s a Good Place to Start

NAB

Immersive is the new goal for just about every piece of video storytelling, whether the outcome is entertainment, education, communication or a sales pitch.

https://amplify.nabshow.com/articles/immersive-video-heres-where-to-start/

Immersive video is often used interchangeably with virtual reality, although the two differ in one crucial respect: While immersive video is designed to make viewers feel like they are inside the video, VR experiences often enable users to interact with and/or direct at least part of the simulation.

“360 video allows users to navigate within the virtual environment to get a 360-degree view,” says Emily Krings, blogging for live streaming platform dacast. “They can see what is above, below, and all around them, depending on how they move. Immersive video alone, on the other hand, does not necessarily give viewers the ability to actively participate.”

Up until recently, VR sets were relatively expensive. A VIVE StreamVR in 2016 cost $1,000; you can now pick up an Oculus Quest 2 for $299, bringing VR and immersivity within reach of any budget.

VR Production Tips

Krings provides a five step guide for producing immersive video for VR headsets. This goes from the obvious – use a 360-degree camera from the likes of GoPro, VUZE, Ricoh or Insta 360 – to the sensible; be very strategic with your point of view.

“Since the goal is for viewers to feel like they are in the scene, it is a good idea to record at eye level, to create a first-person POV. Either a head mount or chest point can help you achieve this. A neck camera or glasses with a built-in camera would work, as well.”

Other tips: incorporate a storyline.

“Your content should be displayed in a way that feels meaningful.”

Emily Krings, dacast

“This is not to say that you need to create a script or narrate every move, but your content should be displayed in a way that feels meaningful,” she advises. “For example, if your video involves exploring a landmark, you’re not going to want to simply walk in a circle around the subject. Think of how a tourist would approach it in real life. They’d likely step back for a full view, get close to example details or plaques. They may even grab a brochure about the place. Create the story around the experience that you’re trying to create for the viewer.”

Similarly, it’s a good idea to incorporate some natural movements that will make the video more lifelike. Within reason, of course. Extreme jerkiness would lead to nausea.

And since many immersive videos are structured as a walkthrough or involve some other sort of ‘journey,’ adding unexpected turns can make it more lifelike. 

“Let’s say that you are walking through a city. It’s not likely that you’ll walk perfectly straight and keep your head looking directly ahead the entire time. Naturally, you’re going to see what’s going on around you. You may move to the side to dodge a person coming your way. Incorporate these turns and pivots as you record your content.”

Future Applications

Some of these ideas could be incorporated to encourage remote attendees to tune into live streamed virtual events with a VR headset. The idea being that this would help remote participants feel presence in contrast to the two dimensional zoom experience.

With developers like Microsoft building virtual presence video communicating systems as next-gen replacements for Teams, perhaps we should all master the language of video immersivity.

 

Monday, 26 July 2021

IBC BTS We Are Lady Parts

IBC

With songs such as ‘Nobody’s Gonna Honour Kill My Sister But Me’, Channel 4 sitcom We Are Lady Parts, sets out to lift the veil on Muslim women. Equal parts subversive, silly and sweet, Working Title’s music comedy is about a Muslim female punk band - called Lady Parts - who are on a mission to find a lead guitarist and maybe, just maybe, get a proper gig.

https://www.ibc.org/trends/behind-the-scenes-we-are-lady-parts/7766.article

Creator (writer, director, producer) Nida Manzoor says that she took creative inspiration from The Young Ones and Spinal Tap while numerous other references pop up explicitly (Clockwork Orange, Brief Encounter) or implicitly to guide the look.

Cinematographer Diana Olifirova shot a 16-minute pilot for the show in 2018 and says the creative intent for the series commission was to widen the scope.

“Nida wanted to create a cinematic comedy not just a basic TV show, to make it as vibrant as possible and to use the language of cinematography to tell the story. I loved this. We would work a lot on every scene to tell the story using cinematic means not just by acting and dialogue alone.”

An early example is in episode one where lead character Amina walks around the park, sees a guy walking towards her and immediately falls in love.

“I did that shot in the pilot and repeated it in a different location,” Olifirova explains. “We just felt it was such a very simple way of conveying this feeling of getting hooked on someone you see on the street. There’s no dialogue, just a voiceover.”

The shot walks with Amina and slowly pulls her toward us on Steadicam. As she sees the guy the camera travels 180-degrees around her. We see the guy from her point of view and then we move 180-degrees around both characters. In the end we focus on Amina but the guy has disappeared.

“It’s a magic trick you do in camera. It’s one developing shot where the action happens in one movement and it makes you question whether the guy was there or not. In the grade we made the moment when she looks at the guy a little warmer and when he disappears everything returns to a cold exterior. The more things we do like that the more inspired I get about using cinematography in telling the story.”

Born in Ukraine, and a graduate of the NFTS, Olifirova is an emerging talent with festival screenings of her work including the short All of Me.

For Lady Parts she cites horror-thriller Green Room as a main reference for interior colour tones, Richard Ayoade’s Submarine for some editing points, camera movement and composition, and American Honey (lensed by Robbie Ryan BSC) for its free form handheld camera work.

“I take all those on board and enhance with my own imagination and bounce a lot of ideas off the locations, the art department and costumes.”

Her camera choice was Alexa Mini paired with Cooke Anamorphic lenses shooting with a cinemascope aspect ratio 2.35:1.

“We wanted to put all five main characters in one frame so a cinematic aspect ratio which is more width than height was logical. I love anamorphic and wanted lenses that would be picturesque but not too distracting. The Cooke’s are nice medium between being too in your face and being quite delicate.”

Look design involved differentiating the more demure and kind personality of Amina from that of the more raucous and unsettled character of Saira, the band’s guitarist.

“Amina’s world is very pastel in design and her scenes calmer in comparison to others. I used a lot of Steadicam and tracking movements and tried to light her with softer light, a bit less colour, more warmth and diffusion. Everything in her world looks cleaner and more pristine but this changes throughout the series. Mostly this change is elaborated in the costume. I tried to keep a similar vibe with her in camera movement and lighting.

“With Saira we use handheld always and have much more contrasty lighting, more colour and angles and a much harder light. For the scene where she goes back to family, I wanted to try to shoot using only light inside the set rather than any sunlight. I used mainly practicals just to make it very different to the other story worlds. Everything with her is darker in general.”

It’s clear that Olifirova had a lot of fun designing the music sequences. In Ep 1, Amina solos on guitar in the style of Don Mclean, in her wardrobe. This was shot in a 50m long corridor with a camera track and dolly. Actor Anjana Vasan was also on the platform as it tracked backwards through rows of clothes.

Olifirova had the camera on a slider with a zoom lens to perform a contra zoom movement while inside the ‘moving’ wardrobe. She also added interactive lights. Oh, and there are puppets dueting the chorus.

“We pulled a lot of things together to make it surreal and strange,” the DP says.

For a pastiche of Brief Encounter, shot in black and white, she swapped out the cool LED bulbs on location for purple and green colours and also used white tungsten bulbs. “Even if you didn’t see it, you can sense that the lighting works for the feeling we’re trying to evoke.”

Her favourite of the musical numbers was the finale night time exterior. It’s the scene of the band’s last gig which needed to be emotionally impactful but at same time look inexpensive.

“I was battling between wanting to use super fancy laser lights and keeping it low-key since that’s what Nida wanted. I used a couple of powerful moving lights that only turn on at the very end of the sequence. I think they work because the lights don’t touch any of the characters or go through the camera to make any flares. They– don’t call attention to itself. At the same time, I hope the scene is a fittingly touching emotional end.”

Covid impacted production, mainly forcing a change of exterior locations to studio builds. Olifirova didn’t mind this, saying that she likes to create within the space.

“It meant we had more creative input to light scenes as we wanted. We could take away the ceiling, choose lights of any colour. I like having space. For me space is the most important because I like to have wide shots that create composition and enhances the emotion. The more I can contruct a shot in-front of the camera the happier I am.

“I like using mirrors and practical lighting and [production designer] Simon Walker provided us with such a variety of both. For example, if you don’t have a window on one side of the set but you have to shoot in the opposite direction you can use a mirror to reflect the window and add depth. I used a lot of those tricks.”

Olifirova grew up wanting to be a photographer and took as many courses as she could in Kiev.

“When I realised how much more you could do with a moving image I began experimenting with double exposure and different movements and dynamic lighting changes. I realised how much more it gives me as an image author.”

She started shooting more and more and went to film school in London. While movies are a passion, what really drives her is the act of creation.

She recently wrapped shooting Netflix’s upcoming British teen series Heartstopper based on the graphic novels by Alice Oseman and produced by See-Saw Films.

“What motivates me is making things happen together with a team. The challenge is that each image in a film has such a small amount of time for a person to see that image so the cinematographer’s craft is to draw their attention to the element they need to see. It’s exciting for me to achieve that shot after shot after shot.”

 

 

Saturday, 24 July 2021

Hybrid Release Strategies Are Amazing… But We Have Questions

NAB

The performance at both box office and premium VOD of Disney’s Black Widow has focused minds in Hollywood. Has the dilemma of release windows and playing off exhibition against streaming been cracked?

https://amplify.nabshow.com/articles/hybrid-release-strategies-are-amazing-but-we-have-concerns/

Well, not so fast.

As outlined by NAB Amplify, cinema versus streaming is no zero-sum game and exhibitors aren’t out of the picture but the odds are stacked in the studio’s favor.

Black Widow’s domestic opening BO gross of $80 million isn’t chicken feed but the movie’s progress was stalled when traffic fell 41% from Friday to Saturday, “an almost unprecedented drop for a Marvel title,” says The Hollywood Reporter.

There’s no doubt its theatrical outing was cannibalized by a day/date release onto Disney+. But that’s okay — for Disney — since the studio got to take home 100% of the $60 million in revenues (which IndieWire calculates as meaning that about 2 million of Disney+ 103 million global subscribers paid $30 to screen the film at home.)

“On that basis, Disney would have so far earned more from PVOD than in theaters,” says Indiewire’s Tom Brueggemann. “A single weekend’s performance is not the final word for movies, theaters, or even Disney, but it suggests major implications for all concerned.”

Comscore’s box office analyst Paul Dergarabedian tells THR that overall domestic revenue crossed $100 million for the first time since before the pandemic struck. He also noted that the marketplace is still grappling with “latent consumer reticence.”

“If the pie is big enough to power $158.8 million worth of global theatrical revenue plus $60 million worth of streaming, it shows that consumers love to have a choice,” Dergarabedian says. “But this model does not apply to all movies, and that’s why each film’s big-screen/small-screen success must be evaluated on a case-by-case basis.”

And therein lies the rub.

But not all studios are Disney, and not all films are Black Widow. Not even all Disney films are Black Widow.

Disney didn’t release streaming numbers for its major cinema and PVOD release Mulan in 2020 nor for animated feature Raya and the Last Dragon, which made just $8.5 million in theaters, according to Wired.

True, those were released in the midst of the pandemic when cinemas weren’t open or simply not an option for many but Wired’s point is whether or not a movie is a success in theaters and on streaming depends on the type of movie and the audience it serves.

“It’s a crapshoot,” says Wired’s Angela Watercutter. “While there’s little doubt the traditional 90-day window between a movie’s theatrical release and its debut on streaming/VOD is permanently closing, how studios — and, for that matter, theater chains — will navigate that is full of open questions.”

For a movie with a built-in fan base like Black Widow, it’s a no-brainer, Wired says. Put it in the theaters, and send it to Disney+. For a movie like F9, which is part of a franchise built around the moviegoing experience, keep it theater-only, at least for a few weeks. Indies can go to art houses and streaming the same day — cinephiles will find theaters, everyone else will Netflix and chill. A movie like Dune, meanwhile, probably has enough hype to hit HBO Max the same day as theaters and still make money.

These hybrid strategies are likely to shift movie to movie even if Disney follows through on its plans to return to a theatrical-first pattern later this year.

What Disney has also done in releasing PVOD numbers for Black Widow is shake the tree for how strategies are accounted for.

Industry chatter suggests that Disney revealed the numbers in order to prop up an apparently underperforming title. A Marvel movie that does less than $200 million opening is considered a fail.

Or it released numbers to put pressure on exhibitors not playing ball and demanding too much take for showing the film (Japanese exhibitors are fingered).

Either way, the move puts “pressure on studios to reveal such information going forward on behalf of filmmakers, talent and agents” says THR.

Disney is unlikely to make a habit of it. Who wants to admit failure if they don’t have to? Consequently, the jury is out on whether it will show and tell for day-and-date Premier Access releases, such as Jungle Cruise, starring Dwayne Johnson.

Strategies will change as audience behavior does. Welcome to the new, flexible release pattern.

“Appointment TV viewing may be dead,” Gartner analyst Eric Schmitt, tells Wired, “but appointment movie watching is just making its debut and looks to have a long runway ahead.”

 


Friday, 23 July 2021

NFTs Are Another Sign of Creators Taking Back Control

NAB

https://amplify.nabshow.com/articles/nfts-are-another-sign-of-creators-taking-back-control/

Across the board, creators are striving to claw back control, agency and, increasingly, revenues from employers, publishers and distributors.

One way is via non-fungible tokens (NFTs), which give creators new ability to go directly to customers. But are they a flash in the pan or here to stay?

NFTs are irreplicable blockchain-based tokens that effectively assign ownership, in some form, for a specific digital item. A robust market for NFTs has sprung up among collectors and speculators.

Key milestones in the market’s development included the sale of a digital collage artwork by the artist Beeple for $69 million and the sale of the first-ever tweet (by Twitter founder Jack Dorsey) for $3 million. The NBA’s Top Shot Licensed Digital Collectibles NFTs launched in June 2020, and had traded more than $550 million for video “Moments” by May 2021.

Although musicians may have missed out on live performances and the merchandise sales that go with them, Grimes sold thousands of NFTs at $7,500 each for two short videos — the digital equivalent of signed, limited-edition prints, as PwC points out in its Global Entertainment & Media Outlook 2021–2025.

The Kings of Leon launched an album last March as an NFT that included a limited edition vinyl disc, along with MP3 files and a GIF of the artwork.

Brands are getting in on the act too.

In June, Anheuser-Busch minted NFTs for its premium Stella Artois beer and partnered with ZED RUN to create 50 unique horses for digital horse racing. The horses sold for millions of dollars, according to Spencer Gordon, who leads the marketing team at Anheuser-Busch.

“When they sold we donated the money to help bars and restaurants in Europe that were suffering from the (COVID-19) crisis,” he tells Forbes.

But the cutthroat race to be one of the first brands to launch an NFT has resulted in a clutter of low-quality NFTs issued only for the sake of earning money rather than resonating with buyers, writes Forbes’ Rashi Shrivastava

“There is no real functionality or even art to it,” says Nick Tran, global head of marketing for TikTok. “There is no craft, it was literally just, let’s get this out quickly, get the headline, move on to the next thing. Ninety-five percent of those are probably going to fail.”

TikTok has a creator marketplace that helps brands in 40 countries find partners, and its Creator Fund enables people whose self-generated content makes waves on the platform to earn money from their posts.

Young creators are also at the core of the business model of Roblox, a gaming platform that enables users to build their own games and play games developed by others. Roblox went public in a blockbuster IPO in March 2021 and boasts a market capitalization of about $55 billion.

“Substack, the newsletter platform company whose slogan is ‘Take back your mind,’ has emerged as the portal of choice for hundreds of independent writers — many of whom have left struggling newspapers and digital media operations and are now eager to sell subscription newsletters to their fans and audiences,” reports PwC.

Unionization is another sign of creators asserting themselves. In Hollywood, a standoff between the Writers Guild of America and the Association of Talent Agents resulted in a new code of conduct for agents, aimed at ending the “packaging” or bundling of talent by agents for TV and film production.

Musicians are seeking a bigger payback too. PwC reports that the rapid growth in streaming has powered corresponding increases in the value of large catalogues of music and their associated rights. That’s good news for formerly embattled creators who aim to monetize their portfolios of work.

Taylor Swift, after a long-running dispute with the company that owned rights to her master recordings, began re-recording and reissuing her previously recorded hit songs to regain ownership. Other major transactions included Paul Simon selling his catalogue to Sony for $250 million, Stevie Nicks selling a majority stake for $80 million to independent operator Primary Wave, and Bob Dylan selling his 600+ song catalogue to Universal for a reported $300 million.

 


AI Compression Enters the Frame

NAB

https://amplify.nabshow.com/articles/ai-compression-enters-the-frame/

In a world where online video use is soaring and bandwidth remains at a premium, video compression is essential to keep the gears running smoothly.

But conventional techniques have reached the end of the line. The coding algorithm on which all major video compression schemes have been based for 30+ years has been refined and refined, but it is still based on the same original concept.

Even Versatile Video Coding (VVC) which MPEG is targeting at ‘next-gen’ immersive applications like 8K VR is only an evolutionary step forward from HEVC, itself a generation away from H.261 in 1988.

What’s more, the physical capacity of the silicon chip is reaching its limit too. Codecs are at an evolutionary cul-de-sac. What we need is a new species.

AI codecs developed

As this article for RedShark News makes clear, the smarts of codec development are being trained on artificial intelligence, machine learning, and neural networks. They have the benefit of being software-based and therefore more suited for an environment in which applications will run on generic hardware or virtualised in the cloud.

Among companies with AI-based codecs is V-Nova. Its VC6 codec, standardised as SMPTE ST 2117 can calculate bitrate to optimise bandwidth usage while maintaining an appropriate level of quality at superspeed.

Nvidia’s Maxine system uses an AI to compress video for virtual collaborations like video conferencing.

Haivision offers Lightflow Encode which uses ML to analyse video content (per title or per scene), to determine the optimal bitrate ladder and encoding configuration for video.

This also uses a video quality metric called LQI which represents how well the human visual system perceives video content at different bitrates and resolutions. 

Perceptual quality rather than ‘broadcast quality’ is increasingly being used to rate video codecs and automate bit rate tuning. Metrics like VMAF (Video Multi-method Assessment Fusion) combines human vision modelling with machine learning and seeks to understand how viewers perceive content when streamed on a laptop, connected TV or smartphone.

It was originated by Netflix and is now open sourced.

Perceptual quality and VMAF

“VMAF can capture larger differences between codecs, as well as scaling artifacts, in a way that’s better correlated with perceptual quality,” Netflix explains. “It enables us to compare codecs in the regions which are truly relevant.”

iSize Technologies has developed an encoder to capitalise on the trend for perceptual quality metrics. Its bitrate saving and quality improvements are achieved by incorporating a proprietary deep perceptual optimisation and precoding technology as a pre-processing stage of a standard codec pipeline.

This ‘precoder’ stage enhances details of the areas of each frame that affect the perceptual quality score of the content after encoding and dials down details that are less important.

“Our perceptual optimisation algorithm seeks to understand what part of the picture triggers our eyes and what we don’t notice at all,” explains CEO Sergio Grce.

This not only keeps an organisation’s existing codec infrastructure and workflow unchanged but is claimed to save 30 to 50 percent on bitrate at the cost in latency of just 1 frame – making it suitable for live as well as VOD.

The company has tested its technology (shown here iSize | Bringing Efficient, Intelligent And Sustainable Solutions To Video Delivery) against AVC, HEVC and VVC with “substantial savings” in each case.

“Companies with planet scale steaming services like YouTube and Netflix have started to talk about hitting the tech walls,” says Grce. “Their content is generating millions and millions of views but they cannot adopt a new codec or build new data centres fast enough to cope with such an increase in streaming demand.”

Using CNN

ML techniques which have been used heavily in image recognition will be key to meeting the growing demand for video streaming that we are seeing, according to Christian Timmerer, a co-founder of streaming technology company Bitmovin and a member of the research project Athena Christian Doppler Pilot Laboratory. The lab is currently preparing for large-scale testing of a convolutional neural network (CNN) integrated into production-style video coding solutions.

Timmerer’s team have proposed the use of CNNs to speed the encoding of ‘multiple representations’ of video. In layperson’s terms, videos are stored in versions or ‘representations’ of multiple sizes and qualities. The player, which is requesting the video content from the server on which it resides, chooses the most suitable representation based on whatever the network conditions are at the time.

In theory, this adds efficiency to the encoding and streaming process. In practicality, however, the most common approach for delivering video over the Internet - HTTP Adaptive Streaming limits in the ability to encode the same content at different quality levels.

“Fast multirate encoding approaches leveraging CNNs, we found, may offer the ability to speed the process by referencing information from previously encoded representations,” he explains. “Basing performance on the fastest, not the slowest element in the process.”

IP protection and standards

There’s a body looking to wrap a framework around these and future developments in media as well as applications in other industries. MPAI – Moving pictures, audio and data coding by Artificial Intelligence - is founded by MPEG co-founder Leonardo Chiariglione.

He blogs about the 1997 the match between IBM Deep Blue and Garry Kasparov which made headlines when the machine beat the man.

“As with IBM Deep Blue, old coding tools had a priori statistical knowledge modelled and hardwired in the tools, but in AI, knowledge is acquired by learning the statistics.

“This is the reason why AI tools are more promising than traditional data processing tools. For a new age you need new tools and a new organisation tuned to use those new tools.”


What Are Movies Now Anyway?

NAB

While Hollywood focusses on the business merits of a streaming and exhibition symbiosis there’s less chatter about what constitutes cinema itself.

https://amplify.nabshow.com/articles/what-are-movies-now-anyway/

As we better understand the new screen culture taking shape, it looks like we may all lose in the long run.

In a brilliantly argued opinion piece for The New York Times, journalist and cultural critic Anthony Scott questions whether we will ever enjoy the kind of freedom that “going to the movies” meant ever again.

While counting himself a cinephile, Scott is not on a nostalgia trip for the experience of seeing films in theatres. He is clear-eyed that Hollywood has always dominated western cinema and done so mostly with content that pleases rather than challenges.

“A dogmatic, winner-take-all techno-determinism, which sees streaming as the inevitable and perhaps welcome death of an old-fashioned, inefficient activity, is answered by an equally dogmatic sentimentality about the aesthetic and moral superiority of traditional moviegoing.”

What was so great about the movies we watched before the pandemic, Scott asks. First cable TV and then streaming services have made many movies, especially indie fare and “world cinema,” accessible.

“That in itself might be a problem. When everything is accessible — then nothing is special. Movies exist in the digital ether alongside myriad other forms of amusement and distraction, deprived of a sense of occasion.”

He admits that he was thrilled to go back to the cinema like many of us recently to watch F9, Black Widow or Quiet Place Part II but notes that our desire to return may be giving these films a critical pass. They all had decent but not stellar reviews. We just wanted the experience of enjoying them in the dark with people, without being able to pause the stream, once more. 

“After more than a year of subsisting on screening links, we found the critical zones of our cerebral cortices flooded with fan endorphins. Whether or not this was a good movie, it undoubtedly offered a good time at the movies, and as such a reminder of what we really cared about.”

As pointed out in a recent article on the coexistence of theatrical and streaming services, one reason streaming services and movie theaters are going to coexist for a long time is that the same companies hope to derive profit from both.

It’s in the interest of a decreasing number of mega-companies like Disney and Discovery-WarnerMedia to keep us watching their content universes at home and at the movies.

“Recent headlines provide fresh evidence that, at the corporate level, the boundaries between film, television and the internet are not so much blurry as obsolete,” says Scott.

Disney swallowing Fox; Warner Bros. and its corporate sibling HBO Max being unloaded by AT&T onto Discovery; Netflix, Apple and Amazon scoping out old studio real estate in Los Angeles; Amazon acquiring MGM.

“Tech companies are movie studios. Movie studios are TV networks. Television is the internet.”

There are creative upsides to this, Scott suggests. “Novels that once might have been squeezed into two hours or tamed for network or public television — “Normal People,” “The Queen’s Gambit” — can find a more organic, episodic scope. Filmmakers like Barry Jenkins (The Underground Railroad) and Luca Guadagnino (We Are Who We Are) can test their skills in extended, intricate narrative forms. Actors, especially women and people of color, can escape from the narrow typecasting that is among Hollywood’s most enduring and exasperating traditions.”

On the other hand, the glut of content may well turn out to be unsustainable. At some point household subscriptions will peak. In which case, how much are we willing to spend on ad hoc purchases — via the iTunes store or video-on-demand or “virtual cinema tickets” — on top of our monthly Netflix or HBO Max fees, Scott asks.

“Those banal household questions have large cultural implications. If we stick to the platforms and consume what’s convenient we risk circumscribing our taste and limit the range of our thought.”

He also addresses the idea that we’re all slaves to the industrial-tech-content megaplex. It’s not a unique idea and most of us are willing just now to trade our time and our data privacy (via algorithms and searches) to the giant corporation’s serving us more Loki or Bond.

“The screen doesn’t care what we are looking at, as long as our eyes are engaged and our data can be harvested. Movies didn’t create this state of affairs, but they are part of the technology that enabled it. As art becomes content, content is transmuted into data, which it is your job, as a consumer, to give back to the companies that sold you access to the art.”

Scott has no answers to the questions he wants to provoke: Is the price for endless entertainment that of submission to a narrowing range of unimaginative content choices? Can critical curiosity be sustained in the face of corporate domination? How can audiences and artists take back control of the movies?

Do they/we even want to?

 

Our Industry’s on the Rebound… But From a Not-Awesome Reality

NAB

https://amplify.nabshow.com/articles/our-industrys-on-the-rebound-but-from-a-not-awesome-reality/

The bounce back is on. Following a torrid year for most parts of the global entertainment and media biz, research from consultants Price Waterhouse Cooper confirms that the industry has regained its momentum, with revenues even outpacing the economy as a whole.

Leading the charge are verticals including music streaming, gaming and — perhaps surprisingly — virtual reality.

In its Global Entertainment & Media Outlook, PwC forecasts a 6.5% growth in total global E&M revenue this year and a further 6.7% rise in 2022 with global revenues reaching $2.6 trillion in 2025.

It’s not all good news, though. Since 2020 saw a titanic $80 billion drop in revenue, any growth is starting from a very low base and actually the overall growth rate paints a less optimistic picture with an average rate of just 3.5% year-on-year until 2025.

Young Generations Prefer New Media Platforms

Traditional media is, not surprisingly, something that many younger consumers… don’t consume. They have little awareness of, or interest in what was trendy and is still the center for older generations. According to the report, media platforms designed for young consumers or that enable lightly-produced, authentic content have boomed. Gaming is central to the youth movement and is becoming a significant driver of data consumption — in fact, it is on pace to be the fastest-growing content category in that regard, accounting for 6.1% of total data consumption globally by 2025, up from 4.7% in 2020.

In terms of consumer spending, traditional TV and home video takes the largest share of total revenue, although it will contract at a -1.2% CAGR to 2025. Newspapers and consumer magazine revenue will also contact over the next five years, and VR earns distinction as the fastest-growing segment.

Its revenues surged by 31.7% in 2020 to $1.8 billion and are projected to sustain a CAGR of 30%+ over the next five years to reach a $6.9 billion business in 2025.

Musicians Are Seeking a Bigger Payback

Against the odds, and despite widespread predictions of doom, music has been one of the standout E&M performers in recent years, as streaming has finally gained critical mass. Revenues from live music slumped by 74.4% in 2020 and are expected to return to 2019 levels only in 2023. But between 2020 and 2025, the music sector as a whole is expected to grow at a 12.8%, fueled by rapid growth in both live performances and digital streaming, which will be a $29.3 billion business in 2025.

From Universes To The Metaverse

When entertainment companies create “universes” of content, they leave room for growth and sustained engagement. Disney is the past master including various Avengers movies and episodic shows. PwC points out that WarnerMedia’s HBO has a prequel to Game of Thrones in the works called House of the Dragon.

“If universes are providing proprietary advantages now, the longer-term future may lie in the metaverse — a more open, multi-brand environment built around consumers. The metaverse enables intellectual property owned by many different E&M companies to coexist on a single online platform.”

For example, DC Comics’ Batman can interact with Disney’s Captain America while Travis Scott performs.

“There are major opportunities to sell music and merchandise through games platforms: it’s all about meeting consumers where they are, rather than having them come and find you,” says PwC.

Don’t Write Off the Blockbuster

Last Christmas, WarnerMedia released Wonder Woman 1984 simultaneously on HBO Max and in movie theaters. It then announced it will launch all its 2021 movies in the same way. This strategy has caused worries among A-list talent, who are concerned about the impacts on their residuals, which used to be based on box office. Cinema saw a 70.4% collapse in revenues, according to the report.

“But we shouldn’t be too quick to write off the commercial viability of the expensive, large-scale, spectacular films that benefit most from the communal, big-screen experience,” the analyst says.

Animation, the production of which is less affected by anti-pandemic measures, has continued to exert especially strong drawing power at the box office. Demon Slayer, which debuted in October 2020, grossed $95 million in 10 days, the fastest a Japanese film had ever reached that milestone. In China, the comedy Hi Mom, released in February 2021, has already garnered $850 million in box-office revenues.

In India, longer, more lavish big-budget movies starring major Bollywood actors still aim for theatrical release. Elsewhere, examples in 2021 of the enduring appeal of blockbusters at the box office have included the strong US and global performance of Godzilla vs. Kong over the Easter period. No Time to Die is slated to launch in theaters in October 2021 and despite multiple delays MGM/UA’s decision to wait for a cinema release should pay dividends.

Cinema will be the fastest-rising segment in the advertising category, notes PwC, although this is mostly driven by the rebound in 2021 coming from a very low base in 2020.

Werner Ballhaus, global entertainment and media industry leader partner, PwC Germany, commented in an accompanying press release, “Whether it’s box office revenues shifting to streaming platforms, content moving to mobile devices, or the increasingly complex relationships among content creators, producers and distributors, the dynamics and power within the industry continue to shift.

“Even in the areas that offer the most compelling topline growth — like video streaming — the nature of competition is likely to change dramatically over the coming years. And all the while, the social, political, and regulatory context in which all companies operate continues to evolve in unpredictable ways. All of which means that sitting still, relying on the strategies that created value and locked up market share in the past, will not be the most effective posture going forward.”

 


Research: Election Elevated TV News Revenue

NAB

The 2020 election and its hardcore battlelines was a boon for TV news, with local TV stations in particular either on par with or outpacing cable and network TV.

https://amplify.nabshow.com/articles/research-election-elevated-tv-news-revenue/

In its annual health check of news media, Pew Research Center found the audience for local TV news increase across primetime and late night time slots. Financially, local TV companies generated more revenue in 2020 than in 2019, consistent with a cyclical pattern in which revenue rises in election years and falls in non-election years.

Per the report: In 2020, viewing for network local affiliate news stations (ABC, CBS, Fox and NBC) increased in two key time slots — evening (4 p.m. to 7 p.m.) and late night (11 p.m. to 2 a.m.) — according to Comscore StationView Essentials data. The local TV average audience for those periods both increased by 4% (though the average audience for the morning news time slot (6 a.m. to 9 a.m.) decreased 4% in 2020.

Local TV over-the-air advertising revenue grew 8% last year to $18.4 billion, according to analysis from the Pew Research Center.

Revenue for the 839 local TV stations defined as “news-producing stations” (stations that have a news director and are viable, commercial and are English-language affiliates in the U.S.) was $15.3 billion, according to the BIA Advisory Services database.

The Center also notes that five major publicly held local TV station companies — Gray, Nexstar, Scripps, Sinclair and Tegna — report political advertising revenue separately from other types of revenues in their official filings. In 2020, these companies reported a total of $2 billion in political ad revenue, compared with $1.2 billion in 2018 and $843 million in 2016, the two most recent election years.

The main three cable news channels (CNN, Fox News and MSNBC) also fared well with ratings and revenue increases. The prime news time slot on Fox News soared 61% to about 3.08 million compared; CNN’s audience leapt from 1.05 million in 2019 to 1.80 million in 2020, a 72% increase. MSNBC’s audience jumped 28% in 2020.

Total revenue for the three major cable news channels increased a 3%-5% (to $1.7 billion for CNN, $2.9 billion for Fox News and $1.1 billion for MSNBC), according to estimates from Kagan media research group.

Since they spent modestly (CNN’s expenses remained largely unchanged in 2020, Fox News saw its expenses decline 4% and MSNBC expenses increased 3%) they each grew profit roughly 6%-7% from the previous year.

Newsmax, a smaller cable news channel that gained prominence during the election lost $1.5 million despite making $26 million in advertising revenue because in its first year of operation it spent Newsmax spent $28 million on the newsroom.

Network TV news — appointment viewing for many Americans — saw its audience increase somewhat across the three major networks in 2020.

“Average audiences for the network TV Sunday morning political talk shows on ABC, CBS, Fox and NBC — This Week, Face the Nation, Fox News Sunday and Meet the Press, respectively — benefited from the election season, each increasing by about 20% in 2020,” finds the survey.

The average audience for the four newsmagazine shows aired by the networks — ABC’s 20/20, CBS’s 60 Minutes and 48 Hours, and NBC’s Dateline — increased for each network about 10% in 2020, following a stable year for CBS and NBC in 2019 and a 21% increase for ABC in 2019.

 

All the Ways the Olympics Will (and Won’t) Be Immersive

NAB

The IOC and its major rights holders like NBC and Discovery — which have paid billions for the privilege — will be hoping to immerse viewers in wall-to-wall Olympic action these next two weeks. The broadcast coverage of the competition is critical to these organizations and more so this year without the color of fans cheering at the venues.

https://amplify.nabshow.com/articles/all-the-ways-the-olympics-will-and-wont-be-immersive/

OBS has prepared well though, arming itself with a fleet of innovations to enhance coverage intended to attract Gen Z. On the face of it this could be the most immersive Games yet. Let’s look at a few of the ways it plans to do this.

UHD HDR

First and foremost, the host feed is being delivered in 4K Ultra HD and HDR for the first time. There’s no doubt this upgrade in picture quality will enhance the viewing experience. Better color, better contrast, better pixels. Quite how many viewers will see this though is open to question. NBCU for example is distributing some UHD feeds but these will be upconverted from HD. The BBC isn’t taking UHD either declaring it too expensive to backhaul from Tokyo. Discovery is showing UHD but on one channel accessed by viewers in Europe from Discovery+.

UHD Audio

UHD also means an audio upgrade to a standard 5.1.4 configuration “to enable viewers to have a more realistic audio experience” by adding overhead mics. But with no spectators in the venue, this immersive experience may not be one OBS want to shout about. It could just be the sound of proverbial tumbleweed or shouts in an echo chamber. On the other hand, we may actually here the coaching commands and outbursts from athletes and referee direction in ways we simply cannot when drowned out by crowds.

Multi-Camera Replay Systems

Between 60 and 80 robotic 4K cameras are at select venues, including those hosting gymnastics, athletics, BMX freestyle, street skateboard, sport climbing and volleyball. The feeds will be stitched together to create multi-cam replay clips — an effect OBS say is similar to the bullet-dodging sequences in The Matrix.

2D Image Tracking

Video tracking technology will help viewers keep track of the position of the athletes across sports, including marathon and race walks, road cycling and mountain biking, triathlon and canoe sprint. Instead of GPS positioning or wireless equipment, OBS’ 2D image tracking is based on image processing technology that allows motion tracking. A ‘patch’ (a square) is defined on selected video frames in order to identify each of the athletes/boats. The computer then creates a ‘label’ that is attached to each of the identified athletes/ boats and that will be maintained even as the image changes. This captured data is then made available to a graphics rendering platform for on-screen presentation. Additional data captured using more traditional GPS positioning can be combined with the ‘labels’ to identify athletes, their speed, distance to finish or relative position to the leader.

Biometric Data

Just for the archery contest — possible because of the athlete’s stillness, biometric data will be taken. Four cameras will be trained on their faces to analyze any slight changes of skin color generated by the contraction of blood vessels. Through an on-screen graphic, audiences will be able to witness the heartbeat variations and adrenaline rush experienced by the archer’s body as they shoot their arrow.

360-Degree Replays

Intel’s True View technology will come in to play during basketball matches. Thirty-five 4K cameras are mounted at the concourse level of the Saitama Super Arena to capture volumetric video that, once processed, renders 360° replays, bird’s eye views and freeze frames from any perspective on the court. OBS will produce between up to 10 True View clips for every basketball game.

Virtual Reality

OBS plans 110 hours of live immersive 180-degree stereoscopic and 360-degree panoramic coverage from the Opening and Closing Ceremonies, as well as from select sports like beach volleyball and gymnastics — sports chosen based on the possibility of getting cameras closest to the athletes.

Discovery subscribers will be able to view the VR coverage and so will users of the NBC Olympics VR by Xfinity app, which include watch parties for Oculus friends.

12K Video Wall

A 50-meter wide screen will broadcast 12K resolution footage of the sailing events that spectators have traditionally watched from nearby piers with binoculars. Floating on the water of the Enoshima Yacht Harbor, the screen will give spectators the sensation of the races being held right in front of their eyes. If there were any spectators…

5G and AR Glasses

The total absence of spectators — including VIP decision makers — is a blow for Intel, Alibaba and other tech sponsors which have created demos of immersive experiences for people to enjoy at the venue. These include wearable glasses at the swimming venue delivering AR graphics over 5G and video replays available to golf fans at the Kasumigaskei Country Club, also over 5G.

5G won’t be a complete washout though. Footage from the Opening and Closing Ceremonies will be contributed over 5G in a test for future Olympic events.  

Mobile First

Digital publishers can draw on a repository of up to 9000 clips and short-form assets called Content+. This includes behind-the-scenes content from the venues purposefully filmed with smartphones. Artificial Intelligence is also being trialed as a means to automate the logging and clipping of all this video and hence distribute that medal winning celebration to social media in an instant.

 


Olympian Effort: Tokyo 2020 Host Production Upgrades To UHD, IP And Cloud

BroadcastBridge

Olympic Broadcasting Services (OBS) the host broadcaster for the Games was founded twenty years ago and has arguably gone through its hardest and most intense period of digital transformation for Tokyo 2020.

https://www.thebroadcastbridge.com/content/entry/17189/olympian-effort-tokyo-2020-host-production-upgrades-to-uhd-ip-and-cloud

What it does matters hugely to the wider Olympic movement since rights holding broadcasters generate a massive 75% of revenues for the IOC.

It also impacts the broadcast industry at large since it showcases with the highest profile what is possible with technology for the production and distribution of live events at scale.

For Tokyo 2020, during which it plans to output 9500 hours, 4000 hours of which are live, OBS is leading the charge into 4K UHD, IP and cloud. Its on-site production footprint in Tokyo will be 30 per cent smaller than it was at Rio 2016, while content production will be up by about 30 per cent. That’s only possible because of cloud, global fiber and IP links, and OBS’ digital repository of content called Content+, which allows more broadcasters to stay at home and remote.

There are other service innovations too including extensive tests of AI and 5G workflows which are planned for wider introduction in six months’ time at the Winter Games, Beijing.

We’re going to concentrate on the main uplifts – UHD, Cloud and IP.

UHD
Almost all of the content captured will be produced natively in UHD HDR. OBS will deliver the UHD HDR feeds to rights holders, while simultaneously ensuring content delivery in HD 1080i SDR. To do this, it has created a single HDR/SDR production workflow model that will allow the OB trucks to generate an HD 1080i SDR output via conversion from the primary UHD HDR signal.

A new full IP infrastructure has been built to support the transport of the signals for the contribution network. OBS’ Venue Technical Operations team has developed a set of LUTs in-house to maximise the quality between all cross-conversions. By natively capturing the content in UHD HDR or up-converted to UHD HDR, then down-converted again, OBS claim the final HD 1080i signal delivered to the RHBs will offer higher quality across all platforms than if produced in a standard HD production.

OBS has made substantive changes before, notably the transition from SD to HD at Beijing 2008. The change now is similar in terms of breadth, but the challenge is greater because the step between HD and UHD is much more demanding than between standard and HD. There is also the additional element of the transition from SDR to HDR and WCG.

“UHD technology has reached a maturity level such that we are all confident that it is ready for Tokyo 2020,” says OBS Chief Technology Officer, Sotiris Salamouris. “Technologically, it is a big step and made even more substantial because of the size of the broadcast operation at an Olympics. We need to support more than 40 venues – this means a comparable number of available production units. There are more than 50 outside broadcast vans and fly-pack systems that we need to cover the Games, and all of them need to transition to UHD and HDR/WCG. It is a tremendous change and requires a lot of attention to how it is being engineered.

“Everyone in the industry knows that it has been coming, and while they may have expected it to take place over a number of years, the reality is that it has recently been speeding up, especially in the world of sports. Most of the equipment that is reaching the broadcast market now is UHD ready and works on UHD workflows. However, this is not something that you can take lightly, especially in our own production environment, since there are so many moving parts that need to be brought together.

“We understand that there is a lot of attention on us and on what we are doing in Tokyo. It has a major impact on the broadcast industry because if we weren’t to introduce it for the Olympics, it would probably take even longer to be widely adopted.”

In terms of tech specs: the SMPTE 292 standard is used for the production of the 1080i/59.94 HD-SDI signal and the UHD production will adhere to the SMPTE 2036-1 standard and follow the 59.94 Hz specification with 5.1.4 audio configuration. The HDR standard is Hybrid-Log Gamma (HLG).

What OBS are not saying is how many of its broadcasters are taking the UHD feed natively. NBC for example is not, choosing to upconvert from HD to select affiliates. The BBC planned to stream a feed in 4K, presumably over iPlayer as it has done in the past, but having to remote all of its production back in the UK has crunched bandwidth costs so again all its output is HD.

IP
Migrating to IP has been on the cards for a while. What is happening now is actually the last part of the transition to IP, with reliable IP-enabled solutions for live transmission. For Tokyo 2020, all OBS’ UHD overlay design, on top of its HD broadcast systems, is based on IP, with IP being used for all UHD contribution and distribution.

“It was even more demanding as we implemented a full ST-2110 platform to carry, route and distribute UHD content, with its extremely high bandwidth requirements,” says Salamouris. “But for us, it was a step that made sense. In our environment and with our own very extensive and complex requirements, when it comes to signal contribution, processing and then distribution, IP, and in particular ST-2110, was probably the only technology that could scale with all our needs. Using IP offers us more flexibility and a far higher scalability than the legacy technological stack.”

Cloud
In collaboration with Alibaba, OBS has created a suite of cloud services, specifically designed for data-heavy broadcast workflows. It says this allows broadcasters to carry out a virtualisation of a great part of their broadcast systems and network platforms in their own private cloud installation, integrated with Alibaba Cloud technology.

“With the launch of OBS Cloud, OBS can accommodate tailored, fully-fledged cloud-based front and back-end solutions for the RHBs to help them more easily set up all or part of their processes in the Cloud,” says Isidoro Moreno, Head of Engineering, OBS. “For broadcasters, this is a dramatic inflection point in the cost structure of their on-site production as they reduce up-front investments. Also, they can significantly keep their set-up time to the minimum and have their equipment all prepared for their Olympic coverage before even setting foot in the host city.”

Salamouris elaborates, “Cloud technology is already adequate for several of our demanding post-production workflows. Not only that, cloud may be the best option available to address some of the unique challenges that we are facing when trying to implement complex workflows, especially in an unforgiving environment such as the broadcast of large sports events where the most scarce resource is actually time.”

He argues that cloud will benefit OBS’ mission in generating more content, covering more hours and distributing an increasing volume of additional footage.

“As a result, the size and complexity of the broadcast systems that we have to install and operate locally in our facilities in the host city(ies), both in the IBC and in the venues, has continued increasing. The time available, however, that we have to build our technology systems in the host city(ies) is on average eight to 10 weeks, and that isn’t going to increase. So, though we have an expectation to continually increase our production output, the timespan available to build all the necessary infrastructure will always be the same, or may even have to be further reduced in the future.

OBS can see a hard limit approaching if this trend continues and that this is precisely where cloud technology helps. “It provides us with the opportunity to implement our systems much earlier and without any dependency with the local infrastructure in the actual Olympic venues, including the IBC,” he says. “You can build systems on the cloud, test them properly, switch them around and do all your preparation well in advance, all before setting foot in the host city(ies).

Then you can fire it up, just before the Games, with all the systems already configured, tested and ready for operation. So now that you can disassociate yourself from being local in the venue or the IBC and being able to operate off-site in the cloud, it means that you can continue increasing the size, the complexity of your systems, and consequently, the volume of your output, without further increasing your infrastructure in the host city(ies).”

Nonetheless, it’s not clear which broadcaster is taking advantage of OBS Cloud for Tokyo 2020.

“In terms of broadcasting, it is still relatively early days in the full change to cloud technology, and Tokyo 2020 will mark a first step,” admits CEO Yiannis Exarchos. “The Beijing 2022 Winter Olympics may then become a facilitator for its wider use.”

AI
For Tokyo 2020, OBS intends to leverage AI-led solutions in some of its broadcast workflows, as a way of testing how it will evolve to be included in future operations. OBS will run an Automatic Media Description pilot project based on athlete recognition and this pilot will be conducted on a select number of specifically chosen sports.

OBS will combine existing metadata such as the Broadcast Data Feed and video logs with image recognition based on athlete bib. Additionally, it will use speech-to-text technology to complement and improve the tagging of media assets. Such applications will allow a faster and more efficient turnaround of workflows such as image selection, automatic searching and clipping. By Beijing 2022, OBS is aiming to expand this process to as many sports as possible, make the most of AI-driven tools in its internal workflows and open the service to RHBs.

“Ultimately, OBS is trying to develop applications that can use this enriched data to create automatic summaries and create pattern detections,” says Guillermo Jiménez Director of Broadcast Engineering. “Data generated through AI-based solutions can be used post Games to analyse production to help improve the predicted content for each user. Combined with the IOC’s Sports Data Project, AI can provide insights into the expected performance of athletes and comparisons with previous Games and other major events.”

Catering For Digital
With the change in viewing patterns in the last decade has come a shift to digital in the approach taken by OBS towards content production and delivery. Traditional linear TV is no longer the sole way to watch live content and enjoy all the action from the Games.

An OBS services designed primarily for digital is Content+, which is an online resource offering short-form content from across the Games that can be easily shared across all platforms. OBS will deploy dedicated crews in Tokyo to generate behind-the-scenes content from the competition venues, the Olympic Village and around the city. It will also generate content with smartphones, providing short video clips from back-of-house athlete areas that will be available to broadcaster’s social media teams.

Overall, between 7,000 and 9,000 clips are expected to be produced to help enhance and supplement RHB coverage. Additionally, it plans to produce 1800 fast turnaround clips from all sports, offering broadcasters access to highlights content.

Main Feed Production
OBS will use a total of 1,049 camera systems for the broadcast of the Tokyo 2020 Games of which more than 210 are slow-mos and 11 are four-point cablecam systems. Aside from the main international feed, from some of the venues, it will produce a Multi Clip Feeds (MCFs) which run simultaneously with the main coverage and offers unseen angles from point-of-view and super slo-motion cameras, and other specialty systems. They help broadcasters tailor their own programming with enhanced analysis. There will be 75 multilateral feeds and 28 MCFs coming from the venues in Tokyo, with 68 multilaterals distributed in UHD.

It will also be employing Intel's TrueView system in Basketball which can utilise virtual cameras for immersive replays. In Tokyo, a total of 35 4K cameras will be mounted at the concourse level of the Saitama Super Arena to capture volumetric video that, once processed, renders 360° replays, bird's eye views and freeze frames from any perspective on the court.

“With 3D Athlete Tracking technology, we will be able to reveal never-before seen insights into athletes' velocity and acceleration, and how they perform against each other,” says Mark Wallace, OBS Chief Content Officer. “The technology can convert that data into visual overlays which can be broadcast over replays, providing commentators with a great tool for analysis and further fan engagement. There is a unique phenomenon with the television audience for an Olympic Games. Many are what I would call the Olympic viewer. They are someone who doesn’t usually watch a lot of sport and they decide they want to watch the Olympics because of the storytelling, narrative and personalities. Our job is to entertain, inform and educate the viewer, engage them and keep them interested in sport and the story.”