Monday, 1 November 2021

The World-Building VFX of “Foundation”

NAB

Quantum math, Apple design, Polynesian tribes and magnetic sand combine with freezing location shoots. The intergalactic and head-spinning themes of AppleTV+ Foundation required world building on a planetary scale. Here is some of it.

https://amplify.nabshow.com/articles/the-world-building-vfx-of-foundation/

“But if you don't have anything to point to at the beginning it becomes daunting to develop the visual language,” Goyer explained at 3D World. “We kept coming back to 'What's something we haven't seen before?”

That was thrown over to concept artist Stephan Martinière.

“As more and more sci-fi films are being made creating a unique visual signature becomes harder but there are still plenty of interesting visual ideas to come up with unique visuals,” he says. “One of them being proposed was to use the Apple sleek design and try to carry it in some of the spaceship looks. Another was to give the Anacreons a very unique tribal look loosely based on the Polynesian designs. There was also a very specific architectural direction for some of the environments.”

Tribal looking spaceships

Goyer’s brief was to describe the emotional effect he wanted a scene or object to achieve. “The Empire is aggressive and male so I wanted their ships to be like knives, which meant that they weren't just folding space but ripping it,” he says.

Martinière said the early concepts for the ships were cool but too modern and did not ultimately fit the storyline of an ancient and feudal society. Incorporating less sophisticated tribal designs into their technology helped give the Anacreons “a unique visual signature but also established the right narrative”. The front part of the ship looks like two ancient carved wood shields.

“Does it need to look like it can it work? I would say I have seen hundreds of different designs for space ships and I don’t think anyone worries about that. Even Transformers make you believe the impossible.”

It’s different for a costume or a weapon. The crossbow is a good example. Martinière had to think of a design that could be functional if the weapon was going to be a hand held practical prop. Even then it still needed to be designed for the mechanical part to work. The simpler solution, he says, was to make “a cool shape and have it fire a laser beam that works too.”

As the VFX shot count went from 1,500 to 3,900 across the ten episodes an additional 19 facilities had to be brought in to support lead vendors Dneg, Important Looking Pirates and Outpost VFX.

However, Goyer was determined to film as much on location as possible – about 60% of the production in the end he estimates. This included principal photography at Troy Studios in Limerick (Ireland), in Germany, Iceland, Malta, Fuerteventura and Tenerife.

"The visual effects had to be as naturalistic and photoreal as possible,” he explained. “I want the show to feel like a Terrence Malick movie and for the actors to experience as much as they can in reality.  I wanted each country to represent a different world and for the actors to feel cold.”

Physical sets and Sandograms

Physical sets included the 'Aircar'. "We got a dune buggy from Germany and brought it back to Limerick,” states Conor Dennison, one of three supervising art directors on the project. “We cut it down the middle, stretched the whole thing out for an extra 10 to 15 feet, put in a new roll cage, new hydraulics, and built an Aircar sitting on top of it. One driver was facing forward and the other facing backwards underneath looking at the actor overhead. When they were going to the left the pneumatics were set up in such a way that it would go to the right, so the actor would go the right way.”

Holograms in Foundation take the form of 'Sandograms'. "The majority of our holograms are meant to be solid particles that coalesce into whatever the hologram was," notes Chris MacLean, production VFX supervisor. "It worked extremely well with static objects and a 2.5D approach developed with Dneg.”

Mural of Souls

Displaying the history of the Empire in the Imperial Palace is the Mural of Souls, which is made of moving colour pigments. The initial approach was to put acrylic ink in a pan, using Ferrofluid and run a magnet underneath it; that was filmed at high speed which looked cool but would have been impractical to have the mural wet all of the time.

"Then we came up with the samsara where the Tibetan monks make mandala out of sand and wipe it away,” explains MacLean. “What if we take that and turn it up to 11. Take the magnets from the Ferrofluid and have the sand be magnetic. The magnetic sand stays on the wall, twirls and makes these crazy images."

Simulations were placed on top of the physical mural created by the art department. "There was depth given to the various key features on the mural so depending on what was actually there, there was a custom particle layout, motion paths and noise fields," continues production VFX supervisor, Michael Enriquez. "It was a lot of back-and-forth testing, and once we got it to work, the effect went throughout the entire shot."

Prime Radiant

The device known as the Prime Radiant displays the lifework of revolutionary mathematician Hari Seldon (Jared Harris).

“We know that Hari Seldon and Gaal Dornick [Lou Llobell] are the only people that can understand this math, but we're so far into the future I don't want to see Arabic numbers," remarks Goyer. "I also want it to be beautiful and spiritual. When Gaal and Hari look at the math it's almost like they are communicating with angels or God."

The solution was found by Chris Bahry, co-founder of Toronto-based studio Tendril. According to Mclean, Bahry does quantum math in his spare time: "He came up with something that I hope becomes the ultimate sci-fi MacGuffin.”

 

 


Virtual Production 101: What You Need to Know

NAB

https://amplify.nabshow.com/articles/virtual-production-101-what-you-need-to-know/

It’s less a question of if you’ll ever find yourself shooting in an LED volumes but when. With the technology still at the cutting edge, there are some essentials to consider before budgeting, tech provisioning and filming with virtual production.

Virtual production with LED walls would have become popular on its own without the pandemic. But the operational impact of this technology, which dramatically reduces the need for a large crew footprint and eliminates the need for location work and travel, cannot be overstated.

Major productions that would have been shot in real-world locations or on green screens have been reconfigured to be partially or entirely shot on LED volumes instead. These include Star Trek: Discovery; Star Trek: Brave New Worlds; Netflix’s 1899; Thor: God and Thunder, and Bullet Train.

Frame.io (newly acquired by Adobe) has an enlightening set of tips and tricks for newcomers to virtual production which I’m going to précis here.

1 Fix it in Pre

To make an LED volume perform its best the lion’s share of visual development occurs in pre-production. That’s a reversal of the recent norm where issues on set were fixed in post.

On a virtual production, schedules for films are pre-loaded with more pre-production time and a less extensive post period.

“Many seasoned filmmakers aren’t accustomed to the idea of making every decision in terms of effects imagery before production occurs and may find the process counterintuitive,” says Noah Kadner, the virtual production editor at American Cinematographer and author of the Frame.io guide.

Assets such as models, characters, 3D environments, etc., must be completely camera-ready before production starts. Along the way, this also means a lot more iteration and visual development can occur. Indeed, the Virtual Art Department, previsualization, and virtual scouting are all vital parts of the LED volume pre-production workflow.

“In many ways, the production day becomes about executing a carefully validated plan instead of best guess shots in the dark, as non-virtual productions often seem.”

2 New Production Roles

A Virtual Production Supervisor acts as the liaison between the physical production team, the Art Department, the VFX team, and the “brain bar” (ILM’s term for its Volume Control Team).

Frame.io suggests the VPS combines the roles of VFX Supervisor and Art Director. Their responsibilities include overseeing the Virtual Art Department (another acronym to juggle – VAD) during pre-production and supervising the LED volume during production.

“The VAD is where all elements that ultimately wind up on the LED walls are designed and created. This area encompasses a traditional art department, with an emphasis on digital assets. The VAD is constantly creating objects which may be digital models, real-world props, or both.”

Clearly, understanding what the VPS and the VAD do in a virtual production is essential.

3 Avoid looking like a video game

Photorealism is the aim nine times out of ten but the pitfalls of the virtual environment looking like a video game are real. Photogrammetry is the go-to technique. It’s a method of measuring physical objects and environments by analyzing photographic data from which to construct 3D assets.

Frame.io name checks a few useful photogrammetry tools such as ML/AI software RealityCapture  Library.

“The effort needed to create a photorealistic 3D asset from photogrammetry is often far less than making the equivalent from scratch digitally.”

4 Get most powerful system you can afford

The more GPU power in your system, the greater the level of detail in an environment you can have on your LED wall in real-time. It’s not something you should have to worry about: A quality integrator can ensure you have a system that performs well and doesn’t blow its fans nonstop. Many of the key components and plugins for virtual production, such as camera tracking and LED panel support, are only available on Windows, and you may need multiple PCs if the volume has multiple surfaces.

5 LED panels trade quality for cost

Pixel pitch is the distance between individual LED lights on the screen and is measured in millimeters. Because you’re re-photographing the screen, the pixel pitch directly correlates to how the image looks. If it’s not dense enough, the image can look low resolution. Or even worse, you may see moiré (when the camera image sensor conflicts with the pattern created by the spaces between the LED lights on the screen).

The higher the pitch is, the more likely moiré will appear when the camera focuses close to or onto the screen.

For reference, the pixel pitch of the LED panels used on The Mandalorian is 2.8mm. But that screen is also approximately 20 feet tall by 70 feet across, so that the camera can be much further away and less likely to focus on the screens. If you are working in a smaller sized volume, this can become even more of an issue.

Panels are now available at 1.5mm and even more dense, which can mitigate or eliminate moiré. The trade off is that the lower you go, the more expensive the screens become.

“So, there’s ultimately a perfect storm to consider which factors in pixel pitch, camera-to-screen distance, focal length, focus plane, camera sensor size, and content resolution to determine whether your footage shows a moiré pattern or not.”

6 Need walls, a ceiling and a floor?

There’s a significant scale continuum between the simplest single wall, rear projection LED setup to the massive volumes used on The Mandalorian.

In general, the larger the volume, the more expensive it will be to rent or to purchase if building from scratch. So, it’s critical to determine how much LED volume you need.

“The choice you make in volume size and form also has a huge impact on interactive/emitted light. If, for example, you put actors/set pieces in front of a single, flat LED wall, your subjects will be dark silhouettes against the screen, like someone standing up in a movie theater. On the other hand, if you have LED sidewalls, ceilings, etc., you will have emissive lighting falling naturally on your subject.

But even if you don’t need or can’t afford an enveloping volume, it’s still very possible to create interactive lighting in sync with the screen content. See below…

7. Using interactive lighting

Digital Multiplex or DMX is a protocol for controlling lights and other stage equipment. Specifically, you can use DMX lighting to turn multicolor movie lights into effects lights for LED volumes.

“You can program specific lighting cues and colors with DMX directly in Unreal Engine or via a lighting board. Or, through pixel mapping, you can set any light on your stage to mimic the color and intensity of a portion of your 3D scene. You can mimic passing car headlights, street lamps, tail lights, you name it.”

To make it all work, you need a DMX compatible light, preferably with full-color control. Some great examples of full-color DMX lights include Arri Skypanels, and Litegear LiteMats.

Next, you need pixel mapping software. Unreal Engine has DMX control, so you can control DMX lights directly from within scenes. Some other examples of external pixel mapping applications include Enttec ELM and MadMapper.

8. Mastery of color

Understanding color science is integral to the cinematographer’s craft and essential when using one digital device to rephotograph the output of another digital display.

The light cast from LED screens can cause unexpected/undesirable results depending on what surface it falls on. Kadner warns about metamerism, which refers to the visual appearance of an object changing based on the spectrum of light illuminating it. LED panels are designed to be looked at directly, not act as lighting sources.

“One way to mitigate this issue is to supplement the emissive light coming off the LED panels with additional movie lighting. It’s more work to set up but the results are worth the effort.”

Manufacturers are also responding by developing LED panels with better full-spectrum color science.

9. Virtual production is not zero-sum

To my mind this is the most important piece of advice, an approach rather to shooting in the Volume. There’s a lot of talk about being able to produce pixel-perfect final shots on set eliminating post altogether. Well, maybe the technology will advance to that extent in time – but it may not be creatively desirable either.

For example, according to ILM the percentage of final shots captured in-camera on The Mandalorian was around fifty percent on season one. The finality shot captured in an LED volume can vary from “ready to edit” to “requires some additional enhancements.”

“Don’t think of this as a zero-sum game. Think of more on a continuum of potential additional finessing in post vs. all or nothing,” says Kadner.

“Most visual effects supervisors who’ve worked in LED volumes agree that it’s far easier to fix visual issues with in-camera effects shots than to start with green screen cinematography and add the entirety of the background imagery in post-production. It’s a reductive and refining process vs. starting with a blank canvas.”

10. Prepare to experiment and be outmoded

The pace of change of virtual production with LED technology and related areas such as AI, camera to cloud, 5G connectivity and volumetric photography inevitably means that as soon as you’ve locked the tech spec down for a project, elements of it will have advanced.

Frame.io picks on Epic Games latest release of Unreal Engine which is accompanied by a host of tools expressly designed for the virtual production filmmaker.

What was completely impossible or highly difficult to accomplish one day may be standard operating procedure the next. Each version offers advancements that will make things faster and more realistic in virtual production.

“So, to save your time, sanity, and budget, embrace constant change. Attend many webinars, watch a lot of YouTube videos, read all you can, and above all, experiment.”

  

Who’s Going to Win the Media Entertainment Wars?

NAB

A handful of companies with expertise in storytelling will win the media entertainment wars and land a market cap of $500 billion, forecasts Jason Kilar, CEO, WarnerMedia.

https://amplify.nabshow.com/articles/whos-going-to-win-the-media-entertainment-wars/

It won’t be a surprise to learn that Kilar thinks one of them will be WarnerMedia.

“I wouldn’t be surprised if you see a relatively small number of storytelling-centric companies that are worth $400, $500 billion each because the internet allows for a scale that heretofore was never possible,” he said in a recent interview.  “I’m not just talking about streaming business models, which of course will be very important to our future, I’m talking about things like video games and interactivity and NFTs, and a whole host of other things that have yet to be invented.”

In the whole 50-minute interview for the VC firm Greylock Partners (it invested in AirBnB) shared on a podcast and transcribed to over 10,000 words there is not a single mention of the $43 billion gorilla in the room: Discovery.

Interviewer David Sze is either too polite to ask or the interview has been given under strict instruction that this is a no-go area.

The nearest Kilar comes is when he talks about why he thinks WarnerMedia will be a good position to succeed.

“What you’re likely to see is a relatively small number of services that have achieved global scale, and are able to thrive by confidently, and with conviction, investing in ambitious storytelling. By that I mean scripted, unscripted documentaries, and perhaps sports for some of those players.”

Unscripted and sports. Yup that’ll be Discovery.

Kilar is best known as the founding CEO of Hulu. He served on the boards of DreamWorks Animation and Univision, and co-founded video service Vessel, which was sold to Verizon in 2016. He joined Warner in 2020.

The Greylock article begins by declaring that Kilar has “an instinct for disruption” – although not by all accounts an inkling of the mega-move to merge WarnerMedia with Discovery that happened right under his nose. By all accounts AT&T boss John Stankey went above Kilar’s head to strike the deal with Discovery chief David Zaslav, making Kilar’s position untenable.

Disappointingly Greylock’s interview is more concerned with blowing smoke than taking the opportunity to ask Kilar some taxing questions: Kilar’s thoughts on the merger and his own (believed to be short term) future are off limits.

Likewise, there is no challenge to WarnerMedia’s highly contentious decision to release its theatrical slate day and date with streaming on HBO Max. Filmmakers like Denis Villeneuve remember are still angry about a decision that seemed to show no respect for art.

Here’s how Sze wafts a vague question at him: [WarnerMedia’s decision] marked a major milestone for the industry that had been talked about for decades, but no one had had the bravery or the moment in time to take that move, and Jason really spearheaded that.”

“It reflects what I’ve seen,” Sze goes on, “which is Jason’s constant thirst for innovation — his willingness to make bold moves and take big risks when he knows it’s the right thing to do, even if it’s not always the most popular thing to do.”

To which Kilar responds, vaguely: “My opinion is that with the proliferation of screens, and having access be as easy as it ever has been in the history of media — that’s a good thing for a storytelling company, because it means that if you lean into those screens, you can be far more accessed than you were before.

Not only that, you’re able to do it in a way where you can have a direct relationship with the customer. Historically, we haven’t had that opportunity. So those are two things I’m particularly excited about.”

Expertise in story worlds

The ability to spin content and story worlds across media will stand WarnerMedia and its 98-year history in good stead, Kilar believes.

“If you just take a step back, and say: ‘What are the different ways that you can move people through story?’ It turns out that a seven-second video that users generate has a tremendous potential to move people through story. That’s the reason, when I look at WarnerMedia and think about our gaming and interactive business and the intellectual property that we’re sitting on, I get so excited about the ways that we’re able to move through the world through story in new ways, in addition to and alongside motion pictures and documentaries and television series.”

While subscription funded content will remain the main model for WarnerMedia, the CEO nods toward the need to have very diverse business models. He sees free ad-supported content as trainer wheels for paid services.

“It doesn’t mean we’re not going to participate in ad-supported environments that have great reach, because I think there are a lot of kids under the age of 10 who are falling in love with Batman, Superman, Aquaman, Game of Thrones, and all these other things because of TikTok.

That’s a great entree into those characters and worlds, where then they can get, obviously, a lot more by going to HBO Max or playing a game that’s set in Westeros with interactive and gaming.”

Kilar also pointed to 2022 and what he called ‘the elevation of HBO Max, the premium DTC service CNN+ and the elevation of games and interactivity, “which I think is going to be absolutely a hallmark of WarnerMedia for the next five decades.”

Before Amazon was Amazon

The most interesting part of the interview arises when Kilar discusses Amazon, a company he joined in 1997 after leaving Harvard Business School, when there were just a handful of employees and where he stayed for a decade.

He suggests that Jeff Bezos “the entrepreneur was heavily inspired by Walt Disney the entrepreneur. It was just a tiny little bookstore at the time, and not a lot of revenue, and certainly not a lot of people. But there was a lot of conviction that there could be a better way when it came to selling books to people.”

He explains how Amazon was built by trial and error. “What we learned at Amazon is that it’s one thing to say you want a marketplace and then another if you have a right to actually build a winning marketplace.”

It launched an auction site called zShops, and then a general auction marketplace to compete with eBay which failed “big time”. He explains, “We did the thing that was ultimately a mistake, which was going to create a ‘me too’ product. The reason why it failed is that we didn’t have anything really to offer the suppliers of that marketplace. They gave us the benefit of the doubt for a couple of months, but then they were like, ‘I’m not getting enough business here. I’m going to go back to eBay.’

“The big insight for the birth of Amazon’s marketplace was: ‘Don’t get distracted by what eBay is doing.’ Instead, ask yourself, ‘What can you do for suppliers that you uniquely have a strength in?’ And what we had a strength in was book buyers. That was the connection, the insight that led to a successful marketplace.

“It was a tiny marketplace at the start — new books and used books — but it then expanded very aggressively into music and video and kitchen products and consumer electronics. It took decades, but it was the right road for us, because we failed miserably with a general auction approach and strategy — which was just trying to be eBay, which we were never going to be successful at.”

 

Horror Comedies Are Alive and Kicking (and Biting) on TV

NAB

https://amplify.nabshow.com/articles/horror-comedies-are-alive-and-kicking-and-biting-on-tv/

Recent TV horror comedies like Netflix Santa Clarita Diet, FX’s What We Do In The Shadows, HBO’s Los Espookys, and CBS’ new sitcom Ghosts present an exciting amalgamation of two already massive genres in their own distinctive ways.

“This deadly combination has had wide appeal, pulling in audiences who might not usually be able to stomach gore, and inviting fright fans to enjoy heartwarming and slapstick content,” says Saloni Gajjar writing at pop culture site A.V. Club https://www.avclub.com/how-horror-comedies-are-staking-their-claim-on-tv-1847900515

This genre fusion isn’t new, as Gajjar points out. Shows like The Munsters and The Addams Family brought the approach to the small screen in the 1960s. Netflix’s upcoming coming-of-age Wednesdayabout Wednesday Addams, and Rob Zombie’s The Munsters reboot for Peacock will keep the respective IPs going strong. An animated series based on Tim Burton’s feature hit Beetlejuice ran on ABC and Fox 1989-1991.

Dark dramas of the ’90s like Twin Peaks and Buffy The Vampire Slayer added gallows humor to offset the horror. More recent shows like Fox’s slasher parody Scream Queens and Starz’s Ash Vs. Evil Dead take a stab at expanding the style on TV. Starz also has Courteney Cox-led Shining Vale in the pipe for 2022, in which her character gets possessed by the ghosts in her new house.

You could arguably throw in the socio-political satire of The Squid Game into the mix. So what seems to have reanimated horror comedies on TV?

Santa Clarita Diet’s showrunner and creator Victor Fresco tells the A.V. Club that comedy blends best with horror because “jokes and mysteries are similar in how they thrive on tension and ever-increasing stakes… The more you can keep the suspense percolating for either of them, the tenser and better the result will be.”

Los Espookys  co-creator Ana Fabrega says that comedies like What We Do In The Shadows or Santa Clarita Diet are rooted in subverting tropes of one specific mythology. Los Espookys though is inspired more by Ghostbusters or Scooby Doo’s Mystery Inc.

“We purposely keep it abstract to open up what a horror-comedy is and can be. [Co-creator and co-star] Julio Torres and I wanted to keep it open-ended to play around in all aspects of what’s considered as magical realism,” Fabrega says. “We want to give a nod to a range of subjects, from Scooby Doo to Latin telenovelas, before putting our own weird spin. It resonates because it’s not cartoonish, but it’s also not seriously scary.”

CBS’s take on British series Ghosts uses supernatural elements to drive a feel-good network comedy as one half of a married couple starts to communicate with ghosts after a near-death incident.  

“We want to be character-driven in an inspirational way,” showrunner Joe Wiseman says.  “The ghosts add an unusual flavor, but become a launching point to tell relatable stories.”

The Ghosts showrunners hope that their show resonates with fans of What We Do In The Shadows as well as NBC’s The Good Place, which also tapped into larger commentary about life after death.

Co-showrunner Joe Port adds, “We’re trying to carve out a specific comedy for those who enjoy ghost stories. It’s similar to watching something like hangout comedies where a disparate group of people are forced to hang out, but with occult and other additional special effects.”

 

 

 


The Internet Is The Engine of Economic Growth

NAB

The internet economy grew seven times faster than the total U.S. economy during the past four years, with millions of jobs generated by if not dependent on, connections to the Web.

https://amplify.nabshow.com/articles/why-and-how-the-internet-is-the-engine-of-economic-growth/

The study, commissioned by the Interactive Advertising Bureau (IAB), found the internet economy’s contribution to U.S. GDP grew 22 percent a year since 2016, in a national economy that grows between two to three percent per year.

In 2020 alone, it contributed $2.45 trillion to the US’ $21.18 trillion GDP.

“It’s clear that the U.S. economy is undergoing a radical transformation driven by the market-making power of the internet,” said David Cohen, CEO, IAB. “It’s now possible for a business located anywhere in the U.S. to reach a global market.  As regulators continue to examine online and digital data policies, they must understand how the internet powers economic growth and how proposed regulations could slow or even stop that growth.”

The study, ‘The Economic Impact of the Market-Making Internet – Advertising, Content, Commerce, and Innovation: Contribution to U.S. Employment and GDP,’ also discovered that the internet generated 17.6 million direct and indirect jobs in the period, marking “a dramatic increase” compared with just three million jobs when IAB began measuring employment growth in 2008.

Specifically, the research estimated that 850,000 people are self-employed and 450,000 work for small businesses in jobs that could not exist without the internet.

The study also showed that the commercial internet directly generated seven million jobs and indirectly provided jobs to another 10.65 million people fulfilling service needs created by internet-based companies.

There are 200,000 full-time equivalent jobs in the online creator economy.  This number is just short of the combined memberships of craft and labor unions SAG-AFTRA (160,000), the American Federation of Musicians (80,000), the Writer’s Guild (24,000), and the Authors Guild (9,000).

“Not only large firms, but also large numbers of small firms and individuals, now have the platforms and tools to find customers, engage with them, and transact,” said John Deighton, a Professor of Business Administration Emeritus at Harvard Business School who led the research.  “And founders don’t need to bring large amounts of capital to the table. Investors have shown great willingness to supply the capital, confident that advertising, sale of subscriptions and licenses, and freemium options will get them an attractive return on their investment.”

Since IAB began measuring the economic impact of the internet in 2008, the internet’s contribution to GDP has grown eightfold, from $300 billion to $2.45 trillion.

It would be unsurprising if the impact of the internet on GDP were not replicated in other major economies around the world.

 

 

Innovating the Data-Driven Fan Experience

NAB

The battle for sports fan engagement is heating up and data holds the key. A trio of executives from F1, Bundesliga and the NHL spoke about how they were unlocking data, using Amazon Web Services.

https://amplify.nabshow.com/articles/innovating-the-data-driven-fan-experience/

Noting that Netflix and Disney are using AWS to pioneer new ways to launch new services, Usman Shakeel, Amazon’s Dir. Solutions Architects, said, “Sports, media and entertainment must reinvent how they create content, how they optimize their supply chain for audience attention as well as making it highly scalable for consumers to consume anywhere anytime.

“We have customers with archives of tape gathering dust and looking at the cloud for a next generation of workflows.”

German soccer league Bundesliga, for instance, has moved to reach younger audiences more attuned to watching short snippets of vertical video on TikTok with a new app powered by AWS.

“We decided to take the best of TikTok – highly engaging content with great music and a great swiping experience and remove all the data tracking and privacy issues,” said Andreas Heyden, EVP Digital Innovations, at Bundesliga at the virtual round table convened by AWS. “It is our responsibility to protect young target groups from the potential danger of social media.”

The app is GDPR compliant and includes thousands of individual ‘taste’ profiles which can be tailored to a fan’s affinity with multiple Bundesliga clubs.  

F1 – fuelled by data

Arguably no sport generates more data than Formula One but the motorsport has historically kept this data internally. That’s begun to change since Liberty Media acquired the property in 2017.

“Probably because of their experience of US sports, Liberty Media knew how important data was in sportscasting,” says Rob Smedley, Data Director at F1. “Within the teams themselves F1 is the most data driven sport on the planet but we just weren’t getting that data out at fan engagement level.”

Deciding to do something about it was one thing but it’s taken several years to sort through and join up the maze of data collection processes from areas including onboard telemetry, local weather, race timing and stats.

“There are 7.2 billion combinations of timing loop data over a two-hour race which was unrelated to the image metadata which F1 owns and operates. The systems in place had grown organically. There was no real holistic solution. So, for analysts to get to the data to build products was a job in itself.”

Smedley has led “a massive digital transformation” of F1 data archives, the way its data is stored, processed, presented and streamed in realtime. The result is what he calls F1 AWS Insights or “data widgits we use to tell the story.”

This is necessary, he explained, because the broadcast feed alone is inadequate to convey the sport’s highly complex stories.

“When you watch the live broadcast of a football game then 90 per cent of what you see on camera is what’s happening on the pitch. In F1 the ratio is different. Sometimes the broadcast feed is only telling 10 per cent of the story because it is only able to show a small portion of the track.”

An F1 track ranges between 3km to 8km in length. Excepting aerials, the broadcast feed can only show a few hundred meters at a time with maybe 2-3 cars out of a field of twenty.

In that respect F1 is similar to golf.  “What you watch on the broadcast feed is only a tiny proportion of what is happening on the course but the way the PGA can package HD image data and telemetry data or statistical data is really well advanced. We take inspiration from that.”

Now Liberty Media feels better equipped to engage existing and new fans with AWS Insights. That’s important as the sport attempts to make greater inroads into North America, where fans expect data as part of the experience. There are signs it is working since the recent Grand Prix in Austin, Texas drew 360,000 spectators over the race weekend.

“Data is now essential for fans,” says Smedley. “If you take it away it gives a completely different viewing experience. Younger demos in particular have a voracious appetite for data.”

Video entertainment platform for NHL

Hockey fans can now get a clearer, real-time view of what’s happening on the ice through the NHL’s new UHD-enhanced video production pipeline.

“We are trying to build an entertainment video platform for the NHL,” said Dave Lehanski, EVP Business Development and Innovation at NHL.

Considering the League hosts 190 different video channels and each NHL season (regular season and playoffs) includes upwards of 1,400 games, establishing a fixed UHD infrastructure that could provide consistent video quality across productions was a crucial consideration.

“We invested a significant amount of time and capital to build a tracking system for the players and the puck,” he explained. “The system generates up to 50 points a second and more than 1 million data points per game. It allows us to create new content and tell stories and present it in real compelling way.

“To transform the fan experience however we need to combine that data with video. The real magic happens when data and video are aggregated as one.”

The NHL is doing this using AWS Elemental Link UHD, a High Efficiency Video Coding (HEVC) encoding device announced earlier this year. The Link UHD connects a live ultra-high definition (UHD) video source, like a camera or other video production equipment, to AWS Elemental MediaLive for video processing in the AWS cloud. 

The NHL installed half a dozen of these at each of the 32 NHL venues and uses them to ingest streams of video from 6-8 cameras into the AWS cloud for aggregation and processing with data. It can then push out different packages of data + video to media partners, coaches, officials and fans.

For example, from the data the NHL can read the speed of players in realtime and by adding a camera at ice level to follow certain players they can match the data to the video and illustrate to the fan exactly what a 15mph or a 22mph skater feels like and bring that story to life.

“Latency is the key to that,” says Lehanski. “How we pair the data from the ice with the video in the cloud and back is the power of the UHD infrastructure we have put in.”

Other applications include facilitation of a sports book. “What better example is there for a fan to get realtime data on a game with which to make betting decisions or to view the outcome of bets they made in realtime.

He added, “In essence these new cameras with Link UHD encoders can encode multiple regional signals at every arena for every game of the season at low cost.”

Footage captured with the updated pipeline is used for a wide range of applications including Video Cast - the NHL’s web-based video player platform with logging capabilities and statistics integration that it provides to internal stakeholders, rights holders, TV networks, and radio stations.

NHL also makes the content accessible through video management software for broadcasters looking to distribute these live video angles or obtain a tertiary path for a program feed. Hockey operations and player safety teams can access close-up angles of every event that happens on the ice, with plans to make recorded UHD footage available to referees, coaches, and players for post-game review and performance analysis. 

NHL SVP of Technology Grant Nodine added, “As we continue to build out the pipeline, the goal is to spin out an archival-quality UHD file that’s a simple stream to store. We want to make search and retrieval of archived footage simpler, give broadcasters instant access to NHL content for syndication and licensing, and facilitate the delivery of new in-game analyses, predictions, and video highlights to enhance fan experiences.” 

 There are plans to incorporate more AI/ML and computer vision technologies into the pipe. “Ultimately, we want to be able to feed our UHD video to computer vision applications to derive additional insights about the game, which will ensure more data-driven video content that benefits hockey fans, referees, players and coaches.”

So You Say You’re Planning a 16K Live Stream…

NAB

The world is barely screened in HD, let alone 4K UHD, yet 8K TV is so advanced as to be considered inevitable, leading some companies to actively pursue live video production and distribution at 16K resolutions — by 2024.

https://amplify.nabshow.com/articles/so-you-say-youre-planning-a-16k-live-stream/

Intel, along with Japanese broadcaster NHK and Brazilian broadcaster GloboTV, just announced plans to experiment with multiple streams of High Frame Rate High Dynamic Range 8K and even 16K at the Paris Olympics, which is just three years away.

What’s more, this leap in resolution will be accomplished over the open internet and not over satellite or cable as per traditional TV broadcasts.

“We are way beyond proof of concept,” says Ravindra Velhal, global content technology strategist and 8K lead at Intel, writing in an Intel-sponsored article at VentureBeat. “Right now, we’re at the beginning of another seven-year cycle for a new TV 8K format.”

Scaling 8K Over the Internet

The Olympics was first broadcast live in 8K in limited fashion at the 2012 Olympic Games in London, led by NHK with input from the BBC. Intel streamed matches of the 2018 FIFA World Cup in 8K over a dedicated link. At the Tokyo Games earlier this year Intel, NHK and GloboTV broadcast 8K at 60 frames-per-second with HDR (High Dynamic Range) to Brazil and Japan. It was believed to be the first live, broadcast-quality transmission on an open IP network cloud.

“The technical feasibility we’re showing now is using agnostic cloud service provider, so that you can have millions of clients consuming content in 8K globally,” said Velhal. “What we’re doing with OBS/NHK is to show that we can take the 8K signal and scale it to a larger area, beyond one city or country, over an open Internet cloud. That’s the big difference.”

If we’re to take Intel’s description of its achievement at face value, then it is remarkable given the data it is processing and the ultra low latency is claims for an 8K live sports experience.

According to Intel, the major innovation here are its Scalable Xeon-based local encoding and delivery solutions. You need a lot of them and Intel doesn’t go into cost, but the whole focus of its involvement is to sell more chips.

The company isn’t not shy about explaining the workflow, however. Content is captured at 8K 60fps HDR “in big, fat” 48-gigabit-per-second optical lines. That’s fed to Intel’s encoder server, from where data is either distributed directly to consumers at 80-100 Mbps or offered as a contribution feed at 250 Mbps to rights holders. The higher quality is necessary for broadcaster’s to further manipulate the signal for its own presentation.

According to Velhal, right now, the web service provider cannot handle distribution of more than 100 Mbps. “Basically, we’re delivering 8K on the existing 4K infrastructure,” he says.

This comes back to Intel chips. If you want to process 50 Gbps data and compress it to 80-100 Mbps, you have to use 112 core-based Xeon Servers. At the Tokyo Olympics, they used encoded servers equipped with four Xeon 8380H processors. Going off this price chart, I think that works out at 4 x $8000 x 112 = $3.584 million. Forgive me I’m wrong on that Intel. Seems like a heck of an investment.

8K Data Crunching and Latency

Point is, 8K live streaming at scale can be done. Delivery to the open Internet cloud is managed using standard repeat request protocols like RTP and TLS or RTP and HLS. Intel says it solves the bottlenecks of an 8K TV playback using single cable HDMI 2.1. The only non-Intel part used in the Tokyo proof of concept was an Nvidia graphics card, which handles color correction and outputting this to the HDMI 2.1 compliant TV. All the rest is done by Intel CPU, both from the encoding side and the decoding side.

Impressively, the round trip latency from venue to screen of the 50 Gbps input signal encoding to produce a 200-250 Mbps contribution and 80 Mbps distribution signals for OTT is 200-400 milliseconds.

“That’s a world record in itself, though we yet to get an Emmy Award for it,” Velhal says.

He has however twice won Hollywood’s prestigious Lumiere award, been twice projects were nominated for an Emmy, holds several patents, and chairs the 8K Association, an advisor to film industry forums worldwide.

He went on to explain how Intel divide the 8K screen into multiple horizontal bands, each with a dedicated Intel Xeon core processor. “That’s how we do a lot of metrics calculation, add, multiply, add, because there’s huge amount of vector data or scalar data. Quality of service is important for broadcasting industry standards, because we are doing a lot of this parallelism here. That’s how we’re able to achieve 200-400 millisecond latency from input to output.”

Noting that the pricing for 8K TVs is consistently falling, along with the fact that YouTube has more than one million 8K videos available, and that entire 8K workflow toolsets from capture to post-production “are increasingly affordable,” Velhal says you don’t need to wait for your cable carrier to start streaming their library in 8K.

Now, On To 16K

The breakthroughs in streaming 8K live are a continuum of the “format momentum” that has led the industry to deliver digital TV, HD and UHD. There are always those who will argue whether 8K resolutions are even necessary or actually visible to viewers without a very large screen, and Intel is keen to point out that its experiments combine 8K with HDR and HFR, attributes that significantly upgrade the viewing experience.

On that note, Velhal says, “For the Paris 2024 Summer Olympic Games, Intel technologies will continue to push pixel frontiers to even live 16K, multiple 8K TV channels or 8K with 120 frames-per-second over 5G. Technically, 16K is several times more than 8K 60fps data rate. When the next platform comes we’ll continue to evolve and advance this technology and explore new frontiers.

“The work we’re doing is the future of Olympic broadcasting, the future of sports broadcasting, and the future of live entertainment broadcasting. We are preparing the world for the democratization of 8K using open Internet.”