Saturday, 14 September 2024

Media vendors sacrificing training and interop for revenue and efficiency, IABM warns

Streaming Media

article here

Media tech vendors cannot continue to do business the same way they always have, the IABM has warned. What’s more they are prioritizing investments in product which generates revenue, flexibility and efficiency at the expense of training and interoperability.

“These are the three core things that your products need to deliver to your customers to be successful,” Chris Evans, head of knowledge and insights at the media technology trade body said at IBC2024. “This is what customers are looking for right now. But the cost of this response is less investment in innovation and future roadmaps, less investment in training and support and less investment in interoperability and transparency.”

Those crucial items had fallen to the bottom of a ranking of investment priorities polled from IABM members.

“This is a really clear trend but does it risk organizations becoming more insular and inwardly focused?” Evans quizzed. “When we've got our back against the wall we’ve proven that that doesn't have to be the case. Many partnerships and collaborations emerged during the Covid crisis that led to really successful business innovations. We continue to hear from the industry and see in our data that training and interoperability are vital catalysts for the next generation of technology to embed itself in the workforce.

“This is a warning sign to the industry not to lose sight of the key areas of mid and long-term success in the face of what are short-term priorities.

The warning came during a state of the industry report by the IABM at the International Broadcasting Convention which found that business confidence had dipped in the wake of macroeconomic headwinds and sector specific challenges, like the Hollywood strikes.

“Business confidence continues to be volatile and has fallen year on year, comparing 2024 with 2023,” Evans noted. “Since April we have seen slight uplifts in business confidence but the picture overall serves to remind us where we are as an industry. Some of the challenges that were in play in 2023 have persisted into this year and continue to negatively influence businesses.”

One of the central issues in play for the industry is the monetization of content. Content, owners and service providers are still navigating business transformation as they streamline existing services whilst pivoting to develop new revenue streams.

The IABM reported a “crisis of confidence” among traditional media pivoting to streaming in early 2023 but since then it tracked results that suggest efforts to stabilize business transformation are beginning to pay off.  These measures include a crackdown on password sharing, price increases and the introduction of ad tiers.

“None of these have been particularly well received by end users, at first but evidently not to such an extent that we've seen mass departures” from SVODs, Evans said.

Turning to FAST channels “another evolution of the DTC offering,” Evans said the slowdown in the growth of SVOD has put an increased emphasis on the development and the launch of FAST channels. As at May 2024 there were a staggering 1943 FAST channels in the U.S, he said.

“That’s pretty substantial growth year on year but a slight caveat is that that growth rate at least in the U.S is slowing. Inevitably, we're going to reach a saturation point. We’re in the [midst of] a land grab for FAST viewership.”

 

Look to YouTube

All the recent attention on competing with pure play streamers like Netflix seems to have taken the broadcast industry’s eye away from an equally if not more potent competitor: YouTube.

 The perception that consumption on smartphones is cannibalizing viewership needs to be reappraised since the Alphabet owned platform now claims the second highest share of viewing in the U.S on the living room screen, per Neilsen figures.

 “We have to look at the proposition that we offer in the living room on the large screen, as well as the small screens and the digital first pieces,” Evans urged the broadcast community. He added that this represents an opportunity for traditional content owners with linear services.

Working with YouTube could provide “another portal or gateway” into broadcaster content “provided that content is versioned in a way that is sticky and appeals to people where they're watching first and then pulls them through to your longer form content.”


Against this backdrop is increasing pressure on broadcasters and media owners to merge and consolidate. The U$D8bn Skydance transaction for Paramount seems concrete but there is speculation of asset divestment by Warner Bros. Discovery, and even a merger of public service broadcasters such as between Channel 4 and the BBC in the UK (denied by the BBC).

 

Cloud and AI interlinked

Cloud services continue to be very important but surprise, surprise the tech sector with fastest year-on-year growth is AI, ML and analytics.

 “What's important is the interlink between AI, ML and Cloud,” Evans said. “This will actually reinforce the importance of Cloud because for many organizations there's just simply not the ability to invest in the compute power to deploy AI and GenAI on-prem. They will have to look to third-party Cloud providers to do that initially.”


Chip manufacturers are releasing new chipsets to capitalize on this demand and the IABM suggested that competition in this space is trending towards reduced price points – but only slowly. Nvidia’s explosive Q2 financials results at the end of August are evidence of this trend.

“Right now, the cost of implementation of compute means that this largely rests on public cloud services and the development of new data centers that can cater to demand for AI and Gen AI.”

 

Hardware rebounds

Media technology equipment manufacturers have talked a lot in recent years about the shift to virtualization and the importance of software defined products but hardware remains a strong, even growing part of the pie.

“For all the benefits of different SaaS pricing models many companies still need the acceleration of FPGA enhanced hardware,” Evans said. IABM figures reveal an uptick in the share of revenue accounted for by hardware.

 “That’s not to say we're going to see a significant bounce back [to hardware]. I see this really as a kind of a recalibration and stabilization of the mix. We certainly won't rebound, to 2021 levels but we will start to see the recent fall off in hardware sales start to level out.”

 

Parallel market

Media tech companies are also finding more customers for their product outside of traditional broadcast media. Indeed, more than half of member company revenues were generated by so-called parallel markets such as corporate, education and houses of worship in 2023, than conventional broadcast.


“These non-traditional organizations are looking broadcast media technologies to enrich creating content and serving an audience.”

Vendors might naturally have looked for customers outside the traditional base last year as the strikes ground Hollywood production to a halt but the trend is here to stay.


“What we will see in 2024 is a stabilization, but I do not think that the parallel markets are going to recede significantly,” Evans said. “They're going to be vital moving forwards to play a part in your overall revenue share. Everyone has their own definition of what a broadcaster is, but I think one of the best kind of statements I've heard to think about potential customers and prospects as content rich organizations.

 “Corporations like fashion houses and banks are already producing content and are in the market for professional media systems to help them organize, manage and monetize it.”



Friday, 13 September 2024

IBC Conference: Uncovering the truth behind macro trends in the media industry

IBC

The media and entertainment market is set to exceed $1tn in 2024, driven by the explosive growth in streaming video, but how many people really understand the dynamics behind the trend? IBC365 speaks with Omdia’s Maria Rua Aguete to learn more.

article here

Omdia has compiled some fresh research to be unveiled at IBC2024 “where the focus will be on what consumers watch, where they do so, and for how long,” says Maria Rua Aguete who leads the media and entertainment (M&E) team at the research analyst.

When the global value of media entertainment tops $1tn this year, online video will be the biggest part of that at $392bn. Omdia figures put traditional TV second at $327bn, games (sizeable at $220bn), music ($44bn) and then cinema ($36bn).


Advertising is now the biggest revenue earner in online video (generating 61% of its $392bn total in 2024) and ads are also doing business in gaming (21% of its total). Aguete notes that significant numbers of advertisers are investing in campaigns across games, traditional TV and online video.

Online video advertising will be by far the number one source of revenue by 2029, where Omdia project it will rake in $362bn alone. This is followed by subscription streaming revenues and then pay-TV.

“In fact we can see how streaming video revenues will overtake that of pay-TV by 2028,” she says.

If you remove social media video from the total advertising pie, then what is bigger - subscription video or advertising video? The answer according to Omdia is that by 2029, SVOD revenues will be at $185bn compared to premium AVOD at $141bn with TikTok not far behind on over $100bn in that time frame.

Pay-TV outloo

Here’s another stat: By 2026 there will be more homes globally watching free content online than they will do via a traditional free television.

While in 2024, most (57%) of free TV being viewed was on broadcast TV, this reduces to 46.7% in 2026 and to 39% by 2029 as viewers move to free and free ad-supported content online.

“Back in 2016, the global composition of all homes paying for TV – either pay-TV or streaming – was dominated by traditional pay-TV,” says Aguete. “In fact, back then, 74% of all homes paying for either a streaming service or pay TV were subscriptions to pay-TV only, and only 6% had a streaming-only service.”

Omdia will show that by 2029, 44% of people will have both pay-TV and SVOD, but the number of homes taking only pay-TV will have declined dramatically to 30% while those taking SVOD only will have risen above a quarter of all households.

“The switch to online is clear,” she says. “The pay-TV bundling strategies of service providers have pushed the pay-TV only home into sharp decline.”

There’s more to being entertained than the living room

Aguete will also cover the rapidly growing media consumption space of the ‘connected car’. There will be 555 million more connected cars on the road by the end of the decade than at the start.

“At IBC there will be lots of people talking about the connected car and it is really important because the market here will grow to be worth almost $1bn by 2030. Having a presence in the connected car will be a hot topic at IBC 2024.”

Highlighting the top 10 video services per country, Aguete reveals some distinct differences in consumption habits albeit that across the board it is YouTube that stands out as number one.

In the US, for instance, Instagram Reels scores highly with more people watching video on the Meta-owned platform than on Netflix. Three social media services rank in the top four in the US but free ad-supported TV (FAST) channels including Tubi and Roku also have a strong top 10 presence.

In the UK and France by contrast, public service broadcasters are prominent in the top 10. German consumers prefer to watch Prime Video over Netflix while in Brazil and South Korea, Samsung TV scores a top 10 hit with its range of FAST channels.

CTV operating systems

“Globally, the Smart TV OS leader is Samsung. What if Samsung TV start producing their own content?” poses Aguete.

A major section in her presentation will focus on the battle for the TV operating system (TV OS).

“Smart TV companies with FAST services can be seen as frenemies,” she says. “The media owners and operators that LG or Samsung partner with, for example, include channels provided by Pluto TV, Tubi and Rakuten.”

Samsung TVs skew towards affluent owners, Omdia report, although many users attached laptops and games consoles and even STBs to their Samsung TV, often bypassing the TV’s own CTV apps.

How much time people spend watching the top 10 services is a question that Aguete says she gets asked a lot – and at IBC she will provide the answers.

“We know that consumers go to YouTube more often that they go to Netflix, but how does that add up? In the US, for example, more than 100 billion hours of YouTube were watched in 2023 and 60 billion hours of TikTok which is double the time they spent watching Netflix which had about 38 billion hours.”

How does that compare to using online services for browsing? Omdia’s stats reveal that the third most viewed service after YouTube and TikTok in the US was Facebook but even on Meta’s social media platform users are spending more time watching videos than actually browsing. “Video has become critical for all social media platforms,” she says.

If you divide the total viewing hours by population then you can figure out how much time people spend watching and something interesting happens. Doing this per head of population in the US shows that the average YouTube user is spending 53 minutes per day on the site, 30 minutes per day on Facebook and 19 minutes per day on Netflix.

“Since you cannot really compare paying subscribers with people watching for free I did a deeper analysis per user to take into account subscribers to Netflix. When you do that in fact the most engaged users are Netflix ones who are spending 90 minutes a day with the platform.”

In this analysis, YouTube is second with 81 minutes, TikTok third with 79 and Amazon Prime fourth with 51 minutes per day.

However, you combine video views and browsing/interaction activity and of the leading M&E sites in the US it is Facebook that emerges on top. It has the most active users in the US, despite declines in its overall subscribers.

Globally, YouTube is mainly watched on the smartphone (63% of people do so according to Omdia) another 27% watch the Alphabet site on a Smart TV.

Conversely, with Netflix most people (58%) watch it on their Smart TV but it’s interesting that 30% of people also do so on their smartphone.

Another slide highlights which devices people watch on. In the UK, most people watch through Smart TVs, as they do in Germany. STBs are important in South Korea but in this market the smartphone dominates.

Aguete is also at IBC presenting Omdia findings around 4K. This will take the form of a round table at IBC for members of the World Ultra-HD Video Industry Alliance (UWA). While there are more than one billion 4K TV sets around the world today, most of them still in use, 4K content continues to lag behind. She says, “We’ve been speaking about 8K for some time but what about content? Is there enough content for consumers to watch?”

IBC Keynote - Paramount CTO Phil Wiser: “Content creation is our bedrock”

IBC 

article here

The media landscape is changing in dramatic ways. Paramount CTO Phil Wiser says the transition to digital can be managed astutely with the right blend of technology, business and, crucially, of content.

The evolving consumption patterns on linear TV, coupled with streaming TV viewing reaching a record high has created a seismic shift which the entire industry is grappling with. Phil Wiser, Executive Vice President and Chief Technology Officer (CTO) at Paramount Global, prefers to take a long-term view. “The major media companies are in the game. When you take a step back, you’ll see we’re pretty well positioned.

“The most important thing to talk about is our content,” he says. “There’s so much press written about shifts in content distribution, but it’s the content itself that is vital to the industry. Content creation is the bedrock of Paramount.

“We need to look beyond any single quarter to understand the transition to streaming,” he explains. “It’s been somewhat gradual relative to other industries over the past two decades. Clearly, it has accelerated recently, but it’s not as if the industry has been idle in adapting to change.”

Previously, Wiser was the CTO at Sony Corporation of America and Sony Music, responsible for creating the digital businesses at Sony Music, including the forging of the groundbreaking deal to launch iTunes. This background has given him the tools and the knowledge to ensure Paramount continues to innovate for the future, especially as the media landscape remains ever-changing.

“Most of the music industry failed to embrace internet distribution and either sat on the sidelines or tried to stop it aggressively. When the shift happened, they ended up with nothing and took a backseat to companies like Apple.”

Rather than resisting the shift to digital, Wiser says broadcasters and studios are keenly focused on the future of distribution and evolving business models.

“The performance of our streaming businesses demonstrates significant growth. This growth is nothing without the quality of our content. We have many high-performing shows and films in the library, and due to be released.”

AI in the mix

An increasingly potent part of the technology mix that Hollywood is grappling with is artificial intelligence (AI) and how best to deploy it. Wiser says that Paramount has taken a conservative approach to the broad adoption of AI, but the long-term impact remains a focus.

“Within the group, we’ve done a good job at training people on the potential of AI so that we’re in a strong position to organically adopt it, but as of today, it’s not having a significant business impact on how we produce content.

“That said, if you look out three to five years, we’ll see AI impact growth substantially. Some of the drivers of this will come from the maturity of the technology and the way AI tools become more applicable to our workflows.”

The nexus of technology and creativity has been at the heart of Hollywood’s success over decades, and Wiser sees no reason why that relationship should change. Indeed, he believes in doubling down on empowering creatives with new technology.

“Another big part will be the innovation that comes out of the creative community. I think AI is better applied and grown organically from within the community.

“It’s also worth noting that AI/ML is a very fluid environment,” he says. “Companies that may have invested heavily in the technology 18 months ago may not have found the return they were looking for. We see our role as enabling and educating the creative community in part by continuing to watch developments very closely.”

Cloud and sustainability update

Another long-term technology play overseen by Wiser, along with other studio CTOs is the move to cloud. This is being guided by the MovieLabs 2030 vision.

“It’s certainly been one of the most important strategic imperatives in the industry,” Wiser says. “It has enabled us at Paramount and our peer group to share meaningful conversations around what our ambitions should be and how we should collectively get there.”

“Over 95% of our enterprise systems are cloud-based, which includes our video-intensive tools, systems, and asset storage. The transition to the cloud facilitates more downstream fluidity and opens opportunities to better exploit assets. We’re seeing significant shifts in areas like animation, whereas cloud-based production remains a work in progress.”

The adoption of cloud technologies and virtual solutions emphasises Paramount Tech’s focus on sustainability. Over the last four years, the company has traded owned and operated data centres for shared facilities, which yielded near immediate carbon efficiency gains. The move to cloud and virtual solutions positively impacts the group’s focus on sustainability.

“The pandemic brought about a remote work model that reduced our physical footprint across the company and additional direct sustainability benefits. We continue to make progress in our efforts toward sustainable software engineering by weaning off inefficient systems and architectures with direct links to our energy consumption measurements,” Wiser says.

Consumers value content

Despite being buffeted on all sides by changes in distribution, consumer behaviour, business model and technology, Wiser is confident that the importance of quality content will remain constant and valuable.

“I believe that the concept of story is not going to change even over a longer time frame. Certainly, the personalised organisation of content is going to continue to improve and refine. As we’ve seen from social media platforms, a change in the way content is surfaced can have a dramatic impact on the overall experience. End user services continue to become increasingly optimised around how they’re sorting content and pushing that to users.”

But Wiser believes there is still incredible value in expert content curation. “People want stories that have continuity, that they can go deeper into. The primacy of the concept of storytelling in 5-10 years is probably going to be very similar to what it is today.”

“Just look at the many attempts to move to a much shorter snackable form of content at broadcast quality. They haven’t really been successful. We’re still living with 20-30-minute episodic season-based constructs to tell a story. That’s not to deny innovation in terms of formats but in terms of story-driven content that people want to consume I feel that will remain the same in five years’ time or longer as it does today.”

At a time of change, Wiser’s IBC keynote will offer a steady hand in guiding the industry toward the business models that will drive long term monetisation.

Tuesday, 10 September 2024

Dry for wet

Definition

article here and also p12-16 here


ICVFX developments have made shooting in and around water a more fluid process, with real-time effects now standard practice. Find out how studios are maintaining naturalism

Water simulation in CGI has advanced massively over the last 30 years, from James Cameron’s Titanic through to the amazing VFX in Avatar: The Way of Water. The technology has matured and rendering power is at the point where many effects can be run in real time.

“The number one benefit of shooting on LED over green screen in a water-based environment is the reflections and refractions,” says Dan Hall, head of ICVFX at VP studio 80six. “We all know that no VP shoot will look correct without a good relationship between the VAD (virtual art department) and AD. When you have water in the foreground, you get an accurate representation of how it reacts to the light that’s emitted from the background. This helps sell the realism of what you are shooting.”

Hall supervised a high-speed test that involved pouring drinks on a beach against an exotic beach background. “We were shooting at over 200fps when I noticed that the virtual sea elements were essentially stationary throughout the pouring. To add more movement to the sea, we did something that wouldn’t have been possible on location – we increased the speed of the waves. 

“To the eye, the waves appeared to be moving quickly, but on reviewing the shot they looked like they were rolling up the beach at only a slightly slower than normal speed while the drinks poured very slowly.”

The way lighting from an LED volume penetrates practical fog on-set lends itself to underwater work. Craig Stiff, lead real-time artist and VAD operator at Dimension, explains: “Green screens cause green spill and would penetrate the fog, so practical fog would likely not be used and instead would be added in post-production. With LED volumes, the light is coming from visible structures which don’t have to be too out of focus. Reflections and out-of-focus edges are also accurate, meaning no painting out or rotoscoping/keying.”  

Having said that, it’s important to consider how the effect will be layered or composited as part of the final image. For example, in survival thriller No Way Up, directed by Claudio Fäh, Dimension and DNEG 360 used SFX elements like fog and haze to give the impression of being deep under the ocean. This, combined with a virtual moon casting light through the fog, sold the idea of it being underwater.

“On-set, we used two haze machines to add atmosphere to the practical set,” Stiff adds. “When used correctly, haze is a great tool for blending the LED wall with the practical set. We then composited VFX elements like ripples, bubbles and fluid simulations in post-production to blend the shots.” 

Plan the dive, dive the plan      

Like anything in virtual production, there are caveats to the way you do things. When working with water, you need to take time to ensure everything has been set up correctly. For example, according to Hall, moiré patterns may become an issue, especially if you want to focus on a highly reflective object like water inside the volume.” 

 “Any wire work (such as for swimming), has to be well planned and executed,” advises Stiff. “Hair and floating materials need to be considered because they won’t behave like they are underwater. Therefore, tight clothing, tied-back hair or head coverings are the best option, unless you can account for it in post-production.”

Practical effects

It is not only possible to augment video backgrounds with practical water effects but it’s encouraged for a more realistic final image as if you were on location. 

For example, says Hall, when working with rain or particle effects, you want to ensure that the physical properties of the virtual rain – including spawn frequency, droplet size, wind direction and material – match those of the physical rain. 

“If there’s rain falling on a subject, it should also be in the virtual environment. Though it’s not always necessary; if you have a shallow depth-of-field in a large volume with physical rain, it may not be needed. This can save on GPU resources.

“The VP supervisor will also be able to advise you on the capability of practical water effects on the volume: LED panels have varying tolerances to humidity.”

However, as we all know, water and electronics do not mix well. The environment has to be very controlled.

“If shooting using a water tank and the LED volume as a backdrop, whether that is for underwater or above, the main thing to worry about is safety,” says Stiff. “These are large electronic devices next to a pool of water with actors. Keeping a decent distance between the water tank and the volume is wise, but this creates a gap between the practical and digital water. To allow the blend between real and digital, camera angles should be kept low to the water surface or high to obscure the edge of the water tank.”

Tim Doubleday, head of on-set VP, Dimension and DNEG 360, claims that the main challenge is limiting practical water to a single area. “When done right, the two elements work brilliantly together since you get all the natural reflections and refractions from the LED wall in the practical water.”

Catch the next wave

Water simulations have advanced to the point where they can now run in real time, including waves, crashing surf and complex water movements.

“The Fluid Flux plug-in for Unreal Engine seems to be widely used in the community and has produced great results,” says Hall. “I also know there has been impressive use of ICVFX for underwater scenes, so I’m interested to see how that progresses.

”As hardware improves, we can do more in real time, which will only lead to more accurate and realistic virtual water elements,” he adds.

The way light interacts with water can also be rendered with a high degree of realism. Doubleday thinks we will see further advancements in how objects interact with water, such as a boat carving through water leaving a trail in its wake, or how a heavy object disperses water when dropped from a height.

“These situations can be simulated using complex offline processes, but I can’t wait to see them running in real time on a giant LED wall!”

Deeper dive: Shark Attack 360, Season 2

Diving into the factual landscape of shark behaviour, the second instalment of Disney+ show When Sharks Attack 360 investigates why sharks bite humans. As the evidence mounts, the international team of experts analyse data in a VFX shark lab, all in order to understand in forensic detail why sharks attack.

For the docuseries, animation studio Little Shadow developed a hybrid VP workflow. Instead of using a traditional LED volume, it used a mix of custom-built systems and off-the-shelf tools to facilitate live green-screen keying, camera tracking and live monitoring.

“We blended live-action footage with CGI, allowing us to transform an underground theatre in Islington into a virtual 360 shark lab,” explains Simon Percy, director at Little Shadow.

“Initially, we used a LiDAR scan to create a 3D model of the venue, which we then employed to plan and scale the project. Due to the venue’s layout and lack of sound proofing, we ran a 4K signal across 110m and four floors using BNC cable, which allowed us to keep most of the equipment separate from the set.”

The creation and integration of CGI assets, such as the shark models and virtual marine environments, were key for building the immersive underwater settings, which were then played back on-set using the VP box, providing immediate visual feedback.

Percy continues: “We built the flight case around a custom PC for Unreal Engine, a pair of Blackmagic HyperDeck Studio recorders, the Ultimatte 12 4K for keying and a Videohub 20×20 for signal management. We also frame synced our cameras to Unreal using the DeckLink 4K Pro. This approach proved both mobile and flexible, ensuring quick playback with real-time asset generation and comping adjustments on shoot days.”

A private Wi-Fi network connected the flight case to an on-set laptop, allowing them to control it remotely, including live switching via ATEM 1 M/E Constellation 4K.

“To bring the underwater scenes to life, we used a green screen and the Ultimatte, which allowed us to integrate the virtual 3D elements into the scenes using AR. This enabled the presenter to have a precise real-time interaction with the sharks. In combination with DaVinci Resolve’s AI functions such as Magic Mask for rotoscoping, we were able to blur the lines of where real and virtual production meet,” Percy adds.

Looking ahead, technological advancements in water and fluid physics simulations are moving quickly. “With the advent of powerful RTX GPUs from NVIDIA and tools like JangaFX’s Elemental suite, we can now simulate water dynamics in closer to real time – a process that would have previously taken days to complete. Blender’s capabilities for large ocean simulations, augmented by plug-ins like Physical Open Waters, hint at the possibilities for increasingly realistic and cost-effective water effects in the future of television production.”

Monday, 9 September 2024

5 minutes with Colourist Pete Ritchie

 interview and copy for Sohonet

article here

Colourist Pete Ritchie is in constant demand for grading high-profile TV campaigns and corporate videos for premium brands including Amazon, Vodafone, Samsung, Toyota, SK-II, Heinz, 3 Mobile, and Facebook. He’s been doing so over a 30-year career which began in Melbourne, Australia, and has continued for the last two decades in Auckland, New Zealand, where he now lives. He explains how he maintains the great relationships he has with clients all over the U.S. and Asia working across time zones in about as a remote (and beautiful) a location as you can get.

A lot has probably changed since you began in the industry…

Yes, that’s right. I joined the industry straight from school aged 18 when transfers were being made from film to one-inch tape. Technology has changed the business and the art of what is possible dramatically. About 20 years ago, I was working for a facility in Sydney and got poached by a facility in Auckland — a city I have been living in ever since. I met my wife here and we've got a family, and everything just sort of fell together. 

About 10 years ago I made the decision to go freelance and set up my own little studio in Auckland with a partner who is a Flame artist. At the time, I probably had one of the first remote freelance kits in the country. I could just go and set up in a production company office and work there for the day if needed. The lighting was never ideal, but it was really good fun.

Then, Covid - when we had to work from home. After that there was no real reason to keep the studio going with the cost of overheads. It was a pretty easy decision to close it down and set up a permanent space at home. It’s a nice suite but it’s not really setup for large client reviews, and that's where finding a remote systems partner became really important. I desperately needed something and when a colleague of mine at Company 3 mentioned ClearView to me I thought I’d better have a look.

Who do you work for mostly?

I have a great relationship with director Simon Clark who is based in Seattle, Washington, with his production company Hey Brutus. He's fantastic and we do a lot of work together. I have other regular clients in Tokyo, Singapore and Australia. The time zone works quite well for them in that they can brief me at the end of their day, and I can happily grade away during regular NZ hours and then present the project, for their morning / start of day. 

What is your principal kit? 

The essential items that I have here at home are a Blackmagic DaVinci Resolve Panel, a Sony PVM-X series calibrated OLED monitor, a Mac computer to power it all and the ClearView Flex.

What has been your experience with ClearView?

Fantastic! It's enabled me to maintain the international relationships that I already had and to build new ones as well as enabling me to provide local approvals with clients sitting at their agency office in Auckland.

With the way budgets are now, people don't always have the money or time to sit in a suite with me for eight hours. It’s no longer an efficient way of working. Remote sessions suits everyone.

Can you deep dive your workflow for us? 

It depends on the job but I love to get a brief up front along with any references for look. I come on at the tail end of the job that their team have been working on for six months so there's already been lots of conversations. I like them to share all their ideas; any sort of visual reference I can get from pitch documents or links to work they like is great. We’ll have a good chat up front, then they’ll  leave it with me to get a feel for the photography and explore looks.

Quite often if you've got a client next to you in a suite they'll be very direct in terms of where they'd like it to be, which is great and helpful but can also cut off some creative options. I prefer to take their opinions on board and then work through all the multiple directions we could take the grade. 

Personally, I enjoy the process more if I am allowed to have a play with the photography and get to understand the photography first. I’ll then present options to them. In my experience I can get seventy per cent of the way by doing this. That's where the Clearview session works really well.

We also use frame.io a fair bit but it’s a slower process because it doesn’t have live feedback. Clearview is fantastic because we can sit on a shot and talk about whether we could try it with a warmer look, say, or brighter, or tweak it any different way. We can see those changes right there rather than having to sign off and arrange another session. Clients really value that live interaction because they are just too busy to be sitting all day on reviews.

That’s not to say you won’t have multiple approval sessions. Sometimes you will go through that loop half a dozen times but the whole process is so much quicker, more creative and more fluid.
Clearview gives me that opportunity to go live back and forth with the client, which is what gets you to the final look that everyone's happy with. 

 


IBC Keynote: Prime Video backs AI to redefine intuitive streaming

IBC

Prime Video and Amazon MGM Studios Technology’s Girish Bajaj shares the latest on a complete user experience redesign, from leveraging AI to improve search and personalisation, to nurturing growth in shoppable TV.

article here

As entertainment shifts to streaming, the battle for viewers will not be lost or won on the quality of content alone but the ability to serve people what they want, quickly.

“What we hear from customers is that they want a simple experience,” says Girish Bajaj, Vice President, Prime Video and Amazon MGM Studios Technology, who will keynote IBC on September 14. “They tend to be very frustrated with all of the choices that they have today in streaming as well as managing multiple accounts across multiple different apps.

“Based on customer feedback, it is very clear to us that they want to spend less time searching and more time watching content. This is what we work on every single day,” he says.

According to research firm Nielsen’s 2023 State of Play report, using data from its Gracenote arm, US consumers spend over 10 minutes trying to decide what to watch.

“Our focus is to build a product for Amazon Prime Video that is simple to use because customers are seeking simplicity.”

Building it is not quite so simple however. Not when another goal is to personalise the entertainment experience for its more than 200 million members worldwide.

Improved AI for content discovery

Amazon recently debuted an entirely new AI-powered interface intended to streamline and enhance the overall content discovery experience with more personalised recommendations.

“We want them to have more control over their streaming appearance. We think that’s the way to reshape the future of entertainment video,” Bajaj says.

Amazon aims to be a ‘one-stop-shop’ for video entertainment by aggregating a vast amount of premium programming in a single app. These range from Amazon MGM originals, and live sports (like Thursday Night Football in the US). It also serves programming from partners as an add-on subscription, such as Max and Crunchyroll (you can even subscribe to services like Paramount Plus via Prime Video), plus dozens of FAST channels of which there are plenty more upcoming, he says.

The UEX redesign offers different rows of recommended content, depending on which profile you’re signed in to. Prime Videos’ recommendation engine is being built from Amazon Bedrock, a tool for creating generative AI experiences based on what you have watched previously. It also offers new categories to help you find content based on taste and what is ultra-popular in different regions.

The company is using AI to scale and deliver on its personalisation promise something which Bajaj will elaborate on at IBC. He says, “We use computer vision and machine learning, specialised hardware, distributed systems, and AI quite extensively at Amazon and in Prime Video. We think it’s very core to what we do and we leverage both traditional AI as well as generative AI to build and power recommendations.”

Building a seamless user experience

Bajaj started with Amazon in 2006, as a Software Development Engineer helping to develop Kindle, and in 2012 joined Prime Video. Since then, he has launched the technology that powers the company’s channels business, expanded the service in over 200 countries and territories, rolled out the X-Ray features which give additional TV and movie information during playback and launched Prime Video Ads.

As a senior technology executive, Bajaj now spends a deal of his time optimising the tech stack to ensure the platform delivers for customers.

“We have to ensure an equally positive experience for all types of content, whether that’s a series or a movie or a live event and whether that member is in a city or the countryside. We spend a lot of time working with partners who have the scale and capabilities to meet these needs as well as trying to stay ahead of innovations. We spend a lot of time obsessing about building a high-quality streaming experience that works at scale. A global streaming product is not easy to do. It is very complex and we build features for every single content type.”

That includes a particular focus on live events where Bajaj’s aim is to enable viewing with the lowest latency possible. “We don’t want customers to find out about a score or some attribute of the game before they see it and have that spoiled for them.” This is something he will also delve into at IBC.

Amazon even employs AI to help spot issues before they arise for content distribution. Depending on where you’re connecting from, it will use a method to determine the best AWS cloud from which to pull the content.

With younger viewing demographics chalking up more time on social media and YouTube it’s worth asking how Amazon Prime sees the future of long-term long-form video.

“We think long-form video is really, really important and a key way that customers engage with content,” Bajaj insists. “It’s going to stay and we’re optimised for long-form premium content. There’s a future for it and we’ll continue to lean into it.”

T-Commerce

Shoppable TV – the notion of connecting viewers directly to retail sites from the Connected TV – is another area of revenue growth for Amazon. Earlier this year advertising giant GroupM and its clients including Danone agreed to develop original, shoppable content for the Amazon Live FAST (free ad-supported TV) channel.

Previously, Amazon Live produced a travel-focused shoppable stream during a Virgin Voyages sailing, enabling viewers to virtually immerse themselves in the cruise experience for the first time and shop AI-powered image-generation vacation packages. Another brand, Method, integrated its Simply Nourish haircare products within Amazon Live’s “Beauty Haul” and “Get Ready with Me” episodes, where host Brandi Milloy inspired viewers with trending beauty tips while they could conveniently shop the featured products.

“We think shoppable TV is a key part of the value proposition. We bring the platform to our partners as well as our consumers where they can watch something that they love but also be able to shop for the products either they find interesting in the show itself or shop for something that’s related to the show in some way.

“It’s all about giving customers multiple ways to engage with Prime Video and our service.”

Aaron Reid / Supacell

British Cinematographer 

Cinematographer Aaron Reid grounds a new breed of superhero in south London.

article here

Few would have pegged Peckham as the home of a new breed of superheroes but that’s what new six-part Netflix series Supacell has done. Written and directed by Andrew Onwubolu, commonly known by his stage name Rapman (Blue Story), this drama sci-fi about a group of Black superpowered people is shot by Aaron Reid (A Town Called Malice).

“They wanted something grounded. It needed to feel big in scale yet for the first few episodes it was very important to establish the emotional heart of our characters,” says Reid.

“Every reference Raps gave us was American. It was Marvel and DC, SnowfallBMF and the Wire. That led us to a few important decisions.”

One was to work with Joseph Bicknell, a British grader working at Company 3 in New York. “He has a real eye on the American sensibility because he is working with American commercials and music videos every day but he knows British terrain and its weather.”

Another was to shoot on Alexa 35, a camera selected by Reid because of its 17 stops of dynamic range. “Our leads are incredibly attractive and I wanted to bring that out so having a lot of range in the highlights felt like a really good option,” he explains. “We were going to be shooting dark skin tones in environments where we might be shooting against a hot window. I wanted to have exposure of the window and the exposure of our subjects.”

With orders for the camera already backlogged before its release, the production worked with rental house One Stop Films to beat the competition. With DIT Alix Milan, Reid tested its capabilities side by side with Venice and Alexa LF and, with the agreement of Netflix technologist Chema Gomez, dispatched One Stop Films to Munich to make the case to ARRI, coming back with agreement to use two units and making it (probably) the first UK show to shoot them.

“The main feature I liked is it is really hard to over expose anything,” Reid reports. “We have scenes with explosions and we kept all the detail in all the highlights. Even if an image might seem blown out all you need do is put a bit of a curve on it and bring back the information. Plus, it has a cleaner feel than a LF or Mini.”

The new camera came with a challenge since there was no firmware with which to upload the exterior and interior LUTs designed by Reid and Bicknell, so Company 3 wrote one.

Wanting a faster T stop, Reid paired it with Tribe7 Blackwings. “They’ve got a really nice bokeh and a soft quality about them. At the same time they hold details really nicely. It’s got its own slight bit of contrast too.”

Creative considerations

As sensitive as the camera may be, darker skin tones are naturally more absorbent of light, so Reid lit almost every scene, from the set piece finale to smaller scenes, in a café with a simple key light on the lead characters.

“Our cast has the full spectrum from lighter to darker skin and lighting them should be understood by the cinematographer. It wouldn’t matter what cameras I use, it’s something I’d always consider.”

He explains, “I light for the space but we also had some handheld Jem Balls so when the action is moving we could move around with our actors. If I was working with white skin tones I might not have needed to but in low light situations especially that light doesn’t translate so you’ve got to give it a little kick on to your characters.”

Reid lensed episodes one, two and six and shot all the pick-ups for the show, with Sam Heasman shooting the other block.

Local lads

Both Reid and Rapman are south London bred so were very familiar with most of the locations for Supacell in Thamesmead, Slough, Peckham and Deptford. Because of the scheduling, there were periods when the DP was juggling seven lighting crew variously rigging, shooting and derigging locations some of which doubled for the same story location.

He credits gaffer Paul Parker and rigging gaffer Thomas Thomas for their wealth of experience in keeping it all within budget. Pixipixel supplied all lighting equipment and HVO-powered generators for the production.

At an army base in Slough they rigged a large inflatable green screen to shoot live action for a scene set in Piccadilly Circus which began with extensive previs.

“The idea was to keep it grounded – until it isn’t,” says Reid of the superhuman elements that increasingly invade the story. “The opening scene is very stylised and hints at what is to come. Then we revert to a drama that looks and feels normal. The way the super powers are shown is exactly how Raps envisioned it.”

By the finale the world building of Supacell has only just begun.

“The idea that characters could gain superhero powers because they’ve inherited a particular gene means they could pop up anywhere. This series is a South London show but Raps could go where he wants. Why not take it to LA, the Bronx and Mumbai or Australia?”

Of Rapman himself, Reid says his energy was infectious on set. “On one night exterior shoot we’d put up four large cranes with lights only for everyone to arrive and the set is waterlogged with gale force winds. We took the cranes down but Raps was calm and said let’s carry on and shoot it another way. He knows how to get the best out of his actors and the whole team.”