Tuesday, 21 April 2026

Behind the scenes: Rooster

IBC

article here 

When Cinematographer Blake McClure signed on to shoot HBO’s new comedy Rooster, he wasn’t looking to reinvent the sitcom; he was looking for a way to stretch the visual language of comedy to integrate the intimacy of large format.

As an arena where dialogue often dominates, camera grammar in comedy is usually expected to stay politely out of the way. What emerged was a bold, large‑format aesthetic built around the Blackmagic URSA Cine 17K 65, a camera McClure initially doubted but ultimately used to redefine the show’s emotional texture.

“Comedy is driven by dialogue. It’s about timing, tone, the speed of the joke. There’s not a lot of air between lines so there’s less room for expressive camera language. You don’t get the lingering, the following of characters, the visual storytelling you get in drama,” he explains.

McClure has balanced a career shooting comedy (Miracle Workers, starring Daniel Radcliffe, Apple TV’s Loot and numerous segments for ​Saturday Night Live!) with drama (Ryan Murphy horror Grotesquerie, The Dropout starring Amanda Seyfried and psychological thriller Ratched).

Rooster finally gave me the chance to merge those two worlds,” he says.

Rooster stars Steve Carrell as an author of pulp fiction teaching who lectures at an Ivy league university where his daughter is a professor. It’s the latest show from creators and showrunners Bill Lawrence and Matt Tarses whose back catalogue includes Scrubs, Spin City and Ted Lasso.

While still a comedy at heart, Rooster is more character‑driven themes blending humour with more dramatic and personal themes,” McClure says.

The show’s visual identity began with Dream Scenario, a 2023 satire starring Nicolas Cage and shot in 16mm by DP Benjamin Loeb. McClure and director Jonathan Krisel initially explored Super 16 to emulate its contrast and texture but the idea quickly ran into practical and aesthetic limitations.

“Our sets were tight. We couldn’t throw backdrops out of focus. Super 16 has a deeper stop, so it wasn’t helping,” McClure recalls. “I started thinking: what’s the exact opposite? Medium‑format 65mm.”

Large‑format portraiture—epitomised by Hoyte van Hoytema’s work on Oppenheimer —had long fascinated McClure. “The emotional closeness, the lack of distortion, the way longer lenses compress space without pushing the camera back,” he says. “It felt right for Rooster’s character‑centric storytelling.”

There was just one problem: 65mm is rarely feasible in television. Multi‑camera setups, tight schedules, and data demands make it a luxury. ARRI’s Alexa 65 had been used to shoot The Revenant, Barbie and Dune: Part Two but was not an option for a TV budget.

But in Spring 2025 with Rooster in pre-production, Blackmagic announced the release of the URSA Cine 65, a camera McClure admits he initially dismissed.

“They’ve been very consumer‑oriented. I wasn’t sure it would hold up to a demanding TV schedule,” he says. “But the idea of 65mm portrait storytelling kept pulling me back.”

Camera placement in large format

The URSA Cine 65 didn’t just change the look it changed how McClure approached position and framing.

“I did initial tests to see if I even wanted to use the camera. Then I brought the director in for more tests. Finally, we did hair, makeup and wardrobe tests representing the show’s tonal range. That’s where we figured out focal lengths for over‑the‑shoulders. With the wide field of view, foreground shoulders sometimes looked too close or made actors feel far apart, so we often eliminated the foreground shoulder entirely and shot just inside the eyeline.”

They always ran two cameras, and about 80% of the time had a third. He shot with a full set of Camtec Falcons, mostly using a 55mm T1.3.

“Because the sensor is so big, I didn’t want to just use longer lenses and push the cameras far back—that defeats the purpose,” he says. “But being physically close created challenges with shadows and angles.

“We were also physically closer to the actors than they were used to. They’d look at the camera and say, ‘Why are you shooting an extreme close‑up?’ But it was actually a medium.”

For example, in scenes set in the College president’s office, he would sometimes put two cameras on the wall or cross‑shoot. “If an actor stood up, our close cameras couldn’t tilt without shooting up their nostrils, so we’d have a third camera higher up to maintain continuity. Sometimes that third camera became a profile shot. We had to rethink placement constantly.”

The 8K workflow

The URSA Cine 65 can shoot 17K, but McClure settled on 8K, balancing quality with practicality. Convincing HBO required transparency, but he had an ally: British DP John Brawley had already used Blackmagic’s 12K camera on Apple TV comedy Shrinking which was also made by Bill Lawrence’s production company Doozer Productions.

“It helped that DigitalFilm Tree was the same postproduction facility [as Shrinking] so they were familiar with Blackmagic RAW. Working with the URSA Cine 17K 65 wasn’t a huge leap for them. But studios don’t want to hear ‘8K’ because of data costs,” McClure says.

Even the idea of capturing to the camera’s 8 Terabyte media mags was off-putting because the hours it could potentially take to offload the footage would not be conducive to fast turnaround TV schedules.

Instead, Blackmagic advised the team to use CFexpress cards which hold about 1TB. “We used about 10 per camera and rotated them as the studio cleared media for deletion.”

Also making an 8K workflow viable was McClure’s choice of compression options offered by the camera.  

“We tested all the compression settings and landed on Q3 which uses a variable bit rate to encode only the moving parts of the frame,” he explains. “More specifically, Q3 allocates more data to areas of high detail and less to static parts of the frame, reducing the overall data rate. You don’t notice it visually, but it meant that our 8K files were actually smaller than an Alexa 35 show I’d just done. This made it an easier sell to the studio, since we weren’t asking them to approve a massive data storage footprint for the DIT or post production. Plus, it held up beautifully in grading.”

Delivery in 4K gave ample room for reframing in post. “We shot 8K using the full width of the sensor within our 2:1 aspect ratio. You can zoom into this camera 350% and it still holds up,” he says. “I’m fine with punch‑ins if they serve the story.

Lighting for naturalism

The script’s first line mentioned ‘New England fall colours’, so that became the guiding idea for the look of the show, despite shooting exteriors at The University of the Pacific’s Stockton campus near San Francisco and on stages in Los Angeles.

“We pushed as much sunlight‑feeling light as possible—20Ks, Molebeams, anything to create that fall atmosphere,” McClure says. “We embraced blown [overexposed] windows. That allowed us to use exterior light as bounce and fill, then augment with smaller sources inside.”

With colourist Josh Bohoskey, he referenced Dream Scenario for its blown‑highlight, high‑contrast look. “Josh built LUTs that brought the lifted shadows back down while keeping a slight lift. We also kept the palette warm, dialling out blue computer‑screen hues, for example. Production design was fantastic and set the foundation.”

He employed a LED colour contrast filter from Camtec. The Color-Con consists of a diffusion filter surrounded by LEDs in a filter holder that occupies two spaces in the mattebox. In addition to controlling colour and contrast, the LEDs can create a controllable, directional hot spot. McClure likens it to modern version of the Lightflex that Freddie Francis, BSC used on The French Lieutenant’s Woman (1981) or the ARRI Varicon.

“With the sensor being so large, every lens naturally has a bit of shading toward the edges of frame. The Color-Con filter exaggerates that. I could focus more LED light on the centre of the filter. So it’s a little bit brighter in the centre and that enhances the shading. Depending on our stop, some shots almost look like there’s a hole punch while others are very gentle.

He explains, “It lifts the highlights and shadows, almost like shooting through smoke without needing smoke. I’m obsessed with baking-in as much of the look as possible; colour, tone, LUTs, everything.”

Lessons learned

Even now, finishing colour on the tenth and final Season 1 episode, McClure is still learning from the format.

“The biggest lesson is: don’t drift back into old habits. Sometimes I drifted back into lensing in instead of physically moving closer but then you lose the charm of the format. Staying committed to that closeness takes more time and sometimes means sacrificing coverage, but everyone agreed the results were worth it.”

It’s a testament to how a comedy series became a proving ground for large‑format storytelling.

“I wanted the show to feel like a portrait, intimate and relationship driven. I’ve always loved medium format photography, and this was the closest I’ve been to that in motion. The intimacy we got was worth every bit of problem‑solving.

“It’s not the right camera for every project—just like anamorphic or 65mm isn’t always right—but it now sits alongside Alexa and Venice as a legitimate tool. It should be considered seriously.”

Country music to comedy

Born in Nashville, Tennessee, McClure got his first spark of inspiration from an unlikely source, the Ernest movies (​Ernest Goes to Camp, Ernest Scared Stupid, ​etc.), because they filmed in and around his neighbourhood. Seeing crews moving about and setting up big lights revealed the world of filmmaking to McClure, who went on to attend and graduate from Watkins Film School.

His worked as a PA on the Coen Brothers’ ​O Brother, Where Art Thou? shot by Roger Deakins and credits those two months on set with Deakins with teaching him as much about cinematography as he learned in film school.

“When I first moved to Los Angeles, I had a reel full of country music videos. The first people I connected with happened to be comedy filmmakers, including Oz Rodriguez, who directed on Rooster. I started shooting shorts with that group, and as often happens in this business, one connection led to another. That was 15 years ago, and it built from there. I haven’t shot a music video since—but I’m shooting my first one since then this weekend, in Key West.”


Wednesday, 15 April 2026

Spatial computing: “Instead of showing people a story, you’re letting them inhabit it”

IBC

article here

Leveraging generative AI, computer vision, and data from real environments, spatial computing has opened the door for cutting-edge systems that blend the physical and digital worlds into a new frontier of human-technology interaction.

Marketed by Meta boss Mark Zuckerberg as the metaverse, a virtual playground populated by avatars, the next-gen internet is now being reconfigured around spatial computing with applications accelerated by AI. 

“The metaverse didn’t die — we simply stopped using that word,” says Rosemary Lokhorst, CEO and co-founder of XR developer Badass Studios. “What we’re seeing now is the same idea evolving and becoming more practical through spatial computing.”

For years, spatial computing – whether labelled VR, AR, MR, or ‘the metaverse’ - has cycled through waves of hype and recalibration. Recently something has shifted.

“AI is enabling spatial computing by solving problems that seemed impossible just a few years ago—scene recognition, environmental awareness, gesture understanding, natural language processing,” explains Neil Trevett, president, The Khronos Group, and VP of developer ecosystems, Nvidia. “These were previously hard research problems. Today, they are increasingly productised capabilities.”

At the same time, spatial environments are becoming training grounds for AI. Digital twins allow systems to learn how to interact with complex, real-world physics and human behaviours.

“The result is a feedback loop. AI enables spatial computing, and spatial computing enables AI,” says Trevett who describes the metaverse simply as “spatial computing experiences where users are connected together.”

Khronos develops open standards for 3D graphics, compute acceleration, and AI. The technologies now overlap. “AI’s impact on spatial computing is fundamental,” Trevett says. “In turn, spatial computing is evolving into a natural user interface for AI, embedding intelligence directly into the environment rather than confining it to a 2D screen.”

On a technical level spatial computing leverages technologies like computer vision to create interactive 3D representations of environments. By analysing visual data, computer vision interprets the geometry and layout of physical spaces. According to Nvidia, other technologies, such as Gaussian splats and NeRFs, enable the rapid reconstruction of 3D scenes for visualisation and analysis. Generative AI can transform 2D images into 3D animations, enhancing the integration of digital content with the real world.

Take out the jargon however and spatial computing is really about using technology in a way that mirrors how we experience the real world.

“It’s about creating an environment where you feel connected to what’s happening around you and able to share that moment with others,” says Lokhorst. “It’s location-based computing — technology that understands and interacts with space.”

The idea behind the metaverse was similar: a three-dimensional environment with depth and space where you can move around and feel as though you’re actually there. One difference is that instead of fantastical VR worlds experienced vicariously by animated proxies of ourselves (the Ready, Player One or Snowcrash version in popular culture which Zuckerberg bought into) the spatial internet is grounded in reality.

“What excites me most is how generative AI, computer vision, machine learning, and AI agents work together,” Patrick Hadley, Sponsored AR Product Leader at Snapchat told an audience at CES. Snapchat’s AR lenses are used 8 billion times per day. That scale gives it a live testing ground for what comes next.

“Think of spatial computing as the canvas, generative AI as the paint, computer vision as the eyes, and ML as the technique,” he said. “Together, they’re enabling entirely new experiences.”

Nonetheless, even Meta, which by some estimates has spent $60 billion on attempting to build the metaverse, has pivoted to talk about spatial computing.

“We’re building what we see as the next generation of the internet—the spatial internet—where people can feel presence and togetherness across devices and locations,” said Anne Hobson, Policy Lead for Metaverse Products at Meta at CES.

Notably, Hobson is still in charge of ‘Metaverse Products’ like the Quest headset or Ray-Ban Meta glasses. “[These are] devices that blend the physical and digital worlds,” she said. “They give AI a first-person view of what you’re seeing in real time, making AI more useful in the moment.”

The global spatial computing market was $102.5 billion in 2022 and projected to reach $469.8bn by 2030, according to some estimates.

Nonetheless, Meta has scaled back its ambition to developing wearables as the interface to spatial computing rather than building the metaverse itself. At the start of the year it shed 10% of jobs at Reality Labs with this new strategy in mind.

Other companies are stepping to furnish the software building blocks of the spatial internet. They are gathering data from real environments, parsing that through Large Language Models (LLMs) to create digital counterparts rendered in some cases using games engines.

Niantic Labs is one. Famous for designing mobile AR game Pokémon Go and now owned by Saudi Arabian group Savvy Games, it is building a shared coordinate of the world for humans and machines. That means reconstructing and understanding real-world spaces so headsets, drones, robots—anything with a camera—can interact in real time.

“We’ve scanned over a million places worldwide and for us that ground truth data is essential,” explained Azad Balabanian Product Manager, Niantic Spatial at CES. “While generative AI is powerful, we can’t over-index on fully synthetic outputs. For many enterprise applications you need millimetre-level accuracy.”

Its geospatial model was showcased at an event during Super Bowl late February when Niantic Spatial enabled a physical robot and its digital twin to share the same reality viewable in realtime on mobile phones. Because the robot and phones were all localised to the environment, they all had the exact same understanding of where they were in space.

“This demo demonstrated the next frontier of our work: AI that understands the physical world,” the company enthused. “We believe there is a significant, untapped potential that is realised when AI moves beyond the screen and into our physical reality. Our mission is to move past the idea of AI as a digital only tool by giving it a sense of place.”

Another company fusing LLMs with real world physics is World Labs. The startup is valued at over $5 billion by investors including Autodesk and Nvidia. Its founder, Fei-Fei Li, talks about how ‘spatial intelligence’ plays a fundamental role in defining how we interact with the physical world and of the challenge in designing computer sims that mimic this.

“[We need] a new type of generative model whose capabilities of understanding, reasoning, generation and interaction with the semantically, physically, geometrically and dynamically complex worlds - virtual or real - are far beyond the reach of today’s LLMs,” she believes. “The field is nascent.”

But this research isn’t a theoretical exercise. Li says, “It is the core engine for a new class of creative and productivity tools.”

Li is positioning Marble, World Labs’ virtual world building tool, as integral to new immersive and interactive experiences. Just like the vision for the metaverse this is conceived as a fully mapped 3D digital world in which we all share.

“We’re approaching a future where stepping into fully realised multi-dimensional worlds becomes as natural as opening a book,” she argues. “Spatial intelligence makes world-building accessible not just to studios with professional production teams but to individual creators and anyone with a vision to share.”

Content producers are already busy in operating in spatial computing modes.

British firm Nexus Studios creates XR content for mobile devices, such as for horror studio Blumhouse, and massive immersive screen experiences at Las Vegas Sphere. It also creates multi-sensory experiences for theme park rides, museums and gallery installations.

“We’re well-versed in both cinematic storytelling and what we call spatial storytelling,” says Chris O'Reilly, co-founder and chief creative officer. “These huge new screens are architectural-scale storytelling environments. They’re not just screens you watch — they’re spaces you inhabit.”

The canvas of spaces like MSG Sphere allows creators like Nexus to describe what they do as world-building. “You can render them as planets, or be inside someone’s bloodstream. The challenge is ensuring your artists don’t think of the space as just a large rectangle. Instead of framing shots, you’re sculpting environments. Instead of showing people a story, you’re letting them inhabit it.”

Badass Studios is already building digital twins of sports like E1 racing and MMA repurposing the data into live AR overlays on the broadcast or virtual game simulations.

“Imagine watching tennis or football in virtual reality,” Lokhorst says. “You could enter the stadium virtually, choose your seat, and watch the match from anywhere. You might even stand on the pitch during a penalty.”

Similar applications were promised several years ago during the first metaverse hype and arrival of 5G.

“A lot has changed technologically since then,” she says. “Compute power has increased, rendering engines like Unreal Engine have improved dramatically, and high-resolution environments are easier to transmit over the internet.

AI has also accelerated development. Where building a game environment once took about a year, we can now do it in two to six weeks. For example, recreating a city like Monaco or Miami might take two or three weeks.

Today it’s becoming more industrial and practical. Sectors like military training and healthcare simulations have helped improve the underlying technology and infrastructure.”

Miniaturisation and comfort

Previous waves of XR were defined by bulky headsets and niche gaming use cases, but the current phase is characterised by miniaturisation and distribution.

Ziad Asghar, GM for XR and Personal AI at semiconductor giant Qualcomm, said at CES, “We’re in the middle of a major transition—from personal computing to mobile computing, and now to spatial computing. The convergence of XR and AI is unlocking use cases that simply weren’t possible before.”

Smart glasses, smartwatches even earbuds with cameras “can understand and interact with the world around you in ways a device in your pocket cannot,” he said.

“But there are real challenges. You need incredible AI processing on-device. You can’t send everything to the cloud. That means best-in-class performance per watt, excellent connectivity, low power consumption—and all in a tiny form factor. A smartphone battery might be 20 times larger than what fits in smart glasses, yet users expect the same experience.”

A solution is emerging out of stealth mode in Dubai. Xpanceo is developing a smart contact lens designed to integrate XR, night vision and optical zoom. A small companion device worn on the body handles processing and wireless power transfer. The company describes the concept as an “invisible computing platform” designed to replace screens altogether and also as a “habitat for intelligence” where data, sensors, and human perception converge.

Founders Roman Axelrod and Dr. Valentyn Volkov will wear the prototype at its first public demonstration at the beginning of 2027 (the timing suggests CES).

Axelrod and Volkov call it the “after-glasses” era telling Forbes that, if their team succeeds, the computer will no longer be a device we hold or wear. It will be something we look through, a living interface between biology and the digital world.

 

Tuesday, 14 April 2026

NAB preview: Automation, reinvention and politics to steal the show

IBC

NAB 2026 looks set to bring a raft of creativity and technological innovation, yet serious political and environmental questions remain.

article here

As the global media‑tech community heads to NAB, the mood across the sector is harder to read than at any point in the past decade. Economic headwinds, supply‑chain friction, shifting audience behaviour, and the gravitational pull of AI have created a landscape that feels volatile. Yet the IABM, representing broadcast and media‑tech suppliers, say the narrative of decline simply doesn’t match what they’re seeing on the ground.

“It’s not as negative as it seems,” says Stan Coote, the organisation’s CTO. “When budgets tighten, marketing is usually the first thing cut. This year, it’s the opposite. Vendors want visibility because they have real innovation to talk about.”

That said, the cost of Las Vegas accommodation and air fares plus reluctance to run the gauntlet of heavy-handed TSA and social media scrutiny is expected to dampen international attendance.

“You don’t want to say it’s a domestic show but there’s no question we are running into some [people] who say they are not going,” Coote reports. “NAB is still where the industry launches products. Even if some people don’t attend physically, they want the wrap‑ups: ‘What did I miss?’”

Industry optimism is echoed in the IABM’s latest business‑intelligence data. Chris Evans, Head of Knowledge and Insights, has been analysing the first wave of the 2026 MediaTech Industry Tracker, which surveys both suppliers and buyers across the sector. What emerges is a picture of divergence rather than uniform contraction.

“Those heavily invested in pure linear broadcast are feeling the squeeze,” Evans explains. “But companies that have diversified into streaming, digital, enterprise, education, government, or other adjacent verticals are finding new customers. It’s a business‑model story. Are you still chasing the same buyers, or are you repositioning your products for new markets?”

Evans adds that trust is becoming more important than ever. “With tighter budgets, buyers want confidence in vendors. They want to understand the roadmap, the ROI, the reliability. NAB is still the place to build those relationships and access the North American market.”

Broadcast: no longer the centre of gravity

The IABM’s revenue‑mix data tells a story of gradual rebalancing. Today, 61% of supplier revenue comes from media and broadcast, with 39% from adjacent markets. That’s actually a rebound: 2023 saw the lowest broadcast share in years, driven by post‑COVID disruption and the Hollywood strikes.

“We’re seeing a rebuild on the broadcast side,” Evans notes. “But the adjacent‑market strategies vendors developed during that period are now maturing. They’re learning how to position their products for adjacent markets.”

Coote highlights a major conceptual shift in the organisation’s technology roadmap. What were once “parallel markets” are now framed as transdisciplinary and plenary integration.

“Media now spans multiple industries, and the benefits flow both ways,” he says. “Education and healthcare are using AI to generate content and graphics, and those techniques are feeding back into broadcast. Audiences care more about compelling content than ultra‑high production polish. That’s influencing everyone.”

Ross Video is one company with considerable crossover in target markets for its product.
Simon Hawkings, Sales Strategy and Business Acceleration Director, says, he’s anticipating “a very practical” NAB with conversations focusing less on what’s possible 10 or 15 years from now to what’s practical now.

“The defining trend will be a shift from innovation to operational impact. AI, automation, cloud, and robotics are no longer framed as future disruptors, they’re being deployed to solve immediate challenges around efficiency, scalability, and cost. I think the focus of the show will move toward streamlined workflows.”

Tariffs and supply chains

While macroeconomic pressures are real, the IABM isn’t seeing the kind of existential strain some feared. Tariffs remain a challenge ((enough for the organisation to establish a dedicated working group) but suppliers are adapting.

“Some members have introduced transparent tariff surcharges,” Coote explains. “The bigger issue is double tariffs — a semiconductor taxed on import, then the finished product taxed again when shipped elsewhere. That’s forced supply‑chain redesign. But we’re not seeing companies shutting down. In a niche industry, if you need the kit, you need the kit.”

Energy costs and shipping disruptions, meanwhile, have not yet materially affected the sector.

Press freedom on the agenda – FCC grilled

NAB convenes while talk show hosts are being pressured off the air and pro-Trump M&As are being waved through by Federal Communications Commission (FCC) chair Brendan Carr.

The White House stooge recently issued a press release to encourage US broadcasters to show “patriotic, pro-America content that celebrates the American journey … from our founding through the Trump Administration today.”

In response, advocacy group The Media and Democracy Project penned an open letter claiming Carr had “turned the Commission into an authoritarian agency in which decisions reflect the views of one person… while at the same time weaponizing the FCC against the major broadcast networks because they present some programming that displeases the President.” 

It will be interesting to see if any of this gets aired by NAB’s Associate General Counsel Larry Walke when he leads the session ‘Ask the FCC’.

A trio of FCC commissioners including Deputy Bureau Chief Alexander Sanjenis will field questions. Will Walke pull no punches or be briefed to stick to general questions about media ownership and the transition to NextGen TV?

The session, ‘The First Amendment and Press Freedom in Today’s Media Landscape’ couldn’t be more explicit. Here, Anna Gomez - the lone Democrat on the FCC – will have her say on what role, if any, the US government should play in addressing public concerns about news content. Or, as NAB itself bluntly puts it, ‘When does oversight become overreach?’

NAB is to be commended for putting media censorship and press safety under the spotlight. Astonishingly this is press safety in America not some foreign warzone.  

‘The Cost of Bearing Witness: Journalist Safety in a Polarized America,’ moderated by CNN anchor John Berman, will hear from Al Jazeera’s managing editor Mohamed Moawad alongside Mara Gassmann of the Reporters Committee for Freedom of the Press and PBS investigative journalist A.C. Thompson. President Trump’s order to end federal funding for PBS was recently ruled unlawful by a US judge.

Sustainability – Not on the agenda
After a few years of apparently being taking seriously the veil has fallen. Cutting carbon from the production and supply chain doesn’t warrant a single conference session of the approximately 500 at NAB2026. If there is one, it is buried so deep in the program as to be inconsequential.

Perhaps this means the message is so embedded in corporate policy that it doesn’t need underlining. Perhaps reducing Co2 automatically follows from adopting more remote and IP based production workflows. Perhaps it is just a sign of the times: sustainability may continue but it’s not to be promoted above the parapet in America.

Devoncroft’s Josh Stinehour is scathing. “When the microphone is hot at industry panel sessions, media technology buyers dutifully answer in the affirmative about the importance of sustainability to technology deployment decisions. [But] when answered within the safety of anonymity and asked in a framing of impact on commercial success, sustainability is not viewed as an important trend in the global media technology industry.”

Stinehour was commenting on the analyst’s 2025 report which ranks the importance of industry trends by technology stakeholders. It found that sustainability considerations are now “standard fare” in US vendor’s product marketing efforts but the importance of the subject related to business P&L comes 20th out of 22 categories and is trending downward year on year.

European’s shouldn’t feel smug either. The broadcast kit buying community here voted Sustainability / Carbon Reduction 12th from 22 in Devoncroft’s survey – an improvement for sure but still not a top ten consideration.

When AI is the single most pervasive topic and technology at NAB and the wider industry you would expect at least one session to focus on the impact of energy draining data centres on the environment.

Creators - Expanded presence

The organiser’s continues to woo the younger generation by offering free passes to Creators and expanding the number of Creator-centric sessions. There’s sound reason too since the creator economy collectively is raking in $250 billion a year according to Goldman Sachs, with YouTube now valued by financial services company MoffettNathanson at $560 billion, out ranking Netflix.

YouTuber Mark Fischbach, known as Markiplier, turned heads in Hollywood earlier this year by taking his self-directed and produced horror film Iron Lung - in which he also starred - direct to cinemas bypassing distributors and taking $50m at the box office. It’s somewhat of a coup for NAB to have him talk about how individuals like him can build up filmmaking skills from shorts to longform on YouTube then use their social media following to generate interest on other platforms.

Creator sessions such as ‘Beyond Views: Measuring Creator Impact’ examine how success is defined beyond simple metrics, while ‘Creator Survival Guide: Contracts, Burnout and the Business of Building Content’ tackles the realities of sustaining a creative career.

What was once dismissed as niche is now operating like a parallel Hollywood. Microdramas, vertical series and creator-led short-form shows are building massive, loyal audiences through low cost, rapid production cycles and direct-to-fan distribution. Erin McFarlane, Head of Vertical Content at Dhar Mann Studios which has a deal with Fox to make 40  narrative-driven vertical titles, shares her experience in ‘Microdramas: The 60-Second Studio Surge.’

For Ross Video’s Hawkins individual creators are now a “serious force” in the ecosystem. “With tools becoming more powerful, accessible, and interconnected, high-quality production is no longer limited to large teams or traditional infrastructures. This growing creator economy is reshaping expectations prioritising speed, flexibility, and independence.”

 

 

 

 

Friday, 10 April 2026

Live at The Grand National: ITV Sport Production ready to give viewers the ride of their life

SVG Europe

article here

Preparations are coming to a head for the latest running of the world’s greatest steeplechase, and for ITV Sport Production’s lead creative director Paul McNamara, “it’s all about giving people a comfortable ride.”

The Randox Grand National draws in between 500-800 million viewers globally and is one of the crown jewels of British sport alongside Wimbledon and the Oxbridge Boat Race as an event that has become a cultural institution.

“To borrow a phrase, the National is when the majority of people go racing,” says McNamara, who will also direct the feature race for the tenth time.

The three day event in Aintree, Liverpool [April 9-11] is a fraction of the 117 race days covered live and free to air by ITV Racing (produced by ITV Sport Production, part of ITV Studios) over the year in a contract with Racecourse Media Group recently extended from 2027 to 2030. Nonetheless, its regular audiences are dwarfed by an occasion which attracted a peak domestic audience of 5.2 million last time out. ITV coverage of Royal Ascot came a close second, reaching five million viewers across its five days.

“We are lucky to be the custodians of this fabulous event,” says McNamara. “As a production we look back at archive films from the thirties, forties and fifties and wonder what people in 30 and 40 years will think when they look back at what we've done. In many ways the race coverage itself remains very similar. What we’ve done over the years is layer in additional cameras like drones and depth of field cameras but the story of the race has a similar pattern.

“The core ambition is to honour that legacy while pushing coverage forward using the tools available today.”

Visually, the Grand National remains one of the most comprehensively covered events in British sport. The 50-camera setup includes a wire cam along the home straight, drones, and helicopters, alongside hi-motion cameras and FS7 depth-of-field units. Yet there is a recognition that, in the current economic climate, simply maintaining this level of coverage is an achievement in itself.

“It’s a massive strength to stay where you are,” McNamara observes. “I want nothing in terms of covering this. For example, I don’t think we could integrate any more fence cameras and even if we did there's a risk of overcomplicating the live race.

“For us, it's not about tricks. It's about showing people clearly the story of the race so that they can follow all the action as it happens. Our focus is not on adding more for the sake of it, but on delivering clarity and consistency.”

Expanding coverage beyond the Race

What has changed significantly over the decades and notably in ITV Racing’s stewardship (since 2017) has been presentation of the meet as an entertainment experience. With more than 12 hours of live programming building up to a race that lasts around 12 minutes, the challenge lies in sustaining engagement throughout.

“It's always been our mantra that we're not a sports channel - we're an entertainment channel that does sport,” explains McNamara. “If a viewer tunes in at any point, regardless of their familiarity with racing, they should be entertained, not alienated by terminology.”

With 150,000 people attending Aintree, ITV will take viewers into the hospitality tents, grandstand and paddock to reflect all the colour and excitement. “We’ll follow [ITV Racing presenter] Oli Bell into different areas of the course. We’ll see the fans cooling down the horses and explain the welfare of the animals. We’ll have sections on fashion. Even betting is demystified, with presenters explaining how to place a wager in simple, accessible terms. Everything is there to inform and entertain.”

The emphasis on accessibility also informs production decisions. Presenters are often positioned in the parade ring, rather than in more detached studio environments.

“When we began our racing coverage we adopted a football philosophy which is to think of the parade ring like the managers’ dugout on the side of the pitch. That’s why we present from the parade ring, right at the heart of the action.”

There are subtle production changes, notably in presentation of audio. Horse racing has traditionally relied on augmented sound due to the scale of the courses, but ITV has worked to modernise this by refreshing its sound libraries (down to the type of horse running on different ground and at different speeds) and increasing the use of live “actuality” audio captured around the track. Microphones positioned at key fences including Becher’s and Canal Turn and remote sections of the 4 mile 550 yard course help create a more immersive experience, blending authenticity with carefully managed enhancement.

Graphics and rerun

While live graphics tracking technologies and AI-based systems have been explored, they are not yet considered reliable enough for real-time use in such a complex environment.

“Tracking and timing is complicated,” says Tony Cahalane, Technical Director, ITV Sport Production. “We’ve trialled AI-based visual recognition, but commentators are usually ahead of the tracking. Tracking can be done on an oval course; but it’s much harder on a course with the scale of Aintree.

“We do name horses in the rerun when we can be absolutely accurate. As a result, live coverage prioritises a clean, uncluttered picture, with more detailed analysis and visualisation deployed in reruns where accuracy can be guaranteed.”

The rerun itself is treated as a significant production in its own right. A separate EVS, editorial team and truck handles it, working with expert analyst Ruby Walsh to refine how the race is broken down and explained. Its popularity is such that it has become the second most-watched race in the UK – after the live run airs just minutes before.

“The key to making this run as slickly as it now does is in using the preceding two days as rehearsal. We don’t just rock up. We can source new angles from drone or hi-motion or fence cam not shown in the live run. A lot of work goes into producing this.”

Digital output

Digital output has become an increasingly important part of ITV’s operation. Alongside the linear broadcast, a dedicated on-site team captures, edits, and distributes content in real time. Clips from races, behind-the-scenes moments, and lighter editorial features are pushed across social platforms, while each day’s Opening Shows are streamed simultaneously on YouTube.

Maggie Price, Senior Production Manager, explains, “We have two digital producers on site, one who clips races and content and another who collects editorial content and colour. They will be constantly cutting and then pushing things out to our social media sites. We have a dedicated ENG camera at our disposal who can gather any additional information for anything that Paul might want to put into the show, as well as content for our VTs of cultural features created for airing the following day. All these are edited on site.”

She also highlights ITV’s Social Stable outreach on digital channels as “a great chance to interact with viewers” and to pull that content into the main linear programming.

“We invite viewers to share how they are watching the race and this creates a sense of participation that extends beyond the broadcast itself,” she says. “The team working remotely across ITV Racing's social media platforms will usually export 16x9 footage from the ITV programme output and then we will edit it via premiere pro into a vertical format (using keyframe tracking and scene edit detection) which makes the content able to be posted on social media platforms like Instagram, TikTok and YouTube shorts.”

Underlying ITV’s Grand National coverage is a long and detailed planning process. Preparations begin months in advance, with initial creative discussions taking place in the autumn and continuing through site visits, production meetings, and ongoing collaboration with the racecourse.

Cahalane says, “It’s a wonderful event and a great team effort; one that continues to demand care, attention, and respect year after year.

“We’ve already put dates in the diary for 2027.”

Streaming Year in Review

Streaming Media

article here

Two stories dominated streaming media in 2025: Netflix versus the rest and YouTube takes TV. YouTube may be the bigger story. “Netflix knows who its competition is,” said media guru Evan Shapiro at IBC25. “It’s YouTube.”
No-one was really surprised when YouTube CEO Neal Mohan declared, “YouTube is the new television.” That’s because the figures don’t lie. In his annual letter in February, Mohan revealed that TV screens have officially overtaken mobile as the “primary device for YouTube viewing in the US.”
The ‘new’ television doesn’t look like the ‘old’ television,” Mohan wrote. “It’s interactive and includes things like Shorts, podcasts, and live streams, right alongside sports, sitcoms and talk shows people already love.”
Noting that 45 million Americans watched election-related content on the platform, he unscored: “YouTube will remain the epicenter of culture.”
Already the largest internet TV provider in the US with more than 9 million subscribers, in May, YouTube held 12.5% of all TV use, according to Nielsen Guage, the highest share of TV for any streamer to date. Nielsen also noted that month that YouTube Main (excluding YouTube TV) is up over 120% since 2021 leading a collection of free services (such as PlutoTV, Roku Channel and Tubi) which have been “a major driver of streaming’s overall success”.
In July, in the UK, regulator Ofcom reported that YouTube was now the second most watched streaming service behind BBC iPlayer. Among 16 to 34 years olds YouTube was the most-watched service overall.
Alphabet’s video platform has become the indispensable distribution partner for studios and broadcasters. More and more of their content is being carried on YouTube with evidence to suggest that this does drive more viewers back to source – but this journey is not without a hitch.
Despite tapping new revenue from an ad-sales pact with YouTube and claiming views of its  content on the platform have leapt 169%, UK broadcaster Channel 4 warned of challenges working with social media networks.
“We have no control over third party platforms,” said Louisa Compton C4’s Head of News at the Edinburgh TV Festival in August. “The algorithms are shady and non-transparent. I also believe public service broadcasters content should be kite marked [given prominence] on those applications.”
Disney pulled its content from YouTube TV for two weeks November saying that YouTube was refusing to pay fair rates for its channel (YouTube claimed that Disney was using the blackout as a negotiating tactic that would have resulted in higher prices for its subscribers).
“Social media have the central ability to control the media experience of the audience,” said Kevin Mayer, co-founder and co-CEO of Candle Media at the RTS conference in September. His company owns CocoMelon, the largest family channel on YouTube with 200 million subscribers yet Mayer highlighted the vulnerability of relying on algorithms that could change overnight.
“YouTube is hard to deal with at times. Monetisation is lumpy. They tweak the algorithm. You have to be careful but the power and global nature of those platforms is undeniable.”
Nonetheless, 85% of Internet users watch YouTube each month with nearly one in five watching full-length movies and TV shows on the platform, charted Ampere Analysis. Significantly, it is 35–64-year-olds that is powering YouTube’s “golden age of film and TV,” according to Ampere.
Expect this to continue into 2026 with a “stronger emphasis on driving consumption on TV through licensed, long-tail movies and classic TV series,” predicts Jason Platt Zolov, Senior Consultant, Hub. “By packaging and promoting familiar titles in ways that appeal to audiences encountering them for the first time, YouTube is poised to extend its dominance beyond short-form and capture more living-room viewing.” 
Paramount Tops Netflix in Battle to Acquire WBD

In a bidding process in which Paramount Skydance was the pundits’ favorite, Netflix bid $82.7 billion (an equity value of $72 billion) for Warner Bros. Discovery’s streaming and studio business in early December and entered negotiations with the company.

Immediately following the Netflix bid, Warner Bros. Discovery shareholders rejected an upgraded offer from Paramount Skydance valuing the entire business at $108.4 billion. In addition, any deal must pass regulatory hurdles—including European regulators—and any deal reached after the principal parties come to terms could take up to 18 months to close.

In late February 2026, Paramount Skydance increased its offer to $111 billion in cash, or $31 per share, and stated that it would cover a $2.8 billion termination fee owed to Netflix should Warner Bros. Discovery break the deal and a $7 billion regulatory termination fee. The acquisition would be for the entirety of Warner Bros. Discovery, including its film and TV studios and linear cable networks.

On Feb. 26, Netflix declined to raise its offer to match Paramount’s revised bid. At press time, the deal was expected to be finalized following a Warner Bros. Discovery shareholder vote on March 20.

If approved, the $111 billion deal would unite two major Hollywood studios and one of the deepest content libraries in the industry.

“For all the regulatory noise, this deal ultimately comes back to the fundamentals of the entertainment business,” says Ed Barton, research director at Caretta Research. “Control of premium IP and global distribution maximizes engagement and builds scale that compounds.”

Paramount Skydance’s franchises (Star Trek, Mission: Impossible, Transformers, and SpongeBob SquarePants among them) combined with HBO’s premium positioning create a formidable global streaming proposition.

The resulting content juggernaut “would immediately become a more credible challenger to Netflix—not necessarily in raw scale on Day 1, but in depth and variety of monetizable franchises,” says Barton. “And while Netflix loses access to Warner Bros. IP in this scenario, it is far from vulnerable. It remains the most effective company in the market at monetizing IP at global scale.”

Three days after Warner formally signed an acquisition agreement, Paramount Skydance head David Ellison announced plans to merge the Paramount+ and HBO Max platforms, giving the combined platform 200+ million subscribers.

The combined Paramount Skydance-Warner Bros. Discovery entity will operate with a net debt of $79 billion. The deal is backed by $54 billion in debt commitments from Citigroup, Apollo, and Bank of America, according to Reuters.

Skydance Paramount
In August, more than a year after announcing its purchase of Paramount for $8 billion, Skydance Media closed the deal with CEO David Ellison promised to turn his new toy into a “tech-forward company that blends the creative heart of Hollywood with the innovative spirit of Silicon Valley.”
In November Paramount reported its first quarter since merging with Skydance. It made $6.7bn for Q3 which was flat year on year as its TV division continued to struggle. The company also posted a net loss of $257m based on merger-related expenses and restructuring costs.
The DTC side was rosier. Streaming revenue increased 17% to $2.17bn, with Paramount+ accounting for more than 80 percent of that growing its subs by 14% to over 79 million.
A unified technology stack is being introduced across Paramount+ and Pluto TV to enhance performance, improve user experience, and reduce costs. Paramount is also developing AI tools to power personalisation and recommendations. However, it will also raise Paramount+ prices in the US in the new year.
“Our direct-to-consumer business is our top priority,” David Ellison, Chairman and CEO of Paramount wrote to shareholders. “We expect it to be profitable in 2025 with growth in profitability in 2026.”
Strength in streaming
Strength in streaming continued to offset the structural weakness in traditional television in 2025 as studios shed their cable business and began to turn a profit from their DTC ventures.
This macro trend is reflected in the swelling values of the streaming video market. While estimates vary according to research analyst, the global market size is projected to grow from $250bn in 2025 to $1.6 trillion by 2035 or from $811bn in 2025 to $2.6 trillion by 2032. 
Overall, the U.S. remains the “largest and most influential” streaming video market globally, according to PwC’s 2025 Global E&M Outlook report. It generated $61.9bn in revenue in 2024 on track to reach  $112.7bn by 2029. The next largest market, per PwC, is China which is one fifth the size.
For comparison, the global broadcasting and cable TV market was estimated at $356bn in 2024 and is projected to reach $450bn by 2030, growing at a CAGR of 4% whereas the global creator economy market size is expected to grow at over 23% between  now and 2033 by which time it will have value of $1.3 trillion.
Streamers spent around $95bn on content in 2025, surpassing commercial broadcasters, according to Ampere Analysis with Netflix leading the pack. It spent around $18 billion on content in 2025 — and “We’re not anywhere near a ceiling”, according to Netflix CFO Spencer Neumann speaking as reported in Variety. He was speaking in March so if the deal for WBD proceeds expect figures going forward to dwarf that. Pre-sale, WBD was planning to spend in the order of $19.5bn on content including sports this year, according to MoffettNathanson reported in THR.
Per MoffettNathanson, Disney was tracking to outlay $23bn in 2025; Amazon $9.1bn on and pre-merger Paramount Global around $15.2bn.
Peacock
By end of the year Comcast’s board had approved the separation of the company’s cable television networks (including CNBC, MSNBC and E!) and complementary digital platforms from its remaining businesses in a process that began a year previously. Versant Media Group is the new independent, publicly traded company [VSNT] led by Mark Lazarus who will be living up to his name if he can turn around the fortunes of legacy media.
At the same time Versant acquired Free TV Networks, which provides both broadcast networks and FAST channels in the US. Comcast retains NBC, Bravo and Peacock which ended the year with around 41 million subscribers and had a $217m loss in the third quarter of 2025, following a $436m loss for the same quarter in 2024.
A new package of NBA games in addition to Sunday Night Football games produced by NBC Sports is expected to give the streamer a leap into the New Year.  Peacock revenue dropped slightly to $1.4bn Q3 compared with $1.5bn in same period 2024 when the Paris Olympics boosted results.
Commenting on Q3 results, Comcast’s then president Michael Cavanagh, now co-CEO, stressed the reliance that Peacock has on sports. “As audiences continue to shift from linear to streaming, the multiple benefits of sports becomes an even greater advantage,” he remarked. “Live sports continue to deliver strong viewership and ad performance across broadcast and streaming. Tunning linear and streaming as one integrated media business gives us real scale and flexibility. It allows us to align programming, marketing, promotion and monetization across NBC, Peacock and our studios… and well positioned to grow.”
Comcast bid for greater UK share
November saw Comcast move for the broadcasting wing of UK commercial broadcaster ITV. The business, which includes ITV’s terrestrial TV channels and streaming platform ITVX, is valued at U$2.1bn. Comcast, which already owns pay-TV broadcaster Sky, aims to create a UK-focused streaming giant with an advertising marketplace based on Comcast’s Universal Ads platform.
The move would face regulatory scrutiny because of the monopoly it would hold (70% of the domestic TV ads). However, Comcast also understands that the UK market which currently supports five main public service broadcasters is under increasing pressure to consolidate. If ITV were to be sold then pressure would grow on the BBC and Channel 4 to pool their resources including their streaming services iPlayer and All4 (though the major PSBs also combine to operate Connected TV app Freely).
Sir Peter Bazalgette, former chair of ITV and a shareholder, said, “There's going to be an inevitable consolidation of domestic broadcasters all across Europe. There are four or five domestic broadcasts across Europe who can't all have a long-term future against the streaming giants. There is going to be a consolidation, and ITV are going to lead it in the UK.”
Apple
In the year that Apple rebranded Apple TV+ to Apple TV the tech company continued its policy of curated rather than volume content. Since the company doesn’t release subscriber numbers, the best guesses are that by end of 2025, at least 45 million people are paying $12.99 to access shows like Slow Horses and F1: The Movie. It’s reckoned that Apple scaled back its annual budget for content from $5bn to $4.5bn but is still operating the service at a $1bn loss.
Amazon Prime Video
In June Amazon defaulted all subscribers to an advertising tier even for paid users, unless they opted to pay extra to avoid ads.
That default flip puts Amazon into a different category than other streamers—instead of discounting into an ad tier, Amazon is monetizing the entire base, noted Subscription Insider.
By end of the year, Prime Video had launched advertising in 16 countries including Australia, Brazil, India and Japan, and gathered 315 million monthly viewers, up from 200m when it introduced ads in April 2024. It does not break out ad revenue for Prime Video but in the third quarter, total ad revenue across all of Amazon was $17.7bn, up 24% YoY.
Calling the 315 million figure “a transformative milestone” Jeremy Helfand, VP of Prime Video Advertising, said, “We’re just beginning to unlock what’s possible when premium entertainment, engaged viewers, and innovative ad-tech converge with relevant and performant advertising at this unprecedented scale.”
"2026 could see Amazon Prime Video introduce a universal video search experience that spans platforms—including services outside the Amazon ecosystem,” predicts Mark Loughney, Senior Consultant, Hub. “By positioning itself as the easiest place to find anything to watch, Amazon stands to become a default viewing hub, with more consumers centralizing and managing their subscriptions through the Prime Video interface.”
Disney
In its Q3 earnings report, Disney CEO Bob Iger made a point of highlighting the success of its orientation to streaming: “Our DTC business was running a $4 billion operating loss just three years ago” and by Q4, September this had swung to a $352m profit and a 39% increase in DTC operating income.
There are now 131.6 million Disney+ subscribers, an increase of 3.8m compared to Q3, split roughly 50/50 between domestic and international. There are also around 64 million Hulu subscribers bringing the combined Disney+ and Hulu base to 195.7m.
Since June, when it closed a $9bn deal to buy Comcast’s 33% stake in Hulu, Disney wholly owns Hulu which it plans to fold into the Disney+ app. From Q1 2026, Disney will no longer report subscriber numbers for Disney+, Hulu, and ESPN+ because “the metric has become less meaningful” for evaluating the performance of its businesses, management said.
Full year revenues increased 3% to $94.4bn a performance that was considered flat by the markets and which was dragged down by its linear business, where domestic networks revenue and operating income dropped 16% and 21%, respectively. In 2026, Disney anticipates spending $24bn on content across entertainment and sports.