Tuesday, 11 November 2025

Touch the future: Immersive video will soon make its presence felt

IBC

As XR devices become more accessible and 6G wireless systems emerge, we’ll move from simply watching video to stepping inside it.

article here

Imagine a world that fuses the digital, physical, and human to create revolutionary immersive experiences. Ericsson calls it The Internet of Senses. Nokia describes “a new world of sixth-sense applications”, and European tech body ETSI talks of the ubiquitous communications network acting as a ‘radar’ to sense and comprehend the physical world. 

The dawn of 6G

Video codec and mobile standards developer InterDigital thinks that the world is on the verge of stepping inside video. It forecasts that, with the arrival of 6G, we will experience the coming together of machines, ambient data, intelligent knowledge systems, and new computation capabilities. 

According to Nokia: “One striking aspect of that will be the blending of the physical and human world, thanks to the widespread proliferation of sensors and AI/ML combined with digital twin models and real-time synchronous updates.”

6G is expected to launch commercially by 2030, with an initial release planned for 2028. Included in the 2028 release is Integrated Sensing and Communication (ISAC), a technology that is considered to have huge potential. ISAC allows the network to become a source of situational awareness, collating signals that are bouncing off objects. It would collect data on the range, velocity, position, orientation, size, shape, image, and materials of objects and devices, essentially expanding the functionality beyond just communication. 

There are 32 potential use cases for ISAC listed in the technical report from the mobile specification group 3GPP. Among them is the ability to build digital representations of the physical world, a so-called digital twin. For example, a digital twin could incorporate a player’s physical environment into an extended reality game. 

“ISAC will enable motion detection and tracking of people and objects,” says Valérie Allié, Senior Director for Media Services at InterDigital. “We will have all this sensing data that will be integrated with high video quality and ambisonic audio. That will enrich spatial computing and deliver even more exciting XR experiences.” Analysts Futuresource predict that 6G deployment will coincide with the maturity of XR hardware and software ecosystems, which is expected to take place between 2028 and 2032. Ericsson also expects that by 2030, most of us will be using XR devices for all our communication, similar to today’s smartphone.

“As we get closer to 2030 and the release of the first 6G standards, XR entertainment is going to become an expectation. We will see everything from interactive digital sports venues to real-time augmented city guides and digital twins,” says Lionel Oisel, Head of InterDigital’s Video Lab, which is based in Rennes, France. “But the success of these experiences will hinge entirely on the quality of experience – where ultra-low latency, responsive interactivity, and consistent media synchronisation are all essential to unlocking XR’s full potential.” 

Universal haptics 

The research lab also believes that haptics will play a bigger part in how we virtually experience sports, films, and TV. In contrast to visual or auditory interfaces, haptic technology is said to enhance realism by stimulating the sensation of touching, grasping, or manipulating virtual objects – making digital landscapes feel more tangible. 

In January 2025, the first MPEG-I Haptics Coding standard was published, paving the way for haptics to be encoded, streamed, and rendered to mobile displays, headphones, and XR headsets.

With a standardised format, haptics can now be streamed alongside audio and video in the same bitstream. It can be authored once and played anywhere across networks, devices, and platforms. In short, according to developer SenseGlove: “haptics is finally ready for prime time.” 

The idea is to be able to encode the haptic signal just once and still enable playback on any device, rather than continue having to create a different process for each unique platform from Microsoft, Sony, Apple, or cinema’s D-Box system. 

There is a clear use case in gaming. For example, when you play Battlefield 6, you will experience over 170 curated effects designed specifically for the game, provided you have the right haptics gear, like a seat pad. As developer Razer Sensa HD Haptics describes it: “You’re no longer just reacting to the fight on the screen, your body becomes part of it.” 

“You've seen haptics in gaming before, but wouldn't it be cool if somebody could make a movie with haptics that you experience through your TV or on your chair?” posited Liren Chen, CEO of InterDigital.  

Philippe Guillotel, Senior Director at InterDigital and a leader of the group in MPEG that is standardising representations of haptic data, says he is trying to convince streamers like Netflix that physical feedback will bring a new experience and added value to their content. 

“Since everything is offline [on-demand], it would be easy to create content with haptics. The issue is the device. One of the reasons we are concentrating on delivering haptics to smartphones, game controllers, and especially to the headset is that most people have these. We need devices to be inexpensive to be adopted by the market.” 

“There is a creative aspect to haptics and we are engineers,” he says. “So, we need artists. We need to educate people in creative schools that haptics is a new modality. [Creatives] can learn how to do it, and they have to understand how people perceive it. Then, we will have a much better content experience.”  

Earlier this year, Apple released a trailer for F1: The Movie, which synced action on-screen with the iPhone’s Taptic Engine: “making you feel the roar of Formula 1 engines.” Subtle moments, like a seatbelt snapping or a ping pong ball bouncing, trigger delicate taps, while high-speed crashes jolt your hands.   

New video codec underway 

InterDigital is also competing for its technologies to be included in a new video codec, which is currently being developed by ISO/ITU as a successor to the MPEG standard Versatile Video Coding (VVC). The new codec, H.267, is intended to be more efficient in terms of bandwidth than VVC without increasing the complexity on the decoder side.   

There is currently a call for proposals out to the industry. These will be evaluated in January 2027. Following this, there will be a standardisation stage and a final standard release scheduled for 2029. 

Already in the testing stage, InterDigital claims to have demonstrated performance gains averaging 25% over VVC with its technologies. Some tests show gains of double that.  

The target for H.267 is to deliver improved compression efficiency, reduced encoding complexity, and enhanced functionalities, such as scalability and resilience to packet loss. 

“It's a real big challenge and a great opportunity to develop new ideas, patents, and algorithms,” said Edouard Francois, Senior Director 2D Codecs Lead at InterDigital. “In particular, we are exploring how AI can be used in synergy with traditional video compression methodologies.” 

Other groups likely to respond include Nokia, Ericsson, Fraunhofer HHI, and MediaTek. Oisel explains: “This standardisation period will determine which tools are adopted (therefore licensable). To do that, you have to prove that it delivers huge gains and also that you don't have high complexity. The issue with AI tools is that they put the complexity on the decoder side, which is something that chip makers like Broadcom will fight against because they don’t want to add complexity to their hardware. If you come with a tool with huge gain but also huge complexity, then this won’t be selected.”  

 


Monday, 10 November 2025

Behind the scenes: Frankenstein

IBC

article here

Cinematographer Dan Laustsen tells IBC365 why he and Guillermo del Toro turned the classic nightmare, Frankenstein, into a love story of ice and warmth between father and son.

Having spent decades contemplating his vision, Guillermo del Toro had a fully conceived approach to his magisterial screen version of Frankenstein, which would test the capabilities of every single aspect of film craft. There would be giant sets, huge props, and a complex wardrobe.

Set against the backdrop of the Crimean War, but otherwise largely faithful to Mary Shelley’s gothic fable, this is the tale of scientist Victor Frankenstein (played by Actor Oscar Isaac) who reanimates a new creature (played by Actor Jacob Elordi) from the body parts of dead soldiers – only to realise that his control has limits.

Creature design 

In the film, Frankenstein selects mutilated bodies for his experiment and puts the pieces together like a jigsaw puzzle. The makeup needed to reflect that, but also have a certain beauty to it.

Creature Makeup Designer and Prosthetics Master Mike Hill previously helped transform Actor Doug Jones into the Amphibian Man in 2017’s The Shape of Water, and conjured the carnival sideshow performers in 2021’s Nightmare Alley. 

“We agreed that we didn’t want all these garish wounds and stitching,” Hill says. “Victor Frankenstein is not a butcher. He’s trying to make the perfect man, so he wouldn’t make this thing look like a car accident. This was a sympathetic being. I didn’t want to make [the creature] too good-looking because at the end of the day, he is a revived corpse, but it was very smart of Guillermo to say that these body parts came from soldiers, all moderately healthy and strong young men.”

Hill’s design was convincing enough that DoP Dan Laustsen could shoot Elordi’s monster as he would any other character. “The creature feels like a normal person,” Lausten says. “It was a character and didn’t require any special treatment.”  

The new creature takes the form of a soldier resurrected from a mass grave who needs to feel like a baby, and then like a philosopher, and then like a man.

Crafting a classic

“I wanted it to feel like an old movie that was made in the heyday of Hollywood,” the director explains in a ‘making of’ featurette. “Luscious and beautiful and operatic.”

He shared this vision with Laustsen, working with del Toro on their fifth film together after winning Academy Award-nominations for The Shape of Water and Nightmare Alley.

“We talked about making a classic movie, but it also had to look modern. To do so, we were shooting wide angle with the camera moving a lot, and having big vistas, strong close-ups, single source lighting, and a very strong colour palette,” the Danish DoP explains to IBC365.

Unpicking those key decisions, Laustsen says his first instinct was to shoot with a large-format (LF) Alexa 65 camera to produce an image close to the classic 70mm print.

“We shot Nightmare Alley half with Alexa 65 and half with Alexa LF, but I felt Frankenstein needed to be shot with Alexa 65 all the way,” he says. “I had to check with my Steadicam Operator James Frater if that was possible because the Alexa 65 camera is a monster. It's a really heavy camera. I didn't want to shoot part of the picture with the Alexa Mini LF, but to shoot 65 throughout.”

Frater was okay with the choice, so Laustsen proceeded to deploy LF Leitz Thalia lenses, which were more wide-angle than on his previous del Toro productions.

“I think one of the reasons it looks so classical is that we shot most of the movie on the 24mm, the widest lens Leitz makes for the Thalia range. Again, using classic film language, we often start on a big wide shot, so we see more or less everything in the set and end on a big close-up of one of the actors.”

To further this classic feel, diffusion filters were inserted inside the camera to create a specific type of romantic image. “The Leica lenses are really nice because they're sharp from edge to edge. If we want to have a flare, we can produce one; there are no surprises. The flip side is that the image is very sharp. That's okay for the sets, but it's not good for the skin tones. So, we shot with a Black Pro-mist 1/ 4 and 1/8 filter inside the camera. We can have deep blacks but still a kind of flare in the highlights. It’s organic and beautiful.”

A language of colour

Light and colour are always vital elements in a del Toro film. Here, del Toro conceived of Frankenstein’s childhood in black and white and red. His mother and his home are red, and since the character loses both as a child, the colour haunts him. For the rest of the movie, he’s the only character who wears red – red gloves, red scarf.

Before anybody joins him on a project, Guillermo’s already got strong ideas about sets, the costume and the colour palette for different scenes,” Laustsen says. “That's a really good idea because it acts as a guide for every head of department [HoD]. We’re starting from the same position. Everything is planned together and blended together. There’s a very close relationship between HoDs and it’s based on this colour palette.”

For scenes depicting Victor with his creature, Laustsen and del Toro brought together “two sides of the colour wheel”, moving from steel-blue cyan to deep, warm amber tones offset with heavy layers of shadow.

“We are playing a lot with the contrast between amber and the steel blue. When the creature and Frankenstein are together, the creature is lit with steel blue and his father is lit by tungsten. It’s totally unrealistic, but it evokes a feeling of coldness contrasted with warmth. This horror story is about love.” 

Light up the room

“We are not afraid of the darkness. We are not afraid of single-source lighting. When you're shooting locations like dining rooms, we need to have a lot of negative fill inside the room.”

To illuminate the lavish sets created by Production Designer Tamara Deverell, Laustsen placed powerful 24kw tungsten lamps outside. Laustsen adds: “To keep the same mood and colour palette consistent across all those sets, you have to be able to control the light from outside. By lighting from behind the windows, we can control the sunlight through the atmosphere and keep the blacks pretty deep.”

This decision allowed the actors to move around with ease and created pathways for the sizable camera on dollies, cranes, and Steadicam. Additionally, coloured gels in front of the lights created the desired hue.

“We didn’t make a LUT [look-up table]. My mindset was ‘I'm shooting it as if it were shot on film, so if I want to change the colour, I'm going to change that on the lights. I'm not changing the colour in the camera.’ I shot more or less the whole movie at 4200 Kelvin, a colour temperature at which the candles look good and the daylight combines well with the steel blue.”

Set the stage

Frankenstein’s family home, depicted in the first half of the movie, is a composite of stately residences shot on location at: Gosford House in East Lothian; Burghley House in Lincolnshire; Dunecht House in Aberdeenshire; and Wilton House in Wiltshire.

“We used a lot of atmospheres like mist, steam, and smoke, and the windows acted as a gobo to control the shape of the emitted light and its shadow. This adds a dimensionality to the image that I hope will feel particularly immersive.”

The elaborate staircase at Wilton serves as a focal point for the fictional Frankenstein estate and links del Toro’s production to one of legendary director Stanley Kubrick’s most revered projects, Barry Lyndon. That period production famously shot scenes on film lit by real fires and candlelight.

“Guillermo and I talked about going with candlelight, and we did a lot of tests there. Every cinematographer in the world wants to shoot something like Barry Lyndon, but we wanted to have a bit more control over the light. When you have candles everywhere, you cannot control the contrast, so we decided to use fewer practicals in favour of single-source lighting. I shot the whole movie at the same T-stop [exposure, in this case, a T-4] inside and out.”

Playing with fire

The team did, however, use real fire torches as the key light for night scenes set on the exploration vessel ‘Horisont’, which is icebound in the Arctic. Instead of creating the ship in VFX, del Toro insisted on a scale build at outdoor stages in Toronto and mounted on a mechanical gimbal so it would look as if it were being rocked physically by the creature. 

“We had big discussions about using flaming torches because, of course, they can be very dangerous, and we’d need to use LED torches, but Guillermo and I were keen to shoot as authentically as possible. When you have a real torch, the light will constantly flicker. It looks organic because it is, and the effect is much more dramatic.” 

The gigantic conflagration that destroys Frankenstein’s laboratory was also achieved in-camera by blending photography of the set in Toronto, Canada, with a miniature 20:1 scale set shot in London on a RED camera.   

“The key was to shoot high speed between 72 and 125 frames, which is why we shot that scene with a RED camera. It has a large sensor, so I can still use the same Thalia lenses we shot the whole movie on.” 

A father-son interpretation

Frankenstein has endured as a tale partly because it allows for different readings. The story could demonise the creature as the embodiment of everything that is inhuman, or condemn Frankenstein as the true monster for daring to be a god. Del Toro’s version reveals the humanity in both characters through their father-son relationship.

“The first time Victor sees the Creator for real, when he opens the blinds and lets sunshine come in, we shoot it a little bit like a love story. There's warm light for the first time that the father and son are together.” 

Similar warm light is used in an early scene when we see Frankenstein senior trying to teach his young son about science. 

“One of the scenes I like very much is the first time the creature sits with his father in the lab, and his father is tenderly shaving him. It’s a simple scene with the sunrise reflecting in a broken mirror. You feel the chemistry between the two actors, and you can also see that Daddy doesn't understand anything about kids.” 

This arrangement is mirrored in the film’s final scene when the pair reconcile, and sunrise streams in through the window. 

 


Friday, 7 November 2025

Comcast moves for ITV to create a UK-focused streaming giant

Streaming Media

article here

UK commercial broadcaster ITV has confirmed it's in early stage talks about a possible sale of its broadcasting business to Comcast which already owns pay-TV broadcaster Sky in the UK
The division includes ITV’s terrestrial TV channels and its streaming platform ITVX. The deal would value the business at £1.6 billion (U$2.1bn).
Sky News – itself part of any potential merger with ITV’s News operations - said “the approach centres on the potential creation of a UK-focused streaming giant.”
Sir Peter Bazalgette, former chair of ITV until September 2022 and a shareholder, told BBC Radio 4, “There's going to be an inevitable consolidation of domestic broadcasters all across Europe. There are four or five domestic broadcasts across Europe who can't all have a long-term future against the streaming giants. There is going to be a consolidation, and ITV are going to lead it in the UK.”
ITV’s largest single shareholder, Liberty Global, which jointly owns Virgin Media O2 with Spanish telecoms operator Telefónica, halved its 10 per cent stake in ITV last month.
The proposed deal does not include ITV Studios, the content arm behind drama such as Mr Bates and the Post Office and reality format Love Island.
Bazalgette said ITV’s share price doesn't reflect the value of ITV Studios and “probably discounts all of their commercial revenue from their channels and ITVX. This is one way to release some value.”
ITV and Sky along with Channel 4 are planning to pool resources into a new advertising marketplace in collaboration with Comcast in 2026. This will be based on Universal Ads, Comcast’s advertising platform, which has been “designed to make television as easy to buy as social media” and includes video generation from Streamr.AI at its core.
Alongside the AI video generator, the marketplace will also reportedly allow easy access to on-demand and streaming inventory from the three sales houses through a single campaign powered by Comcast’s FreeWheels technology.
ITV’s share price is down around 75 per cent what it was a decade ago. In other words, quality of performance hasn’t translated to commercial value. In a supremely competitive field, the world is moving away from linear TV which is where ITV still generates a lot of revenue.
“Free-to-air channels across the world are not seen as having a great amount of value,” said Bazalgette. “In fact, they throw off a massive amount of cash and still sell a lot of advertising, so they're undervalued by the marketplace and this is one way of trying to correct that.”
If ITV were to join with Sky in the UK they would hold about 70% of the TV ad marketplace, a near monopoly which would not pass normally pass the regulator. However, given the parlous stage of public service broadcast and the government’s desire to keep it running, Comcast may sense the sentiment has changed.
Bazelgette called UK competition rules “completely out of date” adding that the real market is video advertising where Google and Meta are prime competitors. Google and Meta have nine times the combined advertising revenue of Sky and ITV, so the CMA (Competition and Markets Authority) needs to redefine what the advertising market is. Once they've done that I think they’ll probably say that this deal was fine.”
Media analyst Ian Whittaker, also speaking to BBC Radio, said the move was “essentially a massive dare to the UK government.”
He said, “Comcast’s pitch that the UK needs to be seen to be open to business and that a merger is the only long-term survival option given the changing structural environment.”
ITV shares rose as much as 19 per cent in morning trading on the news.
The other main UK commercial broadcasters Channel 4 and Channel 5 (styled as 5) face similar pressures to consolidate. Five is owned by Paramount Skydance which under new ownership has begun cost-cutting in the US. Channel Four’s future has been a topic of debate for some time with repeat speculation of a merger between its digital services and BBC iPlayer.
“Channel 4’s future over 10-15 years is very uncertain, and at the very least it is going to need to find ways of collaborating with other broadcasters like sharing streaming services, or selling advertising together because its long term future is not healthy. But it is a very valuable brand so we've got to have a great deal more flexibility in the television market to preserve the value of the domestic broadcasters and the public good of the programs they make.”
He argued the UK media industry should have had a strategy for the survival of public service broadcasters in place a decade ago. “Instead, we're very late to it. [We] should credit ITV with probably triggering that reappraisal. In a way the market and the companies have done what governments haven't done.”
On Thursday, ITV warned the uncertainty surrounding the UK government’s financial plans, to be announced in a budget on 21 November, were hurting ad revenue. As a result it would “temporarily” cut £35m from its budgets.
The company also said it expected advertising revenues to fall by 9% in the key fourth-quarter advertising period in the run-up to Christmas.
In June, Comcast sold pay-TV group Sky Deutschland in Germany to RTL for an initial price  of EUR150 million ($176m) with the final sum determined by RTL’s share price. The US company paid Rupert Murdoch’s News Corp. U$40 billion in 2018 for the Sky operations in Italy, Germany and the UK.
ITV launched ITVX three years ago. By end of 2024 it had recorded 6 billion streams and claimed to have “outpaced all other major streaming platforms in terms of growth in viewer hours - with a 35% growth in viewer hours, ahead of iPlayer, Netflix, Disney+, Channel 4 and Amazon Prime.
Year end 2025 figures are due in a month.

Friday, 31 October 2025

BTS: Good Boy

IBC

From casting his own dog as the lead to shooting at dog’s eye level, first time feature director Ben Leonberg has perfected a filmmaking process entirely built around a pet. The result is critical acclaim and a viral smash for horror season.
article here
They say never work with animals but filmmaker Ben Leonberg had the skill and patience to spend 412 days creating his pet project: an adult horror movie starring his own dog. Good Boy, which was made for peanuts, has become a bone-fide breakout hit.
“Back in 2012, while rewatching the opening scenes of Poltergeist, a ‘what if’ struck me that I couldn’t shake: What if the family dog was the only one who knew the house was haunted?,” Leonberg explains.
Good Boy is the result of that question. It’s a paranormal thriller told from the perspective of a dog and with a distinct visual approach that often frames scenes from 19-inches off the ground.
“In this movie, the instincts and simple reasoning of a pet drive the story and storytelling. Indy is a real dog [a retriever] with real senses, and he'll follow his nose, literally. It's not teaching him to be in a movie. He has no idea he was in a film. It's just that we made the movie around him.”
At the start of Good Boy, Indy moves with his human owner to an isolated cabin in the country. As soon as he moves in, the canine hero is immediately vexed by empty corners, tracks an invisible presence only he can see, perceives phantasmagoric warnings from a long-dead dog, and is haunted by visions of the previous occupant’s grim death. Is the dog really sensing an evil dread or are we reading too much into his expression? The film plays on the ambiguity.
Projecting performance
“Everyone who's had a dog has at some point wondered why its staring at nothing or barking in the middle of the night?” says Leonberg. “I think that's very relatable. Also, dogs in horror movies is a trope we've seen before. They are often the ones who can sense danger before the human characters catch on. The inspiration for my film was to expand that kind of character by telling the story entirely from their point of view.”
In the process it became an intriguing thought exercise into the nature of filmmaking itself.
“People often ask how we got Indy to look scared and the truth is we didn't do anything. It’s just what you're seeing. There are things you can do such as air conditioning the room enough so he won’t pant and generally keeping things calm but all dogs have this very neutral expression. It’s the other filmmaking tricks and techniques that creates the performance. If the audience feels scared it’s because they’re projecting that onto him. In reality, he's having the time of his life!”
He elaborates, “In Hitchcock movies for example the actors aren't doing a whole lot. They’re not emoting much but because the camera is doing something in relation to them, it creates a performance and suspense. The filmmaking tells you to how to feel and you put that onto the character.”
Adjusting to the X factor
Nonetheless, when your lead actor is a dog, traditional filmmaking rules go out the window: For three years, Leonberg and his wife Kari Fischer (also the film’s producer) worked around Indy’s schedule— capitalising on his natural curiosity, eliciting specific expressions with silly noises, posing him in specific positions, waiting under beds for hours to get the perfect shot and enticing him around the haunted house set with treats.
“All shots with Indy were captured on closed sets so that we could maintain his focus, and we only ended up acting in it because I’m one of two people Indy truly loves and listens to.”
In reality the set was their own home in a rural location of New York State into which they moved during Covid. The familiarity of the environment was one clue to how the filmmakers enticed a performance out of their lead actor. The rest required constant invention and planning to build shots around his daily schedule.
“Indy is an enormous X factor,” says Leonberg who storyboarded the whole film knowing that Indy couldn’t be relied on to ever hit exact marks. “It’s not the kind of movie where I could have had a board artist draw every detail of the room and explain to my actor exactly how all the elements of the mise-en-scène would come into play. I had a goal of what the shot was supposed to accomplish in terms of the story but the process was always in flux, trying to figure out how to actually execute on this.”
Sometimes a shot that was supposed to be a medium had to be changed to a two-thirds shot or a close-up just because of where Indy ended up.
“I’d have to adjust the shot that come next in response to that new frame. It was definitely hard, but also really fun. It's a novel way to make a film using time as the primary resource where you're not trying to spend a lot of money or do 12 hour days, day after day. This was working a few hours a day at the pace of a hobby over the course of several years.”
Leonberg did all the camera and lighting himself crediting Wade Grebnoel (his surname, backwards) in the titles. He shot on a RED Dragon X 6K with vintage Nikon AIS glass. “The hero lens was the 15mm. It probably got used in every single scene for close-ups where his face fills the frame. It’s a wide angle lens that is perfect for a canine face. Vintage lenses take some of the edge off the resolution and makes the movie feel a little bit more handmade and organic. That was certainly what I was going for here.”
He wasn’t totally solo. Fischer learned how to operate camera for a few shots where her husband had be in front, and he brought in an additional camera operator for one day when they both needed to be free to move around with Indy.
However, for all but five of the 412 days of principal photography it was just the three of them - including the dog. Leonberg also redid the electrics and built practical effects including rain machines in creating the spectral presence that haunts the film.
Leonberg had a career in commercials before moving into narrative filmmaking with shorts like Bears Discover Fire which featured a life size puppet of a Grizzly. He also earned an MFA in directing from Columbia University where he taught for over five years.
He drew on his background working in immersive media to execute some of the most complex shots—leveraging compositing techniques he perfected in VR to remove himself and Fischer from shots where they coached Indy while on camera.
The recorded production audio which is mostly from camera or from a planted mic was just a guide. Since Leonberg and Fischer are coaxing Indy through each shot the sound required an entire rebuild in post including replacement of all the dog’s footsteps.
What comes next
It’s not a spoiler to state that Indy survives. Leonberg revealed as much before the film’s release likening it to audience knowledge of Ethan Hunt’s invincibility in Mission Impossible.
“My co-writer (Alex Cannon) and I never seriously considered a version in which he dies. If it was a sad ending, it might not have worked out quite so well. Horror movies are horrible by definition but people still like an ending where it feels like the hero has some sort of completed journey. We always knew we were going arrive where we did with Indy living to fight another day, so to speak.”
Independently produced by Leonberg and Fischer’s company ‘What's Wrong With Your Dog?’ Good Boy has made over $7 million in cinemas and is also available on horror streamer Shudder.
Unsurprisingly after all the attention his film has achieved, Leonberg is fielding pet related scripts. “Having developed a unique set of skills, at least working with my own dog, I’m being asked for my advice on other animal film projects. It’s not what I want to do next, although I probably will return to animal storytelling at some point.”
Instead, he wants to explore how new stories can be told using original perspectives.
“Refreshing genre tropes through new kinds of characters and marrying that to a literal new perspective is definitely exciting. Even though it was physically taxing to make a film from 19 inches off the ground from Indy's point of view, I want to take the idea further.”

Thursday, 30 October 2025

Can GenAI unlock ad revenue for cash strapped broadcasters?

IBC

The first AI-created adverts are coming to TV as broadcasters look to compete with social media. ITV and Channel 4 explain why they are now scaling up
article here
Last month, Channel 4 became the latest broadcaster to offer advertisers the ability to create ads using Generative AI. It followed US media conglomerate Comcast (parent to Sky in the UK, the NBC network and streamer Peacock) which launched a GenAI service in June and ITV which began its trial over a year ago.
All are now looking to scale up the proposition which targets the hundreds of thousands of small and mid-sized businesses who currently advertise with Meta, Amazon and Google.
Additionally, ITV, C4 plus Sky are planning to pool resources into a new advertising marketplace in collaboration with Comcast in 2026. This will be based on Universal Ads, Comcast’s advertising platform “designed to make television as easy to buy as social media” and which includes video generation from Streamr.AI at its core.
“Our motivation is fundamentally to bridge the gap between the millions of advertisers that are out there and the thousands who currently advertise on  ITV,” says Jason Spencer, Business Development Director ITV. “Essentially, what we see in that huge gap is a growth opportunity for us to engage those SMEs who are used to making their own ads on Meta and YouTube. We want them to see that actually TV is no longer outside of their reach.”
The argument is that historically TV advertising has been too expensive and too complex for most businesses. Only 7,000 out of the UK’s 3 million advertisers run campaigns on TV, according to clearance and regulation service Clearcast.
We're producing very simple storytelling with a low cost of entry cost for an ad that can be delivered self-serve in 30 seconds. If an advertiser says they want a more sophisticated ad, they can upgrade to one made by our creative production team who will use, amongst other things, the enterprise licenses through GenAI.”
Breaking down the barriers
The aim is “democratise access to TV” as Barry John, head of sales operations, Channel 4 puts it. The problem this is solving is for advertisers who may, in the past, have thought TV was either too expensive or too complicated to buy or the process of putting advertising on CTV seemed out of reach for various reasons.
“We are providing a set of tools that can give an advertiser a suitable quality creative in order to advertise their products with us where they may previously have never thought it was an option.
“This is not about building a huge brand campaign with a long-term initiative. It is very firmly around the sort of small to medium-sized businesses whose objectives are typically sales of some kind.
“We feel there is a route with GenAI to provide a good enough ad for them that will allow them to be on the most powerful marketing medium that we've seen – TV - in a way that is time and cost efficient for them.”
Channel 4’s solution is claimed to cut the cost of producing a 30-second spot by around 90%. In 2024, ITV’s in-house team of five made about 1000 ads for 200 new-to-TV advertisers with budgets between £500 and £5000. The original GenAI ads launched last year took around 10 hours to create and cost around £500 each. Spencer says that cost has reduced even further.
Comcast claim that using GenAI, “What used to take months and thousands of dollars can now happen in a single afternoon.”
Both ITV and Channel 4 are offering the use of GenAI as a managed service for clients. Their inhouse teams will use various AI tools to create an ad, liaising with the client on expected outcomes and tweaking the creative accordingly. 
Both claim the GenAI tools they use are ethically sourced, properly licenced and copyright safe. Both work directly from the marketing assets, like websites and existing videos, owned by the client and both back and forth between the AI models and the internal sales team to finesse prompt engineering. The ads will be subtitled automatically delivering further efficiencies.
Both broadcasters also use Streamr.ai, the video generator recently acquired by connected TV ad company Magnite. A key reason Channel 4 and ITV chose it over dozens of rivals is that Streamr has fed UK broadcast compliance rules (BCAP) into its video generation engine with the aim of having the ad cleared straight away by Clearcast at the first time of asking, therebny removing another impediment for advertisers.
GenAI ad workflows normalised at ITV
ITV has just switched to Streamr having launched using ‘enterprise licenses’ for models that likely include ChatGPT. ITV won’t say which other models it will continue to use them alongside Streamr for different elements of production. Over the past year AI has become “normalised” into ITV ad creative workflows.
“Brands don't necessarily come to us saying they want a GenAI ad,” Spencer explains. “When we look at a brief we see that it lends itself to using a range of tools and techniques from [real] video cameras and graphics packages to GenAI. AI has become integrated into what we do day-to-day. We're not just making ads that are solely end-to-end GenAI. There might be certain shots within certain ads using Generative AI. We might use AI for storyboarding or to speed the process in other ways so we can spend more time discussing with the client about how we can flex things.”
We're producing very simple storytelling with a low cost of entry cost for an ad that can be delivered self-serve in 30 seconds,” Spencer says. “If an advertiser says they want a more sophisticated ad, they can upgrade to one made by our creative production team who will use, amongst other things, the enterprise licenses through GenAI.”
ITV has trained its entire sales team to be able to use the tech. “We've turned our commercial team into producer-creators,” Spencer says. “That's a door opener. We're using this as a pilot initially to see how this helps us to engage more SMEs. We're stress testing the capabilities of it now.”
He adds, “The market is quite tough at the moment. We're up against the Big Tech platforms who have a much more frictionless way of working with SMEs. What we've tried to do is chip away at the perceptions and this is a long-term thing. You have to keep saying it and saying it that the barriers to entry are not what people think they are. You can't just say it once and launch some tools and ‘hey presto’ everyone is sold. It's a matter of continuing that mantra and proving it.”
C4 moves into beta
As well as Streamr.AI Channel 4 use an AI product from Telana. John explains, “Streamr is meant for achieving direct response type outcomes such as including a QR code for consumers to activate and proceed with the sale.”
If the client’s marketing goals are more advanced, perhaps needing detailed planning, then C4 will engage with Telana for “a more hand curated process.”
Longer ads could be created by linking six five second clips together smoothed by postproduction for a more bespoke ad. “This would be used less for direct response and more for creating a marketing message around brand values.”
Channel 4 is now moving from pilot phase into beta and opening up to more SMEs. “There is a set of target advertisers that we will proactively reach out to who've already registered interest with us. It will grow from two to three clients this year to five to 10 then 15 a month from January.”
Universal marketplace for SME ads
With AI technology improving at pace and with tools available to automate the workflow from concept to publish, there is a question over the future of creative agencies. Meta boss Mark Zuckerberg has outlined plans for a completely AI-driven advertising business from detailed campaigns to the creative. Meanwhile, Amazon sellers can now generate video promos with a simple text prompt to an AI chatbot. The ads can appear on Amazon’s online marketplace and across Prime Video and Twitch.
Broadcasters are taking tentative steps in this direction too.
Launching in 2026, Channel 4, ITV and Sky will launch an advertising marketplace in collaboration with Comcast. Called Universal Ads it uses a AI Video Generator that mirrors Meta’s plans to fully automate ad production.
The marketplace will also allow easy access to on-demand and streaming inventory from the three sales houses through a single campaign powered by Comcast’s FreeWheels technology.
In May, Comcast launched a GenAI tool with Creatify (in which Comcast is an investor). The solution, integrated into Universal Ads, makes creating a TV-ready commercial “as simple as building a social media ad. No studio required. No pre-existing video assets needed. No large production budget.”
The targets are small business owners who are told by Comcast that “hat used to take months and thousands of dollars can now happen in a single afternoon.”
“AI is developing extremely fast but ultimately, we see this as an augmentation as much as an absolute pure creation tool,” says John. “This is not replacing creativity in its entirety because we don't think it will get to that point where it can do that as well set of human creatives can do in terms of understanding brand or consumer behaviour.”
He views GenAI as opening opportunities that wouldn't exist without it, such as the ability to create almost infinite varieties of the same creative. “You’ll create a core asset and then put it into the AI engine to output a different final frame for a particular type of person or specific location based on other consumer and retail data. Dynamic creative optimisation is where we think GenAI really comes into its own and where it will scale quickly.”