Thursday, 9 April 2026

Live from the Grand National: NEP saddles up for three day course wide coverage

SVG Europe

article here

For the team at NEP, the Randox Grand National is the culmination of a year-round operation that must scale, adapt and perform under intense pressure.

“This is our tenth year working on the National for ITV,” explains Jon Harris, NEP UK’s Head of Technical Project Planning and Management. “We cover around 100 transmissions a year, week in, week out for ITV Racing (produced by ITV Sport Production, part of ITV Studios) so we’re never more than a few days away from our next OB - but this is the biggest.”

In the field of 34 runners for this Saturday’s main event are 2025 winner Nick Rockett, 2024 winner I Am Maximus and fourth placed 2025 National mount Iroko which is a 8-1 contender.

At the heart of NEP’s operation is its purpose-built outside broadcast unit, Equinox, designed specifically to meet the demands of horse racing coverage.

“Back in 2017, when ITV took on the contract, we looked at their technical specifications and combined that with our experience of racing production,” Harris says. “Equinox was built to fulfil those needs. Everything is centred around that truck.”

Equinox provides a consistent operational base throughout the racing calendar, but crucially, it is also designed to scale. “It supports the week-to-week racing coverage, but also has the capacity to handle major events like Royal Ascot and the Grand National,” he says. “That gives us a real head start when we get on site.”

A rolling operation

Equinox came to Aintree straight from Musselburgh over the Easter weekend. A second OB unit, Atlantic, joined the main operation after finishing another racing fixture at Kempton on Monday night. This compressed turnaround is made possible by a design philosophy focused on rapid deployment and continuity.

“We already have elements pre-rigged, which gives us a huge advantage and allows us to set up in much shorter timeframes compared to traditional setups,” Harris says. “Tuesday and Wednesday we rig and test ready to go live on Thursday. And for the crews, the truck provides an operational continuity with the same layout and workflows they’re used to. That’s really important.”

A unique technical challenge

Unlike other major racecourses, Aintree presents a particular challenge: it is only used once a year for ITV racing. “With somewhere like Ascot, we’re there multiple times a year, so there’s familiarity and some fixed infrastructure,” Harris explains. “At Aintree, everything is a temporary cable rig.”

In preparation, NEP’s rigging team installs fibre to key points around the course beginning in February. “Beyond that, every cable is run fresh for the event.”

Reliance on RF technology

Aintree’s scale is significant. The course itself is 4 miles 2 and a half furlongs long and the ground undulates in places. The famous Canal Turn, for example, sits around two kilometres from the main compound, requiring extensive connectivity planning.

“We rely heavily on RF because we need flexibility,” Harris says. “This allows cameras and presenters to move freely across the course, from the track to hospitality areas, capturing both the racing and the atmosphere.”

He adds, “ITV wants to capture the essence of the event, not just the sport. It takes a lot of coordination, making sure the right microphones are matched to the right cameras, and everything is correctly tuned.”

There are in fact 17 other races broadcast from Aintree before the 4pm Saturday start of the National. Nonetheless this is as much a social spectacle as a sporting event, and that shapes the production. “Horse racing itself is actually very short,” Harris notes. “What surrounds it, the build-up, the crowd, the fashion is a big part of the OB. There’s a push to make things more dynamic, to get out and about—even before the racing starts. Using RF cameras and technologies like LiveU, we can capture people arriving, the atmosphere building.”

Fence cameras are particularly valuable for replays and analysis. “They might not always be used in the live cut, because you need to maintain orientation, but they’re great for telling the story afterwards.”

For aesthetic shots, the production deploys specialist rigs developed by NEP using DSLR cameras with racking control but the core race coverage is standard 1080 50i in SDR.

Camera firepower

NEP populate Aintree with 57 cameras including two 3x speed Sony and two 6x speed Sonys; four fence cams covering eight jumps of the National (these are waterproofed Marshall units to which NEP add FX mics). One tracking car is fitted with one of the HiMo cameras and another carries a custom NEP mobile jib. 

For race coverage ITV deploy a Batcam full size drone and have a couple of smaller units for pre-records with ENG crews.

A wirecam, supplied by Gravity Media’s Specialist Cameras division, is a CAMCAT Standard with tracking data for AR graphics. This runs at the grandstand end of the course parallel to the final stretch for over 850m, crossing the course so it also covers both the race start and finish.

Gravity also supplies a helicam equipped with GSS B512X stabilised head and carrying Sony P50 and Canon CJ45 lenses. “These give us both atmospheric grandstand wides as well as dramatic race coverage,” says Tony Cahalane, Technical Director, ITV Sport.

Planning critical

With such reliance on RF, contingency planning is critical. “Our job is always to ask: what happens if something goes wrong?” Harris says. Interference, signal loss or external disruption are all risks. To mitigate them, NEP maintains a fully cabled backup system.

“We can cover the race entirely on fixed cameras if we need to,” he explains. “It wouldn’t be as good, but it would work. Key presentation positions are also hardwired so we can ensure continuity even in worst-case scenarios. We’ve always got fallbacks. It’s about giving ourselves time to fix the issue while staying on air.”

The sound of the spectacle

Microphones are placed across the course to capture both the roar of the crowd and the quieter, more atmospheric moments. “At the far side of the course, it can be almost silent,” he explains. “You have to reflect that contrast. We’re trying to paint a picture with sound. It’s about making people feel like they’re there.”

In excess of 50 FX mics are sub-mixed and augmented by bespoke recorded horse effects added as in a live dub as is the norm in racing coverage.

Presenters are equipped with individual microphones to ensure clarity amid the noise, while coordination ensures seamless integration with RF cameras.

There is currently a contract evaluation process in progress within ITV to cover the facilities required for the next rights span.

Cahalane explains, “As always, we’ll look at what NEP has delivered and what others could deliver, but we’re still hugely impressed! First incumbents are always in a strong position, but it’s a value‑based process.”

For Harris, who has clocked up 16 consecutive Nationals, the priority is delivering a broadcast that allows viewers to follow the race.

“The most important thing is that people can see where their horse is and understand what’s happening,” he says. “Everything else has to support that. Get the basics right and everything else can be built around that.”

ends

Behind the scenes: Undertone

IBC

article here


While looking after his dying parents, VR filmmaker Ian Tuason became obsessed with demonic possession stories which planted the seed for the uniquely creepy sound design of his first feature.

You won’t hear Baa Baa Black Sheep in quite the same way after seeing Undertone, the latest twist on found horror which emphasises sconic scares over visual gore. Like last year’s low budget breakout chiller Good BoyUndertone preys on audience fears by letting their imagination run riot about what might lurk in the shadows.

“Sometimes the most terrifying thing of all is our imagination, and what we project onto something that may or may not be there,” says Canadian writer-director Ian Tuason. “This is a found audio movie and a soundscape above all else. I created a sound design for this story where everything is directional. The audience can close their eyes and feel where everything is — or might be. When something far away is suddenly  getting closer behind you, the terror becomes heightened and amplified.”

The story takes place in one small house and is told through the point of view of one woman nursing her dying mother who is bedridden upstairs. The woman is Evy (Nina Kiri), who is also co-hosts a podcast investigating paranormal activity. Evy is a sceptic when it comes to unexplained things that go bump in the night. You might be able to join the dots.

“I always considered my mother’s company as the safest place I could be, and when I saw her in this vulnerable, dependent condition at the end of her life, my imagination took off,” says Tuason. “A possession movie where you’re caregiving for a parent and they become possessed by demonic forces was something that I hadn’t seen before.”

The soundscape

Tuason knew the sound for the movie was going to do 80 percent of the lifting in terms  of the storytelling. Having experimented shooting live-action shorts with a 360-degree camera he wanted to duplicate the immersive effect of those visuals in the sound design of his feature debut.

“I thought as much about the aural elements of undertone as I did the visual elements you see on camera,” he says. “The intention was to create a soundscape of increasing force and menace.”

Tuason wrote audio directions and camera directions and into the script. “The script would indicate that a baby cries behind you, or we hear from right to left the tumbling of a body downstairs,” says Tuason.

Othe background sounds like stairs creaking or the ticking of a clock are amplified. Like A Quiet Place (2018) the sound of silence is also threatening. The atmosphere becomes more intimate when Evy puts on noise-cancelling headphones to listen to audio clips for her podcast. The noise-cancelling headphones also mean she can’t hear what may be going on around her.

“A lot of the horrifying elements in the movie are intensified because you, as the viewer, are trying to create images of what Evy is listening to through her headphones.”

He employs audio apophenia, the phenomenon of hearing hidden messages in songs when they are played backwards.

“Audio apophenia is trying to make sense of a random sound that’s played in a different  context, which becomes scary, because the listener is creating the horror in their own mind,” says Tuason.

One of these is subverts the innocuous nursery rhyme, a tune chosen because it is public domain and therefore free of rights.

The film’s supporting characters, including medical professionals, Evy’s boyfriend, podcast listeners phoning into the show, and a married couple on the audio files, all appear off-camera, requiring actors who were skilled with voice-over work.

Except for the audio files, recorded and designed by Dane Kelly prior to production, the sound design in undertone was completed in post. A surround-sound mix was later expanded into Dolby Atmos.

Another similarity with Good Boy is the ingenuity forced on the filmmakers by the low production budget. Tuason wrote and filmed the entire story in the small two storey house he grew up in Toronto, the same one where he had nursed his parents in their dying days.

Over the two and a half years of preparation and writing he took hundreds of photographs which he turned over to production designer Mercedes Coyle to transform the non-descript decor into something a little more chilling in its banality.

This included sourcing and positioning religious iconography like ceramic statues, crosses, and paintings of Jesus and Mary and creepy childhood drawings (based on the scribblings of Tuason’s young nephew).

Director of photography Graham Beasley filmed on Alexa Mini LF partly because its small size was useful for getting into tight corners and stairwells, and for capturing Evy’s claustrophobic mindset. Beasley frames more for what we don’t see than what we do, allowing our fears to fill in the gaps.

His lighting scheme begins with an emphasis on daylight but gradually turns to darkness — a neat metaphor for the dying of the light. The upstairs of the home set it lit in a cold manner, suggestive of hospital lighting, while the downstairs feels more nurturing.

Background in VR filmmaking

Tuason began his career making 360-degree virtual reality-inspired horror shorts on YouTube, including Continuity Problems (2009) and Extreme Close Up (2011). In 2014 he set up his own VR production company DimensionGate pioneering immersive 3D sound for live-action cinematic content for headsets like Meta Quest. His 2015 live-action 360-degree  horror short 3:00am racked up 9 million views on YouTube after being shared by Ashton Kutcher and Lil Wayne.

Following that viral success, Tuason was hired to make VR content for clients including Warner Brothers, The Canadian Football League, The Chainsmokers and Snoop Dogg. In 2024 DimensionGate released VR game Stab It in which you have to stab bats in a cave to survive.

Undertone began life as a radio play in 2018 about paranormal investigators who stumble upon some sinister audio footage.  The project was inspired by user-generated horror memes and myths distributed online (dubbed Creepypastas) which are often presented as first-person accounts or faux-found footage. For his radio play Tuason explored an internet phenomenon in which paranormal enthusiasts record themselves listening to songs played backward.

He then digitally reversed tracks, filmed the actors as they listened, and captured the distorted audio himself on an iPhone. “I discovered these YouTube channels of people examining songs in reverse, searching for hidden messages… and knew I wanted to write about sitting alone in the dark at night looking and listening.”

One particular ‘creepypasta’ that stuck with Tuason was the Elisa Lam story, in which a young woman vanished while staying in a dilapidated hotel in downtown Los Angeles. In the aftermath of her disappearance, hundreds of armchair sleuths pored through security-cam footage of Lam’s apparent spectral presence in the hotel’s corridors and elevators with some speculating she’d fallen victim to the videotape malevolence of 1998 Japanese horror The Ring.

The feature only took shape when, during Covid in 2020, Tuason found himself tending to both of his parents who each had a terminal cancer diagnosis.

“It’s easy to lose faith when you watch your parents dying,” says Tuason. “This was a very personal story. Evy was me. Everything that ended up on the screen, or in the speakers of Evy’s headphones, was intentional and honest, and meticulously thought out.”

Tuason wears the influences of The Blair Witch Project and Paranormal Activity on his sleeve. Having sold the rights to Undertone to A24 for a seven-figure sum, rival horror house Blumhouse hired him to direct a reboot of Paranormal Activity which is scheduled for a May 2027 at Paramount Pictures.

Wednesday, 8 April 2026

Vertical dramas: Market disruptor or passing fancy?

IBC

As Hollywood wakes up to vertical micro-dramas their rise shouldn’t be dismissed as a fad but as a profound shift in the production, consumption and gender-bias of global storytelling

article here
Bite-sized video series designed for mobile viewing are taking the world by storm and even the $26 billion in annual revenue predicted by 2030 seems conservative.
“That number is very realistic—if not bigger,” says Vivian Wang, Head of Content at Crisp based in LA. “Vertical storytelling has limitless potential, and we haven’t even fully seen it yet.”
When people talk about vertical content, they often call it “micro-drama,” but industry professionals usually refer to it as vertical drama, vertical series, or simply “verticals.”
When the format first emerged in China, most of the content was adapted from online literature, produced on very low budgets, and distributed through social media. TikTok, in particular, fundamentally changed how people use their phones.
“People used to believe that vertical viewing was only for short moments—standing in line, taking a break, or filling gaps during the day,” Wang says. “But now we realise that people scroll on social platforms on their couch for three hours without noticing the time passing. Meanwhile, watching a horizontal screen for 20 minutes can already feel long. This shift is revolutionary.”
Crisp is one of a growing number of startup production companies and platforms focused on vertical video and targeting the English language market. Others include ViglooMy DramaReelShortCandyJar, and DramaBox. Spanish-language network TelevisaUnivision offers vertical versions of telenovelas on Vix.
Following Disney’s announcement of plans to launch a new vertical video product later this year, Netflix has also signalled an increase in vertical content on its mobile app.
“There’s never been more competition for creators, for consumer attention, for advertising and subscription dollars,” Co-CEO Ted Sarandos told analysts last week, which makes you wonder whether $83 billion cash for Warner Brothers’ back catalogue and HBO subscribers wouldn’t have been better spent on a creator platform.
“Vertical content is already close to mainstream,” says Sasha Tkachenko, the Head of Studio at My Drama, a micro-drama platform owned by Ukrainian company Holywater in which Fox Entertainment has invested. “Nearly everyone has a smartphone and vertical viewing is unavoidable.”
Profound behavioural change
China now produces around 3,000 vertical shows every month. This phenomenal economy is built on a behavioural change in the way people consume content. Watching horizontally and passively, is a completely different mindset from watching vertically, with your thumb in control of the next swipe.
“When your hands are off the screen, the power belongs to the storyteller. You’re ready to listen, ready to have your mind changed,” Wang says. “Film and television evolved to demand that storytellers offer something meaningful enough to justify that power. If a story doesn’t offer value within the first 5 to 15 minutes, audiences increasingly refuse to give it their attention.”
Vertical viewing flips that dynamic. Wang says, “The power is in the audience’s thumb. You can scroll away at any moment. The audience isn’t looking to have their worldview challenged—they just want immediate entertainment. That changes storytelling completely.
“This is why traditional filmmakers sometimes look at vertical shows and think they’re ridiculous. But the numbers say otherwise. People watch them. People pay for them. The problem isn’t the audience—it’s our perspective as filmmakers. We must seriously rethink what people want to watch.”
Yang herself has a traditional media background. Before coming to the US, she wrote and produced four films (including Goodbye To Youth) and a TV series in China. Transitioning into vertical content required a major learning curve. In China vertical content took off very quickly while Quibi (the Jeffrey Katzenberg-backed short-form video startup which closed in 2020) failed in the US. That shows that short-form, vertical narrative content can work—it’s just that only certain types of storytelling succeed in this space, while others don’t.”
Genre experimentation
Melodrama and drama have dominated the vertical space and continue to perform well but hybrid genres such as melodrama combined with thriller, detective stories, or sports drama.  
Holywater have seen strong success with LGBTQ+ content and commissioned LA producer Second Rodeo to producing Playback, described as the “first-ever musical produced specifically for vertical video”. Unscripted content is being tested across the industry, including the series Love or Dare which debuted on My Drama before Christmas, but no one has truly cracked the code yet. ESPN even offers vertical sports content via the Verts tab in the ESPN App.
“The genre itself doesn’t matter as much as the execution,” Tkachenko says. “Rule number one is that vertical content must be fast-paced and immediately relatable. The viewer needs to instantly recognise themselves or someone familiar on screen. Characters should be clearly defined and almost archetypal. For example, if the main heroine is a woman in her twenties from a decent family with specific traits, those characteristics need to be instantly visible. Clear, simple character construction is one of the core drivers of success.”
Episodes are short (My Drama’s are between 40 seconds and two minutes) so the structure must be tight. “You need a strong hook at the beginning, a conflict in the middle, another escalation, and a powerful cliffhanger at the end,” she advises. “If any of these elements are missing, viewers will simply swipe away to another app. Every episode should grab attention immediately and hold it until the final second.”
Audiences should be involved in story development. “The simplest method is to produce a pilot of around 10 to 15 minutes and release it for free. We then ask viewers directly whether they want the story to continue,” Tkachenko explains. “If more than 60 percent respond positively, we move forward with full production. Pilots are also where we test new genres and tropes that we feel less certain about.”
My Drama also actively engage communities on social media, testing cover designs or titles and allowing viewers to vote. “We closely monitor fan edits, comments, and requests for sequels, prequels, or specific actors,” she says. “Listening to the audience usually pays off.”
Raising production quality
Improving production quality doesn’t always mean higher budgets. Holywater work extensively with partners in Eastern Europe where high-quality content can be produced at a fraction of Western budgets, according to Tkachenko.
“Productions that might cost $150,000 per episode in the UK can be made for $25k without sacrificing quality. This efficiency is a key part of how we raise production standards.”
Its deal with Fox will allow access to Hollywood talent both in front and behind camera. Titles co-produced with Fox already on the platform such as Billionaire Blackmail and Bound by Obsession “look fantastic” says Tkachenko. “The partnership absolutely elevates production value. Its scale, infrastructure, and access to locations across the U.S. benefit everyone involved.”
Holywater’s pipeline usually starts with books. It has a library of thousands of titles available on its app My Passion, which are tested for audience interest. Successful titles move into My Muse, another app where AI is used to adapt and visualise book IP into animated content. “Producing books and AI content is far less expensive than live-action production, so this helps reduce risk,” says Tkachenko.
The company also uses AI to support scripting, music creation, and visual effects such as crowd scenes, stunts or large locations for live action shoots.  
“Speed is critical: a live action title is typically produced within three to four months, while My Muse can produce a full title in one to two weeks. In many cases, it’s becoming difficult to tell whether certain shots are AI-generated or live-action.”
At CRISP, Yang is looking for content that is “vertical-coded”—stories that fully understand and embrace the format. She explains, “A hit for us is a show that generates over $1 million in revenue. Production value matters, but storytelling matters most.”
It takes time, data, and thousands of hours of viewing to truly understand why certain content works. Success depends on understanding the current emotional pulse, she says.
“Vertical content isn’t driven by curiosity as much as anxiety. People come looking for comfort, fantasy, and emotional release. Understanding that anxiety—and fulfilling it quickly and effectively—is the core skill of this format. What filmmakers think is ‘good’ often doesn’t match audience behaviour. Learning that was surprising for me—but invaluable.”
Monetisation is shifting
The vertical industry currently relies on the IAP (in-app purchase) model which Deloitte forecasts will reach $7.8 billion in 2026. Consumers watch a few episodes for free, get hooked, and then (usually between episodes six and fourteen) are asked to subscribe for unlimited viewing or unlock episodes using coins (microtransactions). Additional revenue comes from licensing content to other platforms.
Platforms spend enormous amounts of money on promotion—often more than on production—primarily through algorithm-driven clips and ads on social platforms.
However, this model may be temporary. In China, ByteDance launched RedFruit in 2022, a free platform that removed paywalls and monetised through in-app advertising (IAA) quickly amassing over 236 million monthly users. The longer viewers watch, the more ads they see, and the more revenue is generated.
“When it's free to watch, the focus is on how long can I keep the viewer on my platform instead of how much money I want you to pay. That immediately changes the format.”
For example, RedFruit has started to produce shows with longer episodes of three to four minutes for stories with a slower burn pace.
“The money shots are mostly saved for the later episodes instead of at the beginning to encourage people to stay watching,” Wang says. “You have to create really good content that is worth staying for.”
However, RedFruit’s most revolutionary move is its generous revenue-sharing model. “This is really badass and mind-blowing, especially to artists in China,” Wang says. “I personally have friends who are writers of vertical dramas who are earning millions of dollars every year completely from passive revenue share.”
The model is familiar to YouTube of course only RedFruit will also pay production budgets – and pay the creator an upfront fee - unlike traditional platforms that rely solely on backend profits.
Internationally, we’re starting to see similar momentum. TikTok, still owned by Bytedance in the US despite pending ownership changes, has just launched a IAA drama app called PineDrama. While the Western market still relies largely on paywalls, the shift toward free, ad-supported platforms is coming.
“We are open to licensing vertical content from producers worldwide,” Tkachenko says. “Deals usually involve a flat fee, sometimes combined with profit-sharing, depending on the title, language, and market.”
Female led storytelling
Interestingly, vertical drama originally started as male-oriented content in China—“stories about underestimated men who turn out to be secret geniuses,” says Wang. But around 2020, platforms shifted toward female-focused romance, and that’s when the genre truly exploded.
My Drama’s core audience is typical: women aged roughly 30 to 45. Male content is emerging but represents a fraction of total production. Tkachenko says men tend to watch micro-drama for validation and revenge-based narratives, whereas women watch for escapism and emotional relief. These psychological needs differ significantly, so content must be tailored accordingly.
“Action-driven stories and revenge arcs work well for men. Sci-fi is more challenging due to budget and quality expectations, though we are experimenting with AI in this area. Strong existing IPs, however, are always worth exploring.”
Wang ties the growth of verticals to the demand for female-oriented storytelling. “Traditional film and television have historically been male-oriented,” she says. “Vertical dramas, especially romance, cater directly to female fantasies and emotional needs.
“Many people initially cringe when watching them, but that reaction often comes from internalised discomfort about openly enjoying female wish-fulfillment,” Wang continues. “This format allows women’s perspectives and desires to be centred in visual storytelling in a way film and TV rarely have. That is one of its most revolutionary aspects.”
Withing the year, Tkachenko predicts we may see vertical-first titles re-edited for horizontal viewing or even cinema release. Dedicated Vertical Awards have also emerged.
“At markets like MIPCOM, vertical content is still a relatively small niche, but interest is growing rapidly,” she reports. “People are very curious and eager to understand what’s happening in this space.”

How vertical video became the new frontline for live sports

IBC

article here

Live sports entertainment remains the most powerful driver of real-time engagement in media, but the format through which it’s delivered is rapidly evolving.

As vertical video becomes the dominant consumption format – not just for social clips but increasingly for live content – the question for broadcasters is no longer whether to adapt, but how efficiently they can scale.

For Fox Sports, the answer appears to be training machines to think like camera operators.

“Around 90% of our viewership comes from vertical video,” Ricardo Perez-Selsky, Sr. Director, Digital Production Operations, Fox Sports explains. “That alone shows the scale of demand.”

Fox Sports broadcasts NFL, college football, MLB, motorsports including IndyCar and NASCAR, and global soccer properties such as the FIFA World Cup. Across that portfolio, vertical consumption is no longer a secondary format – it’s the primary one.

Its coverage of LIV Golf, the World Baseball Classic and IndyCar are already complemented by vertical formatted video highlights published to social media. When Fox broadcasts all 104 FIFA World Cup matches live across the US this summer, its streaming and TV coverage will be accompanied by extensive vertical video programming.

All of it is being delivered using technology developed together with AWS Elemental using an AI model trained on Fox Sports content over two years of tests.

From dedicated workflow to machine learning

The origins of the project trace back to summer 2024, when Fox Sports Digital was covering the Euros and Copa América tournaments.

“For our digital-exclusive content (pregame, halftime, postgame, highlights) I basically built out a small control room dedicated to vertical video,” Perez-Selsky says. “It had its own director and producer and team of editors.”

When Amazon engineers visited the setup, they were struck by the fact that Fox Sports was taking 16:9 video and doing a full recut specifically for vertical platforms.

“I explained why that was so important to both our team and our audience.”

The concept sparked an internal hackathon at Amazon later that summer. That experimentation became the early stages of AWS Elemental Inference – an AI driven system designed to automatically convert 16:9 broadcast feeds into 9:16 vertical video.

Teaching a machine visual storytelling

While other tools can convert horizontal to vertical using ball tracking or player tracking, what sets this apart is the depth of development. We trained the model on how Fox Sports Digital produces vertical highlights – focusing on storytelling, smooth camera motion, and following the flow of play, not just the ball.”

The AI model had to learn not just where the action is, but how broadcast cameras behave. “There’s an art to sports camera work. It’s not just panning left and right. There’s a ramp-up and ramp-down. There’s smooth acceleration and deceleration, anticipating passes, not just reacting.”

Over 18 months, the model learned how to behave like a camera operator within a 16:9 frame – avoiding jerky movement and judging where the action is heading.

“It’s more like visual storytelling,” Perez-Selsky says. “Following the action of a play as opposed to just following the ball.”

For example, in soccer, “You’re covering Lionel Messi with the ball, then he passes it 50 feet ahead and there’s this jerky movement trying to catch up. Our technology can anticipate when that’s going to happen. You see a ramp-up and a ramp-down so it’s smooth motion in the highlight.”

That level of refinement took time. “The first time we put it through, it was a little choppy, a little rough. But by the second and third time, you could see that machine learning taking place. It improves with repetition. The machine had to learn how to act like a camera operator inside of a 16:9 frame.”

Replacing 80% of manual editing

While live sports has historically been built for horizontal television screens, reformatting that experience for vertical, in real time, without doubling production costs has remained a stubborn technical and operational hurdle.

For instance, vertical conversion required manual keyframing inside a non-linear editing (NLE) system. “An editor would take 16:9 content and keyframe it into 9:16. That was probably 80% of the workload,” says Perez-Selsky.

Now, that process is fully automated. “The most time-consuming piece (keyframing) is entirely automated.”

Nonetheless, Fox Sports retains vital human oversight. “Custom graphics, custom copy, publishing – you still want eyes on that. And I’d argue you’d want that to stay manual anyway.”

But the heavy lifting is handled by AI. What’s more, the ability to ‘self-improve’ applies across all sports, according to the Fox executive.

 “You can feed it basketball, football, soccer – and it gets better. Anything you feed through it actually learns. That said, if you want optimal 9:16 quality, the machine needs time to learn.”

No vertical cameras required

The automation also shifts the economics of vertical production. Rather than deploying separate vertical camera crews, Fox can leverage the primary broadcast feed.

“It depends what you’re trying to capture,” Perez-Selsky says. “Broadcast cameras which are usually now in 4K, provide resolution and access to ultra-slow motion and multiple angles that an iPhone simply can’t.

“Looking ahead to something like the FIFA World Cup, you might have 20 cameras on the pitch. That variety gives you far more flexibility. Using AWS tools, we can take the world feed and generate high-quality vertical clips. An iPhone can provide a unique perspective, but it’s limited in access and scale.”

“Always-on” distribution strategy

Vertical video highlights are not currently treated as a premium upsell by Fox where the motive is about reach and expanding its audience.

“Highlights and live streams help us maintain an ‘always-on’ presence. If someone follows Fox soccer or IndyCar on TikTok or Instagram, they’re consistently served high-quality vertical content. Our social accounts are round the clock, either with live programming, highlight reels, or evergreen content. Nothing is behind a paywall. That consistency helps grow our subscriber base and broaden our audience. If there are gaps in content, audiences go elsewhere.”

For major events, Fox even offers free vertical previews – for example, the first half-inning of the World Baseball Classic, or the first few minutes of FIFA World Cup matches.

“The goal is to meet audiences where they are and encourage them to tune in via broadcast or the Fox Sports app.”

Technically there’s nothing to stop whole matches being streamed live to mobile. “It’s certainly possible,” Perez-Selsky says. “The only thing that inhibits that now is potential media rights and distribution agreements.”

Fox is already experimenting with format-specific live vertical experiences. For IndyCar Grand Prix for example it has served a vertical live stream that’s exclusively from in-car cameras. “It’s not the full broadcast, there’s no commentary – more of a raw experience,” Perez-Selsky says.

The source remains 16:9 broadcast video, processed through AWS Inference before distribution to TikTok, Instagram Reels, YouTube Shorts or the Fox Sports app.

Beyond sports

Fox’s model is sport-trained but the applications extend further. Award shows, concerts, and entertainment events are all viable with sufficient training data.

“If you fed it every Oscars or every Grammys from the last 10 years, absolutely. It’ll do a decent job right away. The only limitation is the model needs to learn it. Vertical will be part of the conversation for everything going forward, World Cup and beyond.”

Vertical is the essential play

AWS and Fox are not the first to target the market for vertical consumption. NBC Sports was another beta partner with AWS in developing Inference. Last August, ESPN launched Verts, a revamped mobile app which featured clips formatted for vertical. A mobile-first highlights feed produced with Samsung was distributed from the recent Winter Olympics Milano Cortina.

In April 2025, OTT solutions provider Quickplay launched a version of its Quickplay Shorts tool for live sports. This includes an orchestration layer, CMS and front-end leveraging TwelveLabs’ multimodal AI models to analyse, understand and timestamp key moments in videos. Its customers include Philippine streamer Cignal which has been running a ‘live shorts’ service since last April, leveraging content from the professional local leagues like the Philippine Basketball Association.

“With Quickplay Shorts, sports broadcasters can own the conversation around the game, to keep viewers engaged and to drive them to higher value, live game viewing opportunities,” says Juan Martin, Co-Founder and CTO, Quickplay. “The change in viewing behaviour requires a strategic reimagining of audience engagement.” 

Bitmovin is building agentic workflows to automate clip generation, encoding and publishing to mobile-first platforms with demos expected at NAB 2026 and a release of its live vertical workflow tool in Q3.

“Vertical is a hot topic among customers,” says Jacob Arends, Senior Product Manager, Bitmovin. “Broadcasters, telcos and streaming platforms come to us to encode and optimise their content because they are challenged with how to compete with the wave of short-form, scrollable experiences.”

Bitmovin offers content metadata enrichment powered by AI scene analysis. It extracts granular metadata (objects, scenes, speech, actions) from video to optimise search and recommendations and content reuse and clipping.

“If it’s a goal, you want to follow the player who scored; track the ball into the net; and maintain contextual relevance. “That requires machine learning models that understand what’s happening in the scene,” Arends says.

Expectation not a trend

René van Koll, Senior Solutions Architect at Big Blue Marble thinks both formats will continue to coexist. “Look at radio or newspapers – everyone predicted their demise, yet they still have a market. Likewise, some content simply suits landscape better. Think of traditional cinema: that experience doesn’t translate naturally to vertical. But the market is clearly moving toward vertical, and its success shows there’s strong demand. I expect both formats to live side by side for the foreseeable future.”

From a business perspective, the strategy is about engagement first, but new revenue streams are on the radar.

“Live sports rights are among the most valuable assets in media. You can’t replicate a live or viral moment and if you’re not there, you miss that audience,” says Regina Rossi, Head of Product, AWS Media Services. “So this is about expanding reach and engagement – but also monetisation. By adding live metadata and unlocking vertical distribution, customers can create new revenue opportunities and extend their content to additional platforms.”

Martin says consumption is shifting from long-form, TV-first experiences to discovery-driven, engagement-driven formats. “People aren’t waiting for a scheduled time to sit in front of the TV. They’re flowing through content, engaging dynamically, following creators, watching news highlights and sports clips. Traditional broadcasters and rights holders need to adapt to that shift.”

It’s worth recalling that streaming was viewed as something to support the main TV experience but over the last decade has evolved to become the dominant consumption platform. 

“When most viewing happened on television, horizontal made sense,” says Rossi. “But mobile consumption has dramatically increased, and vertical viewing has become the norm. Live sports in vertical, optimised for scrolling and discoverability, is an expectation – not a trend. I believe it’s a long-term shift.”

Wednesday, 1 April 2026

Digital twins: Creating XR experiences that go beyond what’s possible at the venue

SVG Europe

article here

A new front is being opened up in the way sports are covered. Real-time digital twins, immersive AR experiences, and live gaming are bringing sports to a new generation of fans.

“Because we have that digital twin, we can use it in many different ways from broadcast graphics to augmented reality, virtual reality, gaming and more,” says Rosemary Lokhorst, co-founder and CEO of Badass Studios, which has developed a real-time digital twin platform powering XR fan engagement and shared-reality activations for live sports, broadcast, gaming, and immersive events. “XR creates experiences that go beyond what’s possible at the track.”

Badass Studios, which is headquartered south of London, was launched during Covid in September 2020 by Alexander Sangwin-Skillen, who has a background in live event design, and Ben Douglas, who worked on M&As at GE Capital before moving into sports marketing. 

One of their early projects was with Red Bull. They took a Rocket League esports game and mapped it onto Wembley Stadium so it could be played in virtual reality. “Instead of two teams playing in a conventional virtual arena, you could actually see the cars driving around the pitch inside Wembley,” Lokhorst explains.

Lokhorst joined two years later as co-founder bringing expertise in gamification and narrative storytelling. She is the writer and producer of multiple award-winning children’s game Shadow’s Edge which doubles as a therapy tool for emotional health.

“My role was to help Alex and Ben think about the commercial model and what they could do with their technology,” she explains. “Pretty quickly we made a joint decision: instead of continuing as an event production company, we would focus entirely on the technology platform behind these experiences.”

The vision became creating real-time digital twins of live events.

There’s also a commercial angle. Ticket prices to premium live events are becoming extreme. Match tickets for the FIFA World Cup this summer range from $190 (£142) to $790 (£591) for the round of 32, with tickets on the secondary market much higher. A concert ticket for a major act that costs £180 in the UK might cost $1,700 in the US.

“If you can offer a virtual experience for $100, millions of people could participate,” she says. “Our business model reflects that. Our platform is licensed, customers choose the modules they want — AR broadcast, VR experiences, gaming distribution — and then we share [a percentage of] the revenue generated.”

Data capture: E1 racing 

Recently, the studio worked on the UIM E1 Series Championship for electric powerboats. Its process starts by mapping the locations well in advance. For example, the calendar for the 2026 E1 series is known a year ahead, travelling to locations such as Jeddah, Miami, Monaco, Dubrovnik or Lake Como.

“We begin by using Cesium (a plugin to Unreal which is used to render Google’s Photorealistic 3D Tiles), which gives us a base version of the environment,” Lokhorst explains. “Then we interrogate it closely, because sometimes the resolution isn’t high enough. When that happens, we replace important elements ourselves.

“For example, if there are iconic buildings, like Villa d’Este on Lake Como, we rebuild them in Unreal Engine. That gives us a detailed base environment.

“Then we adjust elements we want to manipulate. Water is a good example: if boats are racing, we need realistic spray, wakes and movement. We also add assets we’ve already built, such as sponsor logos and structures like the Ocean Club paddock.”

The boats themselves are also pre-built using actual CAD drawings for accuracy. Around two weeks before the race Badass receives additional data, including the exact racetrack. It uses a tool in its own platform called Track Mapper to plot data points on the map. This automatically generates the racetrack and sponsor placements.

During the race itself, Badass ingest live telemetry from Al Kamel, the official data provider. This includes GPS position, speed, throttle, trim, weather conditions and more. This is mapped onto the virtual environment effectively creating two parallel worlds: the real race environment, with the actual boat; and a virtual environment, where the same boats, branding and movement exist in real time.

During the broadcast Badass has the ability to move sponsorship branding dynamically.

“For example, if a key moment happens near the final corner, we can reposition sponsor branding there so that it gets maximum visibility during the action,” she says.

“Our platform is essentially the underlying layer. On top of that, customers can choose modules such as AR, VR or mixed-reality broadcast tools.”

Virtual broadcast is a fully digital environment with cameras placed anywhere desired. “Because it’s virtual, we can position cameras overhead, track the action dynamically, or place them in impossible locations,” she adds.

“If broadcasters want the real footage but with digital graphics and sponsor placements, we simply remove the environment and boats from the virtual scene and overlay the AR elements onto the live broadcast.”

There’s also the possibility to run the same environment on simulators at the event itself. “Visitors can sit in a simulator and race on the same track using the same data. They can race against each other, or choose a mode we call Live GP, where they race against the actual drivers on the water.

“Because we know the exact location of the real boats from telemetry, players can literally compete against teams like Tom Brady’s team or Will Smith’s team while the race is happening.”

At events, people can experience the simulator either on a standard screen or with a Meta Quest 3 headset. Badass is also exploring support for Apple Vision Pro for higher-resolution experiences. Additionally, the game can be played through a browser.

“We’ve also published a closed beta on Steam, and later this year we plan to release the game publicly so people can race from home — even racing live against the real pilots,” Lokhorst says.

Badass is also creating digital twin activations for Extreme E, the sustainable race car series, which like UIM E1 is founded by Alejandro Agag. Features such as VR cockpit views give fans the thrill of sitting in the driver’s seat, while AR overlays provide real-time stats and insights unavailable in person. These technologies let fans access multiple perspectives, behind-the-scenes content, and interactive features that deepen their connection to the race.

The studio has worked with global motocross series WSX and mixed martial arts organisation PFL, repurposing data into live AR overlays on the broadcast or virtual game sims.

“Our long-term vision includes team sports as well,” Lokhorst says. “Imagine watching tennis or soccer or American football in virtual reality. You could enter the stadium virtually, choose your seat, and watch the match from anywhere. You might even stand on the pitch during a penalty.

She says Badass is also in discussions with a cricket league. “Some sports are simply hard to watch closely in the real world — sailing or boat racing, for example. In a virtual environment you can move anywhere you want.”

Beyond sports

Entertainment is a natural extension. “Imagine attending a Taylor Swift concert virtually, but in real time. You could walk to the front of the stage or dance alongside avatars performing the choreography,” Lokhorst says.

Virtual concerts are not new. Examples include ABBA Voyage or the Travis Scott concert in Fortnite. “But those are usually recorded experiences,” she says. “What we want to create are live shared moments that feel interactive.”

Among the challenges that need to be overcome are the ability to create a social experience within an immersive environment, so the fan experiencing a virtual live event doesn’t feel isolated. The second is delivering a live real time and increasingly personalised experience without breaking the bit budget for connectivity.

On the first, Lokhorst says the approach is to rebuild environments such as stadiums and populate them with other avatars.

“Of course, we can’t realistically place 40,000 people in the same virtual instance. Instead, you see the people within your field of view, and the rest are simulated based on data. The goal is to create the feeling of a shared space.”

Managing bitrate is more complex but approaches here include sending only the data that is needed for the person’s field of view.

“The base environment is the same for everyone and that gets distributed once. The only data moving across the network are the dynamic elements — things like where the boats are. This is similar to how online games like Grand Theft Auto operate. The world exists locally, and the server simply updates the dynamic data.”

In terms of connectivity, Badass utilises a mix of fibre connections, WiFi and 5G, depending on the location. When there’s a broadcast setup already in place, it often piggybacks on their infrastructure.

“5G can be inconsistent depending on the area, so we expect 6G to provide wider and more reliable coverage. At the moment our latency is extremely low — around 0.0018 seconds — which is already sufficient for real-time experiences.”

AI accelerator

The studio has just been accepted into Dell’s AI Accelerator programme which Lokhorst says will help them to optimise the infrastructure. 

“One area we’re exploring with Dell is how AI can reduce the amount of data that needs to be transmitted by predicting certain events,” she explains.

AI is already boosting its capability to build digital twins. LLMs help in development workflows for coding assistance, documentation and standardising processes. AI is also able to collect, clean and organise telemetry data far more efficiently than before.

“We also use synthetic data,” she says. “For example, if we want to pitch a race to a new city like Hong Kong, we can simulate an entire race environment with generated data and show officials what it would look like.

“AI also fills data gaps. If a camera feed drops during a race, we can temporarily switch to the virtual broadcast and use predictive data to maintain the continuity. For certain environment elements, generative AI can automatically fill in buildings along a coastline. Beyond that, AI helps with predictive modelling — for example predicting optimal race lines, which can be used for training simulations.”

Similar applications were promised several years ago during the first wave of metaverse hype and the arrival of 5G.

“A lot has changed technologically since then,” she says. “Compute power has increased, rendering engines like Unreal have improved dramatically, and high-resolution environments are easier to transmit over the internet.

“AI has accelerated development. Where building a game environment or a city like Miami once took about a year, we can now do it in two to six weeks. Today it’s becoming more industrial and practical. Sectors like military training and healthcare simulations have helped improve the underlying technology and infrastructure.”

True mixed reality experiences

So much so that Lokhorst believes that traditional sport and virtual worlds will increasingly converge.

“I think we’re moving towards true mixed reality experiences. Imagine being able walk onto the pitch alongside Lamine Yamal when he steps up during a penalty shoot-out. Or standing next to a Formula One pit stop in the virtual world. Or practice keepy-uppy with David Beckham. Or paying to have the Ted Lasso seat if I want.

These would all be avatar experiences, but younger audiences are comfortable with that. It could also bring families together. Parents might watch the broadcast normally, while children interact with the same event through the virtual layer.

“Ultimately, it’s about creating more shared moments.”