Tuesday, 15 July 2025

How Sky Sports delivers the Lions Tour down under

IBC

article here

On 19 July, after six warm-up matches the British & Irish Lions finally face the Wallabies in a three Test series which will run until 2 August. For the first time Sky Sports is broadcasting the Tests in UHD but this is almost an incidental upgrade compared to the significant remote production it is managing from Down Under.

“The 1997 Lions Tour was essentially one satellite World Feed coming back in. Now its 30 feeds,” says Sky Sports Senior Director, Sam Foskett who has worked on all eight Lions Tours since 1997. “The difference between then and now is huge. 2013 was the last time we sent our full crew to Australia for the Test matches, probably, 25 people. Now it’s just six travelling from the UK.”

The last two Lions tours have been a joint venture between the host Union and the Lions. “Up until 2017, the host Union was in charge of everything and the host broadcaster wasn't but now it's very much a commercial and broadcast partnership and a real collaboration,” explains Foskett. “That's given us the opportunity to get all the camera angles we need to be able to tell the stories in depth in a way we never had before.”

The 2017 Lions Tour from New Zealand was the first sports event that Sky produced remotely. “That was ambitious. We'd done some remote for the US Open tennis but there'd never been a remote from the Southern hemisphere. It was also proof of concept for Formula One.”

At that point, Sky’s F1 team of 40 crew were flying with the rest of the F1 circus around the world. Following the success of the Lions’ PoC from NZ, F1 went remote as well.

The host broadcaster is Australian SVoD Stan Sport which is owned by media group Nine Entertainment. Foskett emphasises the unusually close collaboration with Stan on the production.

“We’re sharing on screen pundits. Their guys come across and do content for us and vice versa. We're sharing filming of training sessions and features. They've given us as many camera feeds as we want. Stan’s vision mixer used to work for us in the UK.”

Foskett worked with Stan’s host director Hamish France on the 2015 Rugby World Cup.  “We’re very comfortable with how we can help one another,” he says. “It means I can speak to him during the game should I need to. For instance, during warm-ups he'll tell me exactly which camera is doing what and that means I can use his cameras to augment our presentation. That direct line of communication is very unusual.”

Since the World Feed is principally for an Australian audience Sky’s brief is to augment it to tell the story for British and Irish fans. “The key to that is that we take a clean feed to which we add our graphics.”

Sky’s graphics (designed by an Australian First Nations’ artist) use match data supplied by Oval Insights and managed by AE Live.  “Stan press one button and that does a Stan wipe on their feed and a simultaneous Sky branded wipe on ours. That means we have our creative all the way so it looks consistently as our program.”

Sky takes in an exact copy of the Sky World Feed as a backup and as further redundancy takes in Stan's World Feed as well. “That's got all of their graphics on so should we have any trouble with our clean Sky feed we can jump to that, but it’s not ideal because we would be using their graphics and we're trying to keep the Sky look across everything.

Among the 30 feeds that Sky is bringing back from Australia to Osterley is a Spider-Cam for every match, super slow EVS for replay angles, coverage from changing rooms pre-game and halftime and coach-cam during the game. There is not though for this event any RefCam since World Rugby declined its use.

For the first time the Tests are being broadcast in UHD (SDR) and will feature Sky unilateral cameras including its own RF cam permitted to enter the pitch during warm-ups and post-match and (potentially) half time.  Stan Sport will share additional RF Cine cams with depth of field lenses/sensors for post-try celebrations.

“The great thing is that Rugby Australia and all the teams seem quite relaxed about cameras on the pitch during warm-ups or during tries,” says Sky’s production manager Rebecca Lea.

Sky has a couple of ENG crew on tour, one focussed on Sky Sports news and the other shooting feature material. “On matchday, one of those is located behind the goal post the Lions are attacking. The other has got a roaming brief and can go anywhere to get fans or injured players on the bench, or live action.

“We’re also taking a lot features from Stan Sport and we're reciprocating with cameras so if we get a definitive angle on something, Stan will have a feed of that as well. We're essentially collaborating on everything we can.”

During the warm-up matches all Sky’s ISO camera feeds were returned via internet points around each stadium to LiveU800s positioned in the OB compound. The two ENG cameras each had their own LiveU600 attached to them. For the Tests the level of facility was ramped up to include a dedicated NEP Australia OB truck at Melbourne, Brisbane and Sydney.

“To make remote production workable city to city we’ve shipped Sky’s Remote production flypack into Australia,” Lea explains. “With this kit plugged in, it will provide a consistent remote production experience at each Test Match venue.

The Sky Flypack contains; Net Insight Nimbras to build the data trunk over the Telstra network to the UK, NTT encoding and decoding devices, Riedel talkback, plus Calrec RP1 for remote audio mixing.

This is kit normally put to work on Sky Sports’ remote coverage and will be shipped to New York for the U.S Open immediately following the Lions Tour.

“For Sam in the gallery [in Osterley] it will feel like a regular remote production just like we do all the time and we're not relying on internet connectivity with the LiveUs,” Lea says. Sky HQ is also where highlights and show closures and teasers are postproduced.

All of this is achieved with a minimal crew of six from the UK (plus seven talent) and a couple more local staff. The Sky UK crew includes production manager, sound guarantee, camera-op, producer, and data engineer.

“That's where the big Sky tech jigsaw is really clever. The distances in Australia mean logistics are challenging but the remote production kit means we can produce the show just as efficiently from the UK.”

Indeed, Sky Sports is able to throw more creative bells and whistles at the Lions Tour coverage because it is able to host presentation from the flagship studio used for Monday Night Football. The Studio features a Virtual Window in the central LED screen allowing presenters to stand in front of the screen, creating that illusion of depth. Former Lions captain Sam Warburton will lead analysis in the studio interacting with the Piero graphics system.


Monday, 14 July 2025

The United Arab Emirates – Ambition without bounds

AV Magazine

At risk of being overshadowed by its noisy neighbour, the Emirates are a distinct, exciting and growing market

article here

In the decade up to 2018 rapid high-tech modernising in the Middle East was focused on Dubai. That spotlight shifted to Saudi Arabia and its Vision 2030 cultural renaissance, leaving the Emirates in the shade.

“There’s lots going on in KSA but its early days and a lot of the focus is on future projects for which they still have to build the infrastructure and the buildings themselves before the AV is installed,” says Steve Simpson, account and project manager, Datapath. “Meanwhile, the UAE isn’t standing still. There are opportunities and projects happening especially in cyber security, education and utilities.”

Johnny Hickman, sales account manager at Matrox Video agrees. “The UAE pro AV market is thriving, fuelled by investments in infrastructure, tourism, and technology. Demand for advanced AV and AVoIP is strong, especially in videowalls for control rooms, entertainment, and signage.”

The KSA’sVision 2030 poses competition, but UAE’s mature infrastructure and early tech adoption (eg. Expo 2020) give it a leading edge. Both markets are set to coexist and complement each other.

Distinct strategic markets
Netgear maintains “distinct strategic focuses” for both markets. Sales chief, Annamalai Ar reports: “The UAE is characterised by an expectation for cutting-edge technology implementation, making regular system upgrades virtually inevitable for businesses seeking to remain competitive.”

He cites rising demand at stadiums and racing tracks for specialised AV network switches of the type that Netgear sells. “The primary drivers of AV investment over the next five years will be live events, stadium developments, government office modernisation, and new infrastructure initiatives,” he adds. “These sectors represent the cornerstone of future pro AV growth in the region.”

Brad Maiden, regional General Manager, d&b audiotechnik calls AV business culture “dynamic, supportive and above all, a lot of fun. Whilst it is growing and robust, it is also a close-knit community where competition is fierce, but (mostly) with a good nature.”

“The majority of project processes are tendered and the level of documentation required to win projects is of a really high standard. The quality of your brand, work and character really matter in this market.”

He highlights the Coca-Cola Arena, Dubai, which is installing the full range of d&b’s signature SL-Series “making it one of the most sonically sophisticated venues in the world.”

Justin Joy, senior sales manager, Peerless-AV judges the local pro AV market “positive” on the back of “an established, globally connected business hub with prominent investment opportunities.

“Multinationals are opening more local offices there. As more major rental companies set up bases in Dubai, this is having a big impact on UAE’s live events sector which is seeing exponential growth. We continue to see increased demand for dvLED mounting systems across multiple sectors, including corporate, hospitality, finance and auditoriums.”

He also observes that the UAE is slightly overshadowed by the KSA’s ‘Giga Projects’ which major integrators are focusing on. “Having said that, UAE remains the AV design hub for both UAE and KSA as the top consultants are Dubai office based,” he says.

UAE's strategic position as a tourism and business hub
The UAE is a “dynamic and forward-thinking environment for pro AV,” according to Sergio Molho, director of business development at WSDG. “Its openness to foreign investment and strategic position as a tourism and business hub give it a unique advantage. While Saudi Arabia is seeing massive, rapid growth, the UAE maintains its own distinct appeal, driven by its stability and international connectivity.”

Molho further judges Emirati AV culture to be highly sophisticated and aspirational. “There’s a strong push to showcase the latest, boldest, and often most luxurious technology, sometimes prioritising visual impact over functionality. It is a market that thrives on innovation and spectacle, with a continual appetite for differentiation. This cultural mindset drives demand for standout installations and future-forward systems.”

Also calling the pro AV landscape “incredibly vibrant with a wealth of opportunities” is Rebecca Knight at Disguise highlighting the ambition across live entertainment, government and enterprise which is driving significant investment in advanced media server technologies.

“There’s a real appreciation for quality and reliability,” Knight adds. “AV culture in the UAE has historically been characterised by a pursuit of excellence and a willingness to embrace technology to deliver world-class experiences. The scale and ambition of these projects is driving significant demand not just for pro AV technology, but expert technical and creative services to help bring these visions to life.”

Simpson has lived and worked in the region since 2003, working exclusively in AV. He says the market has evolved a lot in that time. “One thing is for sure - they do not like to pay for services or recurring licence fees, so if this is your business model it is a tough sell.”

The fact that the company’s Aetria control room product is made, tested and approved in the UK has helped win high-end security projects. “This is one aspect of change which definitely helped us get into certain projects, and was never an issue a few years ago.”

Simpson advises care in posting to social media in the region “or you could be in trouble subject to the laws that govern this.” Consequently, billboards and advertising are still big business with many huge DooH displays, around malls, on the sides of buildings and main roads. “Lighting or projection mapping on large monuments also plays a big part of events with large sound staging and amazing content,” he notes.

DooH an expanding category
Joy also finds DooH an expanding category with advertisers “embracing new technology advances and recognising the importance of targeting communities and audiences in new urban areas.

“AV projects in UAE are highly organised and key consultants are more involved than they ever have been in the control of design and eventual product selection in a project.”

Peerless recently completed a flat-to-wall dvLED video wall project with an international bank in UAE. It’s part of increased investment in the banking sector to modernise branches and enhance competitive edge while improving internal and customer communication.

According to AVIXA, the wider MENA market is expected to generate $13.5 billion in pro AV revenues this year, growing at 5.4 per cent to 2029. While Giga projects for tourism and sports make headlines, the top two solution areas are conferencing/collaboration and security/ surveillance, representing 16 per cent and 15.2 per cent of revenues respectively. Corporate and media/entertainment are the top vertical markets - at 20.8 per cent and 17.1 per cent of revenues.

“Control rooms remain a strong vertical in the UAE, with AVoIP streaming now a standard requirement rather than a future aspiration,” says Hickman. “This shift isn’t limited to mission-critical environments. In the corporate sector, organisations are increasingly adopting AVoIP to support a wide range of applications, such as 1G or 10G content distribution for meeting rooms, lobbies, and breakout areas.”

Broadcasting abilities
Separately, in-house media studios are demanding the ability to broadcast uncompressed content, requiring higher-bandwidth 25G-capable solutions. “Open standards like IPMX and SMPTE ST 2110 are gaining traction across these applications, offering the interoperability, scalability, and performance needed to support multiple concurrent workflows in a unified AVoIP ecosystem,” Hickman informs.

“Over the past few years, the region has seen a significant rise in technical maturity, with more integrators and value-added distributors (VADs) gaining deep expertise in technologies like AVoIP.

The ability to demonstrate products, providing prospective customers with a hands-on experience, is very important, and many AV VADs and integrators have invested heavily in demo facilities and experience centres to meet this need.

“We see growing interest in sustainable solutions, smart city integrations, and immersive, interactive deployments driven by AI, IoT, and automation.”

Growth is expected to continue, driven by scalable IP-based workflows, reduced hardware needs, and open standards like IPMX, confirms Hickman.

Hospitality growth
With an extensive coastline and a great climate for most of the year, the UAE is seeing a large investment in beach bars and clubs, restaurants, hotels and pop-up leisure events.

“Leisure is a large part of the offering in the UAE and they do it so well here,” says Maiden. “Service is key. In Dubai in particular, but also in Abu Dhabi and Ras Al Khaimah (RAK), you will receive some of the best service and experiences anywhere in the world.”

The Wynn Integrated Resort is driving development in RAK. Nearby Ajman is increasing its entertainment and leisure sectors. Saadiyat Island (Abu Dhabi) is expanding with museums and galleries:

“It will be a global cultural destination when it’s finished,” says Maiden.

According to Simpson, Abu Dhabi hosts the majority of the large ministries and government projects. Dubai has always been more focused on tourism but also has a thriving financial centre. “Sharjah, Al Ain, Ajman and other emirates only have a few projects - mainly in government or utilities - with the exception or Ras Al Khaimah where they have a good number of high-end hotels now.”

Dubai remains “the powerhouse for business with a strong entrepreneurial climate and government initiatives supporting development of young talent,” reports Joy.

Hickman notes that Dubai leads in entertainment, smart city projects, and events.

“Abu Dhabi focuses on cultural, education, and government sectors. Sharjah and Al Ain prioritise heritage and cultural tourism.”

Gateway growth at Al Maktoum International
The ambitious $35 billion expansion of Dubai International Airport (Al Maktoum International) is set to become the world’s largest airport, handling up to 260 million passengers annually. When finished around 2034 it will feature five parallel runways and 400 aircraft gates, covering an area of approximately 27 square miles, five times the size of the current Dubai International Airport (DXB) which has consistently topped the list for the world’s busiest international airports.

WSDG is leading the acoustics and systems engineering through its Berlin office working via London-based architect, Leslie Jones Architecture.

“The scale and vision of this project are unparalleled, aiming not only to redefine air travel in the region but also to establish a new global benchmark for airport design and functionality,” says the company’s director of business development, Sergio Molho.

Sharjah has also begun a $327 million expansion that will increase the airport’s capacity to 20 million passengers a year.

 


Friday, 11 July 2025

Shrinking camera gap

Definition

At a recent presentation to the Hollywood Professional Association the postproduction entrepreneur Michael Cioni demonstrated just how far the gap between high end cine cameras and consumer imaging had shrunk.

article here

He made a side-by-side comparison of footage from an ARRI Alexa Mini, a Fuji GFX100 and a iPhone 16. All were fitted with the same Nikon Prime lenses and shot the same lighting conditions.

“You can see in these results that a camera that costs closer to $100,000 [Alexa] and $1,000 [iPhone] are not that dramatically different. This is visual proof that accessibility has got so narrow that we need to think twice about it.”

Cioni was urging studios and streamers to relax their thinking around the types of equipment that should be used to create cinema and TV content otherwise creators and YouTubers will use technology that is so much less costly and virtually indistinguishable in quality to beat them.

There are signs that this is happening but it is being led by auteur directors and DPs prepared to think outside the box. They are not necessarily choosing inexpensive cameras to fit lower budgets but for greater flexibility in storytelling. What these cameras tend to have in common is their smaller size, making them perfect for run-and-gun filmmaking, while also supporting a wide range of accessories for more complex productions.

RED Komodo

The RED Komodo-X has found a solid niche as B-cam on projects including Furiosa: A Mad Max Saga where Simon Duggan ASC, ACS deployed a five camera Komodo rig for background plates and another set mounted on sliders attached to the undercarriage of the War Rig vehicle.

The camera contains the same image science as stable mates like V-Raptor making for consistency in cutting. At less than $3000 multiple Komodo can be rigged on set and if one gets trashed it won’t break the budget.

“F1 cars are so low to the floor there was limited room for us to grip so the Komodo’s weight and size was essential for us to rig multiple bodies as crash cams,” explained Azul Serra, ABC about the action sequences for Netflix series Senna.

The box-format V-Raptor body itself weighs just over 4 lbs and can be mounted on a drone for first person view (FPV) sequences. The Helicopter Girls has pioneered this using a gimballed FPV on projects including on a woodland chase sequence on Wicked: For Good for 2nd unit DP Sam Renton; on Stuntnuts: The Movie for Ben Davis BSC; and for a FPV of racehorses Downton Abbey 3 for Ben Smithard BSC.

“What’s remarkable about that shot is that it doesn't look like it could possibly be from a FPV drone,” says Helicopter Girls co-founder Emma Boswell. “It looks like it should be from a tracking vehicle or a wire-cam. It's an astonishing sequence.”

Director Kazik Radwanski and DP Nikolay Michaylov made Komodo their A camera for indie romance Matt and Mara, in part to meet 4K deliverables.

“Given its the size of a Rubik's Cube Komodo complemented our shooting style, which is very documentary, run-and-gun and very self-sufficient,” says Michaylov. “But it’s also a camera that can be transformed into more of a studio build, which we required for some sequences in the film.”

Sony Venice

Similarly, Sony Venice has been used in its Rialto extension mode to cram multiple cameras into tight spaces. The sensor can record a full 6K in large format suitable for IMAX and way beyond the quality that action-cams like GoPro could achieve.

Claudio Miranda ASC fitted six Venice into the cockpit of an F-18, and another four externally for aerial photography of Top Gun: Maverick. Multiple Venice were placed aboard sets for AppleTV+ saga Masters of the Air.  Erik Messerschmidt ASC rigged nine Venice on fast driving cars for Micheal Mann’s 2023 biopic Ferrari.

Sony just launched an even smaller extension system which is the size of a smartphone. The system can also be used to shoot stereoscopically. When two units are placed side by side, the distance between the two sensors is only 64mm, mimicking the average distance between our own pupils.

“You can operate it like a medium format stills camera and shoot from the waist if needed,” says Kate Reid BSC who shot a test film for Sony directed by the Lynch Brothers inside a replica 2x2x2m space capsule.

One-ers

The current vogue for movies and episodes shot in single takes is possible because of the combined light weight and image quality of camera tech.

Philip Lozano, AFC chose V-Raptor because it was half the weight of Alexa to shoot 88-minute single shot indie horror MadS. Another factor was the ability to record 6K RAW without changing media for at least 90 minutes.

“I didn’t want to start shaking involuntarily because my muscles were tired,” Lozano says. “Nor did I want to photograph with a lower spec camera. The whole package including camera, rig, battery, lens was only 12 kilos but you still need core strength to be able to control the wides and the tight shots.”

A specially designed rig gave Lozano the ability to stabilize the horizontal movement. “The idea was that I could move the cameras as if I had a Steadicam or dolly as well as handheld.”

The DJI Ronin 4D has built-in stabilization, a compact design, and full-frame image quality that enables filmmakers to capture dynamic shots. It’s also relatively inexpensive, costing around £6,000.

The action scenes in Alex Garland’s Civil War were shot on Ronin 4D, notably the climactic battle for the White House. It was his experience working with camera operator Dave Thompson that led Garland to shoot the visceral action of Iraqi docu-drama Warfare entirely on the system, with Thompson as DP.

DP Matt Lewis also used the Ronin 4D to shoot the acclaimed Netflix series Adolescence. “Any slightly larger gimbal and would have been too limited,” Lewis says. “It would have been heavier and had to have been connected to a single operator the whole time. We couldn't have done handoffs or anything like that. So much of what ended up in the show was based on being able to be nimble.”

Sports action

Sports broadcasters are always demanding smaller imagers with high dynamic range to get closer to the action on field or track. Two of the latest innovations in this regard debuted in April.

Proton claims to have made the world’s smallest slow-motion camera, measuring 36x36x90mm. The Proton HFR captures 12-bit dynamic 1080p in frame rates up to 240 fps. It also has a global shutter which eliminates motion artefacts.  It is available with a Flex option which allows the camera head to be separated from its processing unit so it can be used in even smaller spaces.

Germany’s Dream Chip Technologies makes some of the world’s smallest cameras in use during the Super Bowl, on cars covering Daytona and Formula-E racing and for soccer ref-cams. The 30x30x31mm dimensions of the AtomOne Mini features a dynamic range and colour reproduction which matches standard systems cameras.

“Broadcasters don’t want to use a DJI or GoPro because they can't colour match it to cameras like the Sony HDC-3200,” says Dream Chip’s Christian Kuehn.

The new AtomTwo has similar HDR qualities and 1080p 60fps with a global shutter. “The global shutter means you don’t get visible distortions and disturbances associated with a rolling shutter. Five years ago, it was impossible to make such a tiny global shutter camera but sensor manufacture has made incredible advances.  So many cameras are coming out now with a global shutter I think rolling shutter will soon be obsolete.”

Thursday, 10 July 2025

Video replicants and the drive for ethical LLMs

IBC

article here

Image generators such as Veo 3 can now convincingly simulate human emotions, interactions and voice but the speed of development leaves production companies crying out for ethical LLMs.

It is an irony of the AI revolution that a photoreal intergalactic space battle - among the hardest and most expensive things to shoot in real life - is among the easiest to reproduce in Generative AI, but simulating a basic dialogue scene over several minutes between two people? Forget it. For now.

“This is a turbulent era,” says digital artist László Gaál. “One of the worst aspects of AI today is that the pace of technology change is so fast no one can adopt it. Everything that I tell a client on day one will be not true in three months’ time.”

The Hungarian gave up his career as a colourist to concentrate full time on producing purely AI generated content. What he has produced, including a mock Volvo commercial and news report from a fake car show, has made him a talent in demand by brands including L’Oreal to guide them in AI experiments.

“I really enjoy experimenting with different kind of workflows, for example, using traditional analogue photos and turning them into something else,” he says.

He claims to be the first to transfer an AI video back to film stock just to see what it would be like to bring together the two ends of the filmmaking spectrum: analogue celluloid film and AI generated footage.

He was one of the first to try Veo 3, churning out the photoreal 70’ AI-generated auto show video in less than 24 hours after Google’s release.

“GenAI is extremely good at delivering almost everything in a very short amount of time but there are technical limitations,” he says. “10-20 per cent of the time there are inaccuracies, especially with product shapes, logos and texts.”

For example, he says, a completely different workflow is required if you just want to have a random car driving in your scene, or if you want to have a specific brand’s card driving in that scene.

The most interesting development for Gaál is Veo 3’s audio generation.  “Previously, Gen-AI video of people with dialogue didn’t just look fake,” he says, “it looked scary to have incredibly realistic people with very strange mouth movements. The progress on lip sync in Veo 3 is huge. It’s the first model that generates realistic talking. Equally as important is the ability to generate any kind of sound effect. For example, if you want to create a nice exterior scene of the British countryside, Veo3 will generate audio to match.”

Other AI creators agree. Hashem Al-Ghaili tested whether Veo 3 could handle macro videography and found that not only did it generate astonishingly detailed ‘footage’ of insects it did so with sound effects for each shot. “It’s going to change documentary filmmaking forever,” he said.

Nonetheless, you can’t supply your own audio to a Veo 3 generated animation. “In a text prompt you can describe what your character should say and then Veo 3 will make it say that but [the software] won’t allow you to upload an image, for example of yourself, and make it talk.”

Another limit is that each generation is only eight seconds long. “That's why so many videos you see made with Veo 3 are of random characters saying one sentence and then we never see them again. Currently, there's no solution for one character to speak longer than 8 seconds. You cannot reference previous generations.”

A new feature in Veo called ‘Ingredients’ goes some way to addressing this. It allows users to upload specific elements such as photos of a character, object or vehicle) which could be maintained in successive generations. This feature improves customisation while maintaining visual consistency but there is - as yet - no audio capability to accompany this. 

“There's a lot of interest from creative agencies but it’s mostly stemming from their lack of knowledge. Currently half of the job is figuring out the workflows so you can you make something happen that no one else has thought [of],” he says.

He thinks agencies are trying to apply their existing conception of ad production to AI without understanding what AI can actually achieve.

“They’ve seen a five second example of an AI video on Instagram and say they want the same, but with our product in 10 different locations and with a consistent character throughout. I am getting some briefs like that where they just want to force someone to do their ideas their way. I don't see any genuine interest in what is possible.”

He says other companies are experimenting with full time ‘AI Engineers’ who are creating tools and workflows; “Those are closer to what I believe.”

Human in the loop

The world’s biggest VFX companies are adding AI into their pipelines. Cinesite has launched an internal technology exploration unit called TechX; Dneg’s AI subsidiary is called Brahma and recently acquired Metaphysic and Canada’s RodeoFX is exploring AI principally in its advertising division where projects tend to be smaller and AI tools can make artists more agile.

“We have a group using AI to go faster, to explore more things we couldn’t do before and to make more tests with clients,” explained Marie Amiot, VP Advertising Services and Original Content, RodeoFX speaking at Annecy International Animation Film Festival. “Our artists who use AI are happy with the result but it is challenging and also raises ethical and environmental questions.”

LA-based AI production company Asteria is also using AI to speed render times and make more room for creative experimentation. “We’re not generating in the style of Studio Ghibli or anyone else”, insisted Senior Director, Arvid Tappert, speaking at Annecy in reference to the viral craze which saw users of OpenAI’s GPT-4 reproduce characters in the revered Japanese animator’s style. “That’s not what we're about. We want to make non-derivative work from our own ideas and we want to control the AI.”

The dilemma for independent video production and postproduction companies is that the pressure to produce more content at far lower cost may be too hard to resist if an AI tool can get to the same result more efficiently. That risks redundancies as well as copyright infringement.

“We are doing a lot of premium productions that are expensive to produce and that limits how many original projects you can make,” says Barbara StephenPresident of Australia’s Flying Bark Productions. “If we can use emerging tech to help talent tell more and different stories it is an opportunity for us. That said, our business is based on the value of IP and copyright. We have no tolerance for infringement of rights of artists.”

Rob Hoffman, Head of Industry Strategy at computer maker Lenovo laid out the macro pressures impacting film and TV producers, “You are all being asked to deliver a larger volume of higher fidelity content than ever before against timelines and budgets that aren't scaling with what you're being asked to do. There is a fundamental gap between your client's expectations and your ability to be able to deliver. Pipelines and production tools are just inherently complex and they're getting more complex. Throw in concerns about technology taking over the creative process or taking away the artist itself and this has rippling effects across the industry.”

The burden of trying to figure out all of these challenges shouldn't fall on the shoulders of individual creators or artisan studios, he said.

“It's the responsibility of those that are creating the tools and technology being used for film, television and game development to be stewards of the industry. We all need to be doing a hell of a lot more than we are today.”

Emily Hsu, Senior Producer at Epic Games sympathised: “Any decent producer wants to empower creators with the best tools to remove as many roadblocks as possible but if you can’t hire more headcount then the only lever you can pull is more tech and better workflows.”

She believes the market will decide. “Audiences should be given more credit for distinguishing AI slop from quality creator driven content. Slop means it is made without intention or skill and without the eyes of a creator.”

Not everyone is so optimistic that humans will remain ‘in the loop’.

Tim Miller, founder of Blur Studio said at Annecy, “I don’t feel it’s safe to say that humans have to be in the process. There’s a lot of slop, true, but I’ve also seen AI tools do things that I’d not thought possible. If any of us think it won’t continue or accelerate then they are running towards a cliff with hands over their eyes.”

He added, “AI doesn't care whether artists were involved or not. If a group of artists has plagiarised you then the industry could shame them into not doing it again, but AI doesn't give a shit. It will use whatever it needs to accomplish the task. That's scary but also the interesting part.”

Ethical image generators

To move forward, the creative industries would like to use AI models that have either been trained on internal (bespoke/owned) content or on licenced data rather than being scraped off the web.

Nicolas Dufresne, independent director and developer said, “Animators want regulation. Those afraid of AI need regulation. Everyone wants regulation except the main developers.  We need to teach how AI works and show what the consequences are of bad AI but the problem is the opacity of the main AI developers.”

Google, OpenAI, Meta, Runway (maker of Stable Diffusion) and Midjourney are accused of a lack of transparency in the data on which their image generators were trained. Many are the subject of ongoing lawsuits.

Alternatives are emerging. Asteria has launched an AI imaging tool “which is 100 per cent” according to Tappert. Called Marey and built with Moonvalley “everything is fully licenced and paid for. It offers customisation options like fine-grained camera and motion controls. OpenAI and Google say it can’t be done and that they need to scrape [the internet] but now we have something positive to show that can use your own material and everyone gets paid, which is the way it should be.”

Spanish stock library Freepik has partnered with Fal.ai to launch Flite, another open-source image model trained on licensed data. This is built on a dataset of 80 million images, which is far smaller than the usual 1 billion+ images of LLMs. Nonetheless it claims it is the largest publicly available text-to-image model “trained entirely on legally sound content.”

There is evidence that the AI giants are mining specialised data sets rather than trawling the internet, even paying companies for those assets.

“Instead of seeking volume the new trend these companies are asking for is specially curated data which should at least be better for the environment because processing power will be less,” notes Dufresne.

Unleash the genAI

In January 2023 Gaál set up an Insta account under a pseudonym showcasing AI generated images he had made of cars but posted them as if shot for real by a stills camera on film. He says he stopped posting after people began to think the AI was in fact real.

“My idea was to follow the AI journey with AI by creating an account of AI generated photographs. made by an invented photographer, Rick Deckard. After a few months no-one could really tell which one was AI generated and which one isn't. The AI became so real that people mistook them for the real thing.”

The nod to Blade Runner hints at the idea that one day, machines might not even know they are machine. “How will it feel for a genAI machine to create the images and videos we prompt?” says Gaal. “Will it fear being turned off?”

Al-Ghaili is similarly awed, even if his tests are tongue in cheek. “Imagine if AI characters became aware they were living in a simulation,” he posed in a Veo 3 generated video.

In a world where AI-generated characters are being introduced to video games, this is not as far-fetched as it sounds.

Wednesday, 9 July 2025

Telestream Vs Encoding.com law suit goes public with reputations at stake

Streamingmedia

article here

Telestream, the veteran digital video software and workflow tech provider, is fighting a lawsuit brought by two former employees which may have salutary lessons for company owners planning a sale to private equity.

The suit, brought by Greggory Heil and Jeffrey Malkin, co-founders of Encoding.com, against Telestream claims Unpaid acquisition consideration, wrongful termination, fraudulent transfer, and successor liability.

Ultimately, this case (No. CU0001096, Superior Court of California, Nevada County) may be settled in court but the jury of public opinion is already weighing its vote – and it doesn’t look good for the reputation of the private equity led management at Telestream.

First let’s get some facts on the table.

CEO Dan Castles co-founded Telestream (with two others) in 1998. He oversaw a private equity sale to Thoma Bravo in 2012 and grew the company with a series of acquisitions, nine of which were fuelled by the investment of Genstar Capital which has owned the company since 2015.

Heil and Malkin founded Encoding.com in 2008 before selling to Telestream in 2022. It was Telestream’s 13th and last acquisition and made with lead lender Fortress Investment Group. The two principals took senior leadership positions across Telestream’s cloud initiatives. Transaction details were undisclosed.

In 2021, Genstar moved to sell Telestream but reportedly failed to get beyond the second round of funding.

In early 2023, having guided five acquisitions in quick succession since 2020, Castles stepped back to be replaced as CEO by Rhonda Bassett-Spiers. She previously ran iTradeNetwork, a company in the perishable food industry.

Bassett-Spiers was herself replaced in fall 2024 with Castles rehired to CEO.

On June 7 2025, Telestream enacted a corporate restructure, moving assets into a new entity called Telestream 2 LLC.

Genstar Capital is a San Francisco-based PE firm with $50 billion of assets currently under management.

Headquartered in New York, Fortress is an investment firm with $51 billion of assets currently under management. It is owned by Abu Dhabi state owned bank Mubadala. Its website states, ‘We cultivate lasting trust by developing relationships grounded in honesty, integrity, and time-tested results.’

Interview with Greggory Heil

Streamingmedia spoke with Heil and the following is his version of events beginning with the 2022 buyout.

“Dan presented himself as just a great guy,” Heil says. “I knew all the founders of the 12 other companies that were acquired as part of the Telestream family and they were respected in the industry. There was a kind of old school anti-Silicon Valley startup mentality that was attractive to me because I cared a lot about what we had built with Encoding.com and we wanted to make sure it was in good hands.”

Shortly after the acquisition Bassett-Spiers was brought in by Genstar. “That was a bull in a china shop operation,” Heil says. “They fired hundreds of people, tried to redo the branding and product and package it up for sale.”

“Rhonda brought her team with her who were very bright and SaaS savvy but knew nothing about the video tech industry. The company struggled. The lenders to Telestream [led by Fortress] stepped in very aggressively, got rid of Rhonda, and foreclosed on the company. They converted their debt that they had to the company into equity and took over the company.”

This is the point at which things turned south for Heil and Malkin.

“They fired us ‘for cause’ to avoid having to pay our employment for the rest of our earnout. It's a nasty move, but it's a move.”

Heil and Malkin challenged the action. Their initial filing in 2023 alleges that the plaintiffs were terminated ‘for cause’ without documentation to avoid payment obligations.

“Firing someone ‘for cause’ in America is extremely difficult,” Heil says. “You literally have to drive a car when drunk into the side of the building or do something equally egregious and you have to have it documented. But that wasn't the case at all. There was no warning. It was purely financially driven. They thought, we can save some money and tie these guys up in court until we sell the thing.”

What happened next, according to Heil, is that the company withheld roughly 30% of the purchase price agreed for Encoding.com.

“They said we didn't meet some borrowing condition. Or ‘go screw yourselves’ basically,” he says. “So now we have to fight that one too.”

Their suit was amended to allege that Telestream withheld final earnout payment owed to the founders due to borrowing conditions.

Then in June the company restructured in a way that left the Encoding.com founders fuming and willing to take the battle public.

“The messaging to employees and to customers was that this is just a routine asset allocation. Just a little restructuring. Nothing to see here, folks. In reality, they took everything of value out of Telestream 1 and left just the Encoding.com debt and the employee debt. So, really, it’s just Jeff and I and the Encoding.com shareholders left in Telestream 1 with just a little bit of their own debt as cover.”

Encoding.com shareholders alongside Malkin and Heil include Angel investors Harmonic and Rackspace Technology.

“This is a classic case of investment bankers, PE firms and their lawyers getting together and scheming to get away with one to save money in a legal path without ever stepping back and thinking, ‘How are customers feel about this? How will our employees feel about this? What's really the narrative here?’”

The case is about to enter Discovery where Heil hopes to find some answers.

“It appears to be a very targeted asset transfer where they kept what they wanted and isolated us strategically and on purpose.”

His post about the action on Linkedin – his second in four years (the first of which was for winning a technical Emmy) - has been viewed over 20,000 times.

“Customers have called me. Employees have called me. Some employees have quit because they feel misled. I’ve kept quiet in large part because I respect the people who work there but this is so egregious I think it's important for people to know.”

“Maybe we fight this successfully, maybe not. We’re going up against Goliath here. Fortress knew this lawsuit was coming and we're just these little guys but it's a cautionary tale for other founders that are going to sell to Private Equity that might have a similar deal structure in place. Maybe you think twice about what's promised to you contractually when [PE] have a whole hidden playbook to get around those contractual obligations.”

The suit reads: Plaintiffs allege the transfer was pre-planned with Telestream’s lenders to isolate those liabilities. The structure mirrors a fraudulent transfer under California law, designed to hinder recovery by former executives.

“You make your decision as a founder based on what you see you,” Heil says. “You spend all this time and money with legal negotiating a contract. You base your assessment on whether or not to sell based on what you think you're going to get contractually. If they have a whole playbook to enact that not only makes you spend hundreds of thousands of dollars in court just to get all of what is due, then I hope this is a cautionary tale.”

He says he and Malkin have to prove in court that Telestream 2 is a continuation of Telestream 1 because their original suit was against Telestream 1.

“A lot of times when companies do this, they set up a new brand, a new management team and a go-to-market strategy and product portfolio. They could have done that but instead they tried to do this in the quiet of the night and say to the public that everything's rock solid. Telestream employees and customers had to sign a new contract with Telestream 2 but it's the same business. Then in a courtroom, they are basically arguing that it's not the same business.

“It's crazy. Do I have to make an exhibit of the IBC booth showing that they have the same booth in 2025 as they did in 2024? It’s so silly, but the court doesn't know about our industry and what's going on. Telestream 1 is just an empty shell now. There's nothing there except debt. It's just putting another hurdle in front of us in order for us to get our employment claims.”

Their complaint was amended a third time to accuse Telestream of fraudulent transfer and successor liability of continuation of business operations. The discrepancy in Legal vs. Public Narrative in relation to Telestream 1 and Telestream 2 is described in the suit as a ‘contradiction that is central to concerns over transparency and fairness.’

“We're accusing them of something that is illegal but the mountain that that puts in front of you to try to prove is the real issue. They know that we're small guys. They know they're in the wrong, but they also know it's this huge legal mountain for us to come fight them.

“They did this ‘non judicial’ asset grab to avoid the public bankruptcy process. Non-judicial means it happens overnight with no oversight and no accountability. That's why they took this risk against us. It’s got to the point where I think people need to know what happened.”

Asked about his feeling towards Castles and Heil is conflicted.

“I think he was in a difficult position. He did everything he could to avoid bankruptcy and talk the lenders [Fortress] out of bankruptcy. He was successful in a way but then it put him in this position of doing something that's a little bit shady of targeting certain creditors. I do think he's complicit in this. The previous CEO had put the company in a bad place. He comes in, brings back the old management team, stabilizes the ship, goes around on an apology tour to everyone saying ‘we're back at the helm’ - and those are all great things. I understand that the lenders are his boss but the way he handled this was not ethical.”

Telestream right to reply

Streamingmedia reached out to Genstar Capital and Fortress for comment and received no reply.

Telestream did send us back a statement, quoting a spokesperson: “We cannot comment on matters related to individual employment or investment. We can confirm, separately, that on June 7, Telestream completed a recapitalization transaction, which has significantly strengthened the Company's financial position for the benefit of our stakeholders. We're pleased to be on the other side of that effort, which has left us better positioned to support our customers and employees and to serve the industry going forward.”

A court date is set for January.

28 Years Later: Union VFX cracks complexity of iPhone filmed infected action

RedShark News 

article here

Danny Boyle shot almost the entirety of zombie horror sequel 28 Years Later on a iPhone 15 which succeeded in immersing the viewer in the heart of the gory, disorientating action at the same time as it presented technical challenges for the VFX team.
This was especially the case in sequences shot handheld with either a 20-camera or a 10-camera lightweight 3D printed rig arrayed in 2.76:1 widescreen aspect ratio. 
“Danny is driven by a kind of in-camera realism but when you're doing something as complex and ambitious as this with iPhones there's a lot of limitations we were perhaps the first filmmakers to encounter,” says Union VFX’ Dillan Nicholls who was DFX Sup on 28 Years Later working with overall VFX Supervisor Adam Gascoyne.
Union VFX have worked on previous Boyle projects including Pistol and T2 Trainspotting and because of that relationship knew the facility could handle what was thrown at them.
Boyle and cinematographer Anthony Dod Mantle not only wanted the mobility of the phone camera but leant into the aesthetic generated from the device’s limitations.
“It's not the most VFX friendly source media but that's not the look they wanted. They don't want that clean Marvel superhero look. That's exactly what they were trying to get away from.”
For example, many of the scenes of arrow hits were captured using the rigs. Union were tasked with rebuilding a scene from up to 20 cameras in post, match-moving each one of the cameras to get them aligned and synced correctly to match the edit before they applied VFX.
“If we're creating a CG arrow it was like having 20 shots in one and they needed to exactly line up. Any prep or compositing work that you're doing on one angle of those 20 has to ensure looks the same. It's like doing stereo 3D work where, you've got the two eyes. It needs to look the same from each of those 20 different eyes that you're going to see in quick succession, otherwise, it's kind of going to kind of bump.

“It's a bit of an unusual thing to do and our pipelines are not exactly set up to do 20 very fast cuts and make that into effectively one shot that works from multiple angles. Certainly doable but slightly odd and achieved through meticulous planning.”
The ProRes files meant that the team were dealing with chroma subsampling of 422, less information than is normally required for keying.

“It's very difficult to work with because there's less information in the colour channels. You’re better off with a green screen than a blue screen but still not very effective. So for the Causeway Chase, which could have been a big blue screen or green screen shoot, we used gray screens and roto.  There's other advantages to not using blue or green screens which also played into the decision but one of the reasons was we knew the cameras wouldn't cope very well with keying.”

Nicholls continues, “The sensors also offer quite a limited dynamic range. With the fire shots, it clips at a certain level and you can see that in the cinema. You can see that you're not seeing detail in the whites of the fire in the way that you would if it was shot on Alexa or on film. Our task was just to ensure that when we're adding our fire simulations we ensure that they're clipping and that they match the same quality that you get from the iPhone.
“It has a particular kind of noise and artefacts that another director or cinematographer would have asked us to remove but this was actually the aesthetic they wanted.  So all the VFX we put in can't look too clean. We needed to kind of break it. We needed to make sure it has the same limitations as the iPhone.

They found other challenges with using the iPhone. The metadata from the pinch Zoom didn’t feed through correctly into post. There was a difficulty too in matching the video’s motion blur which Union haven't fully understood yet.
“There were certainly challenges where we were mixing a lot of CG with live action where it would take quite bespoke work to get the correct matchup. The iPhone seems to constantly adjust some of these things internally.”
Dod Mantle wanted to use the iPhone auto-stabilisation feature. “It’s great for amateurs like you or I but it breaks the math of how you track cameras,” says Nicholls.
“We knew this beforehand and were able to advise Anthony that if he used auto-stablise there would be some limitations on what we can do. Basically, it breaks the image so that parts of the image slide around. Match move is a very much a mathematical process based on parallax. It's not like compositing where you can eyeball in. With match move it either works or it doesn't and if you start automatically moving pixels around without having a way to undo that then it gets difficult. We had to be very flexible and did find ways of working with it using multiple camera tracks at different layers and depths.”
Since then, Union VFX have been talking to Apple about whether they can provide that data. “We were honest about the areas that we struggled with if they wanted it to be used more frequently for visual effects.”
It’s possible that Apple might let users invert the stabilisation data so that a director and cinematographer can use the tool as they wish on set but it would be possible to then invert it, for VFX, then reapply back.
Union worked on around 930 shots but the total count is much higher. “Danny likes to keep everything going in the edit right up until the end. We certainly worked on more shots than are in the final edit.”
Holy Island (aka Lindisfarne) in Northumberland was the real life location for the survivors community in 28 Years Later. It even has a causeway, but it winds and curves, while used as the basis for photography the linear causeway in the film was a CG construction from photography at Bovingdon.
The casting of child actors, notably 13-year-old Alfie Williams playing Spike, limited film time – especially tight at night. It’s one reason why the nighttime Causeway Chase was staged in the studio. The set contained a 100-metre ankle deep water tank and the sequence contained around 130 shots
“We would get a lot of the water splash interactions as they're running through the water in-camera, but everything else was full CG water extensions.”
A lot of work into the night sky of this sequence too. Boyle’s idea being, what would the UK be like if there was no electricity and no pollution for 28 years?
“He was really excited that we create something with amazing kind of nebular skies. Our base reference were photographs taken at the Kielder Forest Observatory in Northumberland and added some reference images from the Hubble telescope of the Carina star birth nebula, so it feels hyper real and magical.”

Arrows were safety arrows drawn in the bow on set and then dropped rather than fired. CG took over the arrows in flight. Actors wore blood squibs to achieve a blood explosion on camera augmented by VFX.

They added flies to the deer heads and made them more gory and created the umbilical cord in the baby birth scene “but even here they most of that achieved on set.”
For a scene in a Happy Eater fast food restaurant they added layers of smoke. “You can generate some smoke on set to get feel for how it would look, but essentially we needed layers of benzene across the interior of that building and it’s not something that you're ever going to achieve practically doing multiple takes.
When the gas ignites in the Happy Eater, VFX augmented the stunt performers wearing protective suits, to look as if set on fire, or replaced them with digi-doubles.

A fox running out a cottage was shot in camera but two sequences of herds of deer and a murmuration of birds, were largely Union’s work.

“We wanted a Jurassic Park moment, giving that sense of nature having taken over the landscapes. It was a case of taking what was there in the photography, the shapes of the fields, and thinking what would that look like if the farmer had suddenly abandoned it.”

The actors and extras playing the infected wore prosthetics of body parts on top of shorts requiring significant cleanup of. “They're wearing shoes running around in the field so those needed removing or replacing. In sequences where they're wearing harnesses the infected needed a complete body rebuilds.
“These are maybe trivial or straightforward things but when you're talking about hundreds of shots and needing to paint out these costumes on several hundred extras that is a serious amount of work. It requires a lot of planning.
The island was a DMP (digital matte painting) environment. It’s a crucial story point, and the filmmakers were very conscious to establish its location in in relation to the mainland. “This explains why it's safe and why they’ve made a home there. This mean that the DMP asset couldn’t be just a one-off that you see from one angle where you can sort of fudge it. It's supposedly a 1.5 mile long Causeway so everything had to be mathematically correct in our asset build to enables you to then film it from whatever angle is needed to tell the story.

“Some of those angles are quite wide therefore, there's less emphasis on the detail, but then they would ask for a close-up. It had to be rigorously planned and designed to make sure that the components of set they filmed on Holy Island – such as the village gate, sat with our CG world. The mainland in the background in the real Holy island was too close for the story world so we had to push it back and get that sense of 1.5 miles distance.”
All of that work fed into the nighttime Causeway Chase, which is where the Alpha infected chases Spike and his Dad (Aaron Taylor-Johnson) back to the sanctuary of the island.

While production design, prosthetics and SFX created Boyle’s vision for the infected Union had creative input quite late in the process. “Danny decided he wanted to make the Alphas more menacing and less human. So we did a round of look dev, doing digital makeup effectively by graying that tongues and lips, creating bags under their eyes, reddening their eyes. Whereever, the larger infected appear in the film we applied that look by tracking those textures.”

An infrared look on some night time shots (ie where infected are feasting on a deer) was developed by Union with Boyle. “Not super complex, but I think really quite effective.”
Part two of the planned trilogy, 28 Years Later: The Bone Temple was shot back-to-back with this, is currently in post and due for release next January. It is directed by Nia DaCosta with photography by Sean Bobbitt.