Thursday, 10 April 2025

NAB show review: Tariffs, technology and legacy business in the spotlight

IBC

article here

Artist led, AI driven, fan-first media show the way forward at a NAB show dominated by tariff-suffering hardware vendors and advertiser weakened broadcast
Trump didn’t just cast a shadow over NAB, he sucked the air from the room. Only 55,000 turned up in Vegas, a massive drop on the 92,000 who participated in NABShow 2019. Tariffs were the talk of the town as economic uncertainty gripped an industry already challenged to make ends meet.
“What the hell are we doing here?” asked NAB’s opening keynote speaker Stephen A Smith who was presumably booked because he’s ESPN’s leading broadcaster. Since he’s also considering a presidential run in 2027 there were overt political messages too.
“There’s a reason folks all across the globe clamour to come to the U.S,” he said, “but that is not a licence for us to be arrogant and dismissive of legitimate concerns about our republic. It is a reason to stand up and uphold our principals. Not enough politicians are doing this.”
He blamed the U.S media for partisan reporting. “We made the pivotal mistake of taking sides. We’ve got cable networks on the right and the left but what about the truth? Your rhetoric is feeding into the nonsense that disintegrates all 350 million American citizens.”
“I have a sports background but to me this is common sense,” Smith added. “Because of ideology, we’ve got too many selfish people whose sole interest is in going up against each other not whether its right or wrong for the American people. That’s what ticks me off about politicians on Capitol Hill.”
An intersection of non-interacting people
NAB has always been a more parochial show than IBC since it is dominated by the interests of its owners at North American TV stations and cable networks. Like the broadcast business everywhere though it is struggling for identity and relevance in the age of the creator economy.
“It's not an industry so much as collection of people who use media to do lots of things now,” said  Barbara Ford Grant, an AI technologies strategy consultant to Marvel Studios and principal of her production company, BFG Productions. “The best way I can describe NAB is as an intersection of non-interacting people. There are sports here, American broadcasters, creators, robotics and virtual production but all seem to operate separately.”
A traditionally trained VFX artist and the first woman chair of the Academy’s 92-year-old Scientific and Technical Awards Committee, Ford has held leadership roles at HBO, DreamWorks Animation, Sony Pictures Imageworks, Digital Domain and Walt Disney Studios.
“Creatives are not part of the conversation, and that troubles me a lot,” she told a NAB Summit convened by the trade analyst Devoncroft. “When I visited shows like NAB and IBC in the past looking for new technology it was always grounded in the creative context of how we wanted to get our stories out to people.
“The most interesting creativity I saw at NAB this year was around live sports because they have this much more direct relationship with their fans than film and TV does anymore.”
There may not have been a lot of millennials in the LVCC but their fingerprints were everywhere.
“Ten years ago, it was Sony, Panasonic, ARRI, Grass Valley and Avid who were the big companies and broadcasters had to spend millions of dollars on their gear. Today the big companies are Adobe, Blackmagic and DJI. It’s clearly a creator-prosumer level industry now.”
Darren Long, a content supply chain director at ITV, spoke at a breakfast meeting sponsored by Dalet. He said the talk there was “a much-needed gut check for our industry”.
Forget buzzwords for a second, we’re now firmly in a space where innovation, ROI, and operational efficiency aren’t nice to haves, they’re survival tools, Long said.
“Efficiency is the new currency. Forget hours of content; think in terms of how fast and how smart we deliver it. Broadcasters can’t scale beyond 24/7, but digital can. That’s where revenue and relevance now live.
Long added, “Let’s stop just building products for the sake of it. Let’s start building clear capabilities within organisations — joined-up, efficient, and profitable — so we can get the right content to the right audiences in the most effective way.”
Vendors recoil from tariff hit
The shock imposition of tariffs had immediate effect. Australian manufacturer Blackmagic Design was in town promoting its latest 12K camera, PYXIS, but after announcing it would cost $5000 on Friday April 4th, the company raised the price by $1500 over the weekend for the U.S market. Other BMD product was also up 32% for U.S buyer, including the PYXIS 6K.
Most camera makers including those from Sony, Canon and ARRI as well as lenses like Leica are assembled, at least in part, in Asia or Mexico with prices rises across the board likely.
Grass Valley has it manufacturing base just outside of Montreal and said prior to NAB that if tariffs come into force it would have to increase prices to mitigate the impact.
At its NAB press event, GV executive chair Louis Hernandez Jr warned that all vendors needed to slash costs. “Not a little bit, not trimming. I’m talking about 30%, 40%, 50% to get this industry profitable,” he said. “That’s the challenge and that’s exactly what we’ve set out to solve.”
Tariffs only exacerbate existing challenges in the industry which, for Hernandez Jr who also runs the VC group Black Dragon Capital, means the margins for broadcast production have bottomed out.
“As investors, we've been tracking the financials of this industry for a long time,” he said. “A slow steady but still below net profitability overall. There’s a lot more ways to consume, a lot more media and therefore for every asset we create, every story, the consumption of revenue drops significantly because of the sheer number and that created our problem.”
The American TV and film sector is likely to be affected most tariffs both directly and indirectly. That includes a potential ban on Hollywood cinema releases in China.
Ampere analyst Richard Broughton said, “Hardware products will likely face price hikes due to heavy reliance on imports from China. Streaming devices and TVs – often manufactured in or with Chinese partners – will likely become more expensive, dampening consumer demand and extending replacement cycles. Ad-funded media could also take a hit, as key advertisers consider pulling back spend as confidence is hit.”
Puget Systems, a systems integrator of workstations based just south of the Canadian border near Seattle, has temporarily paused orders for components that would be exported from affected countries.
“We are working with our supply partners to understand their strategy to be able to better predict what our cost changes will be,” said president Jon Bach. CPU coolers and fans for example are hit with a 20% hike.
“Thankfully these are not very expensive items in the grand scheme of things, so it won’t have a large impact on system prices, but every dollar hurts!”
The bigger picture however shows that far from pushing vendors to on-shore production to the U.S, it's likely to accelerate the transition towards software, cloud and services running on more commodity hardware.
Tom Morrod, Research Director and co-founder, Caretta Research noted, “There are going to be some vendors that get hit hard by the shifting sands of global trade just as many were hit hard by chipset availability and supply chain disruption coming out of the pandemic. But the vast majority of value is now tied up in managed and professional services, cloud compute and software, so if any industry is ready to ride this disruption out, it should be media.”
Mind the gap
As Morrod noted, if anything, tariffs are likely to push production towards cheaper products - faster. These include low-end portable cameras like the Ronin 4D or RED Komodo, software switching and cloud production tools – the sort of tools already used by creators.
“Studios working on $100 million or above productions have been in a really sweet spot, but now they're not doing well,” Ford said. “They're taxed with having to make something that is a substantially better experience than what everybody else is doing because you want to get audience into theatres but they also have to do it a lot cheaper.
On the other hand, we’re seeing YouTube influencers like MrBeast having to figure out how to make 22 minute episodes. They have to have a supply chain, and they have to figure out how to evolve into a studio. The gap between what used to be completely different industries is shrinking. You can feel that on the show floor.”
Nowhere is this shrinking gap emphasised more than with AI which is putting the means of production in the hands of pretty much anyone.
“This is the age of the generalist,” said Eric Shamlin, CEO of AI-driven production studio Secret Level and co-chair of the TV Academy’s AI Task Force, during the SMPTE-produced summit. “The other thing we are seeing is it’s putting a spotlight back on the creative vision. … People can now create space operas in their bedroom. I think we are about to see a massive unlocking of human creativity…To be a creative, previously, was a very limited group. This blows that apart.”
The integration of AI driven performance versioning tool DeepEditor into the industry’s “most trusted editing platform” Avid signifies a pivotal shift. As Nick Lynes, co-founder & co-CEO of AI company Flawless told IBC, “2025 is the year when that the dam breaks, provided those AI tools are trusted, AI is transformational in an entirely positive way.”
Companies like Grass Valley though risk being behind the curve. CEO Jon Wilson said the company is getting feedback from its customers and will only adopt AI when appropriate.
“I'm not ready to say AI is going to be central to our strategy going forward, but it will be a core part of our strategy, because increasingly it's top of mind for our customers and accelerating in the discussions that we're having with them,” Wilson told TVTech.
Barbara Ford Grant said, “I listened to a lot of executives talk about how they're looking at AI for automated tagging, or they're thinking about doing this or they've started to do that. But entire cottage industries are going to exist in the time it takes them to move their MAM!”
New AI driven studios like Secret Level, Asteria and the Russo Brothers’ AGBO Studio could upend the Hollywood order.
I think jobs are at risk, but I have a lot positivity t because I see the creative potential in businesses that are creator-led. The further your media business is away from the creative process and from the development of new IP and artistry the further away you are from what's happening now.”

Wednesday, 9 April 2025

Practical advice for lighting the volume

IBC

article here


If virtual production is to sell the illusion of what’s being filmed the LED lighting and background environment must be merged with physical sets and practical lighting as seamlessly as possible.

A LED volume not only provides an extension of the scene environment it essentially acts as a massive light box. Light emitted by the walls can be used to create dynamic reflections that interact with the set and actors in realtime. This lighting can be adjusted and fine-tuned by using light cards as well as colour and brightness controls.

“While the volume is a great base source of lighting we highly recommend pairing it with traditional practical lighting for the best result,” says Jamie Sims, VP Projects Manager at MARS Volume. “This is where a skilled Unreal Operator can make a huge difference. Our Unreal Operators and VP Supervisors work hand in glove with DOPs and gaffers to achieve the creative vision.”

Dan Hall, VP Supervisor at Slough’s Virtual Production Studios by 80six says, “Candles, lamps, even fish tanks are fantastic examples of practical lights because they’re subtle and give you an accurate representation of how light will work in a room. Additionally, it takes the eye away from the background, which should not be the focal point.”

Soft and hard lighting

LED panels are ideal at creating soft lighting which generates soft edged shadows but they can’t produce hard light such as hard edged, crisp shadows, spot lights or ‘beauty lighting’. This is where working in creative collaboration with the Gaffer and DoP on a production is crucial to creating the required look.

“While LED screens are an excellent source of interactive lighting and reflection they are behind on colour rendition when compared to today’s practical LED fixtures,” says Sam Kemp, Virtual Production, Technical Lead, Garden Studios.

Hard light is produced by a point source light, such as a tungsten Fresnel or an LED point-source fixture. Consequently, a volume without any additional fixtures can't produce hard light and therefore scenes in daylight require the addition of practical fixtures to 'sell' the idea of direct sunlight.

Kemp notes, “Practical fixtures can replicate hard sources such as sunlight and also help to fill the spectral deficiencies of RGB LED panels. Standard lighting communications control like DMX can be used from the engine for synced effects.

Image Based Lighting

Image Based Lighting (IBL) is a form of pixel mapping that uses calibrated photographic (video) colour (RGB) information to generate subject and environment lighting. The technique – which some practicioners describe as a philosophy - uses images and lighting displayed on LED sets to produce realistic reflections and ambient lighting in a scene.

“The three main benefits are accuracy, time saving and control,” says Tim Kang, Principal Engineer, Imaging Applications at lighting vendor Aputure. “The biggest one for me is control. We’ve been chasing naturalism in lighting for 100 years but have only been approximating the real world. With IBL you can get the naturalism you want and you can control the variables and much more directly.”

Garden Studios has been using IBL since 2021 primarily for driving and VFX heavy scenes. It has recently developed a workflow for tracking hard sources, allowing for a sun source to automatically move around a car driving down winding lanes.

The key is finding a good balance between IBL and traditional lighting controls; between the VP team and the Desk Op,” says Kemp. “Image based lighting doesn't really apply to specific sources when talking about practical fixtures (such as a normal light on a stand) and more to the conceptual control of those sources, such as mapping the colour and intensity of a video to a light fixture’s output colour.”

An accurate colour pipeline is key to matching colours, and this includes the pipeline for IBL. Allowing adequate time to complete camera calibration leads to a smoother shooting experience.

Garden Studios calibrates its screens colour pipeline so virtual fixtures lighting virtual content will correctly match their physical equivalents,” explains Kemp. “A colour meter helps match lighting from LED panels (e.g from a ceiling panel) to physical fixtures, as does using DMX modes such as CIE-XY (which denotes universal colour space representing the colour spectrum visible to the 'average human'). Newer fixtures can define a source colour space when using RGB modes for pixel mapping.

It's not always as straightforward as it sounds since identical LED panels might have been produced in different batches and therefore emit light differently.

“Assuming that the colour pipeline has been set correctly for the Volume, we can pixel map lighting fixtures from the environment to ensure accurate colour replication,” says Hall. “But trying to match an LED panel and a lighting fixture, that are in no way identical, is extremely hard as they display different colour gamut. You must ensure your colour pipeline is set correctly and then dial it by eye. You have to trust your trained eye to see what looks right or not.”

Virtual and real camera team collaboration

The clear advice to production is to pair the DOP, Gaffer and Production Designer with the Virtual Production Supervisor at the earliest stage possible.

“We always recommend a pre-light before a shoot so that the gaffer and DOP can run through all of the shots and lock off any variables before the shoot day,” says Sims. “Working in a Volume gives you so many possibilities, but with that we find that leaving the experimentation to shoot day is an unwise strategy - as it can lead to the time on a shoot day running away. A pre-light day is highly recommended to find what works, confirm approaches and lock everything off so that when it comes to shoot, everything can be achieved quickly and smoothly.”

It is also important for the Production Designer to be “synced” with the Virtual Production Supervisor from an early stage in production. Sims explains, “This is to ensure that the virtual set can be married up to the physical set that is being built. This becomes especially important when trying to make the line between virtual and physical set seamless. Once the set is built and in situ the VP team can then colour match the virtual environment to the physical set.”

Matching practical set and fixtures with virtual assets

Some of the biggest challenges on a virtual production set make themselves abundantly apparent when trying to extend the physical elements of an environment seamlessly into the virtual world. The complexity of this challenge completely depends on what it is you are trying to bring together and the illusion you are trying to masterfully create. 

Sims cites the example of attempting to convincingly marry physical and virtual sets for the outside of a building. “You need to match up straight solid lines and subtle block colours so anything that isn’t bang on perfect or colour matched will be glaringly obvious. This also means your camera tracking needs to be inch perfect to avoid jumping or unwanted shaking.”

Less challenging environments are ones where the line between physical and virtual aren’t as strict, for example, a sandy desert. Colour matching is vital here to sell the illusion.

“To overcome these challenges, we have to underscore the importance of the pre-light day, and getting up close and personal with your VP team at your volume stage. Construction collaboration is key here. The more time the VP Supervisor has to colour match with the set in position the better. Set build days and pre-light days allow for this care and consideration to be taken.”

Fighting on a freight train

Garden recently shot a fight scene on a moving freight train with its custom lighting controller using a combination of IBL mapping, DMX cues and OSC variables (Open Sound Control/OSC is a protocol for networking sound synthesisers and other devices for musical performance or show control).

As the train moves around corners and through a tunnel, a hard-source light array kept the sun in the correct relative position, flickering behind trees, and pixel-mapped LED tubes gave full-spectrum soft fill on the talent, automatically changing intensity in the tunnel,” Kemp explains. “Closeup fill lights were manually set; everything else could be fully automated.

80Six worked on a recent car shoot where the windscreen was taken out and therefore there was no LED ceiling for the shoot.

“Traditionally, when you shoot through a windscreen while someone is driving, there will be reflections of the sky on the windscreen,” Hall notes. “Because the shoot we were doing was as if the camera were inside the car and we only shot out of the lateral windows, we didn’t require an LED ceiling because there was no reflective surface.

“We put an old school light on a revolving wheel that spun in time with the plate playback to simulate the illusion of orange streetlights passing overhead. The colour of the orange sent to the fixture was selected from the footage of the driving plate.”

Tuesday, 8 April 2025

Bright future: How CoSTAR will ideate the next wave in UK creative IP

IBC

article here

If the UK’s creative industries are to continue to add hundreds of billions of pounds in value to the country’s economy then much will rely on the success of a new network of tech labs exploring the future of media.

“The UK’s creative lifeblood is creative IP,” says James Bennett, director of CoSTAR National Lab. “How does that creative IP live on different platforms and reach audiences beyond screens in hybrid spaces? We need to see creative applications in 5G and 6G that work together with AI neural networks to create a future of holographic imagery, innovative live performance and enhanced mixed reality experiences.”

The CoSTAR Network is the evolution of the government funded Creative Industries Clusters Programme that ran ended in 2023 and spent £56 million to drive innovation and growth across the UK’s creative industries. Four of those clusters (led by Universities in Dundee, Edinburgh, Belfast and York) are participants in CoSTAR which is awarded a £75.6 million grant over six years by the UKRI Infrastructure Fund delivered by the Arts and Humanities Research Council.

University R&D powering creative innovation

Each of the Labs is equipped with a private 5G network, compute power for AI and the latest equipment for virtual and mixed reality production though each has a different focus. 
Just as importantly, the Labs are supported by a team of leading researchers with expertise in the use of immersive and virtual technologies

“We're turning the traditional academic route of engaging with industry on its head,” says Bennett. “Historically industry comes to universities. At CoSTAR, we are embedding University researchers in the heart of the industry.”

Bennett ran the StoryFutures Cluster project which saw the creation of nearly 150 projects exploring novel storytelling formats and audience experiences. “We've spent the last five years making sure that university research is at the service of industry via the Creative Clusters program,” he says. “We had proof of concept that if you put R&D from universities in the service of creative industries, you get growth and innovation and new products and services.”

Four Labs have launched with a fifth National Lab opening at Pinewood next year. This is intended as a convergent media production hub and will coordinate efforts to bring the network’s research and infrastructure to bare on projects. 

“It's been a really tough time for our creative sectors and a long wait between the end of the first days of Creative Industries Clusters,” says Bennett. “We know there is a huge appetite among our creative sector to have opportunities to innovate and to be supported by world class research. In really uncertain times, what CoSTAR provides for them is a safe space to make serious attempts at innovation.”

Flexible funding to deliver value

SMEs and start-ups can apply to two Access Programmes for UK companies and partners to access the infrastructure and expertise of the entire CoSTAR network. Typically, applications will be either based around an existing project where R&D might supercharge development, or where the lab acts as a sandbox for pilots and prototypes.

“What we look for in each is essentially whether there is an innovative idea that's got a clear route to growth that's underpinned by an ethical, sustainable and inclusive approach to that growth that we can actually support,” Bennett says.

The Access Program fund is worth £7 million over the three and a half years but Bennett says the value to recipients is double that. “The real value of the seven million cash is much more like fourteen million because the infrastructure itself is being provided for free, including the staffing, to support companies’ R&D. So, when a company gets a cash in grant from us, that is then match funded with access to the infrastructure.

“We can only grow the UK’s creative industries if we have innovative SMEs and startups that are experimenting in the space. CoSTAR will offer opportunities for large organisations to work with SMEs, but the lifeblood will be getting lots of SME innovation through the door, seeing what's possible, accessing the kit and people.”

CoSTAR Screen Lab

In March, Ulster University unveiled the CoSTAR Screen Lab virtual production facility at its Studio Ulster campus in Belfast Harbour Studios.

“Northern Ireland has long punched above its weight in screen production,” says Declan Keeney, Co-Founder & CEO of Studio Ulster and Director of the CoSTAR Screen Lab. “We're seeing the creative industries replacing the heavy industries here, clustered around the harbour. We have about 1200 AAA crew here and a nascent but fast-growing creative technology sector. These are well paid creative technology jobs. CoSTAR Screen Lab will accelerate the development of breakthrough techniques that will redefine how content is created.” 

Studio Ulster itself is a wholly owned subsidiary of the University and a large commercial facility with two LED Volume stages which have hosted BBC Factual 4-part doc, Titanic Sinks Tonight.

The facility is wired to ST 2110 IP standard giving it the power to run 32 channels of 8K compressed video at any one time. A third ICVFX stage installed by Los Angeles-based NantStudios opens in April.

CoSTAR Screen Lab is designed into the core of the building. It offers facilities for ICVFX, robotics and a 5G private network. There are 3D and 4D volumetric scanners capable of ingesting multiple images a second from a 250-camera array (the fourth dimension to go with height, width and depth is sequential).

Over and above these state-of-the-art toys the Lab offers access to expertise particularly in AI, computer vision systems, cognitive robotics and ambisonics audio. Last year Invest Northern Ireland and the NI Department for the Economy invested £16.3 million in an Artificial Intelligence Collaboration Centre (AICC) based at Ulster University in partnership with Queen's University Belfast. Michaela Black, professor of AI and Daryl Charles, Professor of AI and Computer Games are among the academics on site. 

Also on campus is Professor Greg Maguire, former technical animation supervisor at Walt Disney Feature Animation, Lucasfilm Animation and at ILM where he worked on Avatar. Maguire is also founder and CEO at Belfast animator Humain building technology to create digital humans.

This Lab has issued a funding call to support creative and innovative use cases for 5G across the screen and performance sectors facilitating multi-site collaboration.

“The CoSTAR Screen Lab is about getting local and national companies to make proof of concepts and develop capabilities in their companies,” says Keeney. 

“The investment point is very high for this technology but if you have access to a facility like the Lab and the world class expertise we have in the building, all of a sudden you're empowered to take your idea to the next stage.”

CoSTAR Live Lab – understanding the experience

The Live Lab based at West Yorkshire’s Production Park near Wakefield will explore immersive, multisensory, and interactive technologies in the live environment.

Production Park already boasts one of Europe's largest campus’ of companies dedicated to innovation in live performance.  It hosts large stages where artists like Pink, Metallica, Beyonce and Foo Fighters have come to set up their arena tours before taking the show on the road. Among companies established on site are global staging, scenic, and automation supplier for live events Tait, sound reinforcement specialist L-Acoustics and LED display vendor ROE Visual. It’s facilities include marker less performance capture in partnership with Vicon.

“The artists that come to Production Park are not just here to rehearse, they ideate their entire tour here,” explains Live Lab Co-Director Helena Daffern. “They turn up with the seed of an idea ‘I want to be catapulted in on a giant giraffe’ or whatever their ambition is for their tour and it gets developed and designed here.”

Daffern explains that the Live Lab’s foundational research is not just around the live performance industry but the very “concept of liveness” itself.

“That's where the network really comes into its own because the way we engage with audience across all different types of media is changing. We want to interact with our digital world in a different way. The research we can do across the CoSTAR Network will let us explore the human experience of how we interact with screen and gaming technologies That’s why the network is so important because it allows us to share knowledge and innovate in an efficient way rather than in silos.”

There are even lab spaces dedicated to user experience which explore biometrics for heart rate and skin conductance. “We want to understand what really drives the human experience and response to new technologies,” says co-director Gavin Kearney. “By using visual tracking from a camera turned on an audience can we infer from their facial features exactly what they're feeling and emoting? It's these types of technologies that help drive the new generation of immersive experiences.”

One stage at Production Park featuring a 28-channel loudspeaker array installed by L-Acoustics will be offered for experimentation in immersive audio. 

“For example, we could take an audience within Live Lab and run tests on 50 people or we could bring them in to experience one of the Arena venues with a 10,000 capacity,” he says. “That's the wonderful thing about this Lab - everything is scalable.”

Another avenue of exploration is connecting performers with audiences over the internet. “We’re looking at the technologies that will enable shared virtual environments to happen in a meaningful way,” says Kearney. “Our dedicated spaces allow us to test new technologies under controlled conditions so we can vary things like the codecs, bandwidth, latency conditions and so on. We can think about each of the individual technologies in turn and then converge them to create something unique.” 

Wakefield Council recently poured £3.2m into expanding the studios to include four additional production studios tailored for live music and film, as well as new facilities for The Academy of Live Technology. 

Live Lab is currently inviting applications for a 'New Frontiers for Live Performance' pilots and prototypes programme.  Applicants can apply for cash funding of up to £13K to contribute to the costs of their R&D project. In total, the support package on offer is valued at over £100K per project. 

Daffern adds, “If the network is successful it will have succeeded in bringing together different strands of R&D, shared knowledge, resource and facility.”

CoSTAR National Lab – into the metaverse?

The CoSTAR National Lab at Pinewood will offer virtual production technology, a 236m2 sound stage and labs featuring spatial audio, volumetric capture and multisensory devices as well as a private 5G/6G network.

“This is where convergent media experiences are going to live,” says Bennett. “We will look around the corner to what is coming in converged media landscapes where it's hybrid physical and virtual or real time interactions across different devices. We’re also thinking about the built environment as a canvas on which these creative experiences and creative IP can live.”

BT is providing the telco network at the site, while Disguise is providing its RenderStream technology which enables real-time streaming of data between media servers and rendering engines. It is commonly used in virtual production, live events and immersive experiences. CoSTAR also has agreements with a number of unnamed “large organisations” to become partners with news to be announced.

The focus is not just entertainment. Bennett says they are doing work around “accessibility and wayfinding” that will provide new forms of e-commerce.

“A lot of our future landscape gets imagined by Hollywood and features holographic images and AI generated audio visuals coming at us from all angles. One of the really interesting pieces we're putting together is how do you actually create an environment where we may have huge amounts of sensory experiences bombarding us yet be able to block things out and focus on particular areas. How do we create experiences that enables people to enjoy the next wave of the metaverse?”

CoSTAR Realtime Lab – connectivity and AR

With the main site located at Water's Edge in Dundee with close links to the UK video games sector and a second facility at Edinburgh College of Art, the RealTime lab run out of Abertay University will specialise in virtual production, integrating CGI, motion capture and AR. It is equipped with a Mo-Sys tracking system, ROE Carbon LED panels and a Brompton processors. While Scotland’s screen sector can look to benefit from the Lab, Abertay’s demonstration of real-time geographically dispersed production over 5G has already caught the eye. 

There are plans to evolve this experiment over 5G and nascent 6G networks with Pinewood when that lab launches in 2026.

CoSTAR Foresight Lab – skills for the future

Led by Goldsmiths University in London the Foresight Lab is a thinktank scanning the creative industries sector-wide with a focus on key areas including decarbonization and advising on the regulatory framework to support growth and innovation. 

Board members include ILM, Dneg, BBC R&D, the RSC, Framestore, USC School of Cinematic Arts and Microsoft.

“They've been leaning into the debate around AI and Copyright which is being reviewed by the government, for example,” says Bennett. “It has a 20,000 company business tracker to look at emerging trends such as where is public and private money being spent globally on CoSTAR technologies and where is market activity going, where are the skills gaps and where is intervention needed most urgently. That provides a really good context for what is happening in the sector.”

Among the other questions it is researching: How extensive is the use of convergent technologies (including artificial intelligence) by firms working in CoSTAR sectors? What are audience experiences and expectations for products and services using CoSTAR technologies? And what essential data structures and metadata elements should be collected for CoSTAR technology productions?

“CoSTAR is the next obvious step for UK Creative Industry,” says Bennett. “The Creative Industries generate six percent of UK GVA (worth £124 billion in 2024), but only receive around one percent of R&D spend. Now we are making cutting edge infrastructure available within academia where industry can access it. Fundamentally we are putting put world-class research at the service of creative industries to grow innovation in an ethical and sustainable framework.”

 


Monday, 7 April 2025

Building Creative Projects in the Cloud with House of Parliament and Gunpowder

interview and text written for Sohonet

article here

House of Parliament, an independent VFX and creative studio, was founded in early 2020 with a vision of reimagining the concept of the traditional studio. Five years on and the company is a serial award winner working on the highest profile projects. That notoriety includes delivering nine commercials for Super Bowl 2024 in just one month. In the fast-paced world of visual effects (VFX) and creative production, their innovation and adaptability are crucial to Parliament’s success.

Underpinned by over twenty years of experience in high end visual effects, Parliament are experts in consulting, creating and executing visual content to the highest level.

Parliament’s animated work has appeared in prominent productions such as Taylor Swift’s self-directed 2024 VMA Video of the Year for 'Fortnight' featuring Post Malone, and 'Smoke and Mirrors,' awarded the 2024 Prix Ars Electronica’s 'Golden Nica' to conceptual artist Beatie Wolfe. The studio is also a finalist for VFX Company of the Year at the Ad Age Creativity Awards for campaigns including Apple’s 'Flock' directed by Ivan Zacharias for Smuggler, and LAY'S 'The Little Farmer' directed by Taika Waititi for Highdive and Hungryman.

The Power of Collaboration for Speed and Scalability

The Parliament pipeline was built, managed and resourced by their technology partners at Gunpowder; designed to exploit the latest developments for scale, speed and collaboration.

“We are effectively their CTO,” says Founder of Gunpowder Tom Taylor, a leading systems integrator specializing in cloud virtualization solutions. Their role in visualizing and implementing Parliament’s workflow is extensive. “We build the pipelines, we operate the render farm, we help them scale and we help with all the upgrades. We manage billing to ensure projects remain on budget and that the infrastructure is on tap as required and costs don’t spiral.

The key to the success of House of Parliament’s VFX workflow is the virtualized version of Sohonet's real-time review tool ClearView Flex (aka VFlex). Taylor says: “It is exceptionally easy to set up and use which producers love. Since ClearView Flex gives peace of mind to their clients it makes Parliament happy, and it reduces a lot of engineering time for us.”

Solving Critical Connections

A key issue was solving the critical connections for interactions between clients and artists working from home. “We’d jury-rigged open-source tools to get streams at a high enough quality to clients remotely,” reports Taylor. “To be honest we were not consistently successful. Sometimes it would work well, sometimes it would falter. And it always required an engineer to set up and do some tweaking during the session.. We found ourselves constantly trying to make it work. We did not want the clients  to notice, and it was getting to that point.”

In 2022 Gunpowder reached out to Sohonet. Taylor explains, “I knew at the time, the virtualized version of ClearView Flex (VFlex) was operating in AWS, but Parliament was on Google Cloud. Sohonet arranged for us to beta test a version of VFlex in Google and we set it up. From day number one it was like night and day.

“The producer suddenly had control. It was easy enough and clear enough that they could then manage the sessions. The clients were happy because it looked great, and they were also using a tool that they were familiar with. You can’t overestimate the importance of this. Lots of clients had used ClearView Flex all over the world and they were excited to use it when we presented it to them.

“The clients wanted it. We wanted it. Sohonet delivered it for us - in Google, specifically - so that we could move forward. We've got very smart engineers who tried to build this but in the end for peace of mind of the clients and for ourselves we ended up using VFlex and we haven't looked back.”

The result: smoother collaboration, less downtime, and happier clients.

Benefits of VFlex

VFlex has become an essential part of Parliament’s daily workflow, allowing artists to work with Autodesk Flame, Houdini, and Maya in a virtualized environment. Its reliability in maintaining color accuracy and quality across devices has significantly enhanced client satisfaction and streamlined the creative process. 

 “Now, we didn’t need engineers to set up sessions,” says Taylor. “We are no longer relying on open-source tools that risk disrupting our workflow. Think of it this way: we had a whole chain of plug-ins that were our version of VFlex. To get that chain working took a lot of effort. And, if any one of those pieces got updated it would quite often break something else in the chain. We were operating in a  very unstable structure for sending daily reviews out on for clients, crossing our fingers to see if it would work.

“With VFlex, it’s 180-degrees different. It's a known product and clients are very comfortable with it. They know that if they’re watching a ClearView stream that it’s going to be excellent quality, and we know it's not going to lag. Plus, it’s going to be color accurate.”

Ensuring artists and clients are seeing the thing is a perennial issue with distributed workflows but not when VFlex is part of the solution.

“Colorimetry is notoriously tricky when you have some people on an iPad, others on an iPhone or laptop and sitting on the other side of the world. Getting that consistency of viewing experience is exceedingly difficult,” Taylor says. “VFlex gives us peace of mind. We know that the source signal is consistent across any device that the client wants to connect from.”

The Cloud-First Mentality

House of Parliament launched in March of 2020 with a roster of high-profile projects signed and ready to go. Notably, this included production on multiple 2024 Super Bowl commercials. With everything set, the global pandemic enforced lockdown just one week later. 

“They weren't able to get a lease on office space or obtain infrastructure or equipment,” says Taylor. “We had to scramble, fast, and figure out how we were going to do this.”

Cloud postproduction studios were not a new concept at that time, but none had left on-premises workstations entirely. Out of necessity, Parliament had to pioneer a cloud-first mentality.

Gunpowder tackled the problem head-on, talking with cloud providers and using available infrastructure. In a matter of weeks, they had built an alpha cloud studio that enabled Parliament to scale out to 100 artists across different regions and get the commercials done and dusted for Super Bowl LV.

Post Super Bowl, still in the pandemic, Gunpowder reviewed the infrastructure and began to evolve it. “The first few months were definitely a scramble,” Taylor recalls. “We needed this to work irrespective of the issues we encountered. It was trial by fire.”

Scale for Super Bowl

While no two projects are the same, Gunpowder built a core pipeline for Parliament that can scale. VFlex is integral to each one.

Taylor says: “Each department has a volume control in it, if you will, and depending on a job’s ebbs and flows we turn it up or down. That can be multi-region. It can be different countries. If they want to hire a specific designer who's in Australia to produce a certain look, we can get that person in front of the project within minutes. We are literally able to grab a slider bar and drag it up and get 100 extra machines online in three minutes.”

This flexibility enabled Parliament to more than triple in size to accommodate the increase in work, involving over 300 artists, 2 PB of data, and thousands of hours of rendering to complete nine spots ahead of Super Bowl LVIII 2024—all over the course of just six weeks.

“One of the nicest compliments we received from Parliament was that they didn't even have to think about doing this. The key to VFlex is that it is easy to set up. It just works. Producers love to use it, and it makes our clients happy.”

Template for Success

Parliament recently opened a design department and is working with Gunpowder to explore the integration of real-time workflows. “Design and post workflows are traditionally kept separate but we’re bringing the two together so that our 3D artists can benefit from being able to model quickly in tools like Unreal and then bring those tools back into Maya.”

Separately, Gunpowder has taken the cloud template and applied it for clients outside media and entertainment in sports verticals for architecture firms, toy manufacturers and more.

“We not only help legacy creative VFX studios accelerate their transition to dynamic cloud-based operations and workflows, but our goal is also to free production teams to concentrate on delivering their best creative work, by taking care of the cloud infrastructure and management.”

House of Parliament’s partnership with Gunpowder exemplifies how cloud-based solutions can redefine creative production. By focusing on robust infrastructure and reliable client interactions, the studio has set a benchmark for the VFX industry, showcasing how innovation and collaboration lead to success.


Thursday, 3 April 2025

Enginelab and the new breed of cloud postproducer

Enginelab and the new breed of cloud postproducer

article here

It takes a brave soul to launch a new VFX facility given the meltdown at one of the industry’s largest, but creative entrepreneurs conversant with cloud economics are confident that there are good opportunities to be grasped.

UK startup Enginelab is the latest of a new breed of postproduction company designed around facilities in the cloud and powered by AI.

Two of its three founders come from Untold Studios which broke ground in 2018 establishing the world's first cloud-native creative studio with a template of cloud render nodes and virtual workstations.

Sam Reid was CTO of the initial Untold team helping grow the company from a handful of employees to several hundred bringing international business to its creative services from commercial brands, pop artists, studios and streamers.

“I've learned a thing or two about how to how to work in the cloud and how to how to make the cloud work for media and entertainment,” says Reid. “We're cautiously optimistic that increased volumes of work are coming back into the market and that new studios are going to pop up that will need next generation technology, solutions and workflows to support them.”

Describing Enginelab as a full-service independent technology business he adds, “We don't need edit suites. We not going to be hiring artists. We're going to be providing the infrastructure for studio businesses and we’re going to be the technology experts they can call upon for guidance and leadership.”

Joining Reid in the venture is colleague and senior developer from Untold Daniel Goller; and Matt Herman who founded roto and paint shop Trace VFX before selling it to Technicolor in 2016.  Subsequently, Herman took animation and visual effects outfit Psyop from multiple on-prem studios to a fully cloud and remote operation, expanding the business by opening lightweight facilities in Mexico City, Berlin and Hamburg.

“Because we have [set up facilities] once before we should be able to do it again but a lot quicker,” Reid says. “We’re also going to use AI to help us do that.”

Specifically, Enginelab will use AI to automate processes. “AI helps with technical manipulation, the really boring, mundane jobs that an artist would have to do so they can focus more on their craft,” Reid explains. “I’ve have spent a lot of time at Untold evangelising and implementing AI workflows. Now I’m keen to unlock efficiencies in workflows for other businesses. For example, AI can write code a lot more efficiently and a lot better too.”

It’s not too much of a stretch to suggest that the recent collapse of Technicolor is end of the line for post models with volumes of real estate, thousands of employed staff and huge overheads. It is being replaced by leaner organisations where infrastructure is for hire to be tailored per project and scaled up or down as required.

“It's all very well shutting everything down and minimising spend, but you need to be able to quickly kick it back into motion when you get a big project that needs lots of render nodes, for example,” Reid says. “You also have to be comfortable doing it, because it’s one thing knowing you can do it, but you need to have the team around you who know how to do that properly so you don't end up with huge bills and in situations you find it very difficult to get out of.”

In his obituary to Technicolor, Michael Elson, COO at MPC from 1998 to 2008, said The Mill was “founded by visionaries and powered by super talent, ravished by neglect”. MPC, he said, was “killed by a management so adrift it’s criminal”. Of Technicolor itself Elson concluded, “A corporate behemoth was never equipped to deal [with] a world that requires you to be light on your feet and adaptable.”

Reid and Herman are alumni of The Mill, both starting out their careers there in the engineering departments. They are wary of not making the same mistakes as its parent.

“It’s about staying lean and not falling into a trap of huge overheads by being able to adapt to dips in work,” Reid says. “Cloud technology helps with that because you can be very in control of the costs.”

He adds, “I really enjoyed working at The Mill and it’s sad to see what's happened to it. It's where I fell in love with technology. One thing I’ve learned is that when our backs are against the wall everyone bands together. You can see it happening right now. We're having some really interesting conversations with people about setting up new studios and hopefully we'll be able to help them.

“The future is definitely much less about having a physical presence and owning kit. The facilities are disposable to be honest.

“People are the assets and always have been in this industry. We need to protect them because they are the ones that drive value.”

Enginelab are optimistic that the industry as a whole has turned a corner on the last few years of Covid, strikes and economic downturn.  It has its eye too on the 29.25% tax credit for UK VFX that comes into effect on 1 April 2025 (and is backdated for activity after Jan 1 2025). It probably won't receive Royal Assent until late March.

“There zero chance it will fail at this point,” Neil Hatton, CEO, UK Screen Alliance tells IBC365. “HMRC, however, won't issue guidance until it's written in law and there are signs that this is causing some clients to hang back on commitment until they are 100% certain of what is claimable.”

Reid highlights the increasing global and transient nature of the workforce and shifts in locating productions to soak up different tax benefits.

“We hope to see a lot more studios come to the UK especially for films and HETV work. The key to success in 2025 is being able to work with pockets of people around the world. Our challenge is how to make it a seamless and frictionless process.”

They aspire to emulate the business model of Untold which spans longform as much as shortform work.

With a longform project you are looking at many months to potentially years of work, so things like managing the data  become a lot more of a challenge and more of a focus point. Advertising can be started and finished within a few weeks. The challenge here is to be very efficient and render shots quickly.

“We should be able to set up a very secure environment for creatives to focus on what they do best while we make the technology work really hard for them. Those artists could be in Boston or Cape Town as equally as they might be north of London.”

Having established a relationship with AWS at Untold, Reid says it starts as Enginelab’s preferred cloud provider. “If a customer wants to use a different cloud provider then we'll be agnostic. I'm not a cloud salesman, I'm a technologist. We want to work with businesses to craft them the best technology solution that could be in the cloud or it could be on prem or it could be both.”

“If it was a full cloud environment with render, storage and workstations there for maximum efficiency we can also help businesses work together. If more people use the same platform we can create some smart automations and ways of sharing data.

“For example, a big feature film might want to engage us to host their data and we would securely serve data and functionalities out to different vendors on that show. Certainly, there will be a power in numbers if everyone is using the same infrastructure.”


 

Wednesday, 2 April 2025

Relo Metrics Gets up to Speed with F1 Sponsorship

Streaming Media

article here

As sports move online and sponsorship investment follows tracking on-screen brand exposure need to keep pace, not easy when logos are travelling at 200 Mph. Relo Metrics is using a new AI-powered tool to do just this for millions of sponsorship placements in real-time with expansion into other motorsports and then European football leagues to follow.
“F1 is our first truly global sport,” says CEO Jay Prasad. “Every race weekend it gets 100 million viewers around the world. We’ve plans to expand into other motorsport series like Formula E, WEC, and MotoGP. European football is probably where we're heading next.”
According to the SportsPro ‘Formula One 2024 Business Report,’ F1 is experiencing explosive commercial growth and attracting 100+ million weekly viewers, a global audience comparable to the annual Super Bowl. Sponsorship spend across F1 and its teams for the 2025 season is projected to reach more than $2.9 billion, an increase of ten percent year-over-year (YoY), according to a recent study by Ampere Analysis.
Debuting at the Australian Grand Prix, Relo’s Census platform enables brands, teams, and agencies to track sponsorship performance instantly and optimise investments mid-season.
“Motorsport in general has some inherent challenges to it because it's not two teams playing one another like on a football pitch,” says Prasad. “Each [branded car] is very different from one another.”

Formula One runs 10 teams, Moto GP has 11 and NASCAR can field up to 17 teams with multiple cars interweaving with one another at high speed. This clearly creates challenges attempting to track things accurately. In addition, the circuits and grand stands are outfitted with LEDs and signage some of which are virtual assets input into the broadcast stream.

“Because of these inherent challenges we’ve invested further in computer vision-based AI to build neural network models to train our models and basically solve this problem,” says Prasad.
Specifically, Relo has doubled memory capacity and throughput by upgrading to Nvidia’s multi-modal AI model, to train on larger datasets more efficiently.
Traditional methods struggle to accurately capture fast-moving brand placements on cars traveling at over 200+ mph, often relying on manual annotation or delayed post-race analysis.
“From a quality standpoint, just because [a logo] appeared on screen doesn't necessarily mean it had value. If it's totally blurry, then it doesn't have value. Virtual assets could also be animated.”
The camera angles offered by the host broadcast are already designed to maximise the amount of time sponsor logos are on screen.
“Measurement has to keep up with this,” says Prasad. “There are also inserts of driver cameras [point of view cameras from the car cockpit] which our model also takes into account.”
Relo applies scene detection frame by frame across a feed of the host broadcast feed as the race is played out live. It runs three detection models simultaneously. One to detect the logo, another to understand logo placement (such as is it a car or a billboard), and a third sponsor recognition model that classifies each logo that gets detected and says what brand it is by probability
“You have logo, placement, and rights holder.  We have to create a taxonomy of all the placements on the car, the track, and the driver and then track what's happening across all those variables using this advanced model.”
However, at such speeds even the computer vision can still get it wrong. “We can analyze the race data live to deliver initial sponsorship scores but we will then do additional quality processing post-race to deliver a final report. We correct errors with manual checks. We do reinforcement with humans in the loop and the more mistakes we correct, the smarter [the machine] becomes.”
Relo also calculate the media value by factoring in viewership numbers (which are not just global figures but which compare how viewers in different geographic areas).
“There's a bit of a lag in getting those numbers. We can analyze some things fairly quickly and get it up to 95% accuracy level and then for final publication we apply viewing figures.”
Relo Metrics also parses the data through social media. “There’s an important window immediately after the race when things go viral. We can capture all the activity in social and digital sports and news websites.”
The idea is for brands to gain greater understanding into the value of their investment when their logo is exposed for fractions of a second a time on screen.
“Instead of just reporting total values for posts or partners, we delve into the specifics to uncover potential hidden value. This approach allows for a deeper understanding of how each element contributes to overall sponsorship effectiveness, revealing insights that might otherwise go unnoticed.”
Relo Metrics analytics software is licenced directly brands and means brands can now compare F1 sponsorship performance analysis and comparison across multiple sports leagues and media environments.
“Expanding Relo Census into F1 is about more than just tracking sponsorship exposure; it’s about bringing motorsports into a larger ecosystem of sponsorship valuation,” Prasad says.
“We built Census so that brands could benchmark views across sports,” Prasad says. “We started with the major North American sports and built computer vision models for the NFL and NBA then WNBA and MLS. Now we’re adding NWSL and more and more professional sports. That means we've created a syndicated set of placements in those sports. Multinational brands are looking for a more advanced capability and measurement so they can track and compare their investments across multiple sports. And they want F1 to be a part of the way they're analysing their sponsorships overall.”
Allied Market Research shows sports sponsorship is projected to reach $151.4 billion by 2032.
“Say you’re a marketing lead at a bank and thinking about investing in a sport and you are thinking the Northeast U.S. is a good market for you, you might want to know which other brands are active in the Northeast and what share of voice do they have across what sport and what placement is driving all the value. We judge for clarity, for duration, for share of voice. Many different factors go into our quality score.”
Relo already works with two NASCAR teams and plan to adapt its model for any motorsport.
“We are confident that we'll be able to keep adapting what we've built so that we can offer this level of granularity and scale to all motorsports.”
F1 sponsorship
Teams account for 72% of total sponsorship revenue, with corporate F1 deals contributing the remainder.
The average number of sponsors per team is 32. McLaren leads with 51 unique partners. The Williams’ title sponsorship deal with Australian software corporation Atlassian is the largest team asset sold for the upcoming season so far. Ampere estimates this to be worth between $25-35m annually.
French luxury goods company LVMH announced a 10-year sponsorship deal with Formula 1 starting in 2025, valued at over $100 million a year.  Its brands include Louis Vuitton, Moët Hennessy and TAG Heuer.
Technology and financial service brands (cryptocurrency, software/SaaS and gambling) are the largest investors of new sponsorship deals signed for the 2025 season. They include American Express entering its first full season as a F1 Global partner and IBM with Scuderia Ferrari.