Wednesday, 16 April 2025

Flawless AI: “You can't put new lines in people's mouths without consent”

IBC

AI tools like Flawless visual dubbing are making a strong case for standard use in Hollywood

article here

When The Brutalist came under fire for using AI to massage the tricky phonetics of its non-Hungarian speaking actors it may have torpedoed Academy votes for a Best Picture win. Such injustices won’t happen in future as the entertainment industry gets a more mature grip on what AI actually is.

“2025 is the year when that the dam breaks,” says Nick Lynes, co-founder & co-CEO of AI company Flawless. “Provided those AI tools are trusted AI is transformational in an entirely positive way.”

Already this year indie feature producer A24 has lured Adobe’s Chief Strategy Officer and VP of design, responsible for rollout of AI tool Firefly, to work on future film projects. Dneg has acquired Metaphysic and corralled its IP in an already substantial AI R&D division.

Last year, indie producer Blumhouse (the label behind hits such as BlackKklansman and M3agn) made a deal with Meta that will enable filmmakers like The Spurlock Sisters and Casey Affleck to use Meta’s Movie Gen model to create short films.

Lionsgate, the studio behind the John Wick and Hunger Games series, partnered with Gen-AI developer Runway which allows the AI company to create and train an AI model on the film studio’s content.  Director James Cameron is a board member at Stability AI, despite apparently intending to hoist a pre-credit sign on Avatar: Fire and Ash stating that no Gen AI was used to make it.

“Understandably, when AI first broke onto the scene and started to demonstrate its capabilities there was a negative reaction,” Lynes says. “We are now in a phase of people beginning to understand what AI is truly capable of when it is in the hands of professionals. The fear cycle subsides and an education cycle kicks in.”

Flawless is a London based startup co-founded by Lynes and Scott Mann, the director and producer behind 2002 breakout thriller Fall. It has launched DeepEditor, an ‘ethical’ AI-tool which allows for rapid replacement of onscreen filmed dialogue with alternate lines, without requiring the crew to reassemble on set. It claims that this costs less than one-tenth of a traditional reshoot and was used on Fall to digitally replace expletive-filled dialogue and keep a PG-13 rating for international distribution. Its success is said to have led directly to a Fall sequel now in production with ten times the crew size.

Last month, Flawless paid to present DeepEditor at the British Film Editors Awards which could either be seen as foolhardy or vote of confidence and transparency in its product.

“It was received really well,” Lynes says. “We had people coming up to us saying that they were really cynical but now they are excited by the possibilities. One of them said we were really brave to walk in there and do a presentation with an AI product to a room full of film editor's but gave us credit where credit's due.”

Flawless has not yet presented to the American association of film editors (ACE) but says it is talking to “the big trade bodies in North America and around the world.”

Its confidence stems from having built its AI tech in compliance with ethical AI standards. Lynes explains that its models are derived from “clean data” either licensed or created by the company itself (presumably on Mann’s productions like Fall).

It has also integrated consent flows within the software and uses its own Artistic Rights Treasury (ART) system to manage rights and ensure compliance. “Any changes made to dialogue in post will require the explicit informed approval of the acting talent.”

Its AI model is able to study the “idiosyncratic style of every performance” in any selected feature rather than apply generic changes. “This means that when any modification or alteration is made the exact nuance of each particular character's performance is maintained,” says Lynes. “We preserve the style of the actor’s original performance.”

Similar technology developed by Flawless in a product called TrueSync is capable of syncing actors’ lip movements with different languages, improving the quality of dubbed films and making global distribution more accessible and less expensive.

The company calls its visual dubbing approach ‘vubbing’. Only the actor’s mouth is being manipulated “with micro movements in the rest of the face,” says Lynes.

The English-language release for German language film The Light directed by Tom Tykwer which opened the Berlin Film Festival will be dubbed into English using TrueSync.  Flawless also has partnerships with Deluxe and Pixelogic to use TrueSync for localising titles.

“Again, nothing gets done without approval,” Lynes insists. “A director can't drive new dialogue into an actor’s performance even if we preserved their style without the talent giving explicit informed consent.  You can't ask them to ‘say’ a new line and not have their consent for that. We're not in the business of puppeteering.”

These attributes have been commended by U.S actors and voice artist union SAG-AFTRA. Its National Executive Director and Chief Negotiator Duncan Crabtree-Ireland says SAG-AFTRA is working with Flawless “to ensure the future of AI technology aligns with our mission of supporting and protecting creative professionals.”

Flawless insists that it will use only original voice artists for their “visual translations” and not synthetic AI voice technology.

“This is not something we're involved with,” Lynes stresses. “We don't have skin in the game.” Nonetheless he thinks the field of synthetic voice tech will evolve and become part of the filmmaking process.

“The tech is not good enough yet but it will do in time and then it will become a creative choice [whether to use it or not]. That would require a conversation between the talent, the unions and producers and there probably emerge some kind of commercial arrangement where relevant consents are requested.”

It’s another sign of how the feeling in Hollywood is changing toward ways AI can empower not replace human creativity.

“The idea that labour groups want to clamp down on artificial intelligence to halt progress is a misconception,” said Crabtree-Ireland in a statement for the World Economic Forum in January. “We don’t want to stop innovation; we want to be part of guiding it.”

If Flawless is to be adopted into the postproduction workflow it would need to be integrated with Adobe, Avid and Final Cut Pro editing tools. Lynes says there is an announcement along such lines coming in Q2.

“Something Scott said to me on day one was that there is absolutely no way we're going to get filmmakers to change their workflows,” Lynes says. “They might change their workflows as a result of things being introduced to their existing workflows, and they evolve it themselves, but we can't expect people to do anything differently. We're are very sympathetic to the existing workflow.

“AI is basically additive not subtractive to the creative process and the more dialogue we can have about it the better the understanding will be and the more people will come to realise that AI can make a positive difference.”

In the education phase of coming to terms with AI he says companies like his can play a role.

“We can sit with professionals and explain to them that no data and no performances have been stolen to generate new performances because the data that sits within our system has been legitimately sourced with the permission of everybody concerned. You simply can't put new lines in people's mouths without consent.”

 

Tuesday, 15 April 2025

BTS: Daredevil: Born Again

IBC

article here

The Hell’s Kitchen of Disney’s masked vigilante reboot is given a grungy seventies overhaul by lead cinematographer Hillary Fyfe Spera.

The hallmark of Marvel Studio’s Daredevil TV show which ran for three seasons on Netflix was the emphasis on a crime fighting superhero who actually bled and hurt. The reboot streaming on Disney+ is even grittier, doubling down on the use of real locations, in-camera effects, and cinematography styled on seedy seventies New York.

“I don't come from like a Marvel comic book movie background but I felt that my experience and my identity as a cinematographer could bring something to a story which is really about people in very other worldly situations,” says Hillary Fyfe Spera, the lead DP who shot the pilot, finale, and a total of 7 out of 9 episodes.

“I'm a very analogue DP in general and it’s important to me to get as much as we could in-camera. I felt like the style of the show should be very analogue and that the camera should feel very tied to the action rather than to big VFX.”

Daredevil: Born Again picks up where Netflix Daredevil left off with Matt Murdock (Charlie Cox), a blind lawyer with heightened abilities, fighting for justice through his law firm, while former mob boss Wilson Fisk (Vincent D’Onofrio) moves to become mayor of New York. When their past identities begin to emerge, both men find themselves on an inevitable collision course. 

Bullseye in one shot

The Netflix show became renowned for a number of fight scenes shot as a ‘oner’, notably an 11 minute sequence in Season 3 episode #4 ‘Blindsided’ where Murdock battles his way out of a maximum security prison. The show’s creators throwback to this in the pilot by staging their own ‘oner’ set in a bar as Daredevil attempts to capture the villain Bullseye.

“We shot the bar sections on location and as one camera move,” explains Fyfe Spera. “There's lighting cues and practical effects and choreography all timed and very well-rehearsed working with second unit director and stunt coordinator Philip Silvera. The shot continues into the stairwell all on location in the same bar as a camera operator walks backwards up the stairs.  When we get to the roof, that’s the stitch. We shot the roof scene on stage. The next stitch happens when the camera tilts down onto the street, which was a plate of the practical bar and practical street. The timing of all of this was a challenge as was making sure it all felt like it was part of the same language.”

Gritty and vintage look and feel

For the overall look of the show, Fyfe Spera was inspired by ‘70s classics like Taxi Driver and The French Connection to create a gritty and grounded image of Hell’s Kitchen. Another go-to look reference was New York street photographer Saul Leiter who shot his greatest work in the 1950s and 60s. 

To achieve this, she shot Alexa 35 with anamorphic to create more cinematic visuals while opting for Panavision G series vintage lenses’ and incorporating flares to make the image a little imperfect and naturalistic.  

“From the initial first read of the scripts it just felt to me like it would really lend itself to being 2.39 shooting with anamorphic lenses. There's something very present about them. I liked the relationship we were able to have with characters in the frame juxtaposed with wider cityscapes. It enabled us to show characters in close-up but in a wide shot to show their feeling of separation from their surroundings.”

She also used anamorphic zooms of the type that was frequently used in 1970s movies. Aside from The French Connection she reviewed another Gene Hackman thriller of the period, The Conversation (1974) and paranoia thriller stablemate Klute (1971) lensed by Gordon Willis. Boston-set 1973 gangster film The Friends of Eddie Coyle and Michael Mann’s Thief (1981) were other choices as was the more obscure 1983 film Variety about a woman obsessed with pornography.

“It depicts a certain specific era of New York when Times Square was a lot seedier,” says the DP who has lived in the city for over 20 years. “I was trying to bring in that texture.”

Translating sensory moments

To convey Murdock’s acute sense of hearing and sixth sense she employs two different approaches, both mostly achieved in camera. One was the use of a close focus anamorphic lens that was able to be “very present” to Murdoch as he is senses something. “The lens also lent itself to a bit of flare as the camera moves around his head so we combine aspects of the world and the sensory experience.” 

For bigger sensory moments she used a combination of dolly move with a spherical zoom and an aspect ratio change (this was done in post). “We mounted two cameras on the same dolly to capture a 270-degree field of view both with wide angle Primes. In post, we combined those images to give us this feeling that he's observing and sensing the whole world, like tuning a radio dial into the one specific thing that he's pinpointing. It's a fun way of translating an audio or non-visual experience in a visual way.” 

Her favourite scene in episode one is when Fisk and Murdock are in a diner having an extended conversation and for which Heat (the De Niro / Pacino diner scene) was a huge reference. You learn a lot about their characters and the way that the series is going to pan out as they explore the limits of their relationship. 

“I kept the coverage very simple, letting the performances speak for themselves. The exciting thing about that scene from a story perspective is that these two guys who are bigger than life are just meeting in the diner, being human beings, while New York street life is happening right outside the window. It's obviously a practical location. People do come up to the window, that happens naturally with real New Yorkers, and I just felt like that was as real as it gets. Sometimes my job is letting it play and not messing with it too much. It’s as exciting to me as any action scene.”

The power of reflection and light

She also established how each character was shot by using wide lower angles when shooting Fisk to make him appear larger.  

The show is staged in a number of locations, real and studio built, much of it featuring glass, mirrors and reflections on New York streets. She maintained a naturalistic lighting approach throughout while including signature keyframes from the comic books such as giving a Fisk a halo of white light. This happened by accident when they came to shoot the diner scene. She incorporates more white light around him as the story progresses and he goes back to his old ways.

“We do a lot of interactive lighting on the show and we’re always studying the way light hits cars passing by or the way we feel traffic moving. I need to shout out to our incredible gaffer Charlie Grubbs who did an amazing job of letting the light ricochet around this chrome diner. In a bright daylight scene like this we’re open to embracing those anomalies.”

Production design, led by Michael Shaw, built mirrors and glass into sets and used reflective materials to outfit locations. “I love having a plan going into a location shoot but I always keep one eye open to the anomaly that will occur naturally.

“Reflections are a theme of the show. This duality of personalities and identities. We were always looking for an opportunity to use reflective surfaces.” 

Segments in the show framed as vox-pop news reels were shot with spherical lenses to give a different tone and style. “We called it the ‘Shadow Unit’ because it’s about the real New York is hiding plain sight. The filmmakers producing that are doc filmmakers who I have worked with in the past.”

Her first love was still photography. “I was always with a camera as a kid and that developed when I went to college when I discovered cinematography. I didn’t go to film school. I learnt by shooting everything I could possibly get my hands on. The answer is always ‘yes’ and I think that taught me how to react in different environments and how to always find a way to tell the story visually.”

Fyfe Spera has a number of documentaries on her CV including Alice Neel; The Secret Life of Scientists and Engineers as well as narrative shows like Strangers and Dexter: New Blood.

Of a Marvel production she says, “It’s something I hadn't done before and I'm proud of those human moments that we have and that we also earn the more comic book moments that we all want to see. I feel like they're all in the same conversation.”

 


Friday, 11 April 2025

Cyber security for autonomous vehicles

IEC e-Tech

The more transport systems modernise and become autonomous, the more they can be hacked. International standards from organisations like the IEC can provide the appropriate cyber security requirements.

article here

In January, Italy’s maritime navigation systems governing vessels in the Mediterranean Sea were reportedly manipulated. The hacker was a 15-year-old school boy. While nothing other than teenage mischievousness seems to have been the motivation on this occasion, cyber security experts were on the alert. The case highlights the need for organizations everywhere to secure their transport systems as ships, trains and automobiles become increasingly autonomous. Autonomous vehicles (AVs) pose a particularly alarming threat as the world has witnessed a growing number of incidents involving trucks and vans used as weapons in terror attacks.  

The challenge of connected vehicles

By 2030, 95% of new vehicles sold globally will be connected, while the number in service will increase from 192 million in 2023 to 367 million by 2027. Last year “massive-scale global incidents” from hackers targeting connected vehicles and services tripled from 5% to 19%, according to a data security specialist.

It is not just the digitization of almost every aspect of vehicle control and communications which raises the threat of cyber attacks. It is that the interconnected nature of autonomous transport systems creates a vast attack surface for criminals. Self-driving vehicles use cloud services, GPS, sensors, cameras and communication networks to function. They are connected to other Internet of Things (IoT) devices, which makes them vulnerable to cyber attacks and hackers.

While the automotive industry is grappling with the issue, governments are worried too. “A large cyber-terrorist attack targeting the operating systems of many self-driving vehicles simultaneously could cause mass casualties,” was the catastrophic scenario presented by British MPs after their own investigation into autonomous cars. The MPs concluded that self-driving vehicles pose cyber security risks “because of their connected rather than automated capabilities”.

IoT connectivity is the weak link

The automotive industry has been aware for at least a decade of how a malicious actor could remotely exploit vulnerabilities, including IoT functionalities and components which are not cyber secure, to invade user data and control a vehicle’s core functions, such as braking or accelerating. According to this article by a cyber security company, targets now include advanced driver assistance systems (ADAS), vehicle-to-everything (V2X) communications and over-the-air (OTA) updates.

And it is already happening. Automotive hacks in which attackers managed to control vehicles remotely and were able exploit the system’s shortcomings to “put drivers, passengers or pedestrians’ life in danger” rose dramatically in 2024 to account for over 35% of all cases of automotive cyber attacks. “Threat actors are now deploying large-scale, sophisticated, AI-powered attacks targeting not just vehicles but also the EV charging infrastructure, API-driven applications and smart mobility IoT devices,” the report’s authors say. “This expanding attack surface demands a transformative, pro-active approach to cyber security.” As attacks on autonomous vehicles increasingly target V2X protocols rather than other simpler elements of the vehicles, adopting a standards-driven approach to cyber security across the whole product lifecycle is deemed crucial.

A secure automotive future

Several established standards provide the blocks to building supply chain trust across the whole product lifecycle. The IEC 62443 series of standards define requirements and processes for implementing and maintaining electronically secure industrial automation and control systems. These standards set best practices for security and provide a way to assess the level of security performance, including in transport. The approach to the cyber security challenge is a holistic one, bridging the gap between operations and IT as well as between process safety and cyber security. 

ISO/SAE 21434 is the standard for automotive cyber security engineering. It provides a comprehensive framework to identify, assess and mitigate cyber security risks across the supply chain. The standard covers the entire lifecycle of a vehicle, from concept to decommissioning. Additionally, ISO/SAE 21434 satisfies the UN Regulation 155 (R155), which ensures that cyber security management systems (CSMS) take a systematic approach to risk management.

Compliance with such standards has become mandatory for manufacturers and their suppliers, making it crucial for all involved parties to have robust cyber security measures in place. While R155 is mandatory only in the EU, Japan and Korea, its requirements are used by most car manufacturers as a best practice across the globe.

The industrial cyber security programme of IECEE, the IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components, tests and certifies cyber security in the industrial automation sector. IECEE includes a programme that provides certification to standards within the IEC 62443 series.

Maritime cyber resilience

Cyber attacks on shipping companies have become common-place. Amid growing acknowledgement of the risk,  few within industry believe efforts to combat the threats are sufficient.

Maritime systems, like rail control systems, are often outdated and poorly protected. Ships are increasingly being targeted through their GPS systems, which can impact navigation. Their automatic identification systems (AIS) can also be hacked and even spoofed. AIS is an essential safety system which uses radio signals, aiding maritime vehicles to identify and keep track of each other, thereby preventing collisions. As much as it is essential, it can become a weak link for cyber attacks.

“A ship is like a floating factory,” describes one maritime information provider. “There are lots of systems that talk to each other, and they're not protected. Usually there's no encryption or authentication. It's easy to hack individual systems. It's just like hacking a Tesla, except it's much bigger.”

The IEC has published international standards specifying cyber security measures for ships. One of its technical committees, IEC TC 80, has taken on the role of developing international standards for the Global Maritime Distress and Safety System (GMDSS), an internationally agreed set of safety procedures and communication protocols used to increase safety and make it easier to rescue ships in distress. IEC TC 80 issues IEC 63154, which specifies requirements for standards to provide a basic level of protection against cyber incidents (i.e. malicious attempts, which actually or potentially result in adverse consequences to equipment, their networks or the information that they process, store or transmit).

Hackers and cyber criminals are becoming more sophisticated and use new digital technologies to access data and take control of transport systems. But tools, such as international standards, are there to protect the transport industries as they invest to modernize.  By following the steps described in the standards, much of the risks can be avoided.

 

HPA Award Winner: Damian McDonnell finishing colourist on Time Bandits

Interview and copy written for HPA

article here

Creating not just a whole new fantasy adventure world but one fantasy adventure world per episode was the herculean task set by Taika Waititi, Iain Morris and Jemaine Clement for AppleTV+ series Time Bandits and not least for colorist Damian McDonnell who graded all ten episodes.

His standout work on the pilot won the HPA award for Outstanding Color Grading [Live Action Episode or Non Theatrical Feature] at the 2024 HPA Awards in November. “It was incredibly creative in many ways and I was given a lot of latitude to put my stamp on it,” McDonnell says. “What is notable about this show is that it is very look orientated. Every time the characters jump through a time portal the new story world had to look distinct from the previous scene and we kept that up for ten episodes. The grade was pretty much active the whole time to maintain consistency through the whole series.”

A reimagining of the 1981 classic by Terry Gilliam, Time Bandits tells the tale of an eleven-year-old history nerd who falls in with a ragtag bunch of time-travelling thieves as they plunder treasure from various episodes in history. The incredible world-building involves episodes set among the Mayans, the Neanderthals, mediaeval villages and Prohibition era America, which required around two million frames to be pulled for VFX plates.

“We set the main character’s hometown to be dull and all the other locations to be strong looks. It really was a creative endeavor. I feel like I had a lot of creative input in the show which is such a treat.”

Adding to the technical challenge were the inevitable logistical speed bumps: multiple clients, different time zones, multiple shoot locations, a changing team of crew members and the need to work at a distance. While the main shoot was at studios and locations in and around Wellington, NZ, the show’s postproduction services team, The Rebel Fleet (TRF) is based 600km away in Auckland. The production team included LA-based DoP Mike Berlucchi, co-producer Jake Rice and VFX supervisor Tobias Wolters, while the writing and directing team of Waititi, Morris and Clement are about as internationally mobile as you can get.

“There were a lot of moving parts,” says the New Zealand native who has a Resolve suite at his home in Wellington in addition to the facility in Auckland. “My main point of contact was Mike but there were two other cinematographers and different producers were involved. It was VFX-heavy. There was a lot of remote grading but it all worked exceptionally well.”

TRF provided on-set DIT, Video Assist and Dailies services across main and second units. This helped with collaboration on the shoots and keeping consistency on track. TRF incorporated an ACES 1.3 color pipeline that went right through to picture finishing. This meant everyone from dailies to VFX, grading, delivery and mastering – whatever their location – could be confident that what they were seeing was completely accurate and consistent.

“Working on Time Bandits was a real privilege for us at The Rebel Fleet. It was a creatively ambitious show, and we loved being part of a team that brought such a bold vision to life. Our goal was to provide an end-to-end workflow that gave the creative team confidence—no matter where they were in the world—that what they were seeing was accurate, consistent, and creatively flexible. From on-set support through to dailies, onto VFX pulls and into Post, we focused on building a system that supported collaboration and helped maintain a clear visual through-line across every episode.”

— Mike Urban, CEO, The Rebel Fleet

“It was my first time working with ACES color management but it actually simplified the process quite a lot, particularly when handling so many VFX, because the output transforms were all standardized. The best thing is when you can really trust the workflow. I could just concentrate on the grade.”

He continues, “I’ve done Dailies grading before but for me it’s mostly just a reference. People employ me for my creative input, to apply a fresh look. One reason being that once it comes from the edit, things are in a different order than what they were shot and originally viewed. But it has to be collaborative. Fundamentally, these aren’t my images, these are someone else’s. I’m only helping to get the best look.”

Here the relationship between colorist and cinematographer is crucial. “Mike and I developed a shorthand pretty quickly. One thing I really appreciated was his minimalistic approach. He wouldn’t over light a scene and he’d always be open to creative discussion. He would embrace ideas and give me some ideas in return.

“I’m a big believer in pushing things beyond what you feel comfortable with and then pulling it back, because if you don’t go beyond that point, you don’t know that you’ve reached it. Mike was open to this approach and that’s how we developed the looks of the show together.”

TRF also organized deliverables including HDR and SDR finishes. “That’s a lot of work to do on the back end. As a colorist, I always appreciate all the people who can take off the technical delivery. Pete Harrow [The Rebel Fleet CTO] was great with doing all the colour science. Apple was really happy too, which was great.”

McDonnell graded all ten hours of the show. “It’s not really about control. It’s just that with a show this complicated, it was easier for me to simply do everything. I had to keep all the information in my head because it was always changing.”

He started out his career twenty years ago as assistant colorist on Peter Jackson’s Lord of the Rings: The Two Towers and The Return of the King and assistant editor on King Kong, moving to LA in 2008 as a colorist at Laser Pacific then Technicolor on features such as Life of Pi, Iron Man 3 and Captain America: A Winters Soldier.

On his return to Wellington he worked at Park Road Post for eight years including on The Hobbit, later the onset colorist and supervising colorist on Mortal Engines. His recent work includes X, Pearl and Our Flag Means Death also with Berlucchi.

Thursday, 10 April 2025

NAB show review: Tariffs, technology and legacy business in the spotlight

IBC

article here

Artist led, AI driven, fan-first media show the way forward at a NAB show dominated by tariff-suffering hardware vendors and advertiser weakened broadcast
Trump didn’t just cast a shadow over NAB, he sucked the air from the room. Only 55,000 turned up in Vegas, a massive drop on the 92,000 who participated in NABShow 2019. Tariffs were the talk of the town as economic uncertainty gripped an industry already challenged to make ends meet.
“What the hell are we doing here?” asked NAB’s opening keynote speaker Stephen A Smith who was presumably booked because he’s ESPN’s leading broadcaster. Since he’s also considering a presidential run in 2027 there were overt political messages too.
“There’s a reason folks all across the globe clamour to come to the U.S,” he said, “but that is not a licence for us to be arrogant and dismissive of legitimate concerns about our republic. It is a reason to stand up and uphold our principals. Not enough politicians are doing this.”
He blamed the U.S media for partisan reporting. “We made the pivotal mistake of taking sides. We’ve got cable networks on the right and the left but what about the truth? Your rhetoric is feeding into the nonsense that disintegrates all 350 million American citizens.”
“I have a sports background but to me this is common sense,” Smith added. “Because of ideology, we’ve got too many selfish people whose sole interest is in going up against each other not whether its right or wrong for the American people. That’s what ticks me off about politicians on Capitol Hill.”
An intersection of non-interacting people
NAB has always been a more parochial show than IBC since it is dominated by the interests of its owners at North American TV stations and cable networks. Like the broadcast business everywhere though it is struggling for identity and relevance in the age of the creator economy.
“It's not an industry so much as collection of people who use media to do lots of things now,” said  Barbara Ford Grant, an AI technologies strategy consultant to Marvel Studios and principal of her production company, BFG Productions. “The best way I can describe NAB is as an intersection of non-interacting people. There are sports here, American broadcasters, creators, robotics and virtual production but all seem to operate separately.”
A traditionally trained VFX artist and the first woman chair of the Academy’s 92-year-old Scientific and Technical Awards Committee, Ford has held leadership roles at HBO, DreamWorks Animation, Sony Pictures Imageworks, Digital Domain and Walt Disney Studios.
“Creatives are not part of the conversation, and that troubles me a lot,” she told a NAB Summit convened by the trade analyst Devoncroft. “When I visited shows like NAB and IBC in the past looking for new technology it was always grounded in the creative context of how we wanted to get our stories out to people.
“The most interesting creativity I saw at NAB this year was around live sports because they have this much more direct relationship with their fans than film and TV does anymore.”
There may not have been a lot of millennials in the LVCC but their fingerprints were everywhere.
“Ten years ago, it was Sony, Panasonic, ARRI, Grass Valley and Avid who were the big companies and broadcasters had to spend millions of dollars on their gear. Today the big companies are Adobe, Blackmagic and DJI. It’s clearly a creator-prosumer level industry now.”
Darren Long, a content supply chain director at ITV, spoke at a breakfast meeting sponsored by Dalet. He said the talk there was “a much-needed gut check for our industry”.
Forget buzzwords for a second, we’re now firmly in a space where innovation, ROI, and operational efficiency aren’t nice to haves, they’re survival tools, Long said.
“Efficiency is the new currency. Forget hours of content; think in terms of how fast and how smart we deliver it. Broadcasters can’t scale beyond 24/7, but digital can. That’s where revenue and relevance now live.
Long added, “Let’s stop just building products for the sake of it. Let’s start building clear capabilities within organisations — joined-up, efficient, and profitable — so we can get the right content to the right audiences in the most effective way.”
Vendors recoil from tariff hit
The shock imposition of tariffs had immediate effect. Australian manufacturer Blackmagic Design was in town promoting its latest 12K camera, PYXIS, but after announcing it would cost $5000 on Friday April 4th, the company raised the price by $1500 over the weekend for the U.S market. Other BMD product was also up 32% for U.S buyer, including the PYXIS 6K.
Most camera makers including those from Sony, Canon and ARRI as well as lenses like Leica are assembled, at least in part, in Asia or Mexico with prices rises across the board likely.
Grass Valley has it manufacturing base just outside of Montreal and said prior to NAB that if tariffs come into force it would have to increase prices to mitigate the impact.
At its NAB press event, GV executive chair Louis Hernandez Jr warned that all vendors needed to slash costs. “Not a little bit, not trimming. I’m talking about 30%, 40%, 50% to get this industry profitable,” he said. “That’s the challenge and that’s exactly what we’ve set out to solve.”
Tariffs only exacerbate existing challenges in the industry which, for Hernandez Jr who also runs the VC group Black Dragon Capital, means the margins for broadcast production have bottomed out.
“As investors, we've been tracking the financials of this industry for a long time,” he said. “A slow steady but still below net profitability overall. There’s a lot more ways to consume, a lot more media and therefore for every asset we create, every story, the consumption of revenue drops significantly because of the sheer number and that created our problem.”
The American TV and film sector is likely to be affected most tariffs both directly and indirectly. That includes a potential ban on Hollywood cinema releases in China.
Ampere analyst Richard Broughton said, “Hardware products will likely face price hikes due to heavy reliance on imports from China. Streaming devices and TVs – often manufactured in or with Chinese partners – will likely become more expensive, dampening consumer demand and extending replacement cycles. Ad-funded media could also take a hit, as key advertisers consider pulling back spend as confidence is hit.”
Puget Systems, a systems integrator of workstations based just south of the Canadian border near Seattle, has temporarily paused orders for components that would be exported from affected countries.
“We are working with our supply partners to understand their strategy to be able to better predict what our cost changes will be,” said president Jon Bach. CPU coolers and fans for example are hit with a 20% hike.
“Thankfully these are not very expensive items in the grand scheme of things, so it won’t have a large impact on system prices, but every dollar hurts!”
The bigger picture however shows that far from pushing vendors to on-shore production to the U.S, it's likely to accelerate the transition towards software, cloud and services running on more commodity hardware.
Tom Morrod, Research Director and co-founder, Caretta Research noted, “There are going to be some vendors that get hit hard by the shifting sands of global trade just as many were hit hard by chipset availability and supply chain disruption coming out of the pandemic. But the vast majority of value is now tied up in managed and professional services, cloud compute and software, so if any industry is ready to ride this disruption out, it should be media.”
Mind the gap
As Morrod noted, if anything, tariffs are likely to push production towards cheaper products - faster. These include low-end portable cameras like the Ronin 4D or RED Komodo, software switching and cloud production tools – the sort of tools already used by creators.
“Studios working on $100 million or above productions have been in a really sweet spot, but now they're not doing well,” Ford said. “They're taxed with having to make something that is a substantially better experience than what everybody else is doing because you want to get audience into theatres but they also have to do it a lot cheaper.
On the other hand, we’re seeing YouTube influencers like MrBeast having to figure out how to make 22 minute episodes. They have to have a supply chain, and they have to figure out how to evolve into a studio. The gap between what used to be completely different industries is shrinking. You can feel that on the show floor.”
Nowhere is this shrinking gap emphasised more than with AI which is putting the means of production in the hands of pretty much anyone.
“This is the age of the generalist,” said Eric Shamlin, CEO of AI-driven production studio Secret Level and co-chair of the TV Academy’s AI Task Force, during the SMPTE-produced summit. “The other thing we are seeing is it’s putting a spotlight back on the creative vision. … People can now create space operas in their bedroom. I think we are about to see a massive unlocking of human creativity…To be a creative, previously, was a very limited group. This blows that apart.”
The integration of AI driven performance versioning tool DeepEditor into the industry’s “most trusted editing platform” Avid signifies a pivotal shift. As Nick Lynes, co-founder & co-CEO of AI company Flawless told IBC, “2025 is the year when that the dam breaks, provided those AI tools are trusted, AI is transformational in an entirely positive way.”
Companies like Grass Valley though risk being behind the curve. CEO Jon Wilson said the company is getting feedback from its customers and will only adopt AI when appropriate.
“I'm not ready to say AI is going to be central to our strategy going forward, but it will be a core part of our strategy, because increasingly it's top of mind for our customers and accelerating in the discussions that we're having with them,” Wilson told TVTech.
Barbara Ford Grant said, “I listened to a lot of executives talk about how they're looking at AI for automated tagging, or they're thinking about doing this or they've started to do that. But entire cottage industries are going to exist in the time it takes them to move their MAM!”
New AI driven studios like Secret Level, Asteria and the Russo Brothers’ AGBO Studio could upend the Hollywood order.
I think jobs are at risk, but I have a lot positivity t because I see the creative potential in businesses that are creator-led. The further your media business is away from the creative process and from the development of new IP and artistry the further away you are from what's happening now.”

Wednesday, 9 April 2025

Practical advice for lighting the volume

IBC

article here


If virtual production is to sell the illusion of what’s being filmed the LED lighting and background environment must be merged with physical sets and practical lighting as seamlessly as possible.

A LED volume not only provides an extension of the scene environment it essentially acts as a massive light box. Light emitted by the walls can be used to create dynamic reflections that interact with the set and actors in realtime. This lighting can be adjusted and fine-tuned by using light cards as well as colour and brightness controls.

“While the volume is a great base source of lighting we highly recommend pairing it with traditional practical lighting for the best result,” says Jamie Sims, VP Projects Manager at MARS Volume. “This is where a skilled Unreal Operator can make a huge difference. Our Unreal Operators and VP Supervisors work hand in glove with DOPs and gaffers to achieve the creative vision.”

Dan Hall, VP Supervisor at Slough’s Virtual Production Studios by 80six says, “Candles, lamps, even fish tanks are fantastic examples of practical lights because they’re subtle and give you an accurate representation of how light will work in a room. Additionally, it takes the eye away from the background, which should not be the focal point.”

Soft and hard lighting

LED panels are ideal at creating soft lighting which generates soft edged shadows but they can’t produce hard light such as hard edged, crisp shadows, spot lights or ‘beauty lighting’. This is where working in creative collaboration with the Gaffer and DoP on a production is crucial to creating the required look.

“While LED screens are an excellent source of interactive lighting and reflection they are behind on colour rendition when compared to today’s practical LED fixtures,” says Sam Kemp, Virtual Production, Technical Lead, Garden Studios.

Hard light is produced by a point source light, such as a tungsten Fresnel or an LED point-source fixture. Consequently, a volume without any additional fixtures can't produce hard light and therefore scenes in daylight require the addition of practical fixtures to 'sell' the idea of direct sunlight.

Kemp notes, “Practical fixtures can replicate hard sources such as sunlight and also help to fill the spectral deficiencies of RGB LED panels. Standard lighting communications control like DMX can be used from the engine for synced effects.

Image Based Lighting

Image Based Lighting (IBL) is a form of pixel mapping that uses calibrated photographic (video) colour (RGB) information to generate subject and environment lighting. The technique – which some practicioners describe as a philosophy - uses images and lighting displayed on LED sets to produce realistic reflections and ambient lighting in a scene.

“The three main benefits are accuracy, time saving and control,” says Tim Kang, Principal Engineer, Imaging Applications at lighting vendor Aputure. “The biggest one for me is control. We’ve been chasing naturalism in lighting for 100 years but have only been approximating the real world. With IBL you can get the naturalism you want and you can control the variables and much more directly.”

Garden Studios has been using IBL since 2021 primarily for driving and VFX heavy scenes. It has recently developed a workflow for tracking hard sources, allowing for a sun source to automatically move around a car driving down winding lanes.

The key is finding a good balance between IBL and traditional lighting controls; between the VP team and the Desk Op,” says Kemp. “Image based lighting doesn't really apply to specific sources when talking about practical fixtures (such as a normal light on a stand) and more to the conceptual control of those sources, such as mapping the colour and intensity of a video to a light fixture’s output colour.”

An accurate colour pipeline is key to matching colours, and this includes the pipeline for IBL. Allowing adequate time to complete camera calibration leads to a smoother shooting experience.

Garden Studios calibrates its screens colour pipeline so virtual fixtures lighting virtual content will correctly match their physical equivalents,” explains Kemp. “A colour meter helps match lighting from LED panels (e.g from a ceiling panel) to physical fixtures, as does using DMX modes such as CIE-XY (which denotes universal colour space representing the colour spectrum visible to the 'average human'). Newer fixtures can define a source colour space when using RGB modes for pixel mapping.

It's not always as straightforward as it sounds since identical LED panels might have been produced in different batches and therefore emit light differently.

“Assuming that the colour pipeline has been set correctly for the Volume, we can pixel map lighting fixtures from the environment to ensure accurate colour replication,” says Hall. “But trying to match an LED panel and a lighting fixture, that are in no way identical, is extremely hard as they display different colour gamut. You must ensure your colour pipeline is set correctly and then dial it by eye. You have to trust your trained eye to see what looks right or not.”

Virtual and real camera team collaboration

The clear advice to production is to pair the DOP, Gaffer and Production Designer with the Virtual Production Supervisor at the earliest stage possible.

“We always recommend a pre-light before a shoot so that the gaffer and DOP can run through all of the shots and lock off any variables before the shoot day,” says Sims. “Working in a Volume gives you so many possibilities, but with that we find that leaving the experimentation to shoot day is an unwise strategy - as it can lead to the time on a shoot day running away. A pre-light day is highly recommended to find what works, confirm approaches and lock everything off so that when it comes to shoot, everything can be achieved quickly and smoothly.”

It is also important for the Production Designer to be “synced” with the Virtual Production Supervisor from an early stage in production. Sims explains, “This is to ensure that the virtual set can be married up to the physical set that is being built. This becomes especially important when trying to make the line between virtual and physical set seamless. Once the set is built and in situ the VP team can then colour match the virtual environment to the physical set.”

Matching practical set and fixtures with virtual assets

Some of the biggest challenges on a virtual production set make themselves abundantly apparent when trying to extend the physical elements of an environment seamlessly into the virtual world. The complexity of this challenge completely depends on what it is you are trying to bring together and the illusion you are trying to masterfully create. 

Sims cites the example of attempting to convincingly marry physical and virtual sets for the outside of a building. “You need to match up straight solid lines and subtle block colours so anything that isn’t bang on perfect or colour matched will be glaringly obvious. This also means your camera tracking needs to be inch perfect to avoid jumping or unwanted shaking.”

Less challenging environments are ones where the line between physical and virtual aren’t as strict, for example, a sandy desert. Colour matching is vital here to sell the illusion.

“To overcome these challenges, we have to underscore the importance of the pre-light day, and getting up close and personal with your VP team at your volume stage. Construction collaboration is key here. The more time the VP Supervisor has to colour match with the set in position the better. Set build days and pre-light days allow for this care and consideration to be taken.”

Fighting on a freight train

Garden recently shot a fight scene on a moving freight train with its custom lighting controller using a combination of IBL mapping, DMX cues and OSC variables (Open Sound Control/OSC is a protocol for networking sound synthesisers and other devices for musical performance or show control).

As the train moves around corners and through a tunnel, a hard-source light array kept the sun in the correct relative position, flickering behind trees, and pixel-mapped LED tubes gave full-spectrum soft fill on the talent, automatically changing intensity in the tunnel,” Kemp explains. “Closeup fill lights were manually set; everything else could be fully automated.

80Six worked on a recent car shoot where the windscreen was taken out and therefore there was no LED ceiling for the shoot.

“Traditionally, when you shoot through a windscreen while someone is driving, there will be reflections of the sky on the windscreen,” Hall notes. “Because the shoot we were doing was as if the camera were inside the car and we only shot out of the lateral windows, we didn’t require an LED ceiling because there was no reflective surface.

“We put an old school light on a revolving wheel that spun in time with the plate playback to simulate the illusion of orange streetlights passing overhead. The colour of the orange sent to the fixture was selected from the footage of the driving plate.”