Tuesday, 12 May 2026

Creative Cities Convention: “See your background as an asset”

IBC

article here

The Creative Cities Convention in Liverpool, UK, featured a range of highlights, including the first public speech from Channel 4’s new CEO, strategies to strengthen working-class voices, and the latest updates on a burgeoning regional production base.

Breaking into telly is still perceived as a career path best suited to those with the right connections and financial support. However, despite – or perhaps because of – this, working-class voices were dominant at the Creative Cities Convention in Liverpool last week. 

Hometown heroes

Oscar and Bafta winners from the area talked up their working-class backgrounds as instrumental to their success. 

Liverpool-born Producer Jimmy Mulville, Co-Founder of Hat Trick Productions, grew up in a “two‑up, two‑down slum house as the fifth of 11 children. That stays with you.” 

The experience drove him to prove people wrong, he said. 

“I didn't fit in at home because I read books [and no one else in his family did]. When I got into Cambridge University, I felt I didn’t belong there. I knew I wouldn’t last five minutes at the BBC. Being an outsider is a great fuel for anybody in the creative industry.” 

Jimmy McGovern, Cracker Screenwriter and fellow Scouser, was born in 1949, “in an area that was totally poverty-stricken, but it was fine because everybody was poverty-stricken,” he recalled. “I wasn't even aware of poverty until I passed the 11 plus and I went to a school with the sons of bank managers and doctors. That’s when I understood. That's where I got my sense of social justice from.” 

Andrea Arnold, the Director of critically acclaimed social realist films like American Honey, Fish Tank, and Bird, grew up in a council house in Dartford and thought she’d end up working in a factory. 

“People who’ve had education and money grow up with a belief,” she said. “I didn’t have that. You have to build it yourself, and that takes time.” 

She added: “Everyone is capable of so much more than they realise, but people get told they’re not powerful. I want young people from backgrounds like mine to believe in themselves.” 

Tony Schumacher, the Writer of Merseyside-set crime dramas Responder and The Cage, found success in the M&E industry via policing, taxi-driving, and hard graft.

“I grew up wanting to be normal, but what does that even mean? If you’re from where I’m from, you know you’re not ‘normal’ in the way the world defines it.” 

In Schumacher’s view, the social‑mobility ladder actually seems longer now than it did in the 1970s. 

“See your background as an asset,” he urged. “When you walk into a room full of people who are nothing like you, that’s your opportunity.” 

Uplifting universal narratives

Director Asif Kapadia (known for SennaAmy, and Kenny Dalglish) spoke about being a working‑class, second‑generation immigrant from North London.  

“At school in Hackney, everyone was from somewhere else,” he told the audience via video call. “I didn’t realise I was a minority until I entered the industry.” 

He got a grant to attend university. “If I had been starting out now, my parents couldn’t have afforded to send me,” he said. “My perspective – being brown, working‑class, and from an immigrant family – was different. That’s a strength. Be your own boss. 

“And speak up. On my first job as a runner, they were darkening white actors’ skin to ‘look Asian.’ I walked out, found two actual Asian people on the street, and brought them in. I didn’t care if I got fired. You have to stand up for what’s right.” 

Kapadia is currently working on 70 Up, the final instalment of the landmark documentary series, which has followed the same group of Britons every seven years since 1964. 

“The Up project tracked children from different backgrounds to examine whether social mobility was possible,” said Kapadia, who noted the series’ “deep connection to class, opportunity, and the stories we tell about ourselves,” – themes that resonate strongly with his own body of work. 

However, even these established film and TV makers suggested that prejudice among decision-makers, who are often London-based, remains.  

“Just because you are working class or from a particular area doesn’t mean you can only write certain stories,” said Schumacher. 

McGovern added: “If you have empathy, you can tell universal stories.” 

PSB merger on the agenda 

Sir Philip Redmond, the creator of Grange Hill at the BBC, then Brookside and Hollyoaks at Channel 4, reiterated his belief that these two public service broadcasters (PSBs) should merge.

“Every single one of you can get behind the idea that we need a wider debate about what licence payers want from public service broadcasting,” he told the TV industry audience. “Public service broadcasting exists to do what we need it to do, not what others think it should do. That’s a social debate, and it needs to be pushed.

“The future of this £5bn, licence‑fee‑raising organisation is what’s at stake,” he added. “If we want to be unique and specific, the only thing we can do is make sure we have a strong, confident PSB – one that isn’t frightened to ignore the algorithm (of commissioning). There’s one thing I can’t stress enough. Everyone knows this shouldn’t just be about short‑term pressure.” 

Yet, there was pushback from the incoming Channel 4 CEO, Priya Dogra. Making her first public speech in the role, the former Warner Brothers Discovery (WBD) and Sky executive said: “I spent years in M&A, and the thing you learn is that there are no mergers, only acquisitions. Someone is always buying someone else. From Channel 4’s perspective, that’s the wrong outcome. It would mean Channel 4 being subsumed into another organisation. Losing Channel 4’s editorial voice and the impact we have on content and on indies would be a loss for society and for the creative economy.” 

She said the broadcaster was open to “strategic alignments” and said she supported the need for a financially stable outcome for the BBC in its charter renewal. However, she drew the line at introducing advertising around BBC content, whether linear or on YouTube. 

“Beyond the seismic commercial impact on us and other broadcasters, it risks undermining the BBC’s universality,” she said. “It could compromise what makes the BBC the BBC. We already have one commercially driven public service broadcaster – us. Creating another doesn’t strengthen the ecosystem; it weakens it.

“It would be helpful if the government took that option off the table and gave the industry some certainty – especially in an ad market that’s structurally challenged and volatile.” 

Strengthening the region

Finding the funds to develop production and fill skills shortages are two vital, perennial issues if the industry is to grow in cities across the M62 corridor. 

“There’s a lot more infrastructure in the north than there was in 2020, but we still have a long way to go,” said John Whittle, Managing Director at production company Lime Pictures. “We’re not only competing with the Southeast and other parts of England, but with Wales, Scotland, and Northern Ireland. The north of England is in a place where we have to compete and collaborate.” 

Screen Alliance North was established three years ago as a collaboration among four regional agencies (Manchester, Yorkshire, North East, and Liverpool). Its latest report identified 67 production companies operating across the north of England. It also found that £103m had been spent across the four regions since 2022 to make 285 productions. Further, it revealed that over 3,700 people have been trained on courses funded by the alliance. 

“When the BFI tendered for the skills cluster, we came together as a ‘super‑cluster’ representing more than 10 million people,” said Caroline Cooper Charles, CEO of Screen Yorkshire. “That has allowed us to work strategically, combine our knowledge of local crew bases, and make sure everything we do is evidence‑based and not duplicated across regions. 

“We all work closely with our local production communities, and we don’t want to lose that. However, the partnership has allowed us to bring more business into the north – not just from each other, but from across the UK and internationally. We’ve opened the industry to people who weren’t previously engaging in it but now can.” 

This City is Ours, The Responder, and Time are among the award-winning dramas that have helped drive growth in the Liverpool City region since 2019, according to the Liverpool Film Office. Its Impact Report 2019-2025 found the film and TV industry created 5,408 full-time equivalent jobs during that period, with more than 1,600 productions said to have added £150m to the local economy. 

BBC looks north 

Since the BBC moved to Salford 15 years ago, staff numbers have grown from 2,000 to around 3,500.  

Heidi Dawson, BBC Head of the North of England and Controller of Radio 5, said: “I was one of those who moved. I grew up in Lancashire and went to the University of Manchester, but at the time, I had to live and work in London to build a career in the industry. Moving to Salford meant I could come home and do the job I wanted to do here. So, I want to challenge the misconception that it was just a bunch of Londoners travelling north.” 

Major departments like BBC Sport, BBC Children’s, and Radio 5 Live were there from day one. Breakfast TV and Morning Live have followed. Every BBC radio network also has a national programme coming out of Salford. 

“We’ve also got almost a thousand software engineers. The people driving major BBC products like iPlayer and Sounds are based here,” she said. 

Building a long-term home for production

To continue the region’s development, the next step seems to be to anchor productions in the city with a new studio, which is taking shape in a former Littlewoods building. 

The Depot, two 20,000ft² stages adjacent to the Littlewoods building, has been open since 2021. A further six stages and postproduction facilities are planned, provided that finance can be secured. 

Hat Trick Productions’ Mulville said he is working with the London Screen Academy (which provides 16 to 18-year-olds with vocational training in behind-the-camera roles) to create a film and TV education hub on the campus. 

“I approached the London Screen Academy and said: ‘It's a brilliant school, but if you keep it in London, you will become a stereotype. You've got to get this idea out to other places. Liverpool is ideal.’” 

Instead of importing craft talent from elsewhere to make shows in Liverpool, Mulville said: “Local people should work on productions made here.” 

He also expressed concern about the recent trend of BBC dramas portraying the city as a drug capital. 

“I’ve got a rom‑com set in Liverpool – I’ll ring the writer and tell him to stick a bag of cocaine in it, so it gets commissioned,” he joked. “Tell these stories, but tell the other stories too about families, love, community, women’s stories. Not just crime.” 

 


Eurovision 2026: NEP shares tech plan for seventh consecutive song contest outing

SVG Europe

article here

In NEP’s seventh successive Eurovision Song Contest it is operating vision mixing from UHD2 (from the Netherlands) and UHD24 (from Germany). Both are outfitted to be capable of use for the live acts or the interval acts and moderation, although in practice those roles are divided between the trucks and two directors. If one truck happens to fail the other one takes over.

These are accompanied by a pair of vans from ORF supervising the broadcast music mix (one acts as a backup).

A TOC has been built on site by NEP because the complexity and volume of signal management is too vast to be contained in an OB truck alone.

There are 28 live cameras in the arena of which 24 are Alexa 35 Live augmented by four Sony FR7 and FX6 cine-style cameras (four handheld, one gimbal). Ten of the total are outfitted as RF cams, eight on a tripod, two mounted on Scorpio 45 cranes and four Steadicam. The total includes three railcams (remote dolly with telescopic tower), three PTZs and three aerial camera systems (two 2D systems and one  RTS Rope Climber overhead tracking system).

In the trucks NEP is deploying a Grass Valley Karrera/K-Frame vision mixer with Lawo mc²56 and mc²66 on the audio side.

The router and multiview is controlled by NEP TFC, the company’s software management platform.

Screen control

NEP division Creative Technology has supplied the video wall (which is about 500sqm in total of Roe panels) and controls them as well as the media that runs on them during the show.

“What we’ve done here is special,” explains Karl Wigenius,head video for CT Sweden and product manager for video on ESC 2026. “All signal distribution is ST 2110 natively, directly from the media servers to the screens. There’s no baseband cables like SDI or HDMI and no conversion. That’s unique at this scale.”

This system has been developed over the past three years for Eurovision.

“We built a main and backup system so every system has a one‑to‑one backup. If something fails, we instantly switch,” he adds.

Monitor screens in the video control room shows live values from 26 MediorNet network switches and roughly 480 devices connected over IP. They’re sending about 4.2TB per second across the network.

CT’s video screen control also integrates with the OB trucks, receiving signals, timecode via Riedel, and graphics feeds.

 

One wrinkle is that the rolling‑shutter of the Arri cameras makes synchronisation with the LED screens challenging. Wigenius says: “That’s new for this year, it’s something that we haven’t worked with before on this scale. It’s a learning curve certainly. The lighting team like a rolling shutter because they can play more with strobes and lasers but it’s more of a challenge. If you step frame‑by‑frame, the LED might already be on the next frame during the camera’s first capture.”

The Riedel network itself includes 230 analogue and 430 digital radios communicating over 60 channels. It is supplying over 190 Bolero wireless intercom beltpacks, over 120 intercom panels, more than 80 network switches and the 26 MediorNet Racks for signal distribution across the entire main venue).

Lighting and graphics

The ESC has switched for the first time to a fully LED and laser-based lighting rig which delivers CO2 savings. The design by Tim Routledge comprises 3,107 fixtures (mainly Ayrton, Martin and Robe) featuring 28,000 individually controllable LEDs all supplied and rigged by Neg Earth Lights and Acme Lighting. Eighty high speed winches are used for a moveable grid of lights – another first for ESC.

“Tim has created a ‘ballet of lights’ above the main stage,” describes Michael Krön of Austrian public service broadcaster ORF who leads the host broadcast and is executive producer of the ESC 2026. “Each light has its own winch and can move down to the stage floor, creating a dynamic, beautiful choreography of light. It’s something truly new.”

For graphics, ORF is using a range of Vizrt solutions, including Viz Engine, and virtual environments and AR graphics with Viz Virtual Studio. Viz Engine is also rendering all on screen graphics and the data-driven graphics that display each country’s scores throughout the broadcast. 


Eurovision 2026: Arri and Riedel reflect on their Song Contest debut

SVG Europe

article here

From Mozart to modern pop, Vienna provides the heritage-rich backdrop for the 70th edition of the Eurovision Song Contest (ESC), which begins today and concludes with the grand final on Saturday. 

This year there’s a significant change to the workflow for the world’s largest music competition and one of the world’s biggest televised live events.

The introduction of Alexa 35 Live cameras, some 24 out of a total 28 camera plan, is intended to deliver new creative possibility, but the priority is to ensure the 170 million potential viewers of the show don’t miss a beat.

It’s also the first major public outing and integration for Arri and its new parent, Riedel.

“We are the newbies in this game,” Florian Rettich, senior trainer & consultant digital workflow support, Arri told SVG Europe at the Stadthalle venue in Vienna. “It was a learning curve for everybody involved. At the beginning we needed to talk to people and explain the differences. Also we needed to learn what is the expectation because we come from a different background. Technically we can do it, but this is just the first reference. We’re proving we can fully integrate into a live environment while adding extra value.”

While early discussions about Arri’s involvement in the 2026 ESC surfaced last August, the tender document from ORF included both a traditional and a cinematic multicam workflow. It was only in January when the final decision was made, seemingly at the behest of the ORF creative team.

“We submitted both options and worked closely with Arri to integrate their Live Production System (LPS-1, consisting of Fibre Camera Adapter and Fibre Base Station Fibreinto the OB trucks,” explains Axel Engström, technical project manager, NEP. “We essentially teamed up to make it possible.”

This process was straightforward, he said with only eight of the cameras running over SMPTE fibre and the rest wireless or via SDI.

Broadcast vs cinematic constraints

Even then, the ESC production is equipping the Alexa 35s with standard broadcast lenses and transmitting in 1080i, which is not an ideal showcase for what any cine-cam can really do.

“It’s always a compromise,” admits Arri managing director, David Bermbach. “You need to match what’s required for the output. Creatively, of course, the ideal would be PL‑mount lenses and a progressive format. But for this environment, right now, this is the best approach.

“From the camera side, the output is actually 1080p internally, with fewer conversions before going into the interlaced signal. But yes, next time, we’d like to be involved earlier in the creative process. Then you can discuss lighting, camera positions, lens choices, and what changes if you go a different route.

“Again, it’s not our call. ORF or the creatives chose specific lenses for good reasons. We can offer input from our world, and we can also learn from theirs.”

Nonetheless, he believes there’s huge potential in the new approach. “We’re trying to bring cinematic philosophies into this [broadcast] world. Not just creatively but ways of working. But it’s a step‑by‑step process. The first thing is to make sure that we deliver the baseline.”

He indicated that the narrative production side of Arri’s business could learn from broadcast in terms of “workflows, automation and more risk‑averse processes”. “The HDR experience goes the other direction, from cinema into broadcast,” he adds.

One of the most visible changes this year is the use of LUTs to give each performance a distinct visual identity. Arri proposed more than 50 colour profiles, with around 30 ultimately selected (almost one per entry). “Think of it this way, we’re doing 35 video clips live,” says Rettich. “Normally you’d apply a colour grade in post‑production. Now we’re getting closer to that — but live.”

It’s questionable, however, whether viewers will notice the difference, at least in this debut.

“They’ll see it’s slightly different, but they probably can’t spot why,” Rettich suggests. “It’s more stylised. A more post‑produced feeling.”

Bermbach emphasises that the collaboration with NEP and Riedel was central to making the integration work. “Working with NEP was outstanding. They were very open‑minded. Together, we can achieve things none of us could manage individually. There’s a lot of potential and we will develop this show by show to see what we can do together.”

In a sense, the collaboration harks back to the roots of the Eurovision Song Contest in 1956 which was established as a test to see whether live television could be broadcast in multiple countries at the same time and to connect audiences across borders.

It’s now a European and increasingly global phenomenon. Last year’s event achieved record-breaking viewership and engagement, solidifying its status as a major cultural event. The 2025 ESC, held in Basel, was watched by 166 million viewers across 37 markets (up 3 million from 2024) for the event’s three live shows. The grand final itself exceeded a 50% viewing share in 19 of the 37 broadcast markets, with Iceland leading at 97.8%. 

However, Iceland, Ireland, Spain and Slovenia withdrew in protest over Isreal’s participation and won’t be airing the show domestically. While Bulgaria, Moldova, and Romania return to the contest to round out competing countries to 35, the combined TV broadcast is thought unlikely to top last year’s figure.

The technical set up at the Stadthalle began in earnest around two months ago and in the final week included several as-live rehearsals ahead of the three live programmes: semi-finals on 12 and 14 May and the grand final on 16 May.

“Normally, a production of this scale would have four to five years of planning; Eurovision gives you less than one,” explains Michael Krön of Austrian public service broadcaster ORF who leads the host broadcast and is executive producer of the ESC 2026. “You must build a budget before knowing the concept or even the venue. Meeting the financial goals set by the ESC board is extremely demanding.”

Another challenge was combining production specialists from LA, London, and across Europe with Austria’s own creative talent. He adds: “Many people want to show they can deliver the best in sound, lighting, and staging. Keeping our artistic vision intact amid so many strong voices is challenging, but essential.”

The wind sector plans future growth

 IEC e-tech

article here

Standards and certification are the invisible framework behind the wind energy sector’s global expansion and its tech advances, despite current challenges.

The global renewables sector is growing at extraordinary speed with the IEA expecting capacity to double between now and 2030, increasing by 4 600 gigawatts (GW). At the same time, the entire renewables sector faces extraordinary challenges. Few industries embody this tension more clearly than wind energy, where technological breakthroughs, geopolitical disruption and economic strains are unfolding simultaneously. “The renewables sector is growing and faces challenges at the exact same time, and wind is the prime example,” says Jonathan Colby, Chair of IECRE, the IEC System for Certification to Standards Relating to Equipment for Use in Renewable Energy Applications.

IECRE is one of the four IEC Conformity Assessment (CA) Systems, and was established a little more than ten years ago to help provide third party certification and testing services for all power plants producing, storing, or converting energy from wind, marine and solar photovoltaic (PV) energy. The CA system ensures that the essential quality and safety requirements in standards are met, and as a consequence, that a reliable performance can be expected from all these systems. For the wind sector, that means checking that the industry aligns with the IEC 61400 standards, developed by IEC TC 88, the technical committee which produces standards for wind energy systems.

Overcoming current challenges

One of the main challenges for the wind sector is a widening disconnect between costs and market expectations. “Supply chain challenges, tariffs and geopolitical instability have driven up manufacturing and project costs. At the same time, energy markets continue to push prices downward generally, squeezing margins for developers and manufacturers alike,” Colby explains.

The world's installed wind power capacity now meets well over 10% of global electricity demand - with onshore wind accounting for over 90% of that, according to figures from the World Wind Energy Association (WWEA). By 2030, capacity is expected to double to 2 terawatts (TW), according to this global leader for wind turbine maintenance and blade repair. However, there are factors leading the IEA and the Global Wind Energy Council (GWEC) to revise their forecasts downwards. Earlier predictions were predicated on a rapid expansion of wind capacity offshore but trade tensions and shifting political priorities as well as international conflicts are creating volatility in a sector built on long-term, capital-intensive investments.

“People have invested huge amounts of money into offshore wind, and then policy shifts come in and really complicate projects,” Colby says. “These are high-capital, long-duration investments - you can’t just turn them on and off.” Such instability can undermine investor confidence which is essential in a sector where the projects require billions in upfront capital to complete.

However, despite these challenges, the market for wind energy is still growing. In 2025, according to this report by consulting firm Climate Central, the US still generated a record 853 210 gigawatt hours (GWh) of electricity from solar (46%) and wind (54%). The IEA has adjusted its outlook for China, reports global think tank Ember, though the country is still expected to account for around 61% of global operational offshore wind capacity by 2030.

China is leading the way

Chinese manufacturers are leading a rapid escalation in turbine size and capacity,  and announcing 25 to 30 MW turbine designs. Although ratings vary depending on load, wind speed and hours of operation, a single 30 MW turbine could power approximately 30 000 homes a year. One of China’s largest private wind turbine manufacturers has announced a plan to build a 50 MW floating turbine featuring a novel twin-head, V-shaped design.

Rotor diameters have also expanded dramatically, reaching up to 300 metres with blades of 150 m - roughly three times larger than many experts once considered feasible. Installing such massive structures requires specialized vessels and an installation infrastructure that are only just beginning to emerge. For example, Dutch and Chinese companies, among others, have recently launched vessels capable of supporting offshore wind turbines with capacities of more than 20 MW. “Imagine the size of the crane and barge you need to deploy a 50 MW turbine offshore. In some cases, those vessels are under development or don’t exist yet,” Colby comments.

Floating wind: the next frontier

While onshore wind still has room to grow - particularly in regions like China’s Gobi Desert - many countries are approaching saturation due to land constraints, planning challenges and public opposition. In addition, turbine sizes have outgrown onshore logistical limits. You can’t transport a 150 metre blade on land. Offshore, it’s easier to build by the coast and tow it out. “The potential of onshore wind isn’t completely exhausted but the industry is moving decisively offshore,” says Colby.

For offshore wind, fixed-bottom turbines remain the most cost-effective option in shallow waters. But as these sites become congested, developers are moving into deeper waters where floating systems are essential. “Floating is your only option in deeper water - and deeper water has the potential to unlock higher energy yields,” Colby adds.

Floating offshore wind turbines (FOWTs) can be deployed further from the shore, making them less visible to nearby dwellers - an added bonus. “The future of wind energy is going to include large floating wind turbines in certain markets where near-shore water depths don’t allow for fixed-bottom turbines,” says Colby.  

Revenue mechanisms like the UK’s Contracts for Difference (CfD) are also driving offshore growth. A new financial allocation round is coming in July with major funding for floating and fixed‑bottom offshore wind. Globally, new offshore projects are announced almost daily with schemes in Greece, the Baltics, Scandinavia, Spain, Chile, Japan and Korea all detailed in the last few months alone.

Innovations for end-of-life management

While turbine size still dominates the headlines, innovation is happening across multiple fronts. One of the most pressing challenges concerns what to do with turbine blades at the end of their life.

New materials, particularly thermoplastic composites (which can be melted down to extract reusable resin), which can be can be recycled rather than discarded are now more widely used in turbine blades. At the same time, developers are looking to extend turbine lifespans beyond their original design limits.

Standards and conformity assessment work hand in hand

Lifecycle assessment, reusability, and recyclability are areas of interest to TC 88 and IECRE. The latter is about to launch new schemes, one of which is asset management of large projects and the other through-life management and recycling. TC 88 has developed a new technical specification for that, IEC 61400-28.

Elsewhere, engineers are experimenting with new construction approaches, from modular towers assembled offshore to the reintroduction of wood as a structural material. Floating wind, meanwhile, is driving innovation in subsea systems. “You’ve got cables that are moving constantly - there’s nothing to clamp them to, as in fixed systems,” Colby says. “Developing dynamic cables and reliable underwater connectors is critical to making floating systems viable at scale and the IEC is developing the standards for that.”

As technology evolves, the frameworks that underpin the wind sector are under increasing pressure to keep pace. In this environment, standards and certification are becoming central - not just to ensure the safety and performance of wind energy systems, but also to reassure investors and regulators.

IEC TC 88 has just published the first international standard for floating wind design. IEC 61400-3-2 outlines essential design and safety requirements for FOWTs, ensuring their engineering integrity and protection against various hazards during their planned lifetime. The standard is crucial for the safe and efficient deployment of floating wind structures, addressing design scenarios and load cases to meet global industry needs.

Historically, standards in renewables have focused on individual components - ensuring that turbines, blades and electrical systems meet defined specifications. Increasingly, however, the industry is shifting toward a more holistic approach.

“ The newest IECRE wind energy deliverable is project certification, which is just starting to take off - and I think it’s going to be very important, especially for the offshore sector,” Colby explains. Project certification expands the scope from individual components to entire systems, covering everything from design and construction to installation and maintenance and site-specific considerations. This reflects the complexity of offshore wind, where risk arises not just from equipment failure but from how systems interact in challenging marine environments.

In this context, certification becomes a critical tool for reducing uncertainty. “It’s going to reduce risk, increase quality, and help guarantee performance - which in turn drives insurance costs down and brings investors to the table,” Colby says. The financial implications are significant. Even small increases in perceived risk can deter investment in projects of this scale. “Any volatility or uncertainty, and investors start to pull back immediately,” Colby warns.

Still early days for hybrid systems

Hybrid renewable systems using combinations of wind, solar, wave and hydrogen, are widely discussed but commercial rollout is slow. Wave energy projects, in particular, are still at a trial stage, making developers cautious about integration.

“It is a little avant-garde at the moment to talk about hybrid systems,” Colby says. “The problem is that wave systems exhibit a lower TRL (Technology Readiness Level) for marginal MW output. There's a lot of reasons why a hybrid system makes sense, but the wave community needs to show thousands of operational hours of reliability and real strong performance before major wind owners are going to take on the risk of integrating wave components into their billion-dollar offshore windfarms.”

More promising are integrated systems designed from the outset to combine technologies, as well as offshore wind paired with hydrogen storage. One project is being installed off the coast of Grand Canaria comprising a 4,3 MW wind turbine and a hydrogen system comprising an electrolyzer, hydrogen energy storage, a fuel cell and battery system.

Offshore floating photovoltaic (OFPV) is likely to integrate more quickly with offshore wind since floating PV, first introduced on land-based environments like reservoirs, is a proven technology. (For more on floating solar tech, read: The bright future for floating solar tech | IEC e-tech). The outlook for the wind energy sector in years to come is therefore full of promise despite short-term challenges.

 


Monday, 11 May 2026

Avoiding Action Fatigue: How a Fight Scene From Normal Was Shot to Stand Out

American Cinematographer

My interview and words article here

The action-thriller's cinematographer shares a lighting plot and details his use of strobes, practical rain and a global shutter to capture a blizzard-bound set piece. By Armando Salas, ASC

In my first conversation with director Ben Wheatley about Normal, he said, “Think High Noon shot in the '70s, and all the action takes place in a blizzard during a power outage.” Given that many of the sequences would naturally fall into the same lighting palette using candles, flashlights, moonlight and a lot of atmosphere, our second conversation was about how we would avoid action fatigue. Ben was adamant that each action beat needed its own visual stamp. The hardware‑store fight is the clearest example of how we approached that.

This sequence takes place in the middle of the night as Sheriff Ulysses Richardson (Bob Odenkirk) and a pair of bank robbers (Reena Jolly and Brendan Fletcher) — both of whom have become Ulysses' unlikely allies — are being hunted through the blizzard. Bob is shot at and blown through the window of a hardware store. That impact sets off a backup alarm system and triggers slow-firing strobes throughout the store. He has an extended hand-to-hand fight with the hardware-store owner (Carson Nattrass), during which a sprinkler head gets knocked off. So, now it’s raining inside while a blizzard rages outside. The combination of strobes, rain and snow created a completely different look from the rest of the film.

The blizzard that serves as the backdrop for the hardware-store fight sequence was captured on a 150' street set built on soundstages at Manitoba, Winnipeg. Night-gray backings and heavy atmosphere allowed the actors to disappear in 40', helping to depict the scale of the snowstorm entirely on stage.
Setting the Stage

All of this was done on stage. Production designer Jean-Andre Carriere oversaw the build of 150 feet of street and storefronts on soundstages at Manitoba, WinnipegThat, in and of itself, presented a challenge because the set was built fire lane, so I shot some “visibility tests” during prep to make sure I could make the street recede into nothingness.

We hung night-grey backings at either end of the street and filled the stage with enough atmosphere that an actor could be perfectly resolved in a medium shot on a wide lens, turn and run from camera, and disappear in 40 feet. Since that look defined several sequences, the hardware‑store fight needed to feel completely different.

Early on, I pitched the notion that the store might have a backup battery powering its alarm system. Ben immediately said, “I love it — and you know what looks great with strobes? Rain.” My first reaction was, “Rain from where?” His answer: “We’ll knock out a sprinkler head.”

Special effects had to plumb the sprinkler system and engineer a practical gag where a crowbar strike would break the vial in the sprinkler head and start the water flow.

We tested water pressure extensively. Too much and it looked like a monsoon, too little and it was just mist. Then there was the constant battle of keeping lenses from fogging and balancing the temperature of the water with the temperature of the stage.

Lighting and Programming Cues

For the strobes, the art department cut four‑by‑six‑inch holes into the set and installed glass diffusers so they looked like part of the alarm system on screen. Behind each diffuser, we mounted Prolycht LED Fresnels, pushed right up against the glass.

Gaffer John Clarke and I tested a variety of burst patterns. What worked best, especially once we added rain, was a fast burst with a slow chase across the set. It took just under a second for all five strobes to fire. The effect was chaotic but readable, and it gave us those beautiful flash frames that freeze the action mid‑impact.

Ambient room tone came from Arri Orbiters bounced into white cards, with selected ceiling tiles removed to create soft overhead ambience. But the strobes alone weren’t enough to light faces, so we added fixtures that operated at low intensity for modeling, then jumped to full output when the strobe chase passed through them. Every light had to be tied into the cue. Ami Buhler, our console programmer, was instrumental in making that system work.

Why Global Shutter Was Essential

The combination of strobes and rain made camera choice especially important. We chose the Red V‑Raptor XL (X) specifically for its global shutter — essential not only for strobes, but for rendering flashing police lights in a snowstorm.

I wanted to avoid broken frames where half the frame rendered frozen, backlit snow, the other falling to black. Fixing that in VFX would have been prohibitively expensive; the V-Raptor held captured the frozen snow in a very natural way.

To help capture shots such as this, which depicts the hardware-store owner (Carson Nattrass) amid the rain, the art department cut holes into the set walls and installed glass diffusers to mimic an alarm system, behind which the lighting team mounted Prolycht LED Fresnels that fired in a fast-burst, slow-chase pattern — all five strobes completing their cycle in just under a second.

I shot 8K for a 4K finish to get the benefit of low noise, the full sensor use (supersampling) and a really beautiful image even at 1,600 ISO.

We primarily shot two cameras but in the hardware store, the action was so specific and the aisles so tight that we could only squeeze in a second camera for a handful of shots. My A‑camera operator, Matt Schween, was a key collaborator and shot mostly handheld for this sequence. The camera becomes a third participant in the fight, and Matt had to anticipate movement without revealing that he knew what was coming. His instincts were impeccable.

I shot the show on T-Tuned Tribe7 Blackwing7 primes. This was my first time using that lens set, and I loved the character they brought to the image. For longer focal lengths in exterior work, we carried Arri Signature Zooms, but the hardware‑store fight was all primes.

In-Camera Effects

The show LUT was developed with my longtime collaborator Ian Vertovec (supervising colorist, Light Iron). We built a film emulation LUT with deep shadows and noticeable grain which was very much the aesthetic Ben wanted for the 1970s and '80s films we were referencing. The atmosphere from the blizzard lifted the shadows and added texture. The final look was very close tour show LUT. For final color, we graded in London. I worked remotely with Rob Pizzey, using Sohonet’s ClearView sessions and Frame.io reviews.

Ben wanted the action in this scene to escalate through the props, which also allowed the comedy to emerge. The stunt team would pre‑shoot the fight on video while we were filming on another set, and we’d refine the set pieces based on that. It helped that Bob is incredibly skilled at selling choreography and giving the camera exactly what it needs.

There’s a grisly practical gag right at the end involving a penny nail and a prosthetic eye piece which I love. Every film Ben referenced was from the '70s and '80s, built on practical effects so shooting effects in-camera was baked into his vision. CG was always the last resort.

It’s a fight in the middle of a blizzard, inside a store that’s raining, lit by strobes. It’s chaotic, but it’s controlled chaos.

 

Friday, 8 May 2026

Intelligent by design - Sir David Attenborough interview

IBC Daily Executive Issue 2011

Sir David Attenborough explains the vital role technology has played in the evolution of natural history filmmaking.

“Watching time lapse photography of flower buds opening is marvellous in 2D but in 3D it really is transcendental,” says Sir David Attenborough. “You experience the sensation of being able to touch the plant.”

Technology has come a long way since Attenborough’s first foray into filmmaking in the 1950’s but his relationship with innovative recording equipment has been almost as intimate as it has with some of his subjects. While the 85-year-old is among other things a world respected broadcaster and iconic presenter, synonymous with the natural history genre he has helped define, his success has in part been built on a natural affinity and enthusiasm for new storytelling tools.

“In 1952 TV was regarded by the BBC’s governors as essentially electronic,” says the distinguished recipient of IBC’s International Honour for Excellence. “The serious business of broadcasting took place in TV studios – or on radio – and when you asked for extra money to make a film the reaction was ‘what for?’. It was extremely difficult to get funding.

“TV was 405-line monochrome and the telecine machines only had one gauge – 35mm – for reels which cost a fortune and weighed a ton,” he recalls. “It simply wasn’t possible to make the sort of film I wanted to make in Africa in 35mm so I asked to use 16mm and was told in no uncertain terms that 16mm was for amateurs.

“I persisted and put my case to the head of TV who finally agreed to allow me to take 16mm on location (for Zoo Quest) and I became the first user of that format at the BBC. The difficulties didn’t end there because the standard that BBC staff cameramen worked to was also 35mm and most wouldn’t touch 16 with a barge pole so I had to employ freelancers.”

Attenborough recalls that the Cine Special he took on his first expeditions used a side-loading reel running 2 minutes 40 seconds, that it operated for just 40 second bursts and only then by winding it up.

“As you can imagine it was extremely difficult to film any kind of continuity of animal behaviour and there was no way of synchronising sound with picture so I couldn’t talk to camera,” he says.

In the mid-1960s Attenborough was appointed Controller of BBC2, during which time he also helped usher in colour transmissions.  But programme making and the natural world was too big a draw and after resigning in 1973 he spent three years planning and filming Life on Earth which debuted in 1979 to universal acclaim. It was the most ambitious project yet achieved by the BBC Natural History Unit.

“Natural history is full of marvellous opportunities for colour and showing birds of paradise in black and white was a very frustrating thing to have to do,” he says. “Colour meant natural history could be so much more visually exciting and Life on Earth was expressly designed to take advantage of this.”

Another major technical advance, crucial to the evolution of the genre, was the commercial jet plane. “Suddenly you could schedule round-the-world trips to capture environments at different times of the year where previously a trip to Australia to film a three-minute sequence of some owls was just not practical.”

Cheaper, faster flights made 13-hour long series economical.  “Every series that followed was a response to a technical advance,” he says. “The Private Life of Plants was a direct response to the introduction of servers which allowed you to film continuously over long periods, to generate amazing time lapse footage. Infra-red cameras enabled us to film the nocturnal behaviour of mammals (The Life of Mammals) that had not been possible before. Cool lighting systems and highly light-sensitive cameras allowed us to film insects on a macro scale whereas previously the poor things had been scared by the light and heat and had exhibited abnormal behaviour.

“Digital of course made a huge difference with cameras that can operate soundlessly without scaring wildlife, that you record hours of footage on just waiting for something to happen, and also miniaturised so you can put them in hides.”

The last big technical change, he suggests, was the introduction of gyro-mounted helicopters which enabled aerial shots of wildlife activity, of wolves hunting buffalo for example, that had never been seen before.

More recently Attenborough has been pioneering 3DTV documentaries, closely involved in the production of Flying Monsters 3D (Atlantic Productions for Sky) which was the first of its kind to win a BAFTA.

“3D works under certain circumstances and depends on how you produce it,” he says. “Right now, the technology is limiting although the results can be liberating. It takes four-to-five people to lift the cameras and 30 minutes to change a lens, which is no way to react to fast moving animal behaviour. The systems are very temperamental which means you could be sitting around for an hour and half while the cameras are aligned.

“Frankly it’s a nightmare to anyone accustomed to crawling through the bushes trying to get close to nervous animals with a small camera capable of capturing sound and vision. You can’t do it with 3D – at the moment.”

The restrictions on the use of long lenses, which tend to create a flat, cardboard-like effect, are equally limiting.

“If you ask a cameraman to go to South America and film landscapes in 3D but they are not allowed to use lenses longer than 75mm then they are simply not going to be able to bring back content as good as it would be in 2D. Audience’s expectations will be let down and that is to be avoided at all costs.

“That’s why you have to choose your subjects carefully to exploit the value of 3D. I chose to work with fossils (for Flying Monsters 3D) and deliberately that of a creature which moved in three dimensions so that its 40ft wing span can fly over the audience if you wish it to.”

Atlantic Productions’ follow-up is Bachelor King, a 3D narrative documentary currently in postproduction about penguins on the island of South Georgia. Again, it was Attenborough’s choice.

“The thing about 150,000 penguins on a beach is that they look identical so that when they move or you lose track of them during filming you can simply construct the story from another one,” he says. “The beach’s other inhabitants included seals, and they don’t move too fast either.”

He believes that the future for 3D depends on technical development. “At the moment the big problem with 3DTV is that you have to wear glasses which occlude light. That means you can’t see the person next to you, read a newspaper or do anything you would normally do other than watch the TV and that means you have to have event programming.”

Although he has just completed another landmark production Frozen Planet, following the cycle of the polar seasons, which took three years to film and saw him film at the north pole for the first time, Attenborough continues to search for new worlds to explore.

“We still know remarkably little about the deep ocean because our technology is limited by pressure at depth – but that would make for an incredible subject,” he says. “As for the next thing after 3D, that could be holograms. Imagine that – creatures popping out of your TV and appearing in your living room.”

Thursday, 7 May 2026

Mathieu Kassovitz tells filmmakers: “Adapt or die”

IBC

article here

The French director and actor has ditched CGI for AI in his next film and says AI is part of the creative dialogue

There’s no more propulsive, streetwise or analogue movie than 1995’s Cannes Film lauded classic La Haine (Hate) but its writer director Mathieu Kassovitz now believes actors and all below the line crafts could soon be superseded by AI.
So much so that he is rewiring the production of his latest film with AI from set design to performance.
“I’m preparing a film that simply couldn’t be made without AI,” he says, describing The Big War, his adaptation of French comic-strip artist Edmond Calvo’s 1945 masterpiece La Bête est Morte! Featuring animal characters and set during the Nazi occupation of France during World War II, Kassovitz’ version of the satire will be photorealistic.
“The comic is very Walt Disney, but I’m making a very realistic film,” he explained to IBC365 at the World AI Film Festival in Cannes. “Realistic rabbits in realistic forests, with tanks, with the human war in the background, and rabbits fighting big bad wolves in the forest.”
Quotes from VFX studios in France and the US ranged between 50 to 60 million dollars.  “For a film for children, but about war, that’s not a risk I want to take. Not a risk financiers should take. And it’s not good for cinema. I like films whose commercial ambition matches their budget.”
During the 3–5 years of prep “AI arrived,” and everything changed. “I subscribed to one of these models and typed: ‘little rabbit in a burning forest’. The images weren’t what they are today, but already they were astonishing. And that completely changed how I thought about the film.”
What AI unlocked was not just cost reduction — though he says the film’s budget has been slashed by almost two thirds — but creative freedom.
“Traditionally, making this film would require going into forests with a crew, filming ping-pong balls placed where the characters will be, then a team comes to volumetrically scan the place, then I wait six months before seeing my characters in the environment. It gives me very few editing choices. And it’s a physically demanding but not very interesting shoot. No actors; just a soundtrack and ping-pong balls.”
“Now, I can generate and iterate on images much more freely. For example, my characters are 70 cm tall. If I shoot at 35mm from 1.2 metres, I see their feet touching the ground.  Showing just that simple detail of contact with the ground explodes the cost of a VFX shot. Previously, I had to avoid this. Now, all 760 shots can be staged differently. I can generate the environments and the character as I want them and with far greater naturalism than the best VFX today.”
Kassovitz recently paused pre-production to pour resources into a new studio and hiring coders and engineers to program and train bespoke tools for the film.
He compares the moment to George Lucas inventing the tools he needed for Star Wars. “He had a script, but the VFX of the time weren’t good enough or were too expensive, so he built his own tools.”
“For the past couple of years, we’ve been working with designers to develop our characters, costumes, and environments. We create everything first, then use AI as a tool to build on that foundation.”
The challenge isn’t generating images. “AI can give you a 100 million dollar image on your screen in five minutes,” he says. “The challenge is control. If I ask a character to move left, will it move left consistently? Can I actually make a full film this way?
“To do that, we need to build tools on top of existing models—layers, APIs, systems that allow us to control outputs precisely for filmmaking purposes.
“Right now, AI is like the early internet. A powerful framework that’s open to everyone, but still chaotic. The real value comes from building the tools that sit on top of it—the ‘clients,’ in a way—that make it usable.”
For Kassovitz, AI is not replacing the collaborative nature of filmmaking; not yet. “Cinema is never the work of one person. It’s 50 people… actors, technicians, editors and everyone brings something to the table. Now we have something that we can all gather around and say: ‘here’s what I give you, what do you give me back?’ And from there, a creative ping pong begins.
“If I were someone like Ridley Scott, maybe I could get write my own cheque. If I were Terry Gilliam maybe I could illustrate characters and sets but I can’t. I rely on people who are good at that to provide the spark which we then feed into the machine. The AI expands and refine those ideas. It isn’t just executing instructions. It becomes part of the creative dialogue.”
And yet, Kassovitz has seen something in AI that startled even him. Taking out his phone he shows IBC365 a video from a 20-minute AI-made film of a wizened old man staring back in close-up.
“I felt an emotion that gave me chills. Genuine emotion in the character’s eyes, just like you would expect from a human actor. That’s when I thought, we have a problem.  Because honestly, I’m not sure you could always get that from a real actor in the same way. In fact, you cannot have an actor that looks like that.”
The implications for actors as well as technical crafts are profound. “They have to adapt or die,” he says. “That’s how it’s always worked. When blue screen came in, it changed sets. When digital cameras replaced film, it changed workflows. In every revolution you have some artisanal layers that have to evolve.”
That doesn’t mean everything disappears. For example, stop-motion animation still exists. “It’s slower, more labour-intensive, but it has a unique quality that audiences can feel. If you’re passionate enough to spend years on something like that, it will show and it will be different from what AI produces.
“There will always be credits. There will always be heads of department. Unfortunately, some roles will disappear or be recycled elsewhere. But it’s like when digital arrived: suddenly cameras became accessible. Before digital, you couldn’t touch a camera unless you were certified. Today it’s accessible to everyone. Is that good or bad? I think it’s good.”
He predicts the rise of AI “superstars” - entirely digital actors with millions of followers. “I guarantee you that in a few years people will want to see the Tom Cruise that they have in their head and not the real Tom Cruise… and Tom Cruise would say, you know what? Give me my money, have fun.
“In my case, for The Big War, I’m not really using traditional actors on screen. I’m working with animated characters like rabbits and animals. The actors I do work with are voice actors. But even there, AI gives me more freedom. I don’t have to put performers in motion-capture suits with cameras strapped to their heads. I can focus more on creative direction and performance.
“In a few years, audiences may not even question it. Younger generations won’t necessarily care whether something is AI-generated or not. Unless it’s explicitly labelled, they may not even be able to tell the difference.”#
He is blunt about the industry’s anxieties. “Yes, AI can produce a lot of crap, but humans have produced crap for 40 years tooWhat matters is authorship. The only limit is my taste. I feed the machine with my inspiration, and I must get back something that feels like what I’d get from a human team.”
Kassovitz was dismissive of attempts to reward filmmakers for AI models trained on their work – even his own.
“In La Haine I stole shots from Scorsese who stole from Kurosawa who stole from Eisenstein. That’s the fucking rule.”
A scene of Vinz (Vincent Cassel) posing with his hand as a gun in front of a mirror is an acknowledged lift from Taxi Driver. A rooftop scene where one of the characters clicks his fingers to turn off the lights on the Eiffel Tower “like they do in movies” is another. “I stole that from an Italian film,” Kassovitz says.   
“What matters is intent. If I see some guys that are doing La Haine… and they’re doing some stupid shit with it, of course I’m gonna say, what the fuck are you doing? I’m going to sue you. Come and ask me, you can have my permission, we can work on it. No problem. But theft is theft with or without AI.”
He admits to having lost some love for cinema in the decades since La Haine’s release “because VFX are everywhere, even in intimate films. I no longer know what I’m looking at,” he says. “is it real snow? A real car? A real apartment? It kills emotion. But now we’re no longer in fake reality — we’re in recreated reality.
“We were lucky to know analogue. To go from 16mm to Super 8 and 35mm to 70mm The next generation won’t know that and that’s tragic. But it will push people to be more personal, more creative — to do things AI can’t do.”
The future, he believes, lies in specialised cinematic models: “If we create cinema‑specific models trained only on films — millions of hours of art — we might create the last artistic tool we’ll ever need.”