Thursday, 8 January 2026

Multi-camera live virtual production on a broadcast budget

IBC

German broadcaster SWR claims a world first live multi-camera virtual production with potential learnings for broadcasters everywhere.

article here

Single-camera live virtual production is well established in TV, but production using multiple cameras has been reserved for dramatic content – until now.  A recent trial at German regional public broadcaster Südwestrundfunk (SWR) claimed the world's first live virtual production using three tracked cameras. The results were noted by fellow domestic public and private broadcasters, the EBU and others further afield, including Norway’s NRK.

Part of the ARD network, SWR launched the three-month trial in September 2025 to explore the potential of virtual studio production.

“We wanted to evaluate the role that virtual production can play in future in-house productions – whether for entertainment, culture or information,” explains Michael Eberhard, Technical Director, SWR. “The aim is to make public broadcasting fit for the future through technological innovation and to reach new target groups – especially on digital platforms – in an economically sustainable manner.”

Challenges for TV productions

TV shows made using virtual production are typically produced with only one camera since the LED wall can only represent the perspective of one camera at a time.

With at least three cameras required for meaningful use in a television studio, this means that the vision mixer and director see the perspective of the next cameras only after the cut. Since the matching background image is not displayed on the wall for every camera all the time, it also means the camera crew can’t align the image to the background. The challenge is greater because each camera must be switched seamlessly so that the virtual background instantly matches the camera that’s been selected.

This is what the proof of concept at SWR set out to conquer. In Studio 6 at the broadcaster’s headquarters in Baden-Baden, a 10x4m Crystal LED Verona Wall was installed, working in combination with three standard studio cameras (Sony HDC-5500s), each with an Ocellus tracking system and a dedicated Unreal Engine. A Brompton server controlled the display of graphics on the LED wall. The broadcaster used its existing Sony switcher. The planning, construction and calibration of the wall was carried out in collaboration with the Austrian AV and rental company AV-Professional.

The three-month test began with a reality-type show, ‘Fehler im System’ (Errors in the System), a narrative role-playing style show created in collaboration with production company Midflight Productions. Six virtual sets created in Unreal Engine in advance were designed to appear integrated with physical objects, including a table and props, in front of the wall.

Technical solution

A separate image of the 3D environment is rendered for each camera. Each odd-numbered frame shows the 3D environment of the currently edited camera. Each even-numbered frame shows a pure blue image.

Since the LED wall switches 100 times per second between blue screen and 3D environment, each camera ran synchronously at 100fps (simultaneously generating two 50p signals - one with 3D environment and one with blue frame). To do that, the Sony cameras were upgraded with a high-speed licence to work at twice the broadcast standard.

“The video signal with the 3D environment of the current camera is used for program output,” explains Patrick Volgar, Engineering Technician, SWR. “The video signal with a blue frame is used to key the perspective of each camera individually and apply it to the monitor image. This means each monitor image always shows the perspective of its own camera and makes multi-camera production possible.”

The tracking system on top of the cameras is marker-free (no reflectors or gyroscopes) and works via five black-and-white cameras and infrared LEDs. The system automatically creates a reference map of fixed points in the studio for orientation, enabling camera movement to be synchronised with the graphics played back through Unreal.

“We all had to learn how to use the tracking system in combination with Unreal and the Wall,” says Volgar. “All of this was new to us. In doing so, we realised it is helpful to have someone who understands in real detail how to work with Unreal.”

There was a five-frame delay switching cameras on the wall, something easily manageable for most content, especially recorded shows, but which can present an issue for live music.

“Virtual production could be challenging to use for music because when you want to switch on time, on the notes, the delay makes it not impossible, but harder. That said, the overall experience, for instance, when watching on our output monitor or in the gallery, was almost as if you were cutting in a traditional studio.”

After starting out with blue screen, they changed to green so that the blue hues of the graphics and set didn’t interfere with the keying.

Another tweak was to give each operator’s viewfinder a frame bordered in red, which designated where they needed to be in relation to the wall and retain accurate perspective. “It wasn’t strictly necessary but if felt more comfortable for the operators to have a visible reference to help them stay inside the frame of the wall,” he says.

To avoid moiré, the visual glitch caused when high-resolution digital sensors look at high-resolution digital screens, Sony provides its plug-in to Unreal, with which productions can previsualise cameras and lens combinations. This can include a heat map of moiré in the studio to make sure that the creative ideas are technically possible. SWR’s crew also opened the camera iris to 2.8 for shallower depth of field and placed props 2.5m from the wall.   

“You can simulate everything up front so you know what distances are going to work, but of course, in a live situation, you can’t plan for every eventuality, so sometimes it may be necessary to advise your camera operator to pull back a little from the wall to avoid moiré,” Volgar says.

Interactive role-playing format

The broadcasts went live over two nights in October on Amazon platform Twitch, with positive feedback from audiences.

“It shows that innovative and interactive formats on Twitch and YouTube can reach and enthuse large audiences,” says Eberhard. “The use of innovative technology offers enormous potential for long-term audience loyalty and to strengthen SWR’s position in the digital entertainment landscape.”

This wasn’t the only format that SWR trialled. It invited pitches for programme ideas, received 19 ideas and selected five. In addition to Fehler im System, other use cases included a variation on a classic culinary show set in different historic periods (Kochen in Epochen), a challenge show ‘Cosplay Masters’ with a focus on faster prototyping without the high costs of set construction; and a short-form science explainer set on Mars to evaluate the feasibility of creating social media formats in virtual production including video podcasts, 9:16 aspect ratios and fast conversion times.

Having finished the PoC, SWR is evaluating what it has learned. The LED wall has returned to the rental house AV-Pro in Vienna.

Importantly for SWR and the other broadcasters that came to see the PoC in action, was the possibility of using existing equipment such as studio cameras and a production switcher.

“This demonstrates that investing in virtual production need not be done from scratch. They can use legacy kit in addition to which the most expensive component, the LED wall, can be rented for the periods when needed,” says Sebastian Leske, Head of Cinema Business Development at Sony Europe.

Volgar thinks the technology may become standard within five years. “For me as a technician, it’s amazing to work with a new technology. It is clearly more flexible than compared to green screen production. 

“What is most exciting is the ideas the creative department will get from this. Editorial teams could present from locations that are otherwise too expensive, or that don’t even exist. Imagine presenting history from different cities or periods in time or a science show from Mars.”

Eberhard believes virtual production with multiple tracked cameras can be a game-changer for the media world. “It gives us creative possibilities and makes us more efficient, flexible and economical. The technology can also open doors for collaboration in public broadcasting, making it a perfect fit for our times.”

 


No comments:

Post a Comment