Wednesday 23 December 2020

Extending reality - Eurosport Cube

AV Magazine

Eurosport and White Light re-engineered mixed reality studio Cube for Covid-safe remote presentation, integrating real and virtual worlds for live broadcast. Adrian Pennington investigates.

 

https://www.avinteractive.com/features/case-studies/extending-reality-21-12-2020/

The Eurosport Cube is a live presentation studio that brings together a variety of technologies to create an interactive mixed-reality set. First introduced for the 2018 Winter Olympics in Pyeongchang, where it won awards for originality and ingenuity, the Cube was sent back to the drawing board for an enhanced version scheduled to be deployed at the Tokyo Summer Olympics.

Covid-19 pushed the games back 12 months but the pan-European broadcaster decided to utilise the technology to overcome the challenges of the pandemic. It unwrapped a new incarnation for coverage of the US Open tennis in New York and again at Roland-Garros for the French Open in October.

As with Cube version 1, Cube 2.0 is a multi-vendor project led by Alex Dinnin, director of graphics and innovation at Eurosport with lead technical specialist White Light (WL). “It won awards, which we are all very proud of, but when we stepped back everything took quite a while to produce,” says Dinnin. “It needed seven hours to build for one shot. We didn’t use it live. The key factor was that it needed to be much quicker. We needed to run EVS and graphics and not have to convert it to a bespoke codec so it could be controlled by a vision mixer just like a normal studio.”

But everyone was excited by the potential. “The first Cube was an R&D project,” says Andy Hook, technical solutions director at WL. “The fact that we’d replaced green screen with LED technology allowed us to create content that had not been easily achieved before. That was our starting point for v2.”

White Light based development on its own SmartStage technology with which it has had success in education and corporate sectors over the last few years. A key physical change was the removal of one of the Cube’s three walls. In Pyeongchang, Eurosport was shooting entirely inside the LED volume – much like how virtual production is being used to shoot shows like The Mandalorian. Cube 2.0 has two walls 90-degrees to each other with a virtual set built in Unreal affording a much greater flexibility of camera angles.

“Cube 2.0 has a very small footprint (3.5 x 3.5 metres) but the virtual set extension enables the camera to move wherever we want,” Hook says. “We’ve seamlessly stitched the content shown on the LED walls with the virtual set extension.”

The imposed travel and social distancing restrictions not only necessitated a remote production from New York and Paris, but meant that Eurosport’s lead tennis anchor, Barbara Schett, was the only presenter based at WL HQ in Wimbledon. All athlete interviews and contributors had to be virtually beamed in from various locations.

In practice at Flushing Meadow, this meant a player would come off court for a flash interview into a green screen that was part of the venue’s broadcast facility. The raw green screen feed is sent to the Eurosport production team at WL which keys the athlete out and places them inside the Cube’s virtual environment.

Teleportation
The athletes themselves view the studio output on a large monitor in their eyeline. The position of that monitor determines the position and eyeline of the athlete back in the Cube.

“The challenge we had with the US Open was that we were using someone else’s facility,” Hook reports. “You would probably only have the athlete for 30 seconds before going live in order to key and light them, place them in the set, and size them correctly. Our team sometimes only had fifteen seconds to complete that whole process before the interview started. When we came to the French Open we had a bit more control and this informs how we continue to develop this technology.”

Continues Hook: “We’ve done hundreds of hours of live corporate events using Smart Stage and a big part of that is teleporting CEOs and keynote speakers into the space. We’ve been doing a lot of work on how we improve that remote acquisition and also how we improve the depth of capture and the automation of the set-up process.”

Seven servers are configured as one unified, networked system controlled from a single GUI. Some of those servers are render nodes rendering out the Unreal realtime world/set, others are generating the outputs to the LED walls and the composite camera feed with AR, and one is acting as a dedicated ‘Director’ – providing a single control point for all of those and keeping everything in sync.

“The disguise GUI is a full 3D realtime visualisation of everything that is going on in the studio, giving the operator in the gallery a view from any angle and also the ability to work offline and pre-visualise any changes they might want to make or rehearse. It also means we could have a world built in Epic Games’ Unreal, then add within the same 3D space worlds built in Unity or notch or another engine. We assemble it all together in realtime.

“We have complete control over what appears in the LED wall (back plate) and in AR (front plate) or both, to the extent that we can have our presenter walk around virtual objects with no occlusion. To the presenter, the AR version of the athlete being interviewed looks a little skewed but their eyeline is completely natural and that’s what is important for the interaction to work.”

The set design was briefed by Eurosport and White Light to set designer Toby Kalitowski, creative director at BK Design Projects. His design was built in Unreal Engine by Dimension Studio. WL loaded that on to the disguise servers and is treated as a video source. The set’s background includes a window which is treated as another video source – for the French Open it was often a panorama of Paris.

Show control
Unreal was designed as a game engine not a show control system so being able to trigger events in Unreal, to change lighting and to sequence that process has traditionally been difficult.

“What we’ve been able to do is provide a great front end so that the operator in the gallery is able to respond to any request that the director might come up with and trigger changes in Unreal and whatever other content is coming in.”

Lighting control is also vital and with the new Cube the lighting director is able to synchronise both real and virtual lighting. If they fade up a real light in the studio, the system will turn a version of that light on in the virtual world at the same time.

Likewise, if the colour temperature of the live feed in the virtual window changes, the LD can change the studio light to match.

BlackTrax tracking, also new, automatically lights the presenter as they move. “Rather than having lots of different lit positions depending on where (Barbara) stands in the space, the lighting automatically maintains a perfect light on her,” Hook explains.

“Barbara creates a natural shadow on the floor because she is being lit properly with a key, fill and a back light. You also get realistic reflections on the set because the virtual environment and video sources reflect on to the walls, an interactive table, on to glasses, jewellery. It’s those little things that add believability and cause people to question whether what they are seeing is real.”

As mixed reality technology becomes more sophisticated and affordable it will be increasingly used in broadcast, but skills need to keep pace with the technology.

“As we’ve gone through different MR broadcast projects we’ve realised there are many elements which are outside the comfort zone of normal production teams or that needed to be thought about in a different way, or that require different roles,” says Hook. “For example, you may have a traditional broadcast set design which is handed over to someone who will build that into a realtime virtual environment. The design can get lost in translation.

He advises: “The lighting director needs to get involved early in the virtual studio production to make sure the lights are in the right place in both worlds. You need to plan shots around what will and won’t be seen in the real world and in the virtual world. You also have to consider that you only have one LED volume and it can only show content to one perspective at a time. All the cameras are tracked and you can cut between them live with the content in the LED wall cutting in sync to show the correct perspective. Cutting live is easy, but if you want to re-edit in post-production you need to have planned what any ISOs or B-roll look like on the other cameras as the background could be wrong.”

Adds Dinnin: “Before, it was a graphics project. Now it operates much more like a normal TV studio. We have a director and vision mixer who have never worked in mixed reality before who are able to sit down and work it out. If they want to change the background video source they just a push button. It is much more broadcast oriented and because it’s a 3D studio it can be reversioned to suit whatever sport we want.”

The Cube’s next outing is for the Australian Open in January, then comes next summer’s Olympics. Reveals Hook: “There are definitely tricks we haven’t shown yet which will show we have moved the goalposts even further on.”

 

No comments:

Post a Comment