Tuesday 3 May 2022

Natural interacting

AV Magazine

article here

Plenty of new graphics technologies become wildly popular and quickly fade away, such as the stereoscopic 3D movement around 2013.

With virtual production, however, technological advances such as the increasing quality and accessibility of gaming graphics engines and LED and the speed of GPUs (resulting in low latency and high image quality) has seen demand soar across film and TV, from narrative storytelling to live sports presentation and reality shows.

Applications mixing live action with photorealistic augmented graphics (mixed or extended reality/xR) is becoming so popular for one major reason: it removes many of the current challenges the broadcast and film industry faces.

Crews no longer have to travel to different locations to get captivating footage. It can be done at the home studio and with fewer crew on site. This capability can even be extended to ‘teleport’ actors and presenters that are in different locations, which Discovery is doing regularly for remote coverage of events, like the Olympics. There’s the added benefit of cutting the carbon footprint.

“When using green screens to create virtual scenes, one of their biggest limitations is that actors, hosts or presenters cannot actually see the environment they are supposed to be interacting with,” says Phil Ventre, vice-president of broadcast, disguise.

Now, the LED volumes that make up virtual stages not only surround presenters in photorealistic virtual environments but also allow them to interact with them in the most natural way. Furthermore, green screens require intensive and expensive post-production iterations for things that could be easily fixed if they were discovered during shooting.

“In a live environment, xR is even more beneficial as studios can make use of CG graphics in the moment. It is possible to have a live studio audience experience similar visual effects that the audience at home will enjoy.”

The BBC has been using augmented reality within physical sets for several years and in 2019 opened a virtual studio at Dock10, within Salford’s MediaCity, to house Match of the Day. That studio used Unreal Engine rendering technology and a compositing system from Zero Density to allow programme makers to create photorealistic output in realtime.

BBC’s own green screen solution
Then for the Tokyo Olympics 2020 (taking place in summer 2021) in the same studio they switched to using UE with Brainstorm. For the Beijing Winter Olympics, BBC Sport made another change, swapping out Brainstorm for graphics and virtual set control systems from Viz.

“It’s all about giving us more control,” explains John Murphy, creative director and head of graphics for sport at the BBC. “We were quite reliant on external providers – whether motion graphics specialists, Moov, Dock10 or Brainstorm – so for Beijing we developed our own green screen studio at MediaCity, building on the success of Tokyo but using our own graphics department in house to design and operate the virtual worlds.”

As the Winter Olympics is a smaller event than the Summer Games, BBC Sport scaled the VR set back to its Pres 2 studio, as it would for any other Winter Games. Pres 2 was converted into a four-camera greenscreen studio with a locked-off fifth ceiling camera added for the Games.

The studio is just 84 square metres but the benefit of the virtual set permits almost infinite expansion: in this case for seven different presentation positions.
“Previously, it was just a small physical set which was quite an uncreative and under-utilised space that studio directors didn’t want to use,” says Murphy.

“A presenter and guests would have been sitting around the same desk doing the same thing against the same screens and backdrop. What the virtual set allows us to do is to open up the space and make it far more creative and engaging.”

Executive producer for major events at BBC Sport, Jonny Bramley adds: “When you design a real set you’re very much constrained by those four walls, and the challenge with that is how creative you can be within those four walls.

“A VR set design means you’ve got to literally think outside of the box – outside those four walls – and so the scenes we had in Tokyo meant the set was the entire city of Tokyo with our little box on top of a skyscraper in the middle. This time the set is in a mountain environment and it’s our own little log cabin in the middle of that.”

The BBC’s team led by Jim Mann and Toby Kalitowski have had fun designing the virtual world for the Beijing Games. With very little, if any snow, on the ground in China and with many events in actuality taking place against urban backdrops, the BBC’s virtual set is a fictional Swiss-style mountainside log cabin and ski resort amid snow clad landscapes.

There’s even virtual wildlife that occasionally wanders into shot – reindeer, penguins, even a polar bear – in complete disregard for geography.

“Although it sounds gimmicky, the presenters have been referencing the wildlife so it creates a bit of fun rather than the standard 1+2 sitting around a desk,” says Murphy. “When the presenters (including Hazel Irvine and Clare Balding) talk about ‘going outside’, they were of course actually inside the VR studio.”

The technology which drives the virtual studio is a Viz 4 and Unreal Engine integration using the Viz Fusion keyer along with the Mo-Sys camera tracking system. Unreal Engine provides the virtual landscape and rendering and Vizrt’s Viz 4 and Viz Arc provide the control of the studio, including any AR.

BBC Sport’s graphics department has been using Viz for some time and was waiting for the vendor to enable integration with Unreal Engine.

“Viz was the only ‘go to’ for Pres 2 because its technology is already inhouse and they were able to launch Viz 4 with Unreal for us to use on this project.”

The Viz Engine offers native integration of the Unreal Engine and the Viz Arc control application enabling producers to control aspects of both render pipelines.
Vizrt systems range from plug and play to something that requires proper planning over several months. “It all depends on the complexity of application,” says Vizrt CRO, Gerhard Lang.

“You can have simple setups with camera tracking and you can add in Steadicam, multiple cameras, cameras on cranes, and automated camera control. The choice of camera matters too. Lower quality cameras (outputting noisy signals) will be harder to key.”

Lang predicts the growth of the technology among broadcasters “which do not have such a sophisticated team as the BBC.” This will be enabled by reducing costs further and putting processing and functionality in the cloud.

“Going NDI-native and cloud-ready offers content producers the opportunity to rationalise workflows but considerable care must be taken when seeking to unlock the potential. Live video production can be a complex challenge at the best of times and performing chroma keying remotely is tough.”

Fireworks
Fireworks is the first original UK film shot using Virtual Production. The short was initially conceived as a stage play by writer, Paul Lally and developed as an immersive experience for VR headsets using volumetric capture at the suggestion of producer, Annalise Davis then reworked as a VP.

“I was doing a test of another project in the volume at Dimension when I realised that all the ideas we’d explored in VR for Fireworks could be delivered using VP,” says director Paul Franklin. “Once you can find a really compelling creative reason to do it in VP then every aspect of it earns its place.”

Franklin is a former VFX producer at Dneg where he won Oscars and BAFTAs for Interstellar and Inception.

Fireworks is told in realtime, in the tense final moments of an MI6 operation to take out a dangerous target who has been tracked down to a Tripoli marketplace.
Instead of visiting Lebanon, production designer Jamie Lapsley researched the city using Russian language search engines to find “holiday videos” with which to build a version of the city in Unreal Engine.

“The project was a chance to design a whole city and to realise that in a way I could never do normally,” says Lapsley.

Dimension’s Unreal artists discussed lighting, materials and object placement with Franklin’s team before creating realistic props and environments in the software.

“The goal was to achieve the essence of a real street section,” says Ed Thomas, VP supervisor. “We built lighting and atmospherics to match what it feels like to be in Tripoli.”

Franklin navigated the virtual world with VR goggles to plan the shoot. “The time you would normally spend with your production designer, 1st AD and DP to work out how to get around the set, you have to devote to the virtual asset because it is going to be baked into principal photography as you shoot it,” he says.

“I was able to put up a virtual movie camera (an iPad on a shoulder mount) and walk around the space looking into the virtual world and could share that with my cinematographer.”

Franklin took individual frames from the VR recce to produce storyboards of the entire film.

“Until you put a headset on and explore it is very difficult to visualise issues you may run into,” says Ollie Downey BSC. “A VR recce means you can get to the bottom of things comprehensively.”

Dimension’s Unreal lead, Craig Stiff not only built the virtual world, he acted as virtual gaffer. “On a normal job you might say to your gaffer ‘I want a SkyPanel here and a backlight there’ and in Unreal Craig can just dial that in,” says Downey.

Virtual lights combined with practical fixtures were arranged by gaffer, Andy Waddington. He says, the physical lighting design has to match the colours and temperatures of the virtual illumination displayed on the LEDs.

The volume itself consisted of a curved LED screen for the main environment with side panels and a ceiling panel to represent more of the scene wrapping around the characters.

“We treated it as though we’d built our set actually in this street in Tripoli,” says Downey. “I let the background over expose and treated it as I would a view from a window on a normal set – only one I have more control over.”

The real and virtual sets were blended. “For example, the texture of the ground has to blend seamlessly into the texture of your digital ground,” says Thomas. “Once you’ve got that it’s very hard to see where reality ends and virtual reality begins.”

La Soirée Extraordinaire
When the producers of French TV music show ‘La Soirée Extraordinaire’ wanted to present an evening of song to audiences in a way they’d never seen before, they tasked creative studio, Blue Node Paris to immerse artists in extended reality.

Production company, DMLS TV – which is part of the Banijay group – also partnered with Virtual Display Services, an XR specialist, on the concept. Blue Node’s CGI specialists handled the extended and augmented reality elements, providing a workflow for creatives to work swiftly with the artists during the tight five week turnaround ahead of broadcast on M6.

Blue Node Paris also collaborated with D/Labs, which provided artistic direction and 2D content, and technical services provider, AMP Visual TV on tests for the cameras and camera switching. Ten cameras were used, five of them tracked. They used the disguise xR workflow to shoot 43 musical numbers in four days featuring xR, AR or a combination of both.

“No live broadcast shows in France have used extended reality before,” says Pierre-Guy di Costanzo, co-founder at Blue Node Paris. “We have many music TV shows, and it is always the same thing – someone speaking with a big screen.

“With xR you can go further and immerse the artists in the studio world. This has not been seen before. When the producer saw a demo in our studio he was interested in the technology. We could deliver this vision technically, creatively, and everything in between.”

The main challenge was integrating all the diverse elements required for each of the 43 unique tracks. Blue Node teamed with AMP to test the cameras using mini-scenes, so they’d know exactly what they had to do on-site at the La Seine Musicale concert venue, where all the tracks were shot.

On set, Blue Node Paris used two disguise vx 4 media servers and four rx render nodes to power the photorealistic virtual scenes developed in Unreal Engine. Five of the ten cameras were tracked using stYpe’s camera tracking system, including one mounted on a Louma crane, one Microfilms moving head, one on a Microfilms dolly and one on a pedestal.

“When you have twelve scenes a day to deliver, you need to prepare your timeline to switch from one to another,” says di Costanzo. “With disguise you can just click, shoot and go. Since Disguise is native Unreal it allows you to deliver content on big screens.”

The show ‘transported’ singer, Julien Doré from a forest cabin to the Grand Canyon surrounded with dream-like animals. It showcased music duo, Vitaa and Slimane in a chateau ballroom where the floor became a giant chessboard and the singer’s queen and king.

No comments:

Post a Comment