Monday, 31 December 2018

Rendering the impossible

FEED

When games engines meet live broadcast the real and photoreal are interchangeable


Design and production tools that enable broadcasters to create virtual objects that appear as if they’re actually in the studio have been available for a few years, but improvements in fidelity, camera tracking and, notably, the fusion of photoreal games engine renders with live footage has seen Augmented Reality go mainstream.
Miguel Churruca, the marketing and communications director at 3D graphics systems developer Brainstorm, explains, “AR is a very useful way of providing in-context information and enhancing the live images while improving and simplifying the storytelling. Examples of this can be found in elections nights, plus entertainment and sports events, where a huge amount of data must be shown in-context and in a format that is understandable and appealing to the audience.”
Virtual studios typically broadcast from a green screen set. AR comes into play where there is a physically-built set in the foreground, and augmented graphics and props are placed behind the camera. Some scenarios could have no physical props with everything behind and in front of the presenter being graphics.
“Apart from the quality of the graphics and backgrounds, the most important challenge is the integration and continuity of the whole scene,” says Churruca. “Having tracked cameras, remote locations and graphics moving accordingly, perfect integration, perspective matching and full broadcast continuity are paramount to provide the audience with a perfect viewing experience of AR graphics.”
The introduction of games engines, such as Epic’s Unreal Engine or Unity has brought photorealism into the mix. Originally designed to quickly render polygons, textures and lighting in video games, these engines can seriously improve the graphics, animation, physics of conventional broadcast character generators and graphics packages, but it’s complicated because of the constraints of real-time rendering and operation.
That, though, has been cracked.

Virtual/real live music show
Last year a dragon made a virtual appearance as singer Jay Chou performed at the opening ceremony for the League of Legends final at the famous Birdsnest Stadium. This year, the esports’ developer Riot Games wanted to go one better and unveil a virtual pop group singing live with their real world counterparts.
It’s a bit like what Gorillaz and Jamie Hewlett have been up to for years, only this isn’t as tongue in cheek.
K/DA, is a virtual girl group consisting of four of the most popular characters in League of Legends. In reality, their vocals are provided by a cross-continental line-up of accomplished music stars: US-based Madison Beer and Jaira Burns, and Miyeon and Soyeon from actual K-pop girl group (G)I-DLE.
Riot tapped Oslo-based The Future Group (TFG) to bring them to life at November’s World Championship Finals opening ceremony from South Korea’s Munhak stadium.
Riot Games provided art direction and a base CG imagery model for K/DA’s lead member Ahri, and TFG transformed Ahri into her popstar counterpart and originated models for her three group mates, based on concept art designs from Riot.
LA postproduction house Digital Domain team supplied the motion capture data for the characters, TFG completed their facial expressions, hair, clothing, and realistic texturing and lighting.
Lawrence Jones, Executive Creative Director, TFG. “We didn’t want to make the characters too photorealistic. They needed to be stylised yet believable. That means getting them to track to camera and having the reflections and shadows change realistically with the environment. It also meant their interaction with the real pop stars had to look convincing.”
All the animation and the directing cuts were pre-planned, pre-visualised and entirely driven by timecode to sync with the music.
“Frontier is our version of Unreal which we have made for broadcast and realtime compositing. It enables us to synchronise the graphics with the live signal frame accurately. It drove monitors in the stadium (for fans to view the virtual event live) and it drove real world lighting and pyrotechnics.”
Three cameras were used all with tracking data supplied by Stype including a Steadicam, a PTZ cam and a camera on a 40ft jib.
“This methodology is fantastic for narrative driven AR experiences and especially for elevating live music events,” he says. “The most challenging aspect of AR is executing it for broadcast. Broadcast has such a high-quality visual threshold that the technology has to be perfect. Any glitch in the video not correlating to the CG may be fine for Pokemon on a phone but will be a showstopper for broadcast.”
Over 200 million viewers watched the event on Twitch and YouTube.
“The energy that these visuals created among the crowd live in the stadium was amazing,” he adds. “Being able to see these characters in the real world is awesome.”

WWE WrestleMania
The World Wrestling Entertainment (WWE) enhanced live stream production of its annual WrestleMania pro wrestling event last April with Augmented Reality content produced by WWE using Brainstorm’s InfinitySet technology.
The overall graphic design was intended to be indistinguishable from the live event staging at the Mercedes-Benz Superdome in New Orleans
The graphics package included player avatars, logos, refractions and virtual lighting and substantial amounts of glass and other semi-transparent as well as reflective materials.
Using InfinitySet 3, WWE created a wide range of different content, from on-camera wrap arounds to be inserted into long format shows, to short self-contained pieces.  Especially useful was a depth of field/focus feature, and the ability to adjust the virtual contact shadows and reflections to achieve realistic results.
Crucial to the Madrid-based firm’s technology is the integration of Unreal Engine with the Brainstorm eStudio render engine. This allows InfinitySet 3 (the brand name for Brainstorm’s top-end AR package) to combine the high-quality scene rendering of Unreal with the graphics, typography and external data management of eStudio and allows full control of parameters such as 3D motion graphics, lower-thirds, tickers, and CG
The Virtual Studio in use by the WWE includes three cameras with an InfinitySet Player renderer per camera with Unreal Engine plugins, all controlled via a touchscreen. Chroma keying is by Blackmagic Ultimatte 12.
For receiving the live video signal, InfinitySet is integrated with three Ross Furio robotics on curved rails, two of them on the same track with collision detection.
WWE also use Brainstorm’s AR Studio, a compact version which relies on a single camera on a jib with Mo-Sys StarTracker.  There’s a portable AR system too designed to be a plug and play option for on the road events.
The tech played a role in creating the “hyper-realistic” 4K AR elements that were broadcast as part of the opening ceremony of the 2018 Winter Olympic Games in PyeongChang.
The AR components included a dome made of stars and virtual fireworks that were synchronised and matched with the real event footage and inserted into the live signal for broadcast.
As with the WWE, Brainstorm combined the render engine graphics of its eStudio virtual studio product with content from Unreal Engine within InfinitySet. The setup also included two Ncam-tracked cameras and a SpyderCam for tracked shots around and above the stadium.
InfinitySet 3 also comes with a VirtualGate feature which allows for the integration of the presenter not only in the virtual set but also inside additional content within it, so the talent in the virtual world can be ‘teletransported’ to any video with full broadcast continuity.

ESPN
Last month, ESPN introduced AR to refresh presentation of its long running sports discussion show, Around the Horn (ATH).
The format is in the style of a panel game and involves sports pundits located all over the U.S talking with show host Tony Reali via video conference link.
The new virtual studio environment, created by the DCTI Technology Group using Vizrt graphics and Mo-sys camera tracking, gives the illusion that the panellists are in the studio with Reali. Viz Virtual Studio software can manage the tracking data coming in for any tracking system, works in tandem with Viz Engine for rendering,
“Augmented reality is something we’ve wanted to try for years,” Reali told Forbes. “The technology of this studio will take the video-game element of Around the Horn to the next level while also enhancing the debate and interplay of our panel.”
Sky Sports
Since the beginning of this season’s EPL Sky Sports has been using a mobile AR studio for match presentation on its Super Sunday live double-header and Saturday lunchtime live matches.
Sky Sports has worked with AR at its studio base in Osterley for some time but moving into grounds is aimed to improve the output aesthetically, editorially and analytically. A green screen is rigged and de-rigged at each ground inside a standard matchday 5m x 5m presentation box with a real window open to the pitch. Camera tracking for the AR studio is done using Stype’s RedSpy with keying on Blackmagic Design Ultimatte 12. Environment rendering is in Unreal 4 while editorial graphics are produced using Vizrt and an NCam plugin.
Sky is exploring displaying AR team formations using player avatars and displaying formations on the floor of the studio, having them appear in front of the pundits.
Sky Sports head of football Gary Hughes says the set initially looked “very CGI” and “not very real” but it’s improved a lot.
“With the amount of CGI and video games out there, people can easily tell what is real and what is not,” he says. “If there is any mystique to it, and people are asking if it is real or not, then I think you’ve done the right thing with AR.”

Spanish sports
Spanish sports shows have taken to AR like a duck to water. Specifically, multiple shows have been using systems and designs from Lisbon’s wTVision, which is part of Mediapro the Spanish media group.
In a collaboration with Vàlencia Imagina Televisió and the TV channel À Punt, wTVision manages all virtual graphics for the live shows Tot Futbol and Tot Esport.
The project combines wTVision’s Studio CG and R³ Space Engine (real-time 3D graphics engine). Augmented Reality graphics are generated with camera tracking via Stype.
 For Movistar+ shows like Noche de Champions wTVision has created an AR ceiling with virtual video walls. Its Studio CG product controls all the graphics. For this project, wTVision uses three cameras tracked by Redspy with Viz Studio Manager and three Vizrt engines with the AR output covering the ceiling of the real set and the virtual fourth wall.
The same solution is being used for the show Viva La Liga, in a collaboration with La Liga TV International. 
AR is also being used for analytical overlay during a live soccer match. Launched in August, wTVision’s, AR³ Football is able to generate AR graphics for analysis of offside lines and freekick distances from multiple camera angles. The technology allows a director to switch cameras, the system auto-recalibrates the AR and it takes a couple of seconds to have it on air.


1 comment:


  1. Our primary objectives are to promote Total Quality Management and to maintain our success through steady investment in our service expansions and innovations.https://globalinklogistics.com/servicescategary/Freight-Forwarding Our goal is to promote health, safety and environmental safety through out our organization. Our aim is to be the best customer service team in our profession. We strive to develop and maintain a first-class infrastructure to ensure employee satisfaction, which drives customer loyalty leading to sustained profit growth and creating improved company value..

    ReplyDelete