Thursday, 19 January 2017

Lost in Time: The Future of Immersive Media

Streaming Media 

A high-concept game show devised by members of The Matrix visual effects team, produced in Norway, and being sold by the makers of The X Factor could herald the future of multi-screen entertainment.


Lost in Time is a mash-up of live action filmed on a green screen with audience participation and real-time graphics. spanning TV, gaming, mobile and e-commerce.
"We are pioneering a new category of digital entertainment called interactive mixed reality (IMR)," explains Bård Anders Kasin, co-founder of The Future Group (TFG), the tech masterminds and co-producer of the show with FremantleMedia. "The real and virtual worlds merge to produce a new environment where physical and digital co-exist."
"All the big TV evolutions have come about from technology," says Dug James, SVP of development production and part of the global entertainment development team for FremantleMedia. "The advent of mini-DV and portable camcorders enabled the first reality shows. Nonlinear editing and logging of rushes enabled on location rapid turnaround formats like Big Brother and text voting pushed interactivity with the Idol and Got Talent formats. Fremantle are always looking for those forward-looking tech changes that deliver new ways of entertainment or storytelling."
Nolan Bushnell, the founder of games developer Atari Corp. and a consultant to the project, claims that the fusion of gaming with TV can "bring a standard construct for new kinds of entertainment."
 Lost in Time follows three contestants as they are transported into different eras including the Roaring Twenties, Wild West, Space Age, Ice Age, Medieval Age, and the Jurassic period where they compete in a series of challenges against the clock with the aim of winning a jackpot prize. Viewers can also participate within the same virtual world via mobile app where they are able to play against the show contestants as well as other players from across the country in real time.
While the TV show can be enjoyed without playing the second screen game, the format is targeting family audiences for broadcasters concerned about shedding core viewers to OTT services and mobile.
TFG has spent two and half years and amassed $42 million in funding to develop the core technology which is based on Epic Games' Unreal Engine.
The nut that it's managed to crack is to synchronise computer animation with live action photography such that people are able to interact with virtual objects, or have physical objects interact with the virtual world, in real time.
It's a breakthrough that puts the company three years ahead of anyone else, according to co-founder Jens Petter Høili. "Various companies might have developed different aspects of the technology, but no one has put it all together in the way we have," he says.
The technique is becoming increasingly popular in visual effects filmmaking on films like Avatar or The Jungle Book where a director can direct actors against a green screen while viewing virtual backgrounds in real-time. TFG takes this a stage further—it is fully live-capable. The virtual worlds are created in advance and rendered live mixed with live action photography.
"A games engine wants to render in as few milliseconds as possible whereas broadcast cameras records at anywhere from 25 to 50 frames a second," explains Bård Anders. "To make this work for broadcast we had to work a way of getting frame rates from virtual and physical cameras to precisely match, otherwise this would not work."
Working with Canadian broadcast kit vendor and virtual set specialist Ross Video, TFG devised a means of binding the Unreal game engine with linear timecode.
To do this the team were granted access to the Unreal source code and reengineered it so that the rendered virtual image is genlocked with studio footage captured with SMPTE timecode.
To achieve a pinpoint accurate chroma key, a set of HD systems cameras have been customised to run 4:4:4 RGB rather than the 4:2:2 YCbCr compressed images of conventional broadcast.
Contestants and physical objects are tracked by 150 IR sensors positioned behind the green screen. This arrangement also enables motion capture in real time. Demos of this have included contestants mocked up as storm troopers.
"In a movie you'd put markers on an actor and remove them in post," says Bård Anders. "We don't have that luxury so we needed a whole new way of linking the IR signals with the camera tracking."
Even the speed and racking of the robotic cameras has been tinkered with. Such systems are typically designed for slow-moving tracking shots and gentle zooms in news room virtual sets not for filming people running or jumping around.
The cameras are loaded with Ross' software UX VCC which provides a bridge between the robotic and manual camera systems with a tracking output and the Unreal engine.
Accommodation had to be made for any change in the physical depth of field from focusing or zooming which will naturally distort the picture's bokeh (the visual quality of the out-of-focus areas of a photographic image). To do that profiles of each individual lens are fed to the UX VCC which in realtime replicates the distortion inside the virtual camera model.
"If a physical prop in the studio and a virtual prop are not aligned even by a fractional amount then the whole chain pulls apart," says Bård Anders. "The background optics of each lens which distort when you change focus need to be exactly matched in the games engine."
Production is being made in a 500 sqm/5381 sq ft studio on the outskirts of Oslo. This arrangement includes a Technocrane, automated ceiling camera, several SolidTrack markerless systems plus Steadicam units. A military-grade simulation platform is used for flying or driving game elements.
The idea is that broadcasters could either use this as a hub and fly in to shoot their version of the show or establish their own green screen base equipped with a package of TFG gear. Further production hubs in the U.S. and Asia are planned.
TFG will offer a package of pre-built virtual environments as well as a library of 3D assets for content creators to build their own worlds.
This allows a broadcaster to tailor the show to suit. A Chinese version of Lost in Time might include a future Shanghai or an ancient Han Dynasty world in contrast to a version produced for Argentina, for example.
The entire technical setup will be sold to broadcasters along with a licence to produce the format. Crucially, that required the system to operate within a standard production environment.
"We could produce over IP but Fremantle needed this to scale which means it has to be able to plug into studios throughout world," says Bård Anders. "It is also important for broadcasters to use this without needing to train people to a large extent when they operate it."
Familiar broadcast switchers and control surfaces are integrated, such as Ross Carbonite Black and the Ross UX VCC. Directors will have to familiarise themselves with the ability to select from a virtual infinite number of camera angles inside Unreal with which to replay highlights of a game from any angle.
As part of the current production contestants will be recorded in 3D using photogrammetry for insertion of animated avatars of their facial likeness at certain points in the games' storyline.
The need to standardise an advanced suite of technologies also explains the decision to produce in HD, although there is nothing stopping a 4K production except managing (and paying for) four times the data overhead.
"There is a balance to be struck when you increase the resolution between resolution and frame rate but with graphics hardware advancing so fast we don't anticipate this will hinder us for long," explains Øystein Larsen, TFG's VP of virtual.
The Oslo production began by using NVIDIA GeForce GTX 1080 cards and is now running on its latest GPU architecture, Pascal which are "40% faster" says Larsen.
Likewise, the first season of the show is not being made for virtual reality because of the small market penetration of VR goggles. A VR app has been tested on platforms including Oculus and is available for any broadcaster that wants it.
Nor will the first season, which airs on commercial broadcaster TV Norge from March, be live. Most of the 60 minutes of each show is pre-recorded, but through the use of the companion app viewers can interact with live elements incorporated into the broadcast—for example, competing against other viewers during the show to win real prizes.
"In our experience when you do something this new something always fails in the first season," says Bård Anders. "Nothing has failed yet, but we decided to remove one element of risk which is live production. However, this is possible and will likely happen from season two."
When it does, the format's possibilities begin to open even further. It would be possible, for example, for players of the mobile app to compete live with competitors in the studio and for the same virtual world played in by studio contestants to change and react according to actions of players at home.
"This production is proof of concept," says Bård Anders. "Once we've nailed this we can really start to let our imagination's fly."

The Matrix Connection

Bård Anders was a technical director at Warner Bros during the making of The Matrix trilogy when he came up with the initial idea.
"When we started to experiment with gaming technology in the production pipeline I thought at some point we have to be able to do this in real time," he says.
In 2012 he started-up The Future Group in Oslo by merging his startup with another tech company led by Høili, a serial entrepreneur who led the sale of Høili Group to private equity firm Industri Kapital in 2005 and founded several ventures such as EasyPark.
Bård Anders called Larsen—lead technical director at Warner Bros during Bård Anders' time there—to head R&D on the virtual systems.
Another Matrix VFX alumai with Larsen at Manex and ESC facilities, Kim Libreri is now CTO at Epic Games. The Matrix link continues with Michael Gay, who also worked on the trilogy with Bård Anders, Larsen, and Libreri and is now Epic's director of cinematic production.
"After 14 years we are all back working together again," says Bård Anders.

The Fremantle Connection

FremantleMedia will sell the show at sales fayre MIPTV in April. James emphasises that for all the technology the content itself has to be compelling to watch.
"We've spent a huge amount of time on how to tell real emotional stories. We knew this wasn't going to be a game show for gamers—this had to be for families. The tech has to be stable since it needs to perform day after day but we have to give viewers a reason to interact and we do that, we hope, by telling human stories featuring genuine personalities."
Another key for Fremantle is advertising and sponsorship. Product placement is a given with logos designed to mimic the virtual environment (e/g a Pepsi logo styled to fit a saloon in the Wild West).
TFG is also working with brands to create advertising spots using the Unreal Engine such that viewers need not leave the show's virtual universe.
"Instead of leaving the show at a commercial break, we enter another virtual world which is content or a story created by a brand," says Stig Olav Kasin, TFG's chief content officer and former TV producer of Norway's version of The Voice.
The app's games can be played offline when the show is off-air.
"Sponsors are attracted because they can go deeper into storylines of the game rather than just having a bumper, plus they can have the activation all week," says James.
Users can download the app to iOS or Android and invite friends to join through Facebook. "We are concentrated on using the biggest platform for today which is TV, but designed to transition the viewers and the format over to the mobile platform from tomorrow," says Bård Anders.

The Next Frontier

The patented platform underlying the production, which Bård Anders refers to as an "integration architecture," is branded Frontier and is being sold and distributed as a separate product by Ross Video. The market for this package are post facilities, studios and producers wanting to advance virtual set capabilities with the Unreal engine and produce IMR content.
"The most complex parts are the software which integrates with the hardware system and allows us to connect everything in the studio with the virtual graphics under control of a standard broadcast HD switcher," says Bård Anders.
Other customers might include producers wanting an off-the-shelf means to pre-viz content in realtime, rather than going to the expense of proprietary systems which are currently built for shows like The Jungle Book.
There are suggestions that Frontier be used to reinvigorate existing TV formats. Instead of constructing a new physical set, why not customise a brand new one with Unreal's photorealistic graphics?
Outside of TV and film, TFG is eyeing applications in industry, e-learning, medical and e-sports. For the latter, the tools could be used to insert game commentators into League of Legends, say to analyse moves by players from within the game itself. Kind of like Tron.
Amusement parks are another potential home, offering an advance on current 4D simulation rides by putting people inside story worlds like Harry Potter or the Star Wars universe.
"We've been approached by a lot of the biggest companies around the world," says Høili.
Another spin-off product is an augmented reality graphics system which could be overlaid on soccer matches, for example, and linked with in-game sports data from the likes of Opta. Users could then pull more data, such as stats, or analysis by touchscreen.
"What we are producing with now is like the first smartphone," says Bård Anders. "There will be a natural progression of this technology."
As depth cameras advance and GPUs accelerate the processing of 3D maps interfaced with animation software then the green screen could be removed and content could be created outside a studio.
A company called Owlchemy is already developing this. Using a ZED stereo depth camera, it is able to perform realtime mixed reality compositing using a custom shader and custom plugin to be able to green screen the user and depth sort them directly into the games engine itself.
The merger of platforms and formats is a clear trend and could signal indeed signal the future of media. Finnish talkshow Tilt introduced a virtual reality component last year. The series, which broadcasts on TV6 and is produced by indie Reflect, has its contestants compete in VR games.
French distributor Kabo recently acquired the rights to sell the format internationally alongside another Reflect format which makes e-sports games available for viewers to play on mobile, connected TV and VR headsets.

Unreal TV

Fremantle is not the only media company exploring Unreal for new entertainment formats since Epic made the game engine available for free.
"Games, architecture, engineering, design, education and automotive not so much separate as they are converging into a digital content creation industry - an industry in which everything is interoperable and where Unreal is the engine that links them all together in real time," Epic chief executive Tim Sweeney told the Game Developers Conference.
Nickelodeon is developing an animated kids show codenamed Project 85 using the engine. Unreal's user forum quotes Nickelodeon saying that "the flexibility of the tool allows us to take bigger risks as we can now iterate on the fly and push into areas that were once cost and time-prohibitive. Using Unreal, kids can watch our stories, play with our characters, and invite them into their living room like never before."
ILM X Lab is also using Unreal to create immersive content in the Star Wars universe.

NewTek and Unreal

Announced in September, NewTek said its Network Device Interface (NDI)—a video over IP production protocol—will be implemented for Unreal Engine 4.
Brian Olson, vice president of product management, is convinced of the importance of the convergence of gaming engines and linear production, as well as the impact on e-sports and e-gaming.
"The convergence of linear video production and gaming engines has been taking place over the past several years in an organic fashion," he says. "In a quest for more and more realistic rendering, producers turned to gaming engines to provide state-of-the-art real-time 3D rendering of virtual environments. Gaming engines have gone beyond what has been possible with traditional virtual rendering engines, which were basically character generators to begin with. Things like global illumination, real-time reflections, and real-time shadows are difficult to do with most traditional virtual set products. Manufacturers are only now trying to bring turn-key products to market that utilize gaming engines.
He expects colossal hard sets that take up thousands of square feet are going to be a thing of the past in most cases.
"Virtual environments created by gaming engines will be nearly as realistic and probably have more impact for much less money," he says. "Gaming engines will be even more prevalent in their native space. E-sports and e-gaming events will use IP video streams from actual electronic games and competitions for linear broadcasting and digital streaming. They key is getting video in and out of computers in an easy and high-quality fashion. That's where NewTek's NDI IP video standard really helps. Epic Games has implemented NDI in UE4 to help users with this problem."

No comments:

Post a Comment