Monday 15 January 2018

Mixed Reality - Television on Steroids


Digital Studio

There is a vision of television or more broadly the storytelling possibilities of the entertainment industry – in which viewers will no longer be confined to their living rooms, but can experience the action just like live studio audiences or even gameshow contestants. Space and time will no longer be constraints, with viewers travelling through new universes without stepping foot into the outside world.
This vision is being harnessed by content creators as mixed reality, or MR - “the result of blending the physical world with the digital world,” according to Microsoft. Rather than just adding artificial elements to a real scene as with augmented reality (AR), or creating a completely artificial environment as with virtual reality (VR), MR takes reality, digitises it, and then places all or parts of it into an environment that mimics the real world in real time. MR can be wholly immersive, or it can physically blend with a real-world view.
MR will be one of the technologies that will help spur a “fourth industrial revolution,” according to Cindy Rose, CEO of Microsoft UK.
Along with cloud services, AI, and quantum computing “these technology shifts will reshape our lives, our businesses, our organisations, markets, and society, and we believe that these technology shifts will deliver incredible benefits,” she said.
So how can the TV sector use this new technology? According to Stig Olav Kasin, chief content officer at The Future Group (TFG), the biggest challenge for content creators is developing the storytelling structure to be suitable for TV.
“Viewers are used to a linear 2D story. Creating a unified experience where people can move around freely like in a game simply isn’t possible, or at least no one has cracked it yet,” he says. “The danger for content creators is that they fall into the trap of making the same stories we have today, simply with new visual effects.”
Viewers should be able to participate in the story and universe – on equal footing and at the same time as the contestants.
This was TFG’s goal when it created gameshow format Lost in Time in partnership with The X Factor and ‘Idol’ producer Fremantle Media International (FMI). Across six virtual worlds (Wild West, Ice Age, Medieval Age, Jurassic Period) contestants compete in different challenges in a studio show. The contestants and props are real, but everything you see on screen is visual effects, akin to what you previously would only get from Hollywood movie productions. Even better, the visual effects are real-time capable in a full multi-cam setup.
The big departure from traditional gameshows, though, isn’t just the graphics. At the same time as the show is aired on TV, viewers can compete on equal footing with the contestants and be part of the story via a mobile or tablet app. The best players even win the same prize as the contestants. This takes MR one step further: the real and virtual worlds become one universe, allowing TV contestants and TV viewers to be part of the same storyline – one world, one story, one experience. This is what TFG’s calls interactive mixed reality (IMR).
“In the 2000s, viewers got the power to vote for their favourite contestant in talent competitions and reality shows,” says Kasin. “It was also good business for the broadcasters. In the IMR universe, however, viewers can be participants in their own right. At the same time, broadcasters and advertisers get the chance to communicate directly with the viewers and gather data about their interests and behaviour, like social media companies can.”
Lost in Time premiered on Discovery Communications-owned channel TVNorge, Norway’s second largest commercial channel, last spring. According to TFG co-founder and CEO Bård Anders Kasin the show increased viewing on the channel’s slot by 64% and saw the mobile game played 7 million times during the season (from a local population of just 5 million); “The interactivity was extremely high – much higher than we expected.”
Now the show is coming to the Middle East courtesy of Dubai TV.  It is destined to be “the first of its kind in the Arab world” according to Sarah Al Jarmen, Dubai TV’s director.
The Dubai TV version of the format is set to air across the channel’s pan-regional Middle East and North African reach. It won’t differ radically from the Norwegian version, although there it will be in Arabic and there will be a greater emphasis on mobile game interactivity. Two seasons of 26 episodes will be produced out of the Oslo hub with contestants flown from the UAE to Norway. Two seasons of 13 episodes will show in 22 territories in the Middle-East and North Africa.
In a statement, Anahita Kheder, the senior VP of the Middle East and Africa for FMI, called the show “hugely ambitious and disruptive” and that FMI was very excited “to bring this loud and buzzy format to the Middle East.”
At the moment, VR headsets aren’t distributed widely enough to justify a primetime viewing slot for a live show. That’s why TFG says it developed the games for iOS and Android devices, giving a large global audience the chance to compete and engage with the content. “However, once VR headsets are more widespread, it will open up a new world of possibilities for the TV industry to blend the best elements of gaming and traditional TV,” he says.
Signs of this are already evident. Partnered with the Turner-IMG ELeague, TFG injected CG characters into the broadcast of the Street Fighter V Invitational eSports event last spring. Characters from the game struck fighting poses on the studio set, viewable by studio audiences on adjacent screens and by Turner’s TV audience. It took a couple of months to produce the characters but the resulting animations were displayed without post production combined with physical sets and presenters, live.
TFG was at it again in October for the Eleague’s Injustice 2 World Championship broadcast on TBS, Twitch and YouTube. Among the 3D character animations presented to viewers at home as if interacting with the studio audience, was Batman. This promises to be the first of a wider deal to augment more superhero characters from the Warner Bros. stable in mixed reality.
Interesting to note that Bård Anders Kasin was a technical director at Warner Bros during the making of The Matrixtrilogy when he came up with the initial idea for the mixed reality platform.

A new Frontier
The technology platform underlying TFG’s MR format was developed with Canadian broadcast gear maker Ross Video and is being marketed as a standalone software application by Ross.
Branded Frontier, it’s promoted as an advanced form of virtual set for the creation of photorealistic backgrounds and interactive virtual objects.
At its heart is the Unreal gaming engine, from Epic Games, used as the backdrop renderer of scenery through features such as particle systems, dynamic textures, live reflections and shadows and even collision detection. This works in tandem with Ross’s XPression motion graphics system, which renders all the foreground elements.
Of course, games engines were never designed to work in broadcast. Unreal, or the Unity engine, is superb at rendering polygon counts, textures or specular lighting as fast as possible on a computer. They do not natively fit with broadcast signals which must correspond to the slower frame rates of SMPTE timecode. However, when it comes to rendering performance, game engines are a real step ahead of anything in a conventional broadcast virtual set. It’s the difference between a few milliseconds and anywhere from 25 to 50 frames a second.
What TFG and Ross have done is to re-write the Unreal code so that the framerates output by the games engine’s virtual cameras and those recorded by robotic studio cameras match. They have succeeded in putting photorealistic rendering into the hands of broadcasters. The virtual worlds are created in advance with features like global illumination, real-time reflections and real-time shadow and rendered live, mixed with live action photography.
Fremantle is also promoting the format’s advertising potential. Product placement could simply be ‘written into’ backdrop animations designed to mimic the virtual environment (think of a Pepsi logo styled to fit a saloon in the Wild West). Commercials could also be created in Unreal Engine so that viewers need not feel they are leaving the show's virtual universe. Sponsorship will be tailored to the MENA market.
Other format sales of Lost in Time are pending – including in China, while Bard Anders Kasim reveals that TFG is working on a new virtual reality component to the series.

Karim & Noor
Blink Studios, the Abu Dhabi-based content creator incorporated MR experiences into animated ‘edutainment’ series Karim & Noor. The ‘holotoon’ tackles key educational messages, touching on social emotional learning, and puts children at the centre of an immersive storytelling experience via MR technology.
According to Nathalie Habib, GM and Executive Producer at Blink Studios, the best way to describe MR is by defining the function of the most elevated mobile kit that offers this experience.
“Microsoft HoloLens is the first self-contained, untethered head-mounted holographic computing device for mixed reality,” she says. “It blends 3D holographic content into your physical world, giving holograms context and scale to interact with digital content and the world around you. HoloLens lets you go beyond the screen with holograms to visualize and work with digital content in relation to your real world, unlocking new insights and capabilities.”
She adds, “MR does not block you from your reality. That is the main impediment it overcomes. It engages you while you are still in touch with your reality which is favoured especially in verticals related to education and healthcare where real communication is equally important to virtual engagement.”
Producers wanting to work in MR will need the skills to build 3D holographic content for Microsoft software Windows Mixed Reality using the HoloToolkit for Microsoft’s game development platform Unity, and holograms using gaze, gestures, and voice.
 “Science fiction becomes science fact and Unity and Universal Windows Platform app developers are leading this revolution,” says Habib. “Blink is not about just using and promoting the MR devices but actually creating, developing and producing original content for it. We started our education of MR content creation with our own IP, Karim & Noor, by challenging and grappling with storytelling and immersion, and trying to understand the capabilities of the available latest technology to deliver a story engaging experience. We are working on identifying further content that will leap the MENA region into MR experiences that is set to become mainstream in the next two to three years.”

No comments:

Post a Comment