Wednesday, 22 March 2017

Why Mixed Reality Is the Future of Immersive Broadcasting

StreamingMedia

Intel and Microsoft are among those building tools for a merged reality video experience that could be streamed directly to the home.


Mixed Reality merges real and virtual worlds to produce environments where physical and digital objects co-exist and respond to users in real-time. There are some who believe this could be the future of entertainment.
Distinct from the full immersion of virtual reality (VR) and from augmented reality (AR) which superimposes a graphical layer over the real world, Mixed Reality—or merged reality—overlays synthetic content on the real world that is anchored to, and, importantly, interacts with it in real-time.
The BBC’s Research and Development division is investigating Multiplayer Broadcasting [http://www.bbc.co.uk/rd/projects/multiplayer-broadcasting] which blends live TV with the interactivity of online games by placing potentially hundreds of thousands of avatars alongside presenters in a shared virtual world. It sees this as “the next iteration of audience participation shows in a broadcast-VR enabled future.”
Next week, a pioneering example of the format will debut on Norway’s TV Norge. Producer Fremantlemedia describes Lost in Time as an Interactive mixed reality format which presents contestants playing "inside" a series of video games and incorporates audience play-along via second screens. In January, a StreamingMedia.com article described this in depth.
While the first iteration has been recorded, Fremantlemedia and its co-producer The Future Group, see the TV Norge production as a testing ground and a showcase for future developments which include streaming to a VR app and real-time viewer interaction with the video game world and studio contestants.
As Lost in Time shows, audiences can view MR content on conventional flat screens but the real potential lies in the interactivity afforded by streaming to headsets.
Magic Leap is perhaps the most fabled prototype but other VR/AR or holographic headsets will release ahead of it.
Acer, for example, has shipped a Windows headset to developers this month that can support both VR and AR experiences. Developers including the U.K.’s Rewind VR are creating content for Microsoft HoloLens. Metavision has released an SDK for its AR head gear Meta 2, and later this year Intel will launch Project Alloy. This headset uses Intel RealSense cameras to suck in data about the real world environment of the user.
According to Intel’s Sales Director, Steve Shakespeare, merged reality is more dynamic and natural than other virtual world experiences such as virtual reality, since it allows the user to experience a unique blend of physical and virtual worlds.
“In merged reality, viewers can seamlessly interact with and manipulate their environment in both physical and virtual interactions,” he says. “Similarly, while augmented reality overlays some digital information on the real world, it does not integrate the two in the way that merged reality does.”
The technical challenges are not small, though.  In broadcast in particular, the biggest technology challenge will be how to connect merged reality simulations live.
“At the moment, no networks currently exist that can deliver live mixed reality visuals due to the large amount of information that has to be transmitted,” Shakespeare says. “Significant investment in network infrastructure, such as 5G, will be crucial to creating live mixed reality in broadcast.”
Intel is working with key players in the industry, including the likes of Ericsson and ZTE, to make 5G a reality. The challenges beyond this lie in how developers analyse the full range of data that mixed immersive reality requires—analysing large volumes of data in real-time, as they would need for live broadcast, requires large amounts of computing power.
Intel is addressing this. “In the next year, we plan to include an i7 Kaby Lake processor and Movidius technology into our Project Alloy merged reality headset which will make vision processing seamless, and make the technology invaluable for the live broadcast environment,” explains Shakespeare.
“Secondly, with many merged reality devices, you still end up tethered to external sensors of cameras, which present serious logistical challenges when space is a crucial asset. This is why, in Intel’s Project Alloy headset, the Intel RealSense cameras are attached directly to the headset to allow you to move around the room freely.”
Shakespeare describes the RealSense camera as "seeing" like a human eye. It can sense depth and track human motion. As a result, the experience “becomes much more organic” for the viewer.
Project Alloy is constantly evolving to match the speed of the firm’s next generation technology development. “As we keep incorporating new generations of vision processing technologies, faster processors, and more nuanced sensors, merged reality will become increasingly specialized in intertwining the real and the digital world,” he predicts. “This will create a new generation of immersive broadcasting.”
Further on, Intel expects to introduce a range of additional sensory haptics technology to blur the boundaries between real and virtual even further.
“Our sense of touch is incredibly important, and we’re used to having it in every interaction we have,” says Shakespeare. “This will be crucial for creating the most immersive and natural experience possible.”
U.K. virtual reality and VFX tools developer Foundry is also investigating mixed reality and the virtual production techniques needed for MR content production.
According to Dr. Jon Starck, head of research, mixed reality means the content is connected with the environment around the viewer. This, he says, allows for a more immersive, interactive experience in which the viewer and presenter are essentially able to communicate and steer the delivered content.
“The future of mixed reality could evolve broadcast into a completely non-linear format, adapting content according to the interests or direction of the viewer,” Starck says.
Early stage examples of these formats are already operating with national broadcasters. The BBC showcased CAKE (Cook-Along Kitchen Experience) in 2016, an object-based broadcasting experiment in which customised video content adapted depending on the recipe preference of the viewer.  
However, when it comes to MR experiences, interactive non-linear media is still very much in the research and development stages. One of the main challenges the Foundry comes up against within this format is that of visual quality.
“If you’re incorporating real-time interaction—say of a television presenter or actor—it can be exceptionally difficult to deliver the level of quality that we are accustomed to when watching standard television,” suggests Starck.
There are many ways in which this process is being streamlined to create a more seamless interactive environment. One method is through use of digi-doubles – the creation of photo-real content interactive using game engines. Foundry is then able to create photo-real 3D digital character models that can be animated. Through this process, according to Starck, human scans are used to build photo-real models that are incorporated into the virtual or mixed reality, creating a more realistic outcome, both visually and through interaction.
Another method is Holoportation, which Microsoft describes as a “a new type of 3D capture technology that allows high-quality 3D models of people to be reconstructed, compressed, and transmitted anywhere in the world in real-time. When combined with mixed reality displays such as HoloLens, this technology allows users to see, hear, and interact with remote participants in 3D as if they are actually present in the same physical space.”
According to Starck, Holoportation is a primary example of R&D in the area that could lead to the streaming of mixed reality content to the home.

No comments:

Post a Comment