Wednesday, 4 November 2015

Digital Media Will Explode Out of its Fixed Limits

IBC
The media world is on a cusp of change so profound that it will revolutionise our whole relationship with information and entertainment. This key message was infused throughout IBC2015 with deep implications across TV, mobile and cinema.
“We have reached an inflection point in how we all interact with digital media,” declared keynote speaker Mark Dickinson SVP & General Manager, Media Processing Group ARM. “Technology is about to enable us all to become more intimate with the digital world.”
Dickinson demonstrated an ARM R&D project as an insight into how we might interact with digital devices in the near future. A small image sensor could track a person's hand and enable the user to interact with a computer graphic world running on a laptop.
“I can move around and control my environment in a much more intuitive way than was ever possible before,” said Dickinson, adding that it was possible to put the technology into a mobile phone. Since ARM designs the microprocessors in 97% of mobile phones we can take this as read.
“We are on the cusp of a very different era,” he stated. “Mobile has revolutionised the way we consume content but content has so far been designed for fixed screens and simply digitised and made portable. This is just the first step. What is really exciting is the next step which will fundamentally change the ways we create content that work far more intimately with our digital environment.”
He said he believed this was not a threat to broadcasters, but an opportunity for content creators.
Filmmakers are making increased use of systems which allow them to shoot live action and see the results blended into CG environments in realtime on set or on location. A director like James Cameron might use an iPad as the viewfinder to compose scenes mixed with real and CG elements.
Such virtual cinematography is already merging the boundaries between production and post and is now being transposed to the consumer as virtual and augmented reality experiences.
ILM's experimental lab ILMxLab is testing with virtual reality, iPad and Oculus Rift-based technology to allow movie fans to enter a movie, interact and navigate through scenes.
“If we can allow directors to step into that world (‘Star Wars’, say, or ‘Jurassic Park’) the next move is to allow the audience to step in,” said Mohen Leo, ILM VFX Supervisor, at IBC. “Every time we create a ‘Star Wars’ project we create hundreds of digital models – characters and planets – which could be recombined into new stories in VR.”
ILM imagines a world where cinema becomes realtime and reactive, where audiences can virtually inhabit the worlds of their favourite characters and use them to tell their own tales, and where there will exist an interconnected universe of story experiences that let audiences immerse themselves to whatever degree they want.
“We are entering an age of immersive entertainment where it is possible to collapse the walls that have historically separated us from the story,” said Leo.
Perhaps the greatest paradigm busting application will be in broadcast. If adopted, technologies broadly described as object-oriented broadcasting, would shake the very foundations of televised media.
According to the BBC, which is very energetic in researching the area, in the world of object-oriented broadcast, a programme is “like a multi-dimensional jigsaw puzzle that is sent in pieces and can be reconstructed on-the-fly in a variety of ways just before the programme is presented to the viewer.” The solutions to the puzzle are provided by maps that tell the system where the pieces belong and how to combine them. Default versions of these maps are sent along with the jigsaw pieces. In some cases the map may be modified by the viewer to create a personalised experience. The map may also be modified by a system of sensors that perceives certain aspects of a user's relationship to their viewing or listening environment.
“I think the idea is profound and little understood,” said BBC CTO Matthew Postgate, also an IBC2015 speaker. “It is about moving the whole industry away from thinking of video and audio as hermetically sealed and toward an idea where we are no longer broadcasters but data-casters creating information and delivering a computer graphic model of reality. That opens up all sorts of creative questions around veracity and flexibility.”
The first iterations of what object-based media experience might be like will probably come from audio. In the US, two proposals to update the audio delivery of next-generation (Ultra HD, High Dynamic Range) broadcast are being considered by the Advanced Television Systems Committee in its forthcoming standard ATSC 3.0.
The competitors are Dolby, with its AC-4 technology, and an alliance of Fraunhofer IIS, Technicolor and Qualcomm which has developed MPEG-H.
Both groups promise greater interactivity and greater immersion, by letting viewers adjust the presence of various audio objects in the broadcast signal. This could include allowing the user to choose a language, bring an announcer’s voice out of the background for greater clarity, listen to a specific race car driver communicating with his pit crew, or the option of listening to either the home team or the visitor’s native broadcast mix depending on fan preference.
Object-based broadcasting is fuelling BBC research & development around immersive audio and video, including investigations using Oculus VR, and on mobile where video is pervasive. The technology relies on splitting AV into its component parts and in turn this relies on the infinite flexibility of sending data over IP.
Once that is achieved, media becomes another object with which the increasingly connected world can interact.
Postgate asserts, “Once you move to object-based broadcasting in a world of the internet of things there are fundamental questions about what role a media organisation plays.”

No comments:

Post a Comment