Wednesday, 17 April 2013

NAB perspectives: broadcast heads to big data

Sports Video Group
While the equipment needed to produce Ultra-HD broadcasting is being built, advances in imaging systems far in excess of 4K are in the works.
http://svgeurope.org/blog/headlines/nab-perspectives-broadcast-heads-to-big-data/


Ultra-HD was the expected headline act at NAB where hardware from cameras to vision mixers is being refreshed to accommodate 4K, but it forms part of a broader picture in which big data can be used to drive new creative and business possibilities.
According to IBM we create 2.5 quintillion bytes of data every day. This data comes from everywhere: posts to social media sites, purchase transaction records, cell phone GPS signals, electronic sensors in consumer gadgets to name a few.
Big data as it applies to broadcast is the ability to record more than just a video feed from a single camera. Aside from resolution, information can be captured through the lens including higher frame rates, wider colorimetry and depth maps from smaller witness cameras for post-producing 3D.
Combined with leaps in GPU speeds, cheaper storage, and leaner compression algorithms, it is becoming possible to manipulate, manage and retain through to display more and more of this information.
Sensors from 4-28K
Digital camera maker Red, for example, is upgrading its 5K Epic camera to 6K and has 8K and even 28K sensors in development (the latter for super large format production).
“The question is not why we need 4K, 8K or beyond, but what am I going to do with a device that can shoot at higher resolutions than ever before?” said marketing chief Ted Schilowitz. “How can I manipulate images in production and in post to achieve effects that have never been seen before? It’s what stills photographers have been doing for years – creating other worldly experiences that even the eye can’t perceive.”
Vision Research was presenting Flex 4K a hi-speed shooter capable of recording up to 1000fps at 4096×2160 powered by a core optical technology that seeks to record at very fast frame rates in environments with sub-optimal lighting conditions. At NAB it showcased how far it could push this technology with a model designed for scientific research capable of HD resolution at 16,000fps.
“When you look at images of combustion and diesel images [recorded at 16,000 fps], you can see the things you could never see before, like the flow of fuel in the chamber,” said business manager Patrick J. Ott de Vries.
Making 3D simpler in post
Data is key to concepts to make 3D production as straightforward as 2D. German research outfit Fraunhofer HHI is collaborating with Walt Disney Animation Research and Arri to develop a trifocal camera system which Disney plans to test this summer. The prototype employs an Arri Alexa as the prime lens paired with IndieCamGS2K satellite cameras, with depth maps extrapolated in post to render a 3D image.
“We don’t intend to fully get rid of manual processes for 3D in post but we can reduce it substantially,” explained research associate Nicola Gutberlet.
Fraunhofer also previewed a single-body 3D camera intended for low cost 3D live and recorded broadcasts. The Automated Stereo Production (ASP) system on-boards custom-built Zeiss lenses and a Stereoscopic Analyser which calculates parameters for each shot, like colour matching and stereo geometry, identifies incorrect settings and adjusts them on-the-fly.
It also had what it claimed to be the first camera with single shot sampling for high dynamic range (HDR) video. The system records the full dynamic range between the brightest and darkest areas of the images, simultaneously -  effectively balancing extreme lighting conditions, such as spotlights or under/overexposed video scenes – without the need to take more than one image.
In post, the images can be fused together for an HDR image without motion blur, with increased resolution while keeping the dynamic range.
“The non-regular sampling method of Fraunhofer is the ideal solution for broadcast and video, still image applications, as well small-sized camera design,” said  Dr. Siegfried Foessel, head of department Moving Picture Technologies at Fraunhofer IIS.
The main 3D rig developers 3Ality Digital and Cameron Pace Group were notable by their absence at NAB. The former has offloaded the camera accessories design and manufacturing arm of Element Technica to Red with 3Ality CEO Steve Schklair saying that he wants to concentrate his company on 3D features rather than television.
That no decision has been made about a 3D coverage of the World Cup 2014, even for the final alone, speaks volumes about the lack of interest Sony has this time around in selling the live rights to cinemas and also of broadcast rights holder’s low interest in paying premiums for a 3D feed that they feel won’t find an addressable audience.
Exploring lightfields and camera arrays
The fate of 3D arguably lies in the take-up of Ultra-HD flat panels to provide a full HD glasses free experience, and eventually a superior sans-glasses one. CPG co-chair Vince Pace was at NAB for a day talking up an agreement with Dolby to integrate the Dolby 3D glasses-free format into CPG’s stereoscopic production workflow.
“[3D TV] is just going through a development cycle to the point when you can sit on a couch and watch without glasses,” Pace said. “Autostereo is a model that works. The technology should not be looked at as a white elephant. It makes business sense.”
Data captured by CPG on set or live on location will be incorporated into Dolby’s algorithm to manage the playback of 3D content on autostereo displays which have a more limited depth budget than stereoscopic ones.
Avatar editor Stephen Rifkin and Avatar producer Jon Landau were also at the show, talking production planning for the sequel and indicating that the way data, notably from performance-tracked actors, will be captured and used to shape the film will be groundbreaking.
“We’ve shifted from an analogue type of filming to something like data mining where we can capture pixel density or HDR and many other parameters, process it and achieve an experience beyond reality,” claimed Pace, who seemed be referring to both on-set film capture and the CPG/Dolby 3D process. “I am serious about that. We will get to a stage where you will experience every live event and the imagination of the filmmaker in ways we haven’t even imagined.”
In comments uncannily similar to those of Vince Pace, BBC R&D Technology Transfer Manager Nick Pinks suggested that access to more data via  the new video over IP system Stagebox will open up richer storytelling techniques, particularly around live events.
“We think our production workflows will move beyond taking just camera footage toward capturing data sets,” said Pinks. “We have got some big ideas about how we might want to tell stories in the future. If we captured lightfields such as high dynamic range and GPS coordinates along with immersive audio, you can create a much broader picture than can be achieved with SDI or 4K. These are ridged standards. IP is so flexible, you can build something that will capture an awful lot more information than a traditional broadcast camera link and begin to tell stories in a completely different way.”
At a more prosaic level, Stagebox itself is likely to be gradually shifted into everyday BBC production, with use already earmarked for remote production to augment coverage of major live events such as Glastonbury Festival this summer and next year’s Commonwealth Games.

No comments:

Post a Comment