http://www.broadcastnow.co.uk/techfacils/production-feature/the-technology-of-tomorrow/5028969.article
From 4K stereo cameras to holographic TV and augmented reality, Adrian Pennington looks at how the most cutting-edge kit currently on the market will transform broadcasting.
OLED DISPLAYS
Since CRT monitors began to be phased out of operation five years ago in favour of LCD panels, Soho has been anxiously seeking displays of the same calibre as CRT for Grade 1 monitoring.
An OLED emits light when an electric current is passed through a layer of organic molecules and has several advantages over LCDs, including greater power efficiency, richer colour reproduction, faster refresh rates and greater brightness. Pioneered by Kodak in the 1980s, they are only now becoming available at a price, size and reliability suitable for post-production.
“It’s the black levels that are key and since we stopped using CRTs, most of the LCDs have not matched up,” says Daniel Sassen, head of technical operations at Envy. “We are going to test Sony’s BVM and PVM range, which look extremely promising.” Sony claims its patented Super Top Emission solves one issue that previously dogged the technology: when sunlight or air reacted with the organic material, it degraded its life span.
“Every engineer has dealt with monitoring issues, and I’m hoping this will put to bed the problem of not being able to get hold of the old Grade 1 monitors,” says Sassen. With no need for a power-hungry back-light, ultra-thin, flexible screens have been prototyped by Sony and NHK, which may make a roll-up TV possible.
BEYOND 5.1: NEW SONIC SYSTEMS
Driven by the need for audio to keep pace with higher resolution and immersive image displays, technologies capable of capturing and delivering up to 22.2 channels of sound are being investigated.
“I’ve met with broadcasters who have moved from SD to HD and 3D but the audio has stagnated at 5.1,” says Pieter Schillebeeckx, head of R&D at microphone developer Soundfield. “They are asking: ‘what next?’”
While surround sound is standard in home receivers, 5.1 has its limitations, the most obvious being reproduction of audio just on the horizontal plane. The addition of channels at 7.1 and beyond would add two or more channels of height to fill in the aural gaps.
Soundfield has already developed omni-directional microphones that capture the width, height and depth information at a central point for mixing into new configurations.
Dolby says 90% of existing HD TV sets are capable of using Dolby Digital Plus, the next generation of its audio technology, which can provide channel configurations in 7.1, 9.1, even 22.2, a system under development at Japanese broadcaster NHK to accompany its Super-Hi-Vision video format.
“It depends on how much precious spectrum you wish to take up with audio and how many speakers consumers want in their living rooms,” says Dolby Broadcast marketing manager James Caselton. “You get to 22.2 and it becomes a question of designing wires into the fabric of a house.”
Potentially cheaper and backwards-compatible alternatives to discrete channel formats include Ambisonics, which BBC R&D is investigating, and Wave Field Synthesis, researched by Germany’s Fraunhofer Institute.
AUGMENTED REALITY
We’ve been augmenting reality since the dawn of storytelling: using location, opinion or character and weaving a fiction around it. But until now, this augmented reality has always been one step removed from the audience - imagined through the pages of a book or viewed through a screen.
“We’re at a point where, standing in Oxford Circus on a Saturday afternoon, you could raise your phone to the sky and bring about a wave of spaceships to transport you into a fiction based on reality,” says Simon Meek, Tern TV’s head of multiplatform. “The potential is huge, especially when we think about transmedia storytelling - letting stories cross over platforms and worlds.”
Plum Productions managing director Will Daws believes AR will become as ubiquitous as today’s apps. “You can take presenter talent out into the world and create all manner of tie-ins,” he says.
The technology used to support AR is fast becoming mass-market - any device with a camera, colour screen and half-decent processor can be used. Producers may want to team up with specialist programmers to build the application.
However, it’s important to keep every thing in perspective. “Since AR only appears in the eye of the viewer, there is something about the nature of broadcast that mitigates against it,” observes Magic Lantern chief executive Anthony Lilley. “It’s a technology waiting for a killer app to happen.”
4K STEREO CAMERA
Camera manufacturers are pouring R&D into two principal areas: dual-lens (integrated) camcorders to complement twin cameras on rigs; and 4K production, beginning with sensors capable of capturing 4,000 lines of resolution (four times HD).
Sony has announced the PMWTD300 integrated camcorder as well as the F65, a digital cinema imager with a sensor claimed to be able to record 8K, although sampled down to 4K in the first model.
However, a British start-up has combined 4K sensors and a stereo lens in a radical camera design that threatens to shake the industry as much as Red Digital did in 2007 with the Red One.
In a reversal of conventional camera development, which builds bodies around a chip, the Meduza can exchange components, including the sensor, for new ones as more advanced technologies become available.
“Red took a step ahead of the pack but the pack has now caught up,” says Chris Cary, chief executive of Meduza’s parent company, 3D Visual Enterprises (3DVE).
“Technology is moving so fast that however we build a camera, it would be obsolete before it came out. So with Meduza you get a professional package that is never obsolete. The day of building a camera around a sensor has passed. To us, the sensor is like film stock and DoPs will choose the right size, sensitivity and resolution for each project.”
3DVE is headquartered in London and uses some technology from the US defence industry. Annual rental including camera, accessories and support will be £25,000, and by the time of its autumn release, Meduza will have three different sets of camera heads and a selection of recording options to choose from. It will also have been trialled by cinematographers, who will either ratify the manufacturer’s claims or give it the kiss of death.
HOLOGRAPHIC TV
3D TV has barely begun; meet its successor. Only recently the stuff of science fiction, holographic TV is on its way and promises true 3D, a stereo effect without the discomfort of motion parallax - or glasses.
Samsung and others have been investigating the Massachusetts Institute of Technology’s pioneering work.
“All the large consumer electronics companies are in touch with what we are doing,” says V Michael Bove, the principal research scientist leading MIT’s Object-Based Media Group. “Our desire is to make holographic TV an evolution of 3D TV.”
Bove’s team has used commercially available products to demonstrate the capture and transmission of holographic images. It used the infra-red signal of a single Xbox Kinect camera and a computer with high-powered chips to transmit a video hologram of an MIT employee - dressed as - what else? - Princess Leia.
The chief engineering challenge is the development of cheap displays capable of resolving a holographic image at fast refresh rates. Only a matter of time, says Bove; small holographic screens could be available for PC and games use by 2015. IBM Labs predicts that holographic phone calls will be a part of our lives in the next five years, while Apple has patented technology capable of displaying ‘pseudo-holographic images’ by tracking and responding to the viewer’s eye movements.
CLOUD PRODUCTION
Broadcasters and post-producers are beginning to explore the practical benefits of hosting some, if not all, of a production in the cloud.
Video editing vendors Quantel and Avid have launched platforms (QTube and Interplay Central) that offer editing and media management functionality through a web browser. Targeted initially at news broadcasters, the services will encompass other genres with the spread of faster internet connections.
“We’re early in the stages of cloud production,” says Avid chief executive Gary Greenfield. “Interplay Central isn’t designed to replace Media Composer because, with the best will in the world, most journalists are not skilled craft editors.”
While online craft editing and true remote post-production is not on the cards yet, producers are taking advantage of the cloud’s benefits to facilitate collaboration, store rushes and review work in progress. “Cloud-based systems work perfectly for tapeless productions,” remarks David Sumnall, managing director of Middlechild TV, who exec produced Emergency Animal Rescue for Sky 1 HD using Aframe’s cloud service.
“We upload and store footage to Aframe, perform lo-res edits that are like the old offlines, drop commissioners an email for review of the latest version, and what’s really useful is that they can leave notes on the system for us to pick up.”
Aframe will become a mainstay of Middlechild’s productions going forward. Sumnall advises working with it as far in advance of a project as possible.
“The ability to include all members of a team in the project and have them collaborate in one virtual space with access to all the assets is a tremendous plus,” he says.
SIMULTANEOUS 2D-3D PRODUCTION
3D TV producers are pinning their hopes on new technology to streamline the complexity of production and bring down 3D costs. Among these are a suite of software tools from 3ality Digital that automate lens alignment and convergence for live events.
“The software isn’t perfect, but that is what the testing is about,” observes Darren Long, director of operations at Sky Sports, which is trialling it. “The principle is to get from a situation in which five to eight operators all have slightly different ideas of depth budget toward a more controlled environment. Even one operator to manage two to three cameras is a massive step change.
“As the software matures, we’ll be able to take an analysis of 50 football matches and say ‘this is the kind of depth and pace we are looking for’, and build that into the editorial plan.”
Rival developer Cameron Pace Group, fronted by Avatar director James Cameron, plans to go a step further and automate the 3D process for all 2D productions.
It is pouring substantial sums into a new ‘smart camera’ system linking data from the camera’s sensor to the OB truck, which will extract and reframe 2D and 3D feeds from the same camera positions.
No comments:
Post a Comment