Friday, 2 October 2015

SVGE Analysis: Exploring the options for live HDR broadcast

Sports Video Group Europe
High Dynamic Range (HDR) is suddenly all the rage. By expanding the ratio between the lightest and darkest parts of the video image, live event broadcasters see it as an essential improvement in the viewing experience for Ultra HD. http://svgeurope.org/blog/headlines/analysis-exploring-the-options-for-live-hdr-broadcast/
BT Sports, for example, which launched its UHD sports channel in the rec.709 colour space, has said it plans to introduce HDR within two years. It is likely that other broadcasters, such as those in the Sky group, will launch a UHD sports channel in that time with HDR from the get-go.
That’s because when BT Sport was lining up the infrastructure for launch the options available to it for getting HDR from camera into the transmission chain were limited. That’s leaving aside the matter of how many – or how few – HDR-enabled TV displays are actually in homes, and forgetting for one moment that a standard for HDR to the home is slowly working its way through bodies like SMPTE.
Nonetheless the direction is clear. Consumer electronics manufacturers will be launching UHD screens featuring HDR as a default and there is unanimity that regardless of format – even SD – the picture looks better with higher dynamic range.
So what are the solutions on offer for the live broadcaster? There are several so let’s begin with Grass Valley, which claims to have been the first vendor with a ‘true HDR camera system’ for broadcast.
Grass Valley’s HDR approach
That claim is based on the Xensium-FT CMOS, which GV began developing a decade ago and which are fitted to its latest LDX 86 series cameras. Unlike traditional CCD images, CMOS can imitate the global shutter behaviour of a CCD, but is not sensitive to fast camera movements with short exposure time or to short light flashes. Consequently, it can capture at least 15 f stops and allows Grass Valley to brand the output as Extended Dynamic Range (XDR).
“There are cameras that claim to be HDR ready but they just have one extra f stop than a regular camera, which is 200% more dynamic range,” says Marcel Koutstaal, vice president, camera systems, Grass Valley. “But the LDX 86 supports 15 f stops, meaning at least 800% additional dynamic range than other HDR cameras, or 15,000- 20,000% more light than a regular camera.”
After a year of field trials GV has now implemented, into the camera head, Dolby’s PQ (perceptual quantizer) tonal curve, allowing HDR material to be encoded in a 10-bit signal. This function is more familiarly known as the gamma curve in CCD cameras. Since PQ is part of the SMPTE standard 2084, Grass Valley XDR can be viewed on Dolby Vision certified professional monitors. Incidentally, Dolby has a prototype display called The Pulsar that will theoretically show about 19 stops.
Of course, there is a hefty debate about which is the better route for transferring HDR through the chain. The main ones under consideration at standards bodies are SMPTE ST-2084 and the hybrid log gamma route proposed by the BBC and NHK. Both are seeking to define the OETF (Opto Electronic Transfer Function) – which in Dolby’s case is PQ curves – which converts light from a scene into 10-bit data. This data is converted back within the consumer display in a transfer function known as Electrical Opto Transfer Function (EOTF). This too is up for discussion.
It is important for public service broadcasters, in particular, to be able to reach the mass of households with plane old SDR screens as well as those lucky ones with UHD HDR sets without incurring any additional cost or visual degradation to any viewer.
“No matter what algorithms are chosen we have the flexibility in our software to work with whatever the industry or client’s want,” says Koutstaal.
There are knock-on implications. “As soon as you start mixing HDR sources with SDR sources you have to take great care since it impacts on other areas such as vision mixing,” he says.
During last year’s winter’s test at the European Athletics championship in Zurich, Grass Valley captured HDR in 1080p/50 in parallel with regular processed video output in 1080i/50, with both signals recorded live onto a K2 Summit 3G server. It has ongoing tests with UK outside broadcasters, including at forthcoming sports events, but can give no details.
To achieve XDR operation, the LDX must be equipped with an XF Fibre transmission system to deliver XDR outputs as well as simultaneous standard dynamic range outputs in parallel. The XDR software upgrade option is enabled through a temporary or perpetual eLicense which can be introduced to the camera via USB stick.
Sony solution
Sony’s main reference study in this area is the OB by Sky Deutschland of the German Super Cup between Bundesliga champion Bayern München and DFB Cup winner VFL Wolfsburg on August 1, which shot live 4K HDR to test screens. It too plans future tests with UK OB companies but isn’t at liberty to say more.
The Sky D test used HDC-4300 cameras to capture higher dynamic range as S-Log3 data, Sony’s version of the transfer function that maps the range of brightness onto a 10-bit digital representation.
The signal is taken by the base processing unit BPU 4000 and the output can be switched, recorded onto a Sony server and output from the truck. Monitoring of accurate colour evaluation is enabled by the Sony PVM-X300 Trimaster series.
“The production workflow with S-log3 goes to the point at which the customer decides the mechanism of distribution to the home,” explains Peter Sykes, strategic technology development manager, Sony Professional Europe.
Like Grass Valley, Sony is agnostic about which variant of transfer function it will adopt and support. “This is mainly a distribution discussion,” says Sykes. “We are interested in supporting that and will introduce the necessary adjustments into our product when agreement is reached. Our concentration is on doing everything we can to capture higher dynamic range.”
Skyes also pointed out that Sony cameras F65 and F55 have been able to capture HDR for several years, but only now has the display side of things catching up to view it all on.
PION LiveStream
Danish developer PION launched a GPU processor-based system capable of extracting greater dynamic range from the live broadcast in 2013, but admits its timing may have been a little unlucky.
“We may have been out there a little too early,” says Michael Jonsson, Pion CTO. “In 2013 no-one was talking about HDR or optimising dynamic range. We talked with BT Sport about it (and tested with NEP) but there was not such a great buzz around the topic as there has been this year. Things have changed so much over the past few months that we intend to go back and re-start the conversation.”
PION’s solution, LiveStream is, he says, “fundamentally” different to any other. Essentially, the software is able to extract higher dynamic range from images shot with existing low dynamic range cameras – no new HDR-enabled cameras required.
“Commodity technology is the enabler for us,” says Jonsson. “We are using off-the-shelf GPU technology powerful enough to process 30 frames a second at 1080p, or 60 interlaced in real-time.”
LiveScene – based on the company’s Live Camera Enhancement (LCE) technology – takes the live uncompressed feed from broadcast cameras, or an inbound compressed stream from ENG or event coverage and passes it through a software algorithm. The result is claimed to overcome common live production challenges caused by adverse conditions – such as lack of highlight and shadow detail, noise, and poor colour or contrast. A 3U Rack mount box with GPU, SDI I/O and XEON class multicore CPU is required to be fitted to the truck or studio.
“We are optimising the dynamic range by bringing down the highlights and extracting detail from the shadows,” explained Jonsson.
The company suggests that conventional real-time processing in live broadcast is limited to the camera processing configurable through the CCU or on the camera itself that only enables a set of very conventional parameters to optimise the camera for scene. LiveScene, by contrast, employs a sophisticated computational imaging approach to every single pixel correction with reference to surrounding frames and the global context of the content.
“We are probably the first to apply this kind of algorithm to live broadcast,” says Jonsson.
He says the system is also applicable to UHD. “The current system takes one channel in and one feed out. We are working on a updated version which will be able to process four HD SDI feeds in realtime or one feed of Ultra HD.”
The product is in test at Danish broadcasters Denmark Radio and TV2 and in use at a Nigerian broadcaster Channels TV.
Talking Technicolor 
Technicolor made a bold claim at IBC. “Pay TV operators rely on premium movies and live sports, and while Hollywood has proved it can make movies in HDR for cinema and the home, no-one is doing HDR for live sports – except us,” declared Mark Turner VP, business development & relationships.
The company demonstrated live capture at 4K p60 up-converted from standard dynamic range (SDR) to HDR using software called intelligent tone management (ITM). This is apparently the same algorithm used by Hollywood colorists but now running in real-time on a server.
The signal output of the demo was 1000 NIT P3 in a rec2020 container. “This is a close proxy to the UHD Alliance open standard for UHD,” said Turner.
Importantly, the upscaled signal is routed through an Elemental encoder which spits out a single stream to be received in HDR and SDR. “You can’t justify the cost of running two infrastructures so the distribution system needs to be combined,” he said. “The cheapest way of implementing HDR live is for the mix to happen as normal with the final mix upscaled. OB engineers can adjust the settings in realtime or apply different HDR settings to different sports.”
To receive HDR viewers will need a STB or TV set fitted with Technicolor’s decoder. Technicolor says it is talking with a number of vendors and that its own HDR-enabled STB is being tested by pay TV operators such as BT Sport and Sky. There will be a live test of the workflow at a major US sports event this autumn.
Ikegami and Panasonic
Ikegami can stream component RGB 444 direct from camera head to control unit of its Unicam 3-CMOS 2/3-inch cameras. “This enables us to produce uncompressed RAW data from a 4K native sensor matching the UHD colour space defined in ITU Recommendation 2020,” said the company’s Europe president, Masanori Kondo, at IBC. The 4K-native new Unicam camera is intended for 4K studio and field systems.
Panasonic is currently in development within HDR transmission technology but at this time does not have a specific application for live broadcast, according to Dean Offord, technical & operational assistant, Panasonic Europe.

No comments:

Post a Comment