Friday 22 January 2016

Editing The Hateful Eight

Kodak

Editing the stunning 70mm 1:2.76 aspect ratio of The Hateful Eight.

Quentin Tarantino has shot all of his films on 35mm, only experimenting with digital when he guest-directed a scene in Robert Rodriguez's Sin City. The Oscar(r)-winning Pulp Fiction director's eighth full-length feature sees him revisit the Western genre after Django Unchained, which was his first collaboration with editor Fred Raskin, ACE (Guardians of the Galaxy, Fast & Furious).

Set in 1870s Wyoming, The Hateful Eight traps a rogue's gallery of characters in an isolated location during a snowstorm, with no certainty as to who can be trusted. Shot by Robert Richardson, ASC, and widely distributed in 70mm, the look of the film helps define the period.
"Shooting digitally was never going to be an option, as it would automatically add an element of phoniness to the proceedings," observes Raskin. "The warmth and the tight grain of the film stock contribute to the reality of the era. And of course, Bob Richardson's lighting and the 1:2.76 compositions combine to make this an absolutely stunning picture."
The visual style of the movie is classical Hollywood, but with that unique Tarantino imprint. Many of the director's signature shots pop up: the split-field diopter and the hard profiles of the actors talking to each other, for example.
"I think we probably held wide shots longer than ever before thanks to the 70mm format and its 1:2.76 aspect ratio," says Raskin. "When the image is that striking and well-composed, cutting away when not absolutely necessary seems somewhat criminal. The handheld shaky-cam that dominates modern Hollywood cinema is nowhere to be found here. If the camera moves, it's on a dolly. This visual style contributes to the atmosphere of tension and dread that builds up over the course of the movie."
The footage was sent to FotoKem in Burbank for processing. The 65mm negative arrived daily for processing, printing, transfer, and creation of dailies files. All transfers matched the film print color timing, as a custom LUT was created to emulate the 70mm print. FotoKem also restored a decommissioned 70mm Prevost flatbed from the Academy of Motion Pictures Arts and Sciences to assist in the rare process of syncing 70mm print dailies, adding DV40 audio sync playback on the flatbed.

"Whenever I finished working with Quentin for the day, I would go over the work that we'd done and make a handful of tweaks and adjustments," explains Raskin of the workflow. "Then I would turn the sequence over to my first assistant so that the following morning, the film crew could start conforming the section we'd completed the day before. We had to keep the film crew conforming, so that when we finished a pass on the entire movie, it would only take an extra day or two before we were able to screen the work picture."
The photochemical finish, including time for negative cutting and color-timing, meant that Raskin and Tarantino had to have the cut locked before beginning to mix the film. "The upside, obviously, is that we were able to spend more time focusing on the mix," says Raskin. "The downside is that our time to cut the movie was forced to be a month shorter."
While on location in Telluride, Colorado, the production installed a 70mm projector at the Mason's Hall - one of the venues used for screenings during the Telluride Film Festival. When the footage came back from FotoKem, film assistants Paula Suhy and Michael Backauskas would sync it up.
"At the end of every shooting day we'd head to the Mason's Hall to screen the material shot two days earlier," he relates. "Everyone from the cast and crew was invited to watch. One of our producers would announce at the beginning of every screening, 'Welcome back to the Greatest Show in Telluride!' It was a nice way for everyone to end their day.
"Back in Los Angeles while we were working on the director's cut, the editorial team would prepare a weekly screening of recently completed scenes at the Directors Guild of America (DGA) to get a sense of how it played in 70mm.
"Quentin and I would sit in the fourth row of Theater 1 at the DGA with huge grins on our faces, immersed in the grandeur of the 70mm images," recalls Raskin. "The color of the film was generally richer than that of our Avid dailies, and the detail was astonishing. Sitting that close created the greatest difference in the viewing experience between screening on the Avid and screening in a theater. If we could follow the action sitting that close, we knew the sequences hadn't been cut too quickly."
The shoot required certain sequences to be filmed while snow was falling. Therefore the cast (including Kurt Russell, Channing Tatum, Samuel L Jackson and Jennifer Jason Leigh), had to have the entire script memorized from day one. If they got snow they'd be shooting one scene, and if they didn't, they'd be shooting another.

"Since everyone knew the script so well it gave Quentin the ability to shoot 11-minute-long takes if he wanted to, which ended up being great for the performances," says Raskin. "I was watching some terrific theater on a daily basis. This also informed some of the editing choices; there are a handful of shots in the movie that are a couple of minutes long thanks to this, and of course, the trick has been to find an appropriate and effective time to return to the coverage."
Raskin continues, "In other, digitally-shot projects, I might want to take performances from two separate takes and fuse them, but Quentin wants to keep as much of the movie as untouched, original negative as possible. Quentin is not into digital trickery. Instead the goal becomes to make the best version of the movie using the footage as it was shot, as opposed to using visual effects to 'Frankenstein' the movie together."

With the picture complete and in cinemas, Raskin recalls viewing it for the first time with an excited yet unsuspecting crowd. "At that point it is no longer about watching the movie, it's about seeing how the audience reacts to it," he says. "With a good crowd, it's a blast. Hearing them laugh, shout, applaud - knowing that they're with the film and enjoying it - it makes all of the hard work and the long hours worth it."

Thursday 21 January 2016

IP live production power grab and the aims of AIMS

SVG Europe
Despite vocal efforts to avoid launching the new era of IP based television as a minefield of proprietary and inoperable systems, broadcast manufacturers seem to be doing just that. There are already several transport protocols, with overlapping and competitive components, which are in danger of creating customer confusion, or worse, leading them up a blind alley. http://svgeurope.org/blog/headlines/svge-analysis-ip-live-production-power-grab-and-the-aims-of-aims/
One clear division is whether to work with video over IP in live environments as compressed or uncompressed media. And, if compressed, then which codec is best?
Collectively the industry would seem to want one ring to rule them all but since this would give the winner a considerable amount of power, commercially that’s worth fighting for. This ambition puts customers and the industry at risk of being entrenched once again in islands of production. This article explains how.
The contenders
The contenders include the TICO Alliance, ASPEN, AIMS and a proposal from Sony. They variously support, rework or sidestep standards like SMPTE 2022-6 and the work of bodies like Video Services Forum (VSF), EBU and Advanced Media Workflow Association (AMWA).
Matters are further complicated by the spin of marketers who claim that their approach is fully open to third parties and standards based, claims which can be disingenuous. For example, all contenders use the muddy waters of RDD’s (Registered Disclosure Documents) where technical recommendations have yet to be ratified by a standards body. Vendors can use RDDs to justify claims of a standards-based approach while not being fully transparent that these are in principle only.
The business core of the issue depends on who you talk to but manufacturers each have a vested interest in selling more of their kit if certain video over IP routes are chosen, whether those routes are proprietary or open.
The aim of AIMS
The newest kid on the block is the Alliance for IP Media Solutions (AIMS) with Grass Valley, Nevion, Imagine Communications, SAM and Lawo as founder members. It’s a lobbying group for ‘support of open and publicly available protocols for moving media over IP’. It supports the work which AMWA began last September as the Networked Media Incubator (NMI) and which builds on the work of the EBU/VSF/SMPTE Joint Task Force for Networked Media (JT-NM). AIMS attaches particular importance to VSF technical recommendation TR-03 and, by association, TR-04.
“AIMS collectively — and SAM individually — see a danger in the adoption of proprietary formats,” says Tim Felstead, Head of Product Marketing, SAM. “Our philosophy is that we should be adopting open protocols that are as much as humanly possible royalty free.”
“There’s always a danger [of fragmentation] and it’s the reason we formed AIMS,” explains Steve Reynolds, CTO, Imagine Communications. “Instead of each member setting their own pattern to manage the transition to IP we made this joint decision to adopt a set of standards which 70+ members of the VSF have been working on for several years.”
Reynolds describes AIMS as a trade organisation. “We’re not a standards development body. We are not building new or proprietary protocols but committed to open and interoperable tools.”
TR-03 versus SMPTE 2022-6 
The TR-03 recommendation promoted by AIMS and devised by the VSF is similar to SMPTE 2022-6 with the crucial difference that while 2022-6 multicasts audio, video and vertical blanking essence in a single uncompressed stream, TR-03 passes individual streams through a network, to be re-composed into different combinations as needed for production purposes. An example might be connecting different audio tracks with a video.
“2022-6 is a point to point protocol and doesn’t change anything to benefit of customers,” argues Felstead. “If you want video to go to one place and audio to go to another then you have to take 2022-6 through a process to de-embed the audio and send it to another location. The characteristic of TR-03 is that you start with three different transport streams (audio, video and ancillary data) and those three are then handled separately.”
TR-03 carries audio using AES67 and has its network clock distributed using IEEE 1588 Precision Time Protocol (PTP). Streams and synchronized groups of streams are described with IETF Session Description Protocol.
The work by the EBU in using 2022-6 to create a live IP studio at VRT in Belgium was certainly groundbreaking in creating an interoperable network but is seen as just a first step for the industry into IP and not necessarily the right answer going forward.
“2022-6 brings us no further forward than SDI,” says Felstead. “Will we go back to the transition between linear and nonlinear where the industry substituted tape machines for servers and then just carried on in a linear workflow? If we take SDI into the IP domain we are guilty of making the same mistake, whereas the more adventurous way forward is to split the components and try to devise a better and more efficient workflow.”
It would be nice, he adds, to “avoid the painful birth that we had with networked audio standards [where ten different audio formats delayed widespread implementation by a decade and finally culminated in AES 67] to video over IP.”
The ASPEN approach
Adaptive Sample Picture Encapsulation (ASPEN) creates a framework to carry uncompressed video over the transport stream and makes use of components of MPEG-2 to do so (including for synchronisation and the transport of audio). This is currently documented via RDD 37 and as SMPTE ST 302 (for audio) and 2038 (for ancillary data).
Led by Evertz, ASPEN is also being implemented by ChyronHego, Vizrt, Tektronix and PacketStorm. Cinegy, I-Movix, FOR-A, Hitachi and Matrox are also among 30 backers of the solution. Existing Evertz customers Dome Productions, Game Creek, NEP Group, Time Warner Cable Sports and Discovery Communications have deployed ASPEN. Interestingly, Sony is also a supporter with a key joint customer deployment at NBC Sports.
The Sony Networked Media Interface
Cisco, Evertz, Imagine, Matrox, Rohde & Schwarz DVS and Vizrt are among 42 backers for Sony’s live IP production protocol branded Networked Media Interface (confusingly with the same acronym – NMI – as the AMWA working group of which Sony is also a member).
Sony NMI is an adaptation of SMPTE ST 2022-6 and ST 2022-7 designed for the live environment. It supports the SMPTE standards for transport of uncompressed HD-SDI over IP as well as SMPTE 2059 PTP for timing. It diverges from AIMS in supporting the transport of media wrapped into a single multicast stream; it does not support the AES67 standard and it deploys its own Low Latency Video Codec for 4K 60p transmission over 10 Gig Ethernet. This technology is before SMPTE as a RDD34. Sony continues to participate in the development of IP technologies at the VSF, AMWA and EBU.
“To be honest I do not think there is a risk of industry fragmentation,” says Nicolas Moreau, Product Marketing Manager IP Live Production & Workflows at Sony. “If you look back to when we first introduced NMI it could have been seen as proprietary solution but Sony has quickly opened up RDDs to SMPTE and other groups. We are committed to an interoperable approach.
“With regard to AIMS — we don’t see them producing standards and that is our goal,” he says. “TR-03 is not a standard yet, it is only a recommendation and we only support standards [though Moreau neglects here mention of Sony’s own draft RDD34]. “To support AES67 you need product that meets that standard and as yet there is none.”
Moreau says his main concern is the undue focus on SDI to IP mapping and transport “that the customer tends to forget all about the other vital functions for doing production over IP,” says Moreau. “These include dynamic routing, device discovery and integrating software defined networks. Much of this is also forgotten in the proof of concepts we see on the market. NMI is offering a path to solve that.”
Considered proprietary by some, Sony says it is far from shutting out industry developments. For example, Moreau says that audio is one of the directions Sony is working on. “No-one is saying that any IP solution will be rock-solid for the next decade. Standards evolve and when the industry has something to look at we will take care of it. We are delivering FPGA IP cores today which could be updated if a new standard emerges.”
The TICO Alliance
Like Sony, IntoPix has devised a codec for compressing 4K over 3G-SDI and over 10Gig IP-based network. This is supported by The TICO Alliance, a consortia of organisations including Grass Valley, EVS, Imagine, Nevion, Ross Video and Tektronix which says it is working in “an open and collaborative way with industry organisations, including SMPTE, VSF, and JT-NM, to guarantee an interoperable adoption.”
For some members of AIMS, though, the approach of Evertz, TICO and Sony cannot be supported. “There’s no doubt these are good technologies but our biggest bugbear is that they are proprietary and licensed,” says Felstead. “For that reason we believe it’s not sensible for customers to adopt it.”
Compressed solutions count
While SMPTE 2022-6, TR-03 and ASPEN offer uncompressed solutions, SAM’s Felstead argues that simple economics make compression necessary. “Moore’s law would suggest that bandwidth is growing ad infinitum but it still costs to develop routers and infrastructure that functions together at low latency so in economic terms, uncompressed is not automatically better,” he contends. “You need some form of compression in order to reduce bandwidth consumption.”
Even though SAM is a backer of Sony’s NMI, it want to encourage adoption of the BBC created, VSF supported, royalty free VC-2 codec for compression within a TR-03 industry agreed interoperability framework. Like TICO, this is wavelet-based.
“For the time being TR-03 is about uncompressed and as yet is not dealing with compressed standards,” says Felstead. “That should be dealt with in line with the same philosophy and SAM are already seeing that in our products today with VC-2 compression and TR-03 streams of switchers, routers, modular interfaces, servers, playout, and others.”
The AIMS roadmap does indeed accept use of compressed as well as uncompressed bitstreams.
“If end-users want to use TICO or Sony compression, or for that matter AVC, J2K or VC2, in an AIMS framework that is valid,” says Imagine’s Reynolds. “We don’t see friction with what Sony and TICO are doing.”
However, according to Reynolds, the ASPEN approach is incompatible. “Evertz came up with a clever solution to doing IP encapsulation in the transport stream,” he says. “They launched some projects for ESPN and NBC and needed to come up with a framework to solve the move to IP and decided to do that with a suite of [MPEG-based] technology that was well understood. Nonetheless, this has compromises that are not aligned with the longer term vision of the industry. It’s evident that ASPEN was a short term solution to a short term problem. That’s not to say Evertz won’t move into TR-03 but we didn’t see the need to take that interim step.”
The future of AIMS
Cisco, EVS and the Telos Alliance have recently joined AIMS but Felstead says there’s no arms race. “I don’t thinking joining AIMS is necessarily the benchmark of success. Adhering to its philosophy is the important thing.”
Reynolds reiterates that Imagine wants to have the same level of guaranteed interoperability with IP as with SDI and that this involves many different manufacturer’s technology being connected.
“This is possible as long as there’s an adherence to the formula in the AIMS roadmap,” he says. “We believe we can build a momentum broad enough to avoid the balkanisation that bedevilled the move into file and to create an interoperable formula from day one. Anything that diverges from the work done in the VSF would, I believe, be detrimental.”
While there are clear overlaps and lines of compatibility between these groups, the industry will surely be looking for greater coherence from the industry heading into NAB.

Will IP sink or swim on a lack of standards?

IBC

The great promise of IP is to create a truly open and interoperable environment for the smooth plug and play of best of breed technologies. A bit like SDI in fact, but with greater potential for economic and creative benefit. But competing paths to this goal risk sending the industry back to the bad old days of siloed workflows.
There are a number of initiatives which appear to be in an arms race to mass industry support. Some fuse proprietary technologies with open standards, others are attempting to work with a mix of standards and technical drafts before committees at SMPTE. 
There isn't one which has all the pieces of the puzzle to genuinely claim to be fully open, end-to-end and standards based.
Sony was first out of the box launching its Networked Media Interface (NMI) for live IP production at IBC2014. Its 42 backers include Cisco, Evertz, EVS, Imagine, Matrox, Rohde & Schwarz DVS and Vizrt.
Sony NMI adheres to SMPTE standard 2066 for transport of uncompressed media over IP in a single multicast stream but promotes Sony's own codec for compressed 4K/UHD over a network. The company says that interoperability and compatibility with its protocols can be achieved “by gaining technical information under license”. The IP Live codec is in the process of becoming an open industry standard through SMPTE – but is not yet a standard. 
The TICO Alliance, founded in April 2015, has united organisations including Grass Valley, EVS, Imagine Communications, Nevion, Ross Video and Tektronix to establish
the TICO codec, devised by IntoPix with a similar aim to Sony, at the heart of a IP live production ecosystem.
Although use of this codec also requires a licence, the TICO Alliance says it is working on an “open and collaborative way with industry organizations, including SMPTE, Video Services Forum (VSF), and the EBU's JT-NM (Joint Task Force on Networked Media) with the aim of “guaranteeing an interoperable adoption”.
ASPEN (Adaptive Sample Picture Encapsulation) is led by Evertz and officially launched at IBC2015. It has the backing of 30 broadcasters, media companies, and kit makers including AJA, ChryonHego, Cinegy and Sony, with whom it has a joint customer for introduction of video over IP in NBC Sports. While criticised for being an initiative of one vendor rather than cross industry, Evertz describes ASPEN as the only field proven, open framework for IP-facilities available. It differs from other approaches in working with uncompressed media over MPEG transport streams. This is currently documented via a draft technical document before SMPTE.
AIMS (Alliance for IP Media Solutions) was founded last December by Grass Valley, Imagine Communications, Lawo, SAM, and Nevion (Cisco and EVS have since joined) as a lobbying group to “encourage interoperability within emerging all-IP based infrastructures based on existing open standards. 
Like Sony, it wants to use SMPTE standard 2022-6 as a building block but unlike Sony wants to do this by splitting the video, audio and metadata into separate paths using the VSF devised TR-03. However, once again TR-03 is a draft recommendation and not yet a standard.
Any confusion is understandable. You don't have to be eagle eyed to spot the same names cropping up as members of different bodies. Sony, for example, is a member of VSF and AMWA but not (yet) a member of AIMS which says its only goal is to promote VSF/AMWA concepts. Sony and Evertz support each other's initiatives. Imagine and EVS support Sony, TICO and AIMS but not Evertz. Grass Valley says it supports the open approach of AIMS but also backs the proprietary approach of TICO.
All manufacturers have a vested interest in selling more of their kit if certain video over IP routes are chosen, whether these routes are proprietary or open. Vendors backing multiple approaches are sensibly keeping abreast of all the options so that they are in a good position to move once technologies or standards gain critical momentum. It is also worth pointing out there would appear to be wriggle room among key signatories to AIMS and at Sony and Evertz with regard to adapting their approach when standards finally coalese.
Yet while standards remains unratified vendors will look to sell solutions to customers that need an answer to IP installation today and cannot afford to wait. The danger is that this gap is filled by approaches that incompatible down the line

Arena Television’s £20m IP UHD HDR scanners primed for NAB launch

SVG Europe
Arena Television is spending a mammoth £20 million ($29m) on not one but three triple expander UHD all-IP trucks timed to be on the road this year. OBX, OBY and OBZ are identical in both dimension (16.5m long x 5.7m wide) and specification, and are being launched in partnership with an equiment manufacturer from April.
The same manufacturer is also outfitting UHD IP kit in the studio of a US broadcaster, giving the vendor a European and North American double whammy of NAB2016 marketing. Because of the NDAs involved we cannot disclose the vendor’s name or details of the equipment involved although we can reveal the broad outlines.
“The end of the year will be the time of mass volume UHD delivery,” believes Arena managing director Richard Yeowart. “We will make a big promotional push for our first two trucks during the summer months. They are definitely the first all IP UHD HDR (High Dynamic Range) trucks on the market. They are primed to go beyond 4K to 8K should the industry go in that direction, or they can cope with High Frame Rates. It’s a very expensive investment but it is future-proofed.”
Yeowart has been planning a move into UHD for over a year but delayed investment until he felt he a unified UHD over IP circuit was achievable.
The move comes serendipitously just as Sky’s entire OB output is up for grabs. The tender features 4K/UHD as a key criteria. SVG Europe understands that other UK outside broadcast suppliers are also building UHD scanners. NEP Visions shares Sky’s existing English Premier League contract along with fellow incumbent Telegenic. But Sky is not the only prize on the market. BT Sport, world leader in 4K live broadcasting, can also be expected to increase its 4K output in the coming year.
From coach builder to SI
Arena OBX moved out of coach builders A Smith Great Bentley (ASGB) at the end of last year before and is currently being systems integrated at Videlio. It will be on the road in March, beta tested until April and soft launched in the summer.
OBY has been at Smiths since before Christmas and will remain there until the spring when it will move to Videlio — after which work will begin on OBZ. “Integration should speed up since we will then know the exact specifications such as pre-cut cabling,” says Yeowart. “We’ve gone for the longest truck we can get to create the biggest possible space inside.”
He explains, “It was clear to us that we needed to wind up ready to move to UHD. We’d looked closely at 3D and had one client pushing us hard to offer that. We didn’t think 3D would move mainstream and we got that right. What that meant for our business is that we were better positioned when the next major upgrade came along.
“We tend to build an HD truck every 18 months but the tipping point has now come to move into UHD. Add to that that we have capacity constraints – all our trucks are out on the road [in part catering for the gap left by NEP Visions’ fateful fire which took out several large mobiles] and investment is a necessity.”
The three new vehicles will add to Arena’s fleet of nine HD and seven VT trucks. Yeowart is considering the idea of upgrading some of these units to UHD.
Quad HD ‘sticking plaster’
“If we built last year we would have built a quad HD truck and wouldn’t have had second generation UHD cameras,” he reports. “In fact, if we’d had to build for a contract this year starting in June then quad HD would be the easiest route. The manufacturing industry has been a a bit slower than we wanted in coming to market with a large scale UHD solution. Quad HD is a sticking plaster approach but we’ve gone to manufacturers and explained what we want to achieve and the type of work we expect to be doing and they have come up with a powerful solution.”
Arena is one of the four main UK OB suppliers, an independent company with 30 years history and a business model which has repeatedly seen it invest in the latest technologies ahead of winning contracts to pin them on. “Quad HD will work but it also has to work much harder to achieve the same end,” says Yeowart. “There is four times as much wiring, the router has to be four times bigger. Plus there is a compromise between an HD and UHD production which we don’t think is acceptable for clients.”
Arena’s new trucks will simulcast HD and UHD and offer HDR and SDR paths. There will also be some quad HD routing included to cater for clients requiring that legacy.
“We are having to learn a lot about IP and UHD,” says Yeowart. “For example, there is new training on how to rack cameras and how many people are needed to monitor an HDR feed in different areas of the truck. Plus, how do you guarantee that the HD output is okay in Standard Dynamic Range? There is a a lot happening at once together with putting all the data onto an IP stream.
Waiting for HDR
Choice of HDR caused further delay in speccing the trucks. Arena has run side by side HDR tests with two vendor options at some of the 38 EPL matches it covers in HD for BT Sport (in a contract which expires this year).
“We looked at different cameras and lenses and also looked at the new high brightness screens of 4000nits emerging into retail from CES. There’s no doubt HDR allows us to show shadows and highlights and much brighter pictures with an impact that makes it appear as if you are looking through a window. However, HDR comes at a premium for the broadcaster, and for us, so the process has to be right.
“We still don’t know which way a client will transmit the HDR but the good thing is that we can select the appropriate path from the camera and just have to switch on a licence once the route is decided. In any case, we don’t need to know now but in a couple of months.”
Fortunately Arena has had the luxury of working all this out at its own speed without building to the deadline of a particular contract.
“We are comfortable with what we have seen, that it does work,” says Yeowart. “You will see a truck with end-to-end IP and no conversion. It will IP from the CCU, IP into the vision mixer, IP into the record machine and then to whatever format a broadcaster requires.
“As communications get faster IP will allow broadcasters access to the data stream back at base for remote production. It means we can employ top level IP engineers at Redhill [Arena’s HQ] for remote diagnostics. That changes the way the industry works. Once a truck has an issue on site now you have to deal with it locally, but the ability to remotely monitor on the road will be incredibly beneficial.”
While there is one dominant manufacturer which has partnered with Arena on the project, there is a second main vendor involved. Importantly, Yeowart is giving clients a couple of acquisition options by taking cameras from two vendors. “We won’t be buying a huge quantity of either. We could buy HD cameras and switch them to UHD by buying a license which would provide additional flexibility.”

Wednesday 20 January 2016

Live Sports Are Driving Streaming Video Innovation

Streaming Media Global

The desire to keep sports relevant to the younger audience and connect with mobile media consumption habits is driving innovation online.

A not-so-bold prediction: This year’s Olympic Games will be the most streamed live event in history. I’ll take a bet too that Super Bowl 50 will be the most streamed NFL event ... until the next year’s Super Bowl. In turn, these will be surpassed by the Winter Olympics in 2018 and by matches from the FIFA World Cup later that summer. And so on.

The leapfrogging of live streaming records is a trend across all programming, but reaches its peak around major sports events. The value of the game for rightsholders in attracting audiences, subscriptions, or ad dollars holds as true today as it has since Olympic telecasts began with the 1936 Berlin Games. The unique appeal of drama shared in the moment is now driving the live experience online, forcing broadcast rightsholders to adapt.
“There is a huge paradigm shift in the way sports goes to market,” says Stewart Mison, strategic director of sports business development for Microsoft. “The last great transformation 20 years ago was the introduction of multichannel TV. That gave deeper insight into sport for armchair fans, but not personalized insight. Now we have the opportunity to do just that.”
One could argue that online coverage does not yet match the quality of service or communal experience that viewer’s receive on their TV—figures borne out by TV audiences for events which are far in excess of online views. (Super Bowl XLIX racked up a record 1.3 million concurrent streamers for NBC Sports, but 112 million TV viewers.) On the other hand, what was Felix Baumgartner’s October 2012 Red Bull Stratos space jump other than a digital-only global extreme sports experience with 8 million concurrent live streams?
The sport itself matters. Super Bowls are single-event primetime U.S. TV viewing. By contrast, an Olympics typically takes place in different time zones, lasts 2 weeks, and includes multiple sports. Both, however, lend themselves to digital viewing in different ways.
Perhaps the primary one is that rightsholders and sports franchises are highly conscious of retaining Millennials, a demographic in danger of being priced out of live events, whether in person or on pay TV, and who prefer—at anytime, on any device—access to content that speaks a less studio-formal, more fan-driven language.
It is no coincidence that the sport showing highest audience growth is professional video gaming. A predominantly online and youth phenomenon, part of the esports appeal is the (largely) free viewing of live events such as “League of Legends” (27 million watched the final online while 25 million tuned into watch the final round of The Masters on TV) and the lack of territorial rights. Fans are as likely to view a game in South Korea from the U.S. as they are in Japan—geoblocking esports would probably stunt further growth.
To reach Digital Natives, broadcasters are demanding more value from sports organizations to whom they already pay billions of dollars. The collective value of TV rights for FIFA World Cups in 2018 and 2022 is more than $2.5 billion, of which Fox paid $450 million, five times more than ESPN paid for the previous two tournaments. In the U.K., BT and Sky collectively paid $7.8 billion for Premier League soccer matches from 2016–19, an increase of 70 percent over the last bundle. BT paid an additional $1.3 billion for three seasons of UEFA Champions League football. NBCU paid the IOC $7.75 billion to air a decade of Olympic Games from 2022–2032.
To maximize the value of rights, broadcasters and rightsholders must bring something complementary to viewing in the living room (or sports bar) while they protect their main TV revenue. Fresh research from Kantar Worldpanel ComTech reveals the U.K. consumer still choosing their TV offer based on premium sports. In the quarter ending Sept. 30, Sky, BT, and Virgin Media all gained a share in overall market sales thanks to their strong sports offers. Notably, all three also increased the length or strength of their broadband discounts in the last quarter.
The tipping point is clear when examining the Olympics. The 2014 Sochi games marked the first time the amount of digital coverage worldwide (60,000 hours on 230 dedicated digital channels, including 155 websites and 75 apps) exceeded that of linear broadcasts (48,000 on 464 channels).
A record 10.8 million hours of video were consumed on NBC Olympics’ digital platforms, more than three times what was streamed for Vancouver 2010. Approximately 80 percent of the video was viewed via TV Everywhere-authenticated live streams on NBCOlympics.com and the NBC Sports Live Extra app.
“Sochi showed that the consumption of an Olympic Games on mobile and tablets is now as intense as that of traditional TV,” says Yiannis Exarchos, CEO of Olympic Broadcast Services (OBS). “Digital is no longer marginal. It is the heart of future innovation for broadcasting.”
The IOC are billing Rio as the first real multiscreen games, where OBS will provide broadcasters with additional material—such as real-time statistic feeds, different angles, and super slow-motion sets—which can be packaged as a second screen experience.

New Production Techniques

Sports producers have consistently pioneered techniques designed to bring the experience of the game closer to home, whether by graphical analysis, wireless minicams, or live 3D. The current innovation around data, customizable coverage, and wearable cameras is ushering in a fundamental change in the way sport is presented and consumed. This media strategy is driven by the demand to reach audiences online.
The central idea is to exploit more of the content already captured from a live event and make it available over digital channels. For example, a typical UEFA Champions League match would be recorded by 15 cameras, while a final would more than double the camera options. A decade ago, only content produced as the main multilateral “world” feed would be available live, with additional shots—such as replay angles or slow-motion—only available postevent as a highlights reel.
Storage was a prohibitive issue. The number of hours produced by FIFA’s media producer HBS has risen over successive tournaments from 2,200 in 2006 to 5,000 in 2014 as the cost of storage has gone down and it’s become easier to manipulate files in fast-turnaround workflows. HBS designs a central storage system based on up to 20 EVS servers into which every feed from every game, plus rushes from ENG crews, is recorded. Similar systems are in place at every large sports event for rightsholders to exploit across digital platforms and postproduction broadcast.
UEFA has done just that for the 2015–2016 Champions League season, supplementing it with enhanced data feeds, on-air graphics, and infographics service. The intention is to gather more content from the venue and enable broadcasters to offer an enhanced digital experience. UEFA is also experimenting with an embedded audio watermark in the audio track of the world feed, enabling broadcasters to experiment with marketing strategies across second screens.
“A Lionel Messi goal would be watermarked linking the match action to a series of relevant additional content available on the viewer’s second screen,” says UEFA digital media solutions manager Olivier Gaches. “For example, further information about the player, an opportunity to view a selection of his previous Champions League goals, or an Adidas ecommerce promotion.”

To 4K and Beyond

The technology to produce 4K live has moved from workarounds to a fully fleshed-out equipment chain. A significant development has been the introduction by all the major manufacturers of systems cameras carrying two-thirds-inch chips rather than large format single sensors more suited for cinema. The two-thirds-inch cameras allow outside broadcasters to continue to use their existing inventory of zoomable lenses to maintain the characteristic depth of field of sports action.
New 4K wireless camcorders for touchline camera work are emerging, although it will take some time before the units are of a size and latency to be useful for minicams. Other missing elements include super slow-motion tools capable of generating 4K resolutions. The issue here is that the frame rate needs to be exceptionally high to offset light loss.
It is no coincidence that the first significant 4K live channels are sports channels distributed to the home over broadband. BT Sport was able to beat pay TV rival Sky to a 4K launch in August because its connection to customer homes is capable of download speeds up to 300Mbps, provided customers upgrade their package and add a YouView+ set-top box.
To achieve first-mover status, the telco compromised on production by launching with a 3G-SDI (serial digital interface) workflow, a workaround solution using conventional infrastructure. Although branded BT Sport Ultra HD, pedants could say the channel does not meet full fat UHD since it lacks high dynamic range.
“IP live is not yet ready,” BT Sport COO Jamie Hindhaugh said shortly after launch. “We are looking at IP and attributes like HDR, and how that integrates into 4K. Our focus is on being trailblazers and staying out in front.”

In North America, Canadian telco Rogers Communications has committed to broadcasting in 4K by the end of 2016. It will connect its cable footprint of 4 million homes to the internet at 1Gbps, branding the service Rogers Ignite and requiring customers to buy a new STB if they want to watch Toronto Blue Jays’ home games in the highest resolution. Rogers is in a perfect position to do this since it owns the Blue Jays and also has a stake (with CTV and Liberty Media) in sports network Sportsnet. Being later to market than BT, it is able to add HDR to the UHD mix.
The next big development, and one which is ripping out the fabric of broadcast production worldwide, is the migration from SDI to IP video transport. We discussed this movement in the article “Resistance is Futile” in the November issue (go2sm.com/resistance), but it’s worth highlighting the implications for sports.
The first is that IP will slash the cost of sending OB trucks and crew to mix a live directed feed at a venue. NBC Sports and other broadcasters have already made templates for large-scale remote production workflows for the Olympics, NASCAR, and NHL. Doing so, using technologies such as Avid’s online collaborative workspace, is said to have increased production output by more than 30 percent. Around Super Bowl XLIX, NBC crew in Phoenix were able to collaborate on programming with teams in Stamford, Conn., over Avid’s cloud-based MediaCentral.

Remote Production

Remote production over IP for onward distribution to sportscasters and team websites or YouTube is set to open up second- and third-tier leagues and sports to coverage. A key reference is Pac-12 Networks’ remote coverage of 850 annual sports events. It uses T-VIPS and Nevion links to transmit talkback and telemetry to and from venues up to 2,500 kilometers away. Doing so saves an estimated $15,000 per game, or $13 million a year.
A single 4K camera (or pair of them picture-stitched to provide a panoramic view) can be controlled remotely from a central production hub, making more games economical to produce. However, broadcasters will still send significant mobile setups to flagship events such as a Super Bowl or Ryder Cup, in order to command a more complex multicamera arrangement and generate wider coverage from talent on the ground.
Many broadcasters are looking to time their adoption of video over IP with a move to 4K/UHD. The economic argument is simple: A 10Gbps Ethernet cable can transmit much more efficiently and cost effectively than quad-link HD SDI cabling.
Experiments in this area require unprecedented collaboration among the notoriously proprietary broadcast equipment makers. One proof of concept has been demonstrated by integrator Gearhouse Broadcast in which an EVS IT-based DYVI switcher was shown cutting together 4K signals over 10Gbps fiber from Hitachi cameras. Sports producers need to be convinced that the signals can be encoded/decoded as necessary and cut together in synch with audio without latency—critical for any live broadcast.
It’s worth noting that experiments into 8K continue to concentrate on sports. Japanese broadcaster NHK, which plans to introduce 8K domestic transmissions ahead of the 2020 Tokyo Olympic Games, tested the format at the FIFA Women’s World Cup and at a New York Yankees game, and plans to do the same at Super Bowl 50—even while the NFL has announced no plans to broadcast games in 4K.
NHK has an agreement with FIFA to offer Super Hi-Vision—which combines 8K with a 22.2 surround sound system and 60 frames per second—transmission of the 2018 World Cup. Just as 4K capture is used by broadcasters such as Fox Sports to zoom into and resize pictures for HD output, so 8K cameras could soon be used for similar purpose for 4K output.
Aside from high spatial resolution, the Ultra HD road map passing through standards bodies SMPTE, DVB, and ATSC includes HDR, wider color gamut, frame rates up to 120p, 10-bit sampling, and immersive audio. With each new enhancement, the amount of data the network must transport increases demands on the entire live IP production infrastructure.
“There’s lots of work to be done on the delivery side before 4K is widely viable,” says Bill Wheaton, executive vice president and general manager of media products at Akamai. “We’re going to have to change the fundamental technologies of the internet. Ninety percent uptime [reliability] isn’t good enough. If consumers miss that one goal, they’re going to be frustrated.”
Akamai forecasts that 500 million viewers will soon be watching prime-time live sports online. “With 500 million online viewers, we need 1500Tbps. Today we do 32Tbps, so you can see the huge gap we have to bridge,” says Akamai’s director of product enablement and marketing, Ian Munford.
Akamai’s solution involves using HTTP/UDP to prevent packet loss and reduce latency, to speed the transit of content, and make it easier to handle unpredictable peaks within the CDN. It will also use multicasting, pre-positioning of content at the edge to bypass unpredictable events, and peer-assisted delivery using WebRTC.
Akamai is focusing these efforts around live workflows built especially for sports. “You will see a transition from broadcast toward IP because, unlike broadcast, the internet is not a limited platform but one that allows editorial to be creative with the experience,” Munford says. “And people will pay for that better experience.”

3D Audio

The unlocking of audio from the broadcast picture is likely to be the first creative iteration of object-based broadcasting, a potentially seismic shift in broadcast presentation predicated on the move to IP. Object-based broadcasting conceives of a program “like a multidimensional jigsaw puzzle,” according to BBCR&D, that is sent in pieces and can be reconstructed on-the-fly in a variety of ways just before the program is presented to the viewer.
Two proposals to update the audio delivery of UHD delivery are being considered by the Advanced Television Systems Committee as standard ATSC 3.0. Dolby’s AC-4 competes with MPEG-H, developed by an alliance of Fraunhofer IIS, Technicolor, and Qualcomm.
Both technologies promise greater interactivity by letting viewers adjust the presence of various audio objects in the broadcast signal. The user could choose a language, bring an announcer’s voice out of the background for enhanced clarity, listen to a specific race car driver communicating with his pit crew, or have the option of listening to either the home team or the visitor’s native broadcast mix depending on fan preference.
BBC R&D’s web application Venue Explorer, tested at the 2014 Commonwealth Games, is an example of how a broadcaster might provide viewers with separate audio mixes corresponding to the part of the scene that they wish to look at.
“When viewing a wide shot, the audio will convey the ambience of the stadium, similar to what would be heard by someone in the audience,” a BBC R&D blog reads. “As the viewer zooms in to an area, the audio is remixed to a broadcast feed corresponding to what is being seen.”
The implications are exercising minds today. As digital competes with—and in some cases exceeds—linear coverage, it brings a tension that producers have yet to resolve.
“Broadcasters have been very keen to control the production, but today everybody can curate the live event from multiple angles,” Exarchos says. “This most important challenge for broadcast is how to integrate more democratic storytelling into coverage of a sport event?”
Live sports are so compelling because they deliver those water cooler moments of drama that resonate around the world in an instant. “I don’t believe that with all the infinite choices that are offered to experience a sport there is not a need for a narrator,” Exarchos says. “The question is how we make the traditional directed narrative a more shareable experience?”
The answer is surely a closely marriage of broadcast-produced live pictures and commentary with social media reaction supplemented by greater user choice of content and data which will be played out over a giant and second screens as fans seek ever-closer immersion in the experience at the game.

Data Acquisition and Visualization

Wearable cameras and sensors are feeding the demand for an information-rich sports experience to mobile. In its report “Football’s Digital Transformation,” PwC predicts that a completely personalized user experience will become a natural expectation among fans.
MLB Advanced Media has been able to lead in this field, in part because of its control of media production for all 30 teams. Like the NFL (and unlike soccer), baseball’s regular in-game pauses provide a window to disseminate stats to fans who cannot get enough data.

Virtual Reality Live Streams 

The NBA’s Sacramento Kings became one of the first teams in professional sports to employ digital headgear when it broadcast to Google Glass augmented with graphical overlays live during home games in 2014. The technology allowed fans to witness the courtside experience through the eyes of the team mascot, cheerleaders, and sideline reporters, who streamed their first-person views. While Google has benched Glass, new streaming experiences have retrained on the anticipated mass market for smartphone-adapted virtual reality gear. Leading the charge is NextVR, with a system of stitching and encoding multicamera streams based on R&D for 3D TV transmission. The producer has its own rig of RED cameras and tested the service with many sports organizations including NASCAR, NHL, and NBA games which will look to monetize a pay-per-view experience this year.

Social Media Fuels Ignition

The Formula E motorsport series enters its second season. The 10-race series, which is marketed to Millennials, is organized by the FIA using more environmentally friendly electric powered vehicles. It races on street circuits (in Paris, Beijing, Malaysia, and Long Beach) and allows fans to potentially influence the outcome of the race, perhaps making it unique in major professional sports.
FanBoost gives three drivers with the most votes on social media prior to each race a 5-second power boost per car, per driver, temporarily increasing their car’s power from 150kw (202.5bhp) to 180kw (243bhp).
“Viewers don’t just passively watch. They influence the outcome from second screens,” says Ali Russell, CMO of Formula E Holdings.
Drivers are encouraged to interact with fans too. One of the most active, China Racing’s Nelson Piquet, Jr., won the inaugural championship.
“Fan boost is hugely successful in the Far East because fans don’t have any of the preconceptions that fans in Europe might have,” says Russell.

FIFA World Cup Streaming

Internet traffic during successive FIFA World Cups has grown tenfold to 222 petabytes between 2010 and 2014. For the 2014 FIFA World Cup tournament, HBS delivered 2,799,360 minutes of live video streams to multiscreen viewers via tech provider EVS and processed through the Elemental Cloud.
The Brazil World Cup was the first where viewers were able to use second screens to watch games from multiple camera angles and play back clips on-demand during a live match. Rightsholders could use turnkey apps and video players to deliver customized multiscreen services with this functionality.
EVS used C-Cast, the distribution platform connecting HBS’ live production to a central cloud-based platform, to supply rightsholders with six live mixed ABR video streams from 14 camera angles per game. The service also provisioned 17 unilateral video streams plus four audio commentaries. Using Elemental Cloud, EVS was able to process and deliver multiangle live coverage to 24 million mobile and web viewers worldwide.
EVS delivered live ingests to Elemental Cloud using Aspera FASP. According to Elemental, the efficiency and speed of the workflow meant that streams were delivered from source to viewer screens at latency levels comparable to satellite broadcasts. Over the course of the monthlong 64-match tournament, Elemental Cloud estimates it ingested and streamed 35,280 hours of video.
According to EVS, on-demand multiangle replays are the most consumed additional content distributed by sportscasters as part of a digital package. “It’s a staple feature of PVRs, but having the ability to do this from a smartphone or tablet is the perfect example of being able to consume the broadcast experience on the go,” says Nicolas Bourdon, senior vice president of marketing for EVS.

Friday 15 January 2016

Broadcasters look to the cloud for playout

TVB Europe p25


According to some vendors, cloud playout is good to go today. Yet for the most part, broadcasters are holding off on investment. It is still early days for the technology of course and Moore's law will send performance up and costs down so that realtime, highly reactive graphics, UHD streams and live components can be played from virtualised equipment running in datacentres.


MAM, scheduling, traffic and billing, and automation are already either in the cloud or cloud ready. Even playout and encoding have been deployed in datacentre environments as pure software systems. This implies private cloud solutions rather than public cloud hosted solutions. But, according to Andrew Warman, director, P&P strategy and market development, Harmonic, not all workflows have been replicated for cloud-type environments.

Some will take a considerable amount of time to fully transition,” he says. “An important question that broadcasters must ask is, “which of my workflows can readily be implemented in the cloud, and which are better serviced by a more conventional environment?”

The business benefits

Based on discussions with customers, Harmonic finds that broadcasters have two or three reasons for thinking about a move to cloud type playout. One is that they are looking to deploy pure software running on customer/service provider furnished hardware. This, says Warman, may go hand in hand with virtualization. Broadcasters save money because it is either part of the IT budget or is built into the service contract that runs their channels, leading to CAPEX and OPEX savings.
Another reason broadcasters are considering the cloud is IP connectivity. The expectation is that lower cost Ethernet-based solutions equals savings. “Add to this the continuation of function integration that really took off with CIAB (channel-in-a-box), and broadcasters need fewer connections as systems become more integrated in software,” says Warman.

In addition, some broadcasters no longer want to be burdened with vendor-specific hardware and maintenance, or dealing with proprietary designs. Overall, IT equipment costs less to operate and manage.

Channel agility is a key benefit of moving to the cloud. Traditional broadcast workflows are expensive to build, take time to construct, and can be complex to operate. A cloud-based approach enables agile channel deployment since the software, licensing models, available processing power and connectivity are flexible.
Technologies like Playbox's CloudAir were designed to include ad hoc services that may only be needed for an hour or even less. “New services or channels would therefore genuinely be up and running in little more time than it takes to exchange emails or make a phone call,” says company president Don Ash.
In terms of a 24/7 channel starting on the cloud playout model, an efficient telco should be able to get these live in a matter of hours or even in a few minutes,” he says. “The most significant costs of running any programme channel then become, as arguably they always have been, the overheads of originating, acquiring or refining content and employing whatever administrative and creative people the organisation needs.”

The barriers to adoption

Yet there are a number of barriers to adoption. “Where there is hardware which is not yet amortized it won't be reasonable by economic means to move existing playout to cloud,” says Martin Schröder, MD and founder PACE Media Development. “There will also be training costs to handle new technologies and some new equipment will be necessary on customer's site.”

On the technical side, Schröder says bandwidth from customer premises to the public internet can be a limiting option. Some new skills will be necessary, he says, and there will be a need for a new monitoring/QC infrastructure, “IP-based instead of SDI.”

He also identifies a general fear of new technologies; security concerns; concerns regarding stability and of dependency on a cloud service provider. Perhaps most importantly there is a psychological barrier to overcome.

Broadcasters must tear themselves away from what Veset CEO Igor Krol calls thevery conservative paradigm of reliability and security” of linear TV. When it comes to linear playout broadcasters are still stuck in their old ways, and struggle to think outside the existing ASI/SDI, proprietary hardware and GPUs paradigm,” he contends. “At best, they try to squeeze new technologies into old packaging. A glaring example of such thinking is the constant request to provide SDI over the internet.”

The biggest barrier is a lack of knowledge at all levels. The list of new things to know about touches every part of a playout business.

Questions include; What resiliency models at the virtual machine level do you want to deploy? What are the different cost models in the different clouds? If you upload an extra three day’s worth of content over a long public holiday will your business be hit with a huge bill? The considerations are huge, both financially and technically,” cautions Karl Mehring, Director of Playout & Delivery, SAM.

It is also about understanding that the industry is in flux and that you have to be constantly re-evaluating and testing new solutions coming to the market,” says Krol. “The fact that you have tested a so-called cloud solution twelve months ago does not mean that it “does not work” for you. Actually, it may not have been a cloud solution at all.”

Mehring says he knows of content owners who would “jump at the chance” to put all their playout solutions into the cloud today, if they could trust that the level of service would be what they want. “The fact that these businesses have yet to find a solution they can commit to today should indicate what they believe the state of the industry is at today,” he advises.

Not all vendors are equally capable is a nice way to put it,” suggests Jan Weigner, MD and co-owner, Cinegy. “This may create customer skepticism.”

Cinegy and other vendors which have built a business based on software running in data centres tend to take issue with competitors which have remodelled their hardware product into IP.

At the moment, the majority of hardware vendors have been marketing remotely accessed traditional playout hardware servers as 'cloud playout solutions', in some cases with additional layers of web-based consoles,” claims Krol. “The architecture suggests that you place the manufacturer’s hardware in a remote data centre and operate playout remotely via the internet.
In reality, such an approach has little to do with cloud philosophy, and cannot be called a cloud solution,” he says. “Such solutions have no elasticity because computing resources of such specialised playout hardware cannot be dynamically managed. Ultimately, users have to bear the high cost of acquiring and operating traditional hardware.”
Realistic expectations
Disaster recovery is the logical, cautious approach for broadcasters (Qatari broadcaster Al Rayaan and ITV have explored cloud playout DR proof of concept).

Once the realization sets in that you can just launch another couple dozen channels in a matter of minutes to try, for example, to go hyper-local, people will do it,” says Weigner. “The competition among service providers will heat up as theoretically anyone with a credit card can be one at any moment. Is that a realistic threat? Probably not. But in reality customer relations, service quality, trust earned and ultimately price are the deciding factors.

The cost of the actual playout engine – physical or virtual – is only one factor of the total cost of playing out a channel. But by going virtual and being IP-based I can 'virtualize' my staff and my operations centre can be anywhere where I have an internet connection. Cloud-based playout means that a broadcaster or service provider can run operations from anywhere on the planet - wherever the staff is most affordable.”

It is hard to argue with such a forceful personality as Weigner, and when it comes to Cinegy technology his claims may be justified. The fact remains, though, that few primetime channels are likely to be published from the cloud in the immediate term.

As an industry we need to start being completely honest about what is achievable today,” cautions Mehring. “Some vendors appear to be raising expectations of what is possible today to unrealistic levels. Fortunately many customers have now worked this out, but it is a disruptive influence in the industry at a time of such change. The truth is we are at the early stages of our industry's transformation.”