Thursday, 9 July 2015

Is Vice the Future Face of News?

IBC
When Sir Martin Sorrell, the Chief Executive of WPP, the world's largest advertising agency, was asked how best to understand the new media landscape he referred to Vice.
“They understand how millennials think, what content millennials want,” he said.
WPP owns a 10% stake in Vice so perhaps there's an element of self-interest. But 'old media' investors are lining up to grab a slice of this hot property.
21st Century Fox paid $70 million for a 5 per cent stake, Disney chairman Bob Iger, MTV co-founder Tom Freston and Disney/Hearst-owned network A+E Networks have joined suit helping to value the 21-year-old youth publisher over $2.5 billion and rising.
Vice began as a punk magazine in Montreal and moved online seven years ago. It attracts the broadband generation to channels including Motherboard (covering technology), Noisey (a music discovery channel) and a food channel called Munchies.
It already had a reputation for gonzo-style journalism posted from hotspots of war and crime, before it formalised those video reports into Vice News in March 2014. Vice News has since become the fastest growing such channel on YouTube, gaining 1.1 million subscribers and 175 million video views as of early 2015.
Characterised by Vice founder Shane Smith as “the CNN of the street”, in its most recent deal it extended a partnership with HBO which will see Vice produce a daily news programme and have its own branded channel on the HBO Now streaming service.
“Basically, as a news organization Vice is no different than anybody else,” Kevin Sutcliffe, Vice Media’s Head of News Programmes for Europe told Chatham House, The Royal Institute Of International Affairs earlier this year. “You’ve got to trust us. Getting it wrong is extraordinarily damaging. So it’s very old fashioned, that’s one of the basic bases of journalism.”
Its success, with that of fellow news disrupter Buzzfeed, is no surprise to Vice executives who believe traditional news organizations hold a misguided assumption that millennials are not interested in learning about the world.
“A great American word is to bloviate, which is basically to sit around chatting all day long on the news channels,” said Sutcliffe, a former editor of Channel 4 current affairs strand Dispatches who was hired to launch the channel. “Vice News is absolutely a response to that. We have a form of journalism that is immersive, raw, embedded and authentic.”
Sutcliffe, who has criticised BBC journalism and current affairs as “institutionalised” and “beige”, said Vice disagreed that youth audiences were apathetic about news.
“It’s just how it was being presented that was the issue... formatting is out of date, it’s run its course, it talks down to people, it is not representative of 16 to 35 year olds. It skews very old and that’s because it doesn’t speak to them,” he argued.
“With that in the back of our minds we tried to make what we think is a different form of television news and documentary. What does that look like? It looks like ‘Ambushed in South Sudan’, a film where two of our journalists go on a journey with the South Sudanese army to take a town. It’s a 25 minute film in which you experience this army trying to take a town and then retreating under fire. It’s an experiential documentary where you learn more about Africa, African wars, African people. It’s very up close, it’s very personal. That’s a hallmark of Vice News’ journalism: you’re in the mix with the story, with the journalist, with the people you’re meeting. It’s character driven and it’s immersive.”
That seems to have touched a nerve by attracting large audiences to documentaries about the coup in Mali, the Ukraine conflict and, most notoriously, a documentary which embedded Vice News with the Islamic State.
“It was a global moment in terms of media because we remain the only media organisation to have got inside and been able to film with the Islamic State and got out,” said Sutcliffe. “That showed how we operate, which is a very raw and unmediated way.”
That Vice News is online is also an advantage since it is not tied to a particular schedule or format. “People now want authenticity,” said Sutcliffe. “News now does not break in a newsroom. News breaks on Twitter. We’re posting a lot of editorial every day, from around the world, from our writers and a range of video. If you actually look across our output, there’s an incredible range. We’re not competing with a BBC or cable news. We don’t need to fill their hours.”
Kevin Sutcliffe presents 'Online News Case Study: How Vice News is changing the paradigm' at the IBC Conference. Also see ‘The Big Talking Point: The internet era of TV is here, right? So how well is TV tackling the key issues?’

Wednesday, 8 July 2015

Reinventing the IBC Big Screen Experience

IBC
As digital cinema approaches market saturation it promises to deliver new, exciting and enhanced presentation options and exhibition business models ranging from immersive audio to motion simulation, but at the cost of greater complexity in mastering the Digital Cinema Package (DCP).
Even a typical studio title demands over 100 versions of the DCP but this can rise up to 450 versions for major releases. The basic DCP will be versioned for territories and include copies for subtitles, dubbing and language titles. DCPs will also be mastered for individual projection system and theatre qualities like aspect ratio (flat or scope); resolution (2K, 4K); audio type (5.1, 7.1) and 3D (which requires its own set of 3D subtitles).
Further complexity is now being added with enhancements to presentation driven by growth in premium large format exhibition.
These include object-based audio formats (Dolby Atmos, Barco Auromax); versions for motion simulators (D-BOX, 4DX, X4D); versions with wider colour gamut and greater luminance (or brightness to showcase laser projected films).
Emerging presentation formats like Barco Escape require multiple DCPs tailored for multi-screen projection. Higher Dynamic Range (providing a wider range between the whitest whites and blackest blacks in a picture) is another creative option being promoted by Dolby. Disney/Pixar's ‘Inside Out’ and Disney's ‘Tomorrowland’ are the first titles to be mastered for Dolby Cinema, a presentation format that includes HDR and Dolby Atmos projected using Christie 6P laser projection.
Different frame rates are also open to filmmakers and distributors. Peter Jackson's ‘The Hobbit: An Unexpected Journey’ was the first major release made at 48 frames per second, an aesthetic choice which sharpens the picture by doubling the frames from the century-old standard 24fps. Although the two later Hobbit sequels were also available in 24fps and 48fps versions, HFR has not been widely adopted but nor has it disappeared. Director Ang Lee is reportedly shooting his next feature ‘Billy Lynn’s Long Halftime Walk’ in 3D, 4K and either 60 or 120 fps. VFX pioneer Douglas Trumbull wowed IBC last year with presentation of a 3D, 4K, 120 fps laser projected short film.
While final distribution to cinemas is on hard drive, via fibre network or satellite, DCP administration and creation has become vastly more complicated. A facility like the new joint venture Deluxe Technicolor Digital Cinema is typically given two weeks to create all DCPs for any title including a manual quality control screening that ensures the mastering has been completed correctly.
“The margin has dropped out of distribution and the cost has shifted to producing and handling the DCP,” says Richard Welsh, CEO, Sundog Media Toolkit and SMPTE Governor for EMEA.
The problem is exacerbated by having to distribute to the bulk of theatres with older digital cinema equipment which have not kept pace with many of the new DCP enhancements.
A single venue may receive a dozen different DCPs alone, replete with a separate electronic key to unlock the encrypted film which is bespoke to individual players and projectors.
“The full cost savings anticipated by moving to digital have yet to be realised,” says Welsh.
An industry-wide shift to SMPTE-DCPs is an attempt to streamline the process. While the Digital Cinema Initiatives (DCI) spec (first published in 2005 and better known as Interop), succeeded in getting digital cinema off the ground, the fast evolving nature of the technology makes SMPTE's proposals more suitable for incorporating new developments.
Early digital projection equipment installed in cinemas require an upgrade to playback SMPTE-DCP version movies. UNIC (International Union of Cinemas) and EDCF (European Digital Cinema Forum) are working with Dolby, Deluxe, Sony and others to ensure full conversion.
Down the line, DCPs with Augmented Reality components are possible, viewable in cinemas using a type of glasses.
The next step out from this, with prototypes already in the labs, is holographic projection, a form of 3D immersion visible without glassware.
Immerse yourselves in the latest attempts to re-invent the 120 year old cinema medium with an A-list panel of industry practitioners at the free to attend IBC Big Screen Experience conference sessions.

Wednesday, 1 July 2015

Computer gaming: virtual sport built for virtual production

IBC
The marriage of IP production with internet viewing was always supposed to expose niche live events to a wider viewing public but few would have bet on computer gaming becoming the next mainstream spectator sport.
While debate will rage about whether virtual games are on par with more obviously athletic sports, electronic -- or eSports -- is a phenomenon with rapidly growing revenue streams that have attracted venture capital firms, major brands and broadcasters. “If you are a traditional media outlet trying to deal with massive change in Millennial viewing habits you have to be looking at eSports to capture this new audience,” says Ian Sharpe, CEO, Azubu.
SuperData pins the worldwide eSports audience at 134 million, rising to 153 million next year.  “The intersection of technology, fandom and interactive entertainment is presenting [the industry] with new ways of sharing great experiences on a global scale,” it states.
According to market researchers IHS Technology, 2.4 billion hours of eSports video were consumed online in 2014, a figure expected to hit 6.6 billion in three years by which time the sport's global value will exceed $1 billion (SuperData). Home gamers have posted clips of their work online for years. Now games are watched live on dedicated gaming sites. Teams, some organised in leagues such as the European Gaming League (EGL) and Electronic Sports League (ESL), play strategy games like StarCraft 2, multiplayer online battle arena games like Dota 2, and first-person shooters like Counterstrike.
The two largest networks are Major League Gaming's MLG.tv, which specialises in Call of Duty contests, and Twitch.tv for which Amazon paid nearly $1 billion last year. Competitors include Gfinity, Azubu and Dingit. The success of Twitch, which records a monthly audience in excess of 100 million, recently forced Google to launch YouTube Gaming with an emphasis on live streaming (currently in beta).
Programmed to stream on such sites are matches broadcast live from venues in front of fans. Among the biggest is the World Championship Finals of League of Legends which drew 45,000 people to South Korea's Sangam Stadium last year. Publishers like Riot – owners of League of Legends -- organise these events in support of their intellectual property for which a nascent market in broadcast rights has emerged.
The world's first dedicated eSports stadium is being built in China with 15,000 seats and in the UK, Gfinity teamed with Fulham's 600-seat Vue cinema to run a weekly programme of matches – live streamed – from March until September [see Cinema 2020 at the IBC Big Screen Experience].
Production of the broadcasts are also growing in quality. Native digital content from within the games are ripe for streaming. On top of that, POV cameras capture shots of the gamers (their facial expression and hand movements); wider positions show the venue's spectators watching talent on giant screens and replay systems are available to the live show's producer. Mics on the gamers can pick up their reactions and commentary and VTs of player personalities can be inserted pre-show or during the show.
Vision mixers commonly used in outside broadcasts compile graphics, audio mixing and special effects with the stream published to social media and converted to H.264 for distributing online. All of the feeds can be controlled remotely over IP, which Red Bull Media House does from the dedicated eSports studio it opened last year at its US headquarters.
German-based ESL, owned by Turtle Entertainment, runs its own web channel (ESL.tv) to promote the ESL league and produces all the content in-house. Azubu (pictured), which buy rights to live stream tournaments, uses a combination of Amazon Web Services cloud, Akamai CDN and Brightcove online video player to service its subscribers.
“Higher production values are needed in order to take live event e-sports streaming to the next level,” believes Sharpe. “We need on-ramps into esports so that casual viewers can understand what it is all about. Stats are great at showing how people rank and are performing and giving an indication of what is happening. Commentary is another key and so is being part of the crowd in the experience. That means an emphasis on social media to contextualise the game. eSports are still creating that language and working out how to communicate that language.”
Unlike other sports, eSports affords the opportunity to chat in realtime with players while they are streaming. “This potential proximity is another reason for esports' popularity,” says Sharpe. “What we have to do is create a good experience for showcasing these personalities and build a solid global programme.”
Again unlike other sports and thanks to live streaming, e-sports has a chance to develop an instant global presence. Instead of isolating players in national leagues, networks like Azubu are intent on opening access to players in Brazil, or Korea to Europe and vice versa.
“There is a realtime transparency between gamers and the fan community which is unique,” says Sharpe.
With the global video game industry expected to top  $100 billion in 2015, according to Gartner (far in excess, incidentally, of the global box office for feature film of $36.4bn, according to the MPAA), the industry is primed to move into the mainstream of sports consciousness.
Its primary home will be online but that didn't stop ESPN partnering with Red Bull to broadcast The International 4 — the annual world championships of Dota 2 – from Seattle's KeyArena on ESPN 3 last year. Viewing on a flat screen is one thing but just around the corner is virtual reality. Early 2016, Facebook is to debut VR device Oculus Rift and Sony will bow Morpheus, its VR headset for PlayStation 4. This Christmas, game developer Valve will launch its own visor-style, head-mounted display called Vive.
While Samsung GearVR (co-developed with Oculus) and Google Cardboard are already on retail, the latest entrants will work from a PC not a smartphone and are expected to deliver the next level in virtual experience. EGL's mission is typical of those at the centre of eSports' growth. Earlier this year it partnered with advertising agency BBH – which eyes EGL as a marketing platform -- to turn gaming into a global sport to rival football and Formula 1.

Monday, 29 June 2015

Telstra Acquires Cloud Asset Management Specialist Nativ

Streaming Media


Australian telco Telco has made a further acquisition, this time through the video analytics firm Ooyala—which it bought last August—for U.K.-based media asset management developer Nativ.
The acquisition (for an undisclosed amount) expands Ooyala’s technology stack into video production, pre-production, and broadcast planning for hybrid OTT and on-air video services.
Nativ offers cloud-based media logistics and workflow software and services branded as Mio Everywhere. Among its key reference clients is U.K. broadcaster ITV. 
Nativ is Ooyala’s second acquisition within the last nine months, following its purchase of Europe’s video ad tech provider, Videoplaza, last October.
A press release described the move as “the next step in Ooyala’s multi-phase strategy to deliver the most comprehensive, data-driven personalized TV and video platform that will power the next generation of television.”
The acquisition comes at a pivotal time for the industry, as the traditional systems for managing media workflow for on-air content delivery are undergoing major transformation, in the same way that media delivery and monetization solutions are transforming into more open, modular, cloud-based systems.
Ooyala plans to operate Nativ as a standalone new line of business under the Ooyala brand. Over time it will integrate Nativ’s technologies with its core video publishing, analytics, and monetization platform.
“The new TV marketplace can’t be serviced by legacy broadcast business systems," said Jay Fulcher (right), president and CEO for Ooyala. "New data-driven technologies and services will transform the way broadcasters, media companies, and brands operate in the era of multi-screen consumption. A transformation of this scale represents massive opportunity for the innovators that can drive new, future-proofed standards.”
Charlotte Yarkoni, president of Telstra Software Group and vice chairman of the board for Ooyala, stated, “Combining Nativ’s technology and team with Ooyala is a big step forward in executing our shared vision for a consolidated, global leader in personalized cloud TV and video. Nativ opens a lucrative new line of business for Ooyala. Following its acquisition of Videoplaza last October, it provides key media management and broadcast planning technologies that will extend the reach and power of its video, analytics and advertising offerings. Ooyala will stand out as a trusted provider who can meet the needs of broadcasters and media companies every step of the way.”
Nativ CEO Jon Folland said the deal enabled his company to catapult into a new phase of rapid growth, at global scale. Folland will remain part of the Ooyala executive team under Fulcher’s leadership. “With the backing of Telstra and their commitment to making the ongoing investments to support a rapid pace of innovation, we are now best in class in both standalone media logistics software and comprehensive, data-driven cloud TV,” he said.
The Mio platform includes a module for end-to-end management of multiscreen ad campaign workflows, and a data management module that gives companies the ability to model, gather and manage data across their value chain. This data management module is said to be complementary to Ooyala’s own analytics engine.
Telstra has previously invested $7.3 million in Elemental Technologies.

Wednesday, 24 June 2015

Digital’s phase two

Screen International 

p45 http://edition.pagesuite-professional.co.uk//launch.aspx?eid=b894dd61-fb5a-401e-9cb2-a0ba09be203f

Digital cinema is fulfilling its promise to deliver enhanced presentation options. But with that comes greater complexity in mastering and distribution. “In the early years of digital cinema, it was right to offer a dual 35mm and digital approach to mastering and distribution,” says Andy Scade, director of digital cinema at Deluxe London. “Now that process is largely complete, we are entering a second phase which is about pushing the new opportunities created by digital.”
Over the past five years, the industry has begun to realise the potential of what digital cinema can bring to distribution,” says Richard Fish, commercial director of Eikon Group. “A digital infrastructure allows you to deliver to all platforms including home entertainment, VoD and subscription TV. Digital means speedier delivery for day-and-date releases. If you get it right, you can find efficiencies in the workflow.”
A typical studio release demands more than 100 versions of the Digital Cinema Package (DCP), which can escalate to 450 versions for major titles. The basic DCP will be versioned for territories and include copies for subtitles, dubbing and language titles. DCPs will also be mastered for individual projection system and theatre qualities such as aspect ratio (flat or scope), resolution (2K, 4K), audio type (5.1, 7.1) and 3D (which requires its own set of subtitles).
Further complexity is now being added with enhancements to presentation driven by growth in premium large format (PLF) exhibition. These include immersive audio formats (such as Dolby Atmos and Barco Auro); versions for motion simulators (D-BOX, 4DX, X4D); and versions with wider colour gamut and greater luminance (or brightness to showcase laser-projected films). Emerging presentation formats such as Barco Escape require multiple DCPs tailored for multiscreen projection.
Higher Dynamic Range (HDR), which provides a wider range between the whitest whites and blackest blacks in a picture, is the latest creative option being promoted by Dolby. Pixar’s Inside Out, and Disney’s Tomorrowland are the first titles to be mastered for Dolby Cinema, a presentation format that includes HDR and Dolby Atmos.
Different frame rates are also open to filmmakers and distributors. The Hobbit: An Unexpected Journey was the first major release made at 48 frames per second, an aesthetic choice that sharpens the picture by doubling the number of frames from the century-old standard 24fps. Although the two later Hobbit sequels were shown in 24fps and 48fps versions, the method has not been widely adopted, although James Cameron’s Avatar sequels are expected to feature a 60fps DCP.
While final distribution to cinemas is on hard drive, via fibre network or satellite, DCP administration and creation have become vastly more complicated. A facility like the new joint venture Deluxe Technicolor Digital Cinema is typically given two weeks to create all DCPs for any title, including a quality-control screening that ensures the mastering has been completed correctly.
The margin has dropped out of distribution and the cost has shifted to producing and handling the DCP,” says Richard Welsh, CEO of Sundog Media Toolkit, which develops software processing tools for post-production and digital cinema packaging. The problem is exacerbated by theatres with older digital equipment. A single venue may receive a dozen different DCPs, with a separate electronic key to the encrypted film that is bespoke to individual players and projectors. “The full cost savings have yet to be realised,” says Welsh. “It is in the studios’ interest to reduce complexity because that reduces the cost.”
An industry-wide shift to SMPTE-DCPs is an attempt to streamline the process. While the Digital Cinema Initiatives (DCI) spec, first published in 2005 and better known as Interop, succeeded in getting digital cinema off the ground, rapid change means SMPTE’s proposals are more suitable for incorporating new developments. “SMPTE-DCP will ensure more manageable DCP sizes, while allowing cinemas to benefit from new technology and staying compatible with the basis of DCI,” says Manel Carreras, SVP of marketing and content services business development at Ymagis.
Early digital projection equipment requires an upgrade to play SMPTE-DCP version movies. The International Union of Cinemas and European Digital Cinema Forum are working with Dolby, Deluxe, Sony and others to ensure full conversion. Down the line, augmented reality (AR) components and holographic projection are possible. “One difficulty is making AR work no matter where someone sits,” says Welsh. “By overlaying elements to a film you are experimenting with the theatrical experience, which is more interesting than virtual reality.”





Reality Bytes

Screen International

p41 http://edition.pagesuite-professional.co.uk//launch.aspx?eid=b894dd61-fb5a-401e-9cb2-a0ba09be203f

The anticipated launch of virtual-reality (VR) headsets next spring has created a feeding frenzy in Hollywood. It is founded on a belief in VR’s potential as a new mass entertainment medium. In early 2016, Facebook is set to debut VR device Oculus Rift while Sony will bow Morpheus, a VR headset for PlayStation 4. This Christmas, game developer Valve will launch its own visor-style, head-mounted display called Vive. While smartphone-based Samsung Gear VR (co-developed with Oculus) and Google Cardboard are already on retail, the new entrants to the market will work from a PC and are expected to deliver the next level in virtual experience.
Whether consumers will take to them is the million-dollar question. But with forecasts of 171 million users worldwide by 2018 and the prospect of two billion existing smartphone owners who could be enticed to buy a VR accessory, studios are not about to wait for the tech to scale.
“Even a small percentage of [2 billion users] is a big market,” says Cliff Plumer, the former Digital Domain CEO and Lucasfilm chief technology officer who is now president of virtual reality tech company Jaunt Studios. “VR for mobile is a static experience, but the incoming technologies will give a much more immersive feel that will give creatives many more options. The key to mainstream adoption is to produce enough content that people will come back for more.”
“This is a pivotal point in VR’s development,” echoes Dale Carmen, founder of VFX and VR facility Reel FX. “It is not a fad.”
While VR’s natural early home lies with computer games, it is also being eyed as a live-event platform by the broadcast and sports communities. The initial interest from Hollywood has been as a marketing tool, with studio marketeers latching onto VR as a shortcut to publicity for theatrical releases including The Hobbit, Interstellar and How To Train Your Dragon 2. Reel FX delivered VR campaigns for Legendary Pictures’ Pacific Rim and Lionsgate’s Insurgent. 20th Century Fox commissioned VR pro- mos for Wild, The Maze Runner and The Book Of Life through its tech incubator Fox Innovation Lab. Future titles on Fox’s slate, such as War Of The Planet Of The Apes and James Cameron’s Avatar sequels, are obvious candidates for a VR companion.
“The next two to three years will see the creation of an ecosystem for VR con- tent to be distributed, bought and consumed by viewers,” says Felix Lajeunesse, co-founder of Felix & Paul Studios that produced the Wild VR promo. “Right now, there are no models to generate revenue, so it makes sense for budgets to come from hardware manufacturers who need to seed the market with content or from marketing, where VR serves an immediate purpose.”
Recent examples include a VR promotion for Disney/Marvel’s Avengers: Age Of Ultron. The CGI short was delivered by the New York branch of UK VFX house Framestore and commissioned by Los Angeles-based ad agency 72andSunny. “Event-based and marketing campaigns are a way to turn heads but are not the future of VR,” says Mike Woods, Framestore’s head of VR.
DJ Roller, co-founder of virtual reality tech firm NextVR, agrees: “VR trailers and marketing promos are a powerful tool but are not the highest use of the medium.” Nor is using VR as a vanilla distribution platform for cinema releases. Users of online stores such as Samsung Milk can choose to stream ad-supported or free content, including full-length features in 2D or 3D, in a virtual theatre viewed through a headset. “Playing back movies can offer a more immersive ‘cinema-style’ environment when you are on a plane, for example,” says Woods. “But it doesn’t remotely stretch VR as an art form.”
Comparing the shorter duration of VR content to feature releases also misses the point, according to advocates. “A VR piece may be two minutes long but since it doesn’t play linearly like a film, you could, if the experience is good enough, immerse yourself in that environment for hours,” maintains Lajeunesse. Felix & Paul recently released a nine-minute VR for Cirque du Soleil and is currently working on a 20-minute non-fiction experience. “As we understand more about storytelling in VR, the length of pieces will naturally expand,” says Felix & Paul co-founder Paul Raphael.
Other reasons cited for the current short-form nature of scripted VR include the need to educate users about what to expect. “VR delivers such an intense emotional experience — because you are placed within the scene — that it will take time for people to get used to it,” says Woods. “As they do, their acceptance of longer content will grow.”
Another factor, which will also evolve with time, is the limited capacity of smartphones and internet connections to stream or store the video. “I don’t think anyone’s ready, creatively or technically, to produce a two-hour film, let alone spend that time watching it,” says Lajeunesse. “One interesting idea is to create a series of episodic or bite-sized content that builds into a larger whole.”
Taking notice
By dabbling in VR promos, studios such as Fox will also be working out how they can maintain a stake at the table, should VR becomes a viable content platform. With technical expertise in the hands of VFX houses and VR kit developers, there is nothing to stop their suppliers from striking out as independent content producers. Carmen likens the situation to the impact YouTube made on video distribution. “The arrival of technology like [Oculus] Rift has democratised VR,” he says. “It means we can go straight to Samsung, Google or Oculus and make this content available.”
Virtual-camera maker Jaunt, which is backed by Google and Sky, spun off content arm Jaunt Studios after recruiting Cliff Plumer and ex-LucasFilm COO David Anderman to take charge of the offshoot. VFX and 3D conversion studio Legend3D launched a VR division in February and announced that it had partnered with feature film studios. As for Oculus, it tapped Pixar Animation creative and technical directors Saschka Unseld and Max Planck to head up Oculus Story Studio. Lost, the first of five short films slated for 2015, debuted at Sundance Film Festival in January.
“Our focus is to conceive, write, present and distribute our own content,” says Woods of Framestore’s VR ambitions. The company is drawing on the interactive storytelling skills of computer-game writers, with Woods citing “a huge appetite among consumers want- ing to lead their own story in a wider constructed story arc”.
Relationships and contracts with key actors will give studios a valuable card to play. “There are opportunities for studios to create new content around an existing intellectual property as a specially designed piece of VR rather than a bolt-on trailer,” says Roller. “Other content might include virtual face-to-face meetings with a favourite actor.”
Unsurprisingly, VR has caught the attention of high-profile producers and A-list talent. Mandalay Entertainment founder Peter Guber is an investor in NextVR while Maleficent director and VFX supervisor Robert Stromberg is developing a VR slate for start-up The Virtual Reality Company, which also boasts Steven Spielberg as a board member. This month, Google released live-action VR short Help, shot by Star Trek 3 director Justin Lin, and Ridley Scott is reportedly planning a VR complement to his 20th Century Fox-backed science-fiction adventure The Martian. “The onus,” says Plumer, “is on creatives to push VR further and make it compelling for studios to invest more capital.”
While filmmakers such as Scott, Stromberg and Lin are clearly intrigued by the possibilities of VR, mastering the storytelling side of the medium represents a substantial leap. “Being a master storyteller in one form doesn’t necessarily transfer to VR,” warns Carmen. “In fact, it may be an obstacle. You need to dedicate yourself to the medium and set aside everything you’ve learnt.”
Meeting the medium
Most exponents dislike the term ‘Cinematic VR’, arguing there is nothing conventional about a medium in which the rules of content creation and experience need rewriting. “None of the old rules apply, from editorial to staging,” says Lajeunesse. “A cut in VR is disorienting. To move the camera in VR is to move a viewer against their will. You have to justify every cut and movement and you have to recalibrate the drama to allow space for the viewers’ imagination.”
Even the production technology itself is still in an experimental stage. The camera rigs are prototypes, the picture-stitching software is primitive, the rendering of data from up to 16 cameras is slow and professional-grade monitors to view material in 360-degrees do not exist. Overall, production is playing catch-up to consumer technology. The new wave of headsets bring eye tracking and voice control, while Morpheus integrates the PS4’s Play ‘wands’ so viewers can interact with the environment. Live social interaction with ‘friends’ within the virtual world is another focus and a key component to Facebook’s interest in Oculus Rift, which it acquired last year for $2bn.
So different is the VR experience believed to be that the mixed reception the industry received from its gung-ho approach to stereo 3D is being brushed aside. “First-generation VR will be viewed in time like the first brick cell-phone,” Roller predicts. “Technology coming down the track will ultimately be like a pair of glasses. It is set to be the most powerful medium we’ve ever seen outside of live performance.”



GOING WILD EXPERIENCING THE VR SHORT FOR THE REESE WITHERSPOON VEHICLE



According to Felix Lajeunesse, the three-minute VR short his company Felix & Paul made for Fox Searchlight’s Wild “blurs the boundaries” between conventional trailer and original VR experience. Viewers finds themselves on a tree stump in a forest, watching and listening as Reese Witherspoon’s character Cheryl Strayed (on whose bestseller Wild is based) walks up and sits on a nearby rock without acknowledging their presence. The camera remains static but allows the viewer to look in any direction.
“The intention is to give the viewer the sensation of being present,” says Lajeunesse. If the viewer looks in one direction, they will see Laura Dern (who plays Strayed’s mother) and be able to eavesdrop on their conversation. This only happens, however, if the viewer obeys certain audio cues (eg, Dern’s voice coming from their right) and turns accordingly to look, with Dern triggered to appear by head-tracking sensors in the Samsung Gear VR.
Since everything in a live-action VR shoot is in shot, the crew hid 50 feet away behind a rock during filming, while the camera rig was digitally removed in post- production. To keep equipment to a minimum, only natural lighting was used. “This was the first time either actress had been filmed entirely alone on set,” says Lajeunesse. “The moment they shared contributes to the intimacy of the piece.”


Tuesday, 23 June 2015

UHD: the ultimate goal


Broadcast (BroadcastTech p47)

Ultra High Definition (UHD) may be out of the bottle but discussions about it are far from over. While studios, broadcasters, tech companies and other stakeholders might agree on the ultimate goal of creating a set of UHD standards that will drive public take-up of the technology, there is much debate about what is needed to get there.
Let us recap: already agreed is UHD-1 Phase 1, which, for all practical purposes, is about a spatial resolution four times that of HD (3,840 x 2,160) and a frame rate of 50/ 60fps. Also agreed is UHD-2, which at 7,680 x 4,320 pixels offers 16 times HD resolution.

UHD-2 was used for the joint NHK/BBC public demonstrations of Super Hi-Vision during the London 2012 Olympics and is only being considered in Japan at this time, with broadcasts expected to take place by 2020.
UHD-1 Phase 2 is an attempt “to define an immersive viewing experience”, explains Matthew Goldman, senior vice-president of technology, TV compression, at Ericsson, and is earmarked for standardisation in 2017/18 – although this may slip.
Encouraging adoption
The reason for a second stab at UHD-1 is a realisation that resolution alone does not produce a big enough uplift in picture quality and other attributes are needed to encourage greater public adoption.
Some of the main elements of UHD-1 Phase 2 include a wide colour gamut (WCG) and high dynamic range (HDR), sampled from a bit depth of at least 10 bits. Since all existing digital TV systems are only 8 bits, they are not able to represent subtle shades of colour or details in the shadows simultaneously with details in bright areas of the image.
“This will have one of the biggest impacts on the viewing experience,” says Goldman. “Colours will look more real, with subtle shading possible without visible banding artefacts; and highlights – such as the sun reflecting off surfaces or night-time stadium lighting – will ‘pop’ with realism.”
Higher than 50p frame rates are also on the table, with sports broadcasters pushing for 100/120p to eradicate motion blur at high resolution.
Another key element is dynamic range. The standard dynamic range (SDR) in use today was defined in the 1950s based on cathode ray tube technology. SDR is measured in candelas, for which the production standard equates to 100 candelas per square metre (also known as ‘nits’).
This can only deliver a limited contrast between the whitest whites and the darkest areas of the picture, but a higher dynamic range (starting at 5,000 candelas and rang- ing to 20,000) goes way beyond the current TV production standard and will make the single biggest difference to viewing.
There are a number of proposals for implementing an HDR system – also known as Extended Image Dynamic Range – from Dolby, Philips, the BBC, NHK and Technicolor. Apart from Technicolor’s, each proposal has been submitted to the ITU-R.
Dolby’s approach, dubbed ‘perceptual quantisation’, is an absolute measure of dynamic range that closely matches human vision but requires complex computations. The BBC, by contrast, wants to adopt a relative measure of
ynamic range using log scales, which has the benefit of being simpler to calculate.
Whichever route is selected, when HDR, WCG, 10-bit sampling and higher frame rates in UHD-1 Phase 2 are factored in, working with all the data in production, let alone delivering it to the home, becomes fraught with difficulty.
“It is beyond the capability of current professional and consumer media interfaces to handle that amount of data,” says Goldman.
HDMI 2.0, the latest consumer interface to transport signals between a set-top box and a TV, arrives this year, but it won’t be able to handle 2160p 100p, so a new version of HDMI must be developed.
With the agendas of the various stake- holders at odds, they must trade off what is desirable in UHD-1 Phase 2 with what can be realistically delivered by 2017/18.
Since the benefits of HDR are independ- ent of spatial resolution, the debate has spawned interest in an enriched version of HD, known as ‘Enhanced HD’. “There’s a huge push to define a 1080p version that includes 100/120p, HDR, WCG and deeper sampled bit depth that will fit into HDMI 2.0,” says Goldman. “That type of signal will fit into existing SDI cables as well. It takes up much less bandwidth than UHD.”
Bandwidth matters, especially to cable and telco service providers, so for them it may become a question of what delivers the best ‘bang per buck’ in terms of changing the viewing experience. “Perhaps Enhanced HD offers a more viable, compelling solution for service providers than UHD,” says Goldman. “The argument is not whether to include HDR but what form it should take.”
The difference is not insurmountable, given that several companies, Ericsson included, are represented in two or more of the standards organisations.
UHD-1 Phase 2 will also have a consider- able impact on production and post. “Part of the process of deciding on commercial requirements for UHD-1 Phase 2 is to con- sider the practicability of programme produc- tion that could be used for the format,” says David Wood, commercial module chairman,
DVB-UHDTV, and SMPTE fellow. “The degree of compatibility between Phase 1 and Phase 2 is one of the most important issues in its development. Current thinking is that both backwards-compatible and non-backwards-compatible versions of Phase 2 should be investigated, so a decision on which to use could be taken on the basis of factors such as the extra bandwidth the compatible version needs.”
Real-time data handling
Even if display technologies are capable of the frame rate, wider colour and HDR of UHD-1 Phase 2, colour-grading systems need to be able to handle the data in real-time.
“On face value, the image for Phase 2 is still 3840 x 2160, but the data differential between that and 10 bit UHD is 16 times,” warns SMPTE EMEA governor and Sundog Media Toolkit co-founder Richard Welsh. “If a facility thinks UHD/4K looks like a lot of data now, Phase 2 is an eye-watering prospect.”
Another challenge will be the number of deliverables. One suggestion is to automate the process by using one master format to derive all the deliverables downstream.
“This is not a simple case of throwing data away – just reducing bit depth from 16 to 12 to 10 requires careful management to avoid ugly artefacts,” says Welsh. “Likewise, changing the frame rate and colour space will present challenges. Will creatives want to see, and tweak, all these configurations? And will productions be able to afford it? That remains to be seen.”

Who’s setting the standard?
What are the differences between the SMPTE, ITU and DVB’s approach to UHD? SMPTE’s focus is broadly on production and archiving standards, the DVB’s interest is in content delivery to the home, while the ITU acts as a bridge or programme exchange.
David Wood, who advises the DVB and SMPTE, says: “SMPTE and ITU formats for UHD-1 programme production and exchange are aligned. The DVB broadcasting profiles will draw on the specifications, but not all of the features in the ITU/SMPTE specifications may be included. For example, UHD-1 programme production might be done at 12 bit, while delivery might be done at 10 bit. Meanwhile, the MPEG committee is exploring whether any changes should be made to the HEVC codec in order to code HDR more efficiently.”
The DVB hopes its specification for UHD-1 Phase 2 will be will be finalised in the second quarter of 2016.
Other bodies are keen to contribute to the formation of UHD standards. Unveiled at CES (pictured) in January this year, the UHD Alliance is mostly driven by film studios and dis- play manufacturers, and is looking to agree a set of parameters for UHD viewing quality, and to stamp a ‘kite mark’ onto certified UHD TVs. Its members include Panasonic, LG, Netflix and The Walt Disney Studios.
The Ultra HD Forum is a US-based organisation set up in 2014 by Harmonic to cover all aspects of UHD production and delivery, including live, OTT and VoD. It is not to be confused with the UHD Forum, another group looking at the complete UHD ecosystem. The latter was launched by the UK’s Digital TV Group (DTG) in 2013 and is chaired by the BBC and BSkyB. The DTG is working with both US organisations, as well as the Forum for Advanced Media in Europe (Fame).
The UK’s Digital Production Partnership (DPP) has also committed to drawing up a definition of a UK delivery standard for Ultra HD programmes by the end of the year.

Q&A...David Wood on SMPTE
What will the UHD-1 Phase 2 specifications include? The name ‘UHD-1 Phase 2’ is used by the DVB Project for a collection of features that will be included in a future transmission format. It is pre- paring the elements it believes Phase 2 will need for commercial success. 2160p spatial resolution is included and it is likely to include the BT2020 definition for wider colour gamut. HFR and HDR are under discussion. Advanced audio may also be a part of Phase 2.
What are the delivery requirements?
The working assumption is that HEVC compression will be used for Phase 2. Different combinations of features will affect the compressed bit rate, and different compressed bit rates may be more or less commercially viable for different delivery platforms.
When will UHD-1 Phase 2 be adopted?
It should be available in 2016, to allow services in 2017/18, but it is still under discussion.