Friday, 29 July 2016

Expand the cloud's horizon


Broadcast

The cloud is now capable of handling many of the processes involved in TV production, but can it be extended to include content management as well?


The basic advantages of cloud networks also apply to content management: fewer operational issues, better use of resources, reduced capital expenses, increased mobility and collaboration and the opportunity of new business models. 

Participating: Hugo Bastos, project management office director at VSN; Julian Fernandez-Campon, business solution director, Tedial; Andrea Winn, sales and commercial director, TVT; and Paul Wilkins, CMO, TMD.

More and more aspects of production and delivery are being moved to the cloud. Why should content management follow suit?

Paul Wilkins Precisely because so many other processes are moving to the cloud, content management should start migrating too. If a producer is delivering programmes to a broadcaster via a cloud service, it makes sense for transcoding, quality control, proxy creation and other asset management functionality to be performed in the cloud as the content passes. If the metadata created and used for cloud processing also stays in the cloud, it is readily available to all users at all times.
Julian Fernandez-Campon Cloud can provide media companies and content owners with further management and storage options. These range from storage services with additional backup in a separate location to full content management providing complete search capability and the ability for content to be managed by third-party service providers. In addition, broadcasters can control investment and access on-demand services that can adapt dynamically to meet their business needs.
Hugo Bastos More and more clients and projects require access to a content management solution from multiple locations, and that should be reason enough to consider a cloud or hybrid deployment. Users can access and submit content from virtually anywhere, create new business models, such as media exchange platforms and media marketplaces, as well as adapt to different workflows and workloads due to the scalability of cloud infrastructure.
Andrea Winn The key is operational flexibility, which is not possible when limited to on-site infrastructure. For example, we can create proxies of original content and make it available to our in-house compliance teams and client promo teams. At the same time, we can create multiple language files and extract EDL data for client use. From ingest to playout, the entire workflow is visible to all registered parties and can only be accomplished in the cloud.

Are there any downsides? 

JF-C Broadcasters should carefully analyse and adapt their operation to minimise the cloud service cost. In some cases, some local operation is required, which makes a ‘pure’ cloud approach inefficient and means moving to a hybrid approach instead. Some broadcasters cannot legally have their content stored outside their country, which might impose limitations on the cloud service provider.
AW Craft editing is a big factor. You’ve got to assess how much you need to do and whether you’re better off adopting a more traditional storage solution. Cloud storage can be very expensive and the movement of high-resolution files between regions is not necessarily cost-effective all the time. Beware the costing models of some cloud providers, which can be impenetrable. Broadly, deep archive in the cloud is cheap, but moving content out or between clouds may not be.
HB For several specific workflows and customers, an on-premise solution is still the best. The two main factors that might prevent its adoption are internet connectivity and cloud storage price. Unfortunately for connectivity, nobody can offer a solution and the client/project will need to wait until the proper connectivity infrastructure is updated. To mitigate against the cost of cloud storage, we mostly recommend hybrid solutions where high-resolution content is kept at the customer facilities, or a local data centre, with low-resolution proxies in the cloud for workflow processes.
PW The only real concern is the challenges of moving big files to and from the cloud. Connectivity needs to catch up. In particular, costs for downloads can quickly mount. It’s important to have a robust means of handling proxies so that downloads of full-resolution media are minimised. Some users may also have security concerns, but in truth, a business like Amazon S3 will have the best security teams available. A single significant loss of privacy and a cloud firm would be out of business. It is rumoured that the [US intelligence agency] CIA uses Amazon Web Services, so it is probably good enough for us.

With multiple delivery channels and devices, programme rights are increasingly 
complicated. What is the media management solution?
PW Intellectual property rights are hugely complicated and challenging, but ultimately they can be expressed as metadata. Some of that metadata will only ever be seen by the lawyers and IP specialists, but some has to be fully embedded in the asset management platform because it will be vital to ensure reliable operation of automated workflows.
JF-C Media management solutions have to be rights-management aware, with efficient integration with the scheduling/rights management systems. But this is not sufficient. MAM solutions also need to evolve the concept of content rights. This means assigning rights not only to a piece of content and the dates in which that content can be aired, but also to the destination, such as a premium channel. This concept will allow broadcasters to manage delivery operations, such as components, transformations and packaging, combined with the availability per each destination.

How can MAM help content owners use metadata to maximise revenue or earn additional money?
JF-C Distributed access better enables anyone within the organisation to access content from any location. This high level of flexibility means that it’s easier to evolve and amend media services. Time to market is also reduced if the cloud infrastructure is already available. To meet the demands of multi-screen distribution, broadcasters require solutions that enable fast and secure access over the internet protocol, providing automated workflows that package and present content, which can then be delivered to the cloud, as well as to other sites. This should remove the unnecessary complexity caused when working between so many desktops and departments using a local area network.
HB For a MAM implemented to work as a marketplace, metadata has the obvious role of allowing a client to search and get content as fast as possible related to what he is asking for. In a different implementation, metadata can be used to trigger automated, as well as manual, processes or events, as soon as a specific piece of content is detected. Certain metadata can trigger the system, for instance, to send an email to a specific person, to automatically transcode and publish on specified social media. Metadata allied to automation opens an unthinkable number of options.
AW The more metadata you have and hold, the more opportunities it enables. One example: the value in being able to access footage in support of a breaking news story is reliant on being able to locate the clip and bring it into the workflow quickly. The converse is sometimes true: you might need to pull something from the schedule that is suddenly sensitive.

What is the best way of assessing return on investment?
HB In a word, time. Time saved by all the people and groups using the solution by having a centralised MAM and due to the simplification of the process of ingesting and retrieving content is probably the best way to assess the ROI of any MAM implementation.
JF-C There are several ways to measure ROI: compare current operational costs with the time and resources saved; measure business growth as a result of processes optimisation; check, in real-time, the bottlenecks for a specific production. In all cases, MAM solutions need to have reporting mechanisms to allow broadcasters to get all this information, process it and take action.
AW It’s either got to save money or generate money – a good solution should be doing both. There will be, for example, knock-on cost savings in terms of the ongoing operational cost of the technology infrastructure.

PW A MAM should not only be controlling most of the workflows for management and delivery, it should be tracking equipment and staff utilisation, telling the resource planning department precisely how much each process costs. Only then can an enterprise make realistic judgements about what services to offer, how to monetise them and how to achieve strong and reliable revenue streams.

Wednesday, 20 July 2016

Does the screen matter for live sports?

TVB Europe

As live sports pours online TVB Europe asks whether OTT streams are as good – or better – than the cable and satellite delivered TV experience.

p27 

Sports rights are now a target of social media players just as much as extending the reach of sports properties onto OTT platforms is a strategy of sport franchises and broadcasters.

Broadly, traditional players need to expand into social and online to reach millennials who are deserting studio-bound linear presentations for interactive, informal, and mostly free TV anywhere experiences. 

Social giants, on the other hand, are increasingly moving into premium sports because the audience and profile of these businesses rely on scale, suggests Richard Broughton, research director, Ampere Analysis. “Sports are mass market form of entertainment and capable of attracting large audiences with a high value to sponsors and advertisers.”

Examples of the former include BT's live coverage of the UEFA Champions League on YouTube; Discovery's deal with Snapchat to create a dedicated mobile channel around the Olympics (with content supplied by BuzzFeed); Sky's investment in online sports network Whistle Sports; and the use by broadcasters, including Sky and the BBC, of Facebook Live to augment linear sports programming. In the latter camp is Twitter's live stream of 10 NFL matches (which also meets the NFL's need to look beyond traditional networks).

All sides are dovetailing on internet delivery, which begs the question as to whether the quality of service between linear (satellite, cable) TV and broadband is indistinguishable.

“It can be equivalent – in all honesty I think OTT can be better in some cases – for example, OTT can do 4K now,” says Shawn Carnahan, CTO, Telestream. “The big question is what device is the viewer using and is their prime concern quality or convenience? 

He adds, though, that OTT may not ever be able to achieve the very low latency of broadcast from a technical perspective. “In broadcast, each viewer has the exact same bandwidth and very low latency. I'm not sure OTT could ever achieve that same level.”

Telestream has introduced Lightspeed Live Stream to bring broadcasts and OTT together in the live space. The solution is designed to provide high quality encoding and control the amount of bandwidth available between production and distribution plus the amount available between distribution and the end user.

Neulion's EVP & Co-Founder Chris Wagner is in no doubt. “Online is better. Satellite and cable deliver at 30fps. We deliver at 4K at 60fps [Neulion streamed El Clásico’ Barcelona Real Madrid match live in 4K 60fps over the NeuLion Digital Platform to Sony 4K TVs in April].”

Neulion was signed by Eleven Sports Network to stream live (and VOD) coverage of La Liga, UEFA Euros, Formula 1 and the FA Cup Final to subscribers in Belgium, Luxembourg, Poland, Singapore and Taiwan. “Digital delivery of live video is better and looks better than cable,” says Wagner. “Satellite and cable platforms are being replaced by digitally delivered video.” 

Ian Munford, Director of Product Marketing, Media Solutions, Akamai, says the industry has reached the point where good HD quality live streams can be delivered with reliability. “We have the luxury in the UK of a great HD TV service. We can easily surpass that.” 

For Superbowl 50 Akamai saw a “dramatic increase in every single viewing metric” says Munford. In 2015 it counted 2.5 million viewers concurrently streaming the event live, this year it peaked at 4 million. “The average viewing time increased from 84 to 101 minutes and we saw a big jump in bit rate from a 3.5Mb average to 4.5Mb average. This tells us that there's a shift toward watching major sports using an OTT service and away from snacking to long form viewing.”

There were reports, though, that online viewers of SB50 did not receive a buffer-free experience (not necessarily to do with any of Akamai's involvement). The transition to HTTP-based streaming may have enabled OTT delivery but it inherently introduces latency. 

A study by network performance analytics firm IneoQuest (conducted before SB50) found that that sports buffering inflicted rage in viewers with 2 out of 5 consumers likely to will wait only 10 seconds or less for the video to resume – or they leave the stream.

“When you're using HTTP streaming technology there can be a challenge from the camera through to the playing device,” says Munford. “Some things are not in the rights owner's control.”

One issue is the shift in bit rate, where the live stream pixelates or blurs on account of ABR. “Dramatic shifts do impact the viewer experience,” says Munford. “We're seeing quite a high abandonment rates as a result.”

The content delivery network (CDN) has a number of Media Services Live delivery technologies designed to reduce latency. Its accelerated ingest capabilities minimise the amount of time live video streams take to reach the CDN from their origination point. It uses HTTP/UDP to prevent packet loss and speed the transit of content, and make it easier to handle unpredictable peaks. It will also use multicasting, and peer-assisted delivery using WebRTC.

It also opened a Broadcast Operations Control Center at its Cambridge, Mass., headquarters, to monitor the reliability of OTT streams around major events like the Rio Olympics. “During London 2012 online traffic peaked at about a Terabits per second Tbps,” reveals Munford. “We expect peaks globally of between 15-18 Tbps during Rio to set new global records in terms of online streaming traffic.” 

Akamai forecasts that 500 million viewers will soon be watching prime-time live sports online. “With 500 million online viewers, we need 1500 Tbps. Today we do 32 Tbps [at peak], so you can see the huge gap we have to bridge,” says Munford.

“Any organization looking to deliver high quality scaled events needs to plan how to deal with very large peaks of audience,” he says. “It's a bit like a power surge. Peaks can be unpredictable.”

Carnahan points out that no one would never know about latency unless there is a 'back channel' such as Twitter also providing information about a live sporting event. “Tweets from my friends may be talking about something I haven't seen yet, due to latency on my OTT feed to my device. However, it can be down as low as under 30 seconds - possibly under 15 seconds.”

Wagner counters, “You can't you compare video streaming to texting. It's like watching a match live bu listening to the radio commentary. Video quality and latency go hand in glove. If you want no latency then you'll get video quality at 800kbps.”

With 4K in particular the caveat is the last mile to the home. “When we deliver 4K we are reliant on end user bandwidth,” says Munford. “We are seeing an average 13Mbps [for 4K] and for that we need good fibre in the home. The technology is there, we can surpass [this speed] and we're confident we will continue to push those boundaries.”

Verizon claims to have reduced latency on live 4K online delivery to four milliseconds, for delivery of content including San Francisco 49ers pre-game coverage over the UltraFlix network. The industry norm, though, for streaming HD content is still only 720p.

Beyond the TV experience

Viewers will accept buffering and some pixelation due to ABR for the trade off of interactivity and anywhere viewing. “It's a bit like the transition of music from CDs to streams – the quality is down but the fact hat I can listen to it wherever I am is a bonus,” says Carlo De Marchis, Chief Product and Marketing Officer at online video sports specialist Deltatre.

Beyond a simple simulcast of the live video is the opportunity with OTT to create what De Marchis calls the “beyond TV experience”.

“In DIVA [deltatre's online platform] you can multi-angle synchronised feeds of up 12 cameras available for review after a few seconds. We have timeline markers for pausing the live stream and playback of key incidents. There is social media interaction. There will greater levels of audio choice and, in future, we will take the clean feed with no graphics and send it to a device where the user will define what graphics makes sense to them.”

For live events one of the immediate opportunities is to stream additional content (such as alternate camera angles, secondary audio, etc) in addition to the broadcast feed. “Eventually, there will be an opportunity to stream additional content that is intended for a VR environment,” says Carnahan, adding that Telestream is investigating this. “An interesting case is to imagine a crowd-sourced production where a central location could be getting feeds from mobile users. Multi-camera production sourced from the crowd - perfect for sports.”

Armed with its 10 match (non-exclusive) Thursday night NFL deal, Twitter is trying to turn a second screen experience into a first screen experience, suggests Carnahan. “Instead of watching TV and tweeting about it, it will all be on Twitter. It remains to be seen how many people will turn to Twitter to “watch TV.” It’s an experiment. The issue is the trade off between image quality for an enhanced user experience. Twitter is betting that the enhanced social experience of watching NFL football on their platform will, for some, outweigh the benefits of a traditional TV viewing experience. Twitter is not aiming to be just a second screen; they are changing the viewing experience. For some this may be worth it. Time will tell.”

Thursday, 14 July 2016

Live VR/360° Video Gets Social


Streaming Media Global

NextVR, LiveLikeVR, Greenfish Labs, and others are pushing to make live sports VR/360°viewing more social with avatars, spatial audio, and more.


Virtual reality and 360° video may provide the "best seat in the house" but it will be a fail for live sports if the experience of almost being there can't be shared. While multiple live streaming VR experiments are taking place, from the UEFA Euros to the Rio Olympics, developers are frantically trying to solve the issue of connecting the experience socially among fans. NextVR, arguably the leading producer of live VR,  is on the verge of announcing a partnership with a social media platform.
"There is a prevailing view that VR is isolating but sports viewing is often shared," says Dave Cole, co-founder, NextVR. "We have a partnership with a social communications platform which will be our first foray into bringing communication with peers and friends into the virtual space."
The product is in beta test and primed to be announced ahead of a "marquee event late summer" by early September. It is likely that the initial form of social interaction will be via avatar.
"We plan to integrate APIs from gaming platforms like the PS4 into the the NextVR platform," says Cole. "It makes more sense for users to create one avatar and have the ability to port that to our platform, or other virtual spaces, than to have start from scratch each time. One idea is for users to invite friends inside a virtual lobby where they can assemble and share commentary on the experience."
Connecting friends and live experiences through VR is the long term goal of Facebook founder Mark Zukerberg. Facebook's Social VR team has demonstrated how "animated mannequins" of users/friends might look when augmented with a live view using Oculus Rift.
Other live streaming sports VR developers are working hard to offer at least the first stage of real-time social interaction.
LiveLikeVR, which has been paid to test its VR production pipe by ATP Media and Sky among others, is to launch live sharing capability later this year. It will also use avatars to represent a fellow LiveLikeVR user sitting adjacent to another in LiveLike's virtual lounge and watching a game.
"When you put on a headset and launch our app, then immediately around you is the VIP suite is CG," says Andre Lorenceau, founder and CEO of LiveLikeVR. "On the right or left of the couch is an avatar of a person—which in the early stages of this development will be a representation of a person. It will be just head and shoulders, and eventually hands, but it's not intended to be hyper-real. What really makes it feel as if they are there with you is localised audio. If you're watching a game and your friends are talking to you [via head mounted device MD mic] it will sound as if they are a couple feet away on your right. If you turn you will see them and they will sound like they are in front of you. The sound of someone talking next to you changes according to where you are looking and this makes an extremely powerful feeling of being present with someone."
One issue is to synchronise the live video stream of the game with the audio component of "friends" speaking. "We are fixing these problems," he says.
Greenish Labs, which live streamed the PIAA track and field event, college football, and Pennsylvania's ice hockey team the Hershey Bears in 360° video, is working on ways to incorporate picture-in-picture within the virtual view, plus live chat and realistic audio.
"We have a sound team working on developing our own plugin and software to make the audio sound realistic in the VR environment," says Greenfish CEO and founder Ben Duffey.  "We have a 8-mic audio set up, which records spatial audio at a live event and software which translates that to different areas of the video, so that as your head moves around the video the audio will relocate accordingly. You can't just have stereo sound—when you turn your head you need to be able to hear those sounds accurately."
Gaming platform VREAL is in beta with an attempt to enable any number of viewers to experience esports in virtual reality. "The future of gaming is VR, and the future of VR is social," says CEO Todd Hooper. "The core of our technology is to re-render games in realtime on each specific machine. That enables a viewer to feel that they are inside the game and allows streamers to interact with their viewers. Streamers will be able to 'pass the mic,' interact with physical gestures, or even hand off digital items from in game to the viewers."
Hooper adds that this level of interaction is technically not possible with live action video. 

NextVR Plans Expansion

Separately, NextVR has revealed its "domestic and international expansion" plans. This includes "talking with the largest live production companies on the planet" about adopting NextVR technology. These include outside broadcast production suppliers including NEP and Game Creek Video.
"The aim is to provide expertise our VR production vehicle as a reference platform for producers rather than partnering them on building trucks," says Cole. "We can showcase how live VR streams can be achieved."
The NextVR mobile facility captures 4K video from 8 cameras per rig with capacity for ten rigs. Per rig, that's 24,000 pixels horizontally and 6000 pixels vertically at 60hz. This totals 6 terabytes a second of raw data.
"A typical Netflix [HD] stream is 8Mbps, and we can deliver full 360 broadcast quality stereo video at less than that," claims Cole.
NextVR is busy with a five-year pact to test out VR with Fox Sports and also signed a deal to live stream VR music events for Live Nation. Cole says monetization is already here.
"Sponsor lift is already happening," he says. Lexus sponsored Fox Sports VR streams from the U.S. Open, for example. "We will test subscription and pay-per-view models this year. You will see a pay-per-view product [from a broadcaster] launching soon."
While NextVR's technical edge is in the compression technology first devised for broadcasting live stereoscopic 3D, its business model is in content, specifically in attracting as wide a user base as possible to its NextVR portal.
"We don't syndicate to other networks for both business and technical reasons," Cole explains. "We are building a platform with partners for consumers to come to and watch VR. We are highly incentivised to maintain content on our platform since that is how the company will be valued."

Sky Confirms August UHD Launch

Streaming Media Global

UHD service will go live on Sky Q Silver box with movies, drama, and Premier League, and could find traction where BT Sport has lagged.
As expected, pan-European pay TV broadcaster Sky has confirmed the launch of a new Ultra HD service for UK subscribers beginning August 13.
The service will go live on its Sky Q Silver receiver which was unveiled in February.
According to analyst Paolo Pescatore, director, multiplay and media at CCS Insight, the move will kickstart consumer appetite for 4K in the UK. "Though BT was the first provider to launch 4K in the UK, it has failed to see any meaningful uptake to its BT TV service," he says. "Despite a huge investment in 4K, BT has failed to build upon its early mover advantage."
Mobile operator EE, owned by BT, announced this week that BT Sport, would be available free to its mobile customers for six months in a bid to increase take-up.
The Sky UHD programme line up includes 70 movies including Spectre, drama and documentaries including the series of David Attenborough-fronted natural histories like Galapagos, which was produced by Atlantic Productions for Sky's 3D channel but shot in 4K or higher.
More significantly, a series of live sports including 124 English Premier League football matches will be shot and aired in the format. From 2017, Formula 1 motor racing will also be available for UHD badging.
It is not yet clear what Sky classifies as UHD, i.e. whether it will demand a high dynamic range finish on drama or in the live sports coverage.
Sky is among broadcasters known to be interested in introducing Dolby Atmos,, an object-based audio format, to the home. This technology was tested by Telegenic, a Sky outside broadcast supplier, for UEFA at the UEFA Euros last month.
Sky is also testing virtual reality, particularly around sports, and it would be logical to see it stream live VR from F1 circuits next year in tandem with the sport's new sponsor Heineken, which has stated its intent to shake up the sport's broadcast with VR.
BT Sport has a year's lead on its main rival and plans to up its quota of live sports for the coming season to include all EPL matches plus games from the UEFA Champions League. Virgin Media is also believed to be planning to introduce Ultra HD channels this year.
Sky's new product range is corralled into a TV Everywhere ecosystem. Sky Q, Sky Q Silver, Sky Q Mini, a Sky Q touch remote, Sky Q Hub and a Sky Q app all connect together under the term Fluid Viewing.
Set-up costs for Sky Q start at £99, and for those new to Sky the monthly cost starts at £42. The UHD programming will not cost users any further premium.
"Though the concept of fluid viewing has failed to resonate with consumers, premium content in 4K such as Premier League will drive awareness [and] appetite, and enhance their viewing experience," says Pescatore. "Sky's close relationship with content and rights owners still puts it in a far stronger position than its competitors. It comes when all providers will start to heavily promote their respective multi-play bundles ahead of the new Premier League football season. It's going to be a very busy second half of the year with Sky's entry into mobile, Vodafone's debut in TV and Virgin Media's new set-top box. Sky's latest move throws down the gauntlet to others, so let battle commence."

Hack the Future of Media

IBC

http://www.ibc.org/hot-news/hack-the-future-of-media

The world’s media is in a state of rapid flux and the rules are constantly being rewritten. When internet players are coming to TV and TV is moving to the web, defending the status quo is no longer an option. While traditional broadcast and pay TV models convulse with unprecedented disruption the best tactic may be to break convention and innovate a way to prepare for the future. 

IBC meets this industry challenge head on with the return of the IBC Hackfest. Following the success of its debut last year, the event will double in size for IBC2016 with exciting new partners and prizes taking to the next level. Sponsors can set challenges for the best in international development talent at this high octane event, essentially outsourcing their R&D and encouraging outsiders to look at their brands in an un-blinkered light. 

This year, the IBC Hackfest invites 100 designers, developers and entrepreneurs to engage with the theme 'Re-imaginging the Future of Cities Through Education, Entertainment and Sports'. 

By tapping into proprietary API's and SDK's made available by technology companies uniquely for this event, as well as scores of publicly available APIs, developers will brainstorm ways in which we might live and work together in the smart urban net-connected environments of tomorrow. 

They will draw on Open Data, the idea that information should be freely available without restrictions from copyright patents or other mechanisms of control, and work with established and emerging social media networks, artificial intelligence concepts and Internet of Things (IoT) technologies to unlock commercially oriented solutions that others have found hard to crack.

The IBC2016 event is augmented by AngelHack – the world's largest hackathon organiser – who will be promoting the Hackfest to their community of more than 98,000 developers worldwide. Participating sponsors will have the unrivalled opportunity to present a brief to teams of coders, user experience designers, hardware hackers and data scientists to achieve a specific software development goal. 

The inaugural IBC Hackfest in 2015 produced some incredible results including Emoment, a means of capturing moments through emotion; Old News, a second screen engagement platform for news programmes; and Tapball, a gamification for sports in which fans would play along with friends by tapping their smart screen during a live streamed match to 'bet' on goals or other events. Partners included Twitter, Amazon Web Services, Streamzilla and Monterosa. 

Held in Amsterdam over 36 hours on 10-11 September in the Diamond Lounge, the IBC Hackfest is a fun and creative space where hackers join with over 55,000 attendees at the heart of IBC2016. 

For innovators looking to win high value prizes and to showcase their ideas to a hugely influential jury of broadcast industry professionals, participation at IBC Hackfest is empowering. Use your bright ideas to inspire and build something new and sponsors could invest in you and take your IP to market. 

For industry companies trying to build a developer community and access uncharted expertise, the IBC Hackfest is a necessity. Don't get stuck in a corner. Think your way out of the problem at IBC, the nucleus for creative technology invention. 

Wednesday, 13 July 2016

TV Journeys to Virtual Reality

IBC

It's still very early days for Virtual Reality (VR) and there have been predictions for its disruptive impact on everything from filmed entertainment to journalism. Although it is difficult to predict the impact, it is reasonable to expect that VR will not repeat the failure of stereo 3D.
According to Ampere analyst Andrew White, VR does not compete with standard video in the same way that 3D did since there is no way to convert 360-video to 2D while retaining the original context. VR, he suggests, should be seen as an entirely new medium, running in parallel or as a companion to TV and movies, rather than as an evolution of them.
JPMorgan Securities forecasts VR to be a $13.5bn industry by 2020 mainly comprised of hardware sales topping 89.3m units. Growing sales of consumer gear are predicated on content to watch but here too production seems to be moving at an astonishing pace.
While video games remain the big initial content draw for consumer VR, likely to be given a boost when Sony debuts Playstation VR in October, movie studios and filmmakers are extending their ambition from short marketing promos to longer form stories. A first feature length VR action movie is planned for release in 2017 by cinema motion-seat developer D-Box and The Virtual Reality Company, a producer which counts Steven Spielberg as a shareholder.
To mention just two of many significant investments in this space, VR display maker HTC recently earmarked $100m for content, and Disney invested $65m in 360-video camera maker and producer Jaunt VR.
Facebook arguably kicked off the current surge with its $2bn purchase of Oculus Rift in 2014. Since then, Facebook and Google have developed an ecosystem for VR from capture to distribution. Google will next launch Daydream, an advanced operating system for Android complete with Daydream-ready phones, and motion controllers; while Facebook has its own 17-camera rig design pending. Google is also making a cinema-style VR camera with IMAX – which is launching a number of physical VR cinemas this year.
Broadcasters have spent the year road-testing VR on everything from documentaries to talk shows with most of the development focussed on live events.
“Currently, there is a joint industry initiative to make the technology work and drive uptake by enticing customers to the platform with free content,” says Futuresource Consulting analyst Carl Hibbert. “As soon as consumer payment becomes a core component, rights will become a major issue – whether that’s sports, concerts or other types of event.”
Sponsors are already investing. Automotive brand Lexus sponsored The Open Golf produced by US VR live stream specialist NextVR for Fox Sports. NextVR plans to test subscription and single view pay-per-view models this year, mostly around live music events. “2016 is a year of audience building. We are not going to put a paywall in the way of audience aggregation,” says co-founder Dave Cole.
VR opens up new opportunities in advertising on multiple fronts. “An entire industry is growing around promotional VR experiences,” notes White. “VR offers the opportunity for brands to touch consumers in ways which were previously unthinkable. With YouTube and Facebook both offering platforms for 360-video, more conventional agencies will see new channels opening up for immersive advertisements.”
Outside of gaming and entertainment VR has a future in all manner of industries from flight simulation to architectural fly-throughs. Applications in education including teaching via virtual classrooms and providing digitized campus tours to prospective students. VR is also making exciting strides in the healthcare market. Indeed, the global AR and VR healthcare market is poised to grow at a CAGR of around 17.8% in the next five years to reach $1.45bn by 2020 [Research and Markets]. Earlier this year the first live broadcast by VR of a surgical operation was streamed from St Bart's hospital in London.

Tuesday, 12 July 2016

Sony, Ang Lee among heavyweights invited to explore Douglas Trumbull's Magi Pod cinema format

Director Ang Lee and Sony Pictures are among Hollywood heavyweights invited to attend private demonstrations of VFX guru Douglas Trumball’s (Blade Runner) anticipated pre-fabricated exhibition format Magi Pod.
The filmmaker, inventor and former vice chair of IMAX, has built a prototype screening room at his studio in Massachusetts and is inviting creatives, studios and exhibitors including Sony and Ang Lee to private demonstrations of the high-tech experimental format over the summer. 
“We’ve spent a year building a prototype theatre and I am just at the stage to start individual screenings,” he said. “Sony and Ang Lee are invited as are a lot of important media companies to witness the only option for the future of cinema other than a million dollar projection system.”
According to the three-time Oscar winner, Sony will review the Magi Pod concept for the release of Ang Lee’s Billy Lynn’s Long Halftime Walk in November. However he feels the time frame may be too tight for Billy Lynn and may be best served by later high frame rate titles.
“I also want to invite [Avatar director] Jim Cameron and [producer] Jon Landau and Steven Speilberg,” Trumbull said.
Trumbull presented the system to Lee prior to shooting Billy Lynn, an act which Trumbull says convinced the director to make he project in the high end specification, which will pose a projection challenge if it is to screen as the director intends it to be seen.
“When Ang came here and saw [Trumbull’s 3D 4K 120fps short film UFOTOG] he became very excited about shooting Billy Lynn that way,” Trumbull said.
“One of the outcomes was that he basically told the studio that if he couldn’t shoot it this way then he wouldn’t make it at all which was a bold and brave thing to have done.
“Sony felt they had a safety net to go ahead and shoot [Billy Lynn] this way provided they had options to release the movie at any frame-rate from 120 to 60 or 48 down to 24fps.”
The studio were unavailable for comment.
Global rollout?
Trumbull’s business model anticipates a mass global rollout for the pre-built modules and the industry veteran says he needs a major cinema chain like AMC or studio like Sony to back it.
“We’re aiming to start a whole new company to develop and market it. This could be aligned with any large media company.”
Trumbull says that the Magi Pod is not only capable of playing back content in 4K, 3D and 120 frames a second - which is the ultimate technical specification for Lee’s picture - but will do so at a fraction of the cost of conventional and premium large format (PLF) theatres.
“Our prototype is unlike any movie theatre ever seen,” Trumbull says. “It is not a rectangular box and it doesn’t have a flat screen. It’s more like a holodeck, or ovoid, which envelopes the audience. This gives a giant screen experience in a relatively small space and on a modest budget.”
It is claimed that each Magi Pod would cost a sixth of the cost per seat of an “equivalent PLF experience” and less than half the build cost of a conventional theatre.
Savings are based on a pre-fabricated construction in which each ‘Pod’ would be shipped in its entirety to a site where it would replace an existing small to mid-size multiplex screen. Installation would take a week, rather than many years or months, Trumbull claims.
“There will be significant install and real estate savings since there is no need to hire an architect, or building contractor or go through local planning codes,” he explains.
“While giant screens have about 50ft of space above each person, which is not an optimal use of volumetric space, our system maximises a 20ft ceiling high capacity and a 100 degree wide field of view over just 1200 sq ft.”
In addition, the system would use Trumbull’s patented technique to deploy a single projector to show a 3D 4K 120fps picture rather than the dual projectors which have been the only way to show this high end specification to date.
The DCI standard would, though, have to be revised in order to show 4K 3D 120fps content.
“We are confident that there is no technical impediment to being able to do it,” said Trumbull. “It just requires more bandwidth, more storage media and more Terabytes of data.”
“We need to redesign the movie theatre”
The Magi Pod comprises a Christie Mirage 4K 25 projector throwing 14 ft lamberts, 32 channel surround Christe Vive audio, a hemespherically curved ‘Torus’ screen made by Stewart Filmscreen and a seating system manufactured by Irwin. Each pod will seat 70 people.
“In order to create an experience for consumers which is vastly different from the convenience of downloads or streaming to computers or TV, we need to redesign what the movie theatre could be,” he argues.
“Initiatives like Dolby Cinema (which combines laser projection with immersive audio) are a great direction to go but [installation] is hugely expensive. Magi Pods are a paradigm shift in movie experience and business.”
Trumbull pioneered the 70mm 60fps exhibition format Showscan in the 1980s and has been developing the MAGI filmmaking process for a decade.
“Solving the bottleneck in getting immersive cinema experiences to a wider audience was a completely unexpected result of what we’d been doing,” he adds.

Friday, 1 July 2016

Action Man: Profile Garrett Brown

British Cinematographer

Rocky Balboa ascending the steps of the Philadelphia Museum of Art in Rocky, Danny Torrance riding his tricycle through the cavernous halls of the Overlook Hotel in The Shining, Imperial storm troopers rocketing between trees on the forest moon of Endor in Star Wars: Episode VI – Return of the Jedi.


What do these movie moments have in common? They were all filmed on the Steadicam® and all shot by the Steadicam’s inventor, Garrett Brown.

The Steadicam introduced a brand new vocabulary of camera movement into motion pictures and won Brown his first Academy Award in 1978. In 1985, the Washington Post, with only a slight hint of hyperbole, described the Steadicam as “the biggest thing since Technicolor”.

Yet the Steadicam was only the beginning of Brown’s innovations in camera-control technology. DiveCam put the viewer beside Olympic divers from their leap off the springboard to their plunge into the pool. And the MobyCam moved the audience underneath the water in sync with the athletes in competitive swimming events. In 2006, Brown received another Academy Award for Scientific and Technical Achievement for SkyCam, the aerial camera system which has become a staple of sports stadium broadcasts.

Looking at Brown’s early career, you would hardly imagine that he would revolutionise cinematography. He left college to pursue a calling in folk singing, even recording for MGM, but quit when The Beatles came along, he quips. With no job skills he ended up despairingly selling VWs. “I'd always loved movies and my wife agreed to keep working while I learned moviemaking by reading all the outdated film books in the Philly library,” he says.

Fast forward through bit-part employment as agency copywriter, commercial director and production start-up, complete with an 800lb ‘Fearless Panoram’ dolly to move his Bolex.

The Eureka moment has passed into legend. Fed-up with the cumbersome Fearless, Brown launched a project to isolate his handheld self from the camera. In 1972, he began experiments and had a functional object a year later.

Even my big early versions worked astonishingly well, even though they were way too clumsy and burdensome to be commercially successful,” he recalls. “I finally went into a hotel for a week and looked at all the drawings over and over and forced myself to come up with a smaller, lighter version that could actually handle 35mm movie cameras. And the marvellous result was that, unlike most inventions, mine could be demonstrated without giving away how it worked. I could show the results — a reel of impossible shots — and just blow away anybody in Hollywood who knew what was possible and what wasn't, but give them no clue how it was done.”

The demo reel included shots of a friend swimming the length of a pool and his wife (then girlfriend) running across a park and up and down Philly Art Museum steps. Rocky director John Avildsen got hold of a copy and called Brown up to recreate the scene with Sylvester Stallone. The same year, 1975, Brown was hired to shoot Steadicam scenes in Bound For Glory for Hal Ashby and Marathon Man for John Schlesinger.

Having devised the Steadicam as a humble means to “rid himself of my big crusty old dolly”, Brown confesses to being astounded by his invention's present ubiquity and usefulness.

The somewhat rigid language of old linear moves, literally ‘on rails’, has given way, he admits, to a flowing vernacular “that transports movie narratives and more closely resembles the way humans - with our astonishing internal stabilisers - actually perceive our lives.”

Tracking shots were part of the lexicon of cinema long before 1975 of course. Directors like Orson Welles and the classic three and half minute opening sequence to Touch Of Evil (1958) or Alfred Hitchcock's experimental black comedy Rope (1948) composed of several single 10-minute film reels, have always sought to push the boundaries of cinematic time and space.

The Steadicam though freed the cinematographer to plot ever more complex and fluid compositions.


My wife will nudge me in the middle of a particularly great Steadicam shot and it's still a thrill,” says Brown. “There are so many brilliant practitioners and it truly is an instrument, rather than just a stabiliser. It’s simply an elegant way to move an object in space, with a mass and weightlessness that could never be accomplished by hand. You guide it with your fingertips and the result is a really graceful, beautiful move. At its best, it's like a ballet for the lens. Of course it’s not curing cancer or ending WWII, but it's still extraordinarily useful and an immense amount of fun.”

A perfect example of immersive film-making (from Steadicam operator Larry McConkey) is the three-minute shot in Martin Scorsese’s Goodfellas (1990) in which Ray Liotta’s mobster leads his date into a nightclub through the exclusive back entrance, along winding corridors, through a busy kitchen and to a VIP table. While showcasing the supreme command Scorsese has over cinematic technique, the shot also invites the audience into the continuous hustle and bustle of the mobster's world.

Feature-length films like Alexander Sokurov’s Russian Ark (2002) have been filmed in one take, choreographed (after having to restart three times) by DP and Steadicam operator Tillman BĂĽttner to render the finished piece more like a ballet.

I can’t say I am necessarily enthralled with ‘one-ers’ unless they’re both sensible and valuable – nobody pays any attention to cuts, after all,” says Brown. “But the freedom to get the lens exactly where it’s wanted, to carry on up steps and over doorways in French curves that would drive a dolly crew berserk, remains completely seductive.”

There are so many great Steadicam shots, so asking Brown to select a personal top ten is like asking someone to choose which child they prefer. “I’ve just re-watched Joe Wright’s Pride And Prejudice (2005) and Simon Baker made some ravishingly beautiful and narratively perfect Steadicam sequences,” he says. “And The Revenant (2015) was astoundingly vital and gripping. Alejandro Iñárritu designed, and Scotty Sakamoto operated, some of my favourite sequences of all time.”


Brown, a member of American Society of Cinematographers, contributed to numerous features including Reds (1981), One From The Heart (1981), The King of Comedy (1982), Indiana Jones And The Temple Of Doom (1984), Casino (1995) and Bulworth (1998) before retiring from shooting in 2004 to concentrate on refining an arsenal of camera stabilisation supports for which he holds 50 patents. FlyCam is a high-speed point-to-point system; GoCam is a speedy miniature rail system and SuperFlyCam is an ultra light 35mm wire-borne flyer.

Brown's most recently released invention and his all-time favourite, was not however a commercial success. Tango is a miniature crane perched on a Steadicam arm that permits floor-to-ceiling shooting and “marvellously smooth” traverses. “You stroll along, panning and tilting with a camera-less ‘master’ sled in one hand and a pantographically controlled ‘slave’ sled on the far end, and the little lightweight camera perfectly follows your intended moves,” he describes.

In the old days I used to take out all of my camera inventions and shoot impressively with them to jump-start sales. Since I retired from shooting, unhappily there has been no champion for Tango. However, I’m confident it will be revived eventually. It’s too good and too exciting and is huge fun to operate.”

When Brown started out, the technology for smooth camera movement were dollies, cranes and camera cars, all land bound. “Aerials [via helicopter] came with fierce propwash and needed lots of space,” he says. “Now, gyro technology lets even minuscule platforms yield eerily smooth shots; and though much of the operating is ‘legato’ to say the least, and thus a bit dreamlike, that will certainly improve.”

Filmmaking via drone has taken Hollywood by storm, much in the way Steadicam once did. Does Brown think UAVs might also affect the language of cinema in time?

Humans unconsciously ‘operate’ their eyeballs with fierce authority, so even though drones may show us startling vistas, their ‘effect’ is often relatively druggy and tame,” he says. “Eventually pilots and operators will acquire the rapid and precise panning/tilting chops that are a given with Steadicam, and failsafe drones will finally come into their own as narrative tools.”


Brown believes that gyro-stabilising and remote-control, and even autonomously operating technology, are here to stay and will only become more and more astounding “until of course, we take it all for granted!” But he's savvy enough to realise that even Skycam will eventually be displaced “by harmless clouds of nearly invisible drones that swarm around football squads, each assigned to a hapless player whose only defence will be a badminton racquet!”

Brown still has several unreleased inventions, which may yet revolutionise Steadicam operation and continue to provide the most visceral control of both moves and framing.


I learned long ago to only attempt what I personally want. What still interests me are the fundamentals – how we perceive moving images, for example. The externals of camera manipulation, rather than the internal particulars. I also think we should help people understand that inventing is something that almost any of us might do. You don’t necessarily have to be a technical soul; you just have to really want something and to be motivated enough to chase it with a little money and a lot of thought.”