Friday, 16 December 2016

Trailer Made

Screen International 

A new system from Deluxe Technicolor Digital Cinema is designed to prevent errors in exhibition trailer screenings.

p40
http://edition.pagesuite-professional.co.uk/launch.aspx?eid=99c463ba-98bc-45fe-880e-51aac9b47b9e

When a cinema in California this summer screened the Sausage Party trailer to families waiting to watch Disney-Pixar’s Finding Dory, exhibitors everywhere might have breathed a sigh of relief. After all, this is an accident waiting to happen when trailer scheduling is done manually.

A new electronic service from Deluxe Technicolor Digital Cinema (DTDC) and Unique Digital is designed to eradicate such errors by ensuring the correct features are targeted with the correct genre and rating of pre-show trailers.

It is just one element of a new agreement between two major industry players’ ongoing commitment to the exhibition and distribution community. A venture which is intended to seal the historically disconnected link between studios and cinema management.

“We are bringing to market a range of services to help link the value chain of servicing content to exhibition and deliver a high level of automation and cost efficiency,” explains Sevan Brown, director of worldwide business development and strategic planning, DTDC.

“With installs of electronic delivery via broadband and satellite reaching more and more sites this changes the dynamic of the relationship between exhibitors and studios,” he adds. “What’s more, exhibitors are faced with a wide number of service providers competing to install digital equipment on their premises making the market fragmented and confused. Our intention is to offer long term market stability by working with chosen vendors in each territory in order for the industry to focus on its business and make room for innovation.”

A major partnership in within this strategy is the distribution alliance between DTDC and Unique Digital. Following its successful launch and operation in Norway, the Smart Trailering service is now available in the UK. The system gives exhibitors full control to ensure content is used only during scheduled dates. Film booking departments can schedule trailers from head office, track electronic delivery and have playlists automatically updated on a theatre management system.

“Cinema owners can build their own playlist confident that they are using the correct trailer version and movie distributors can prepopulate metadata and advise on trailer requirements. Trailers can be sent by the content provider electronically straight to the site using Unique Digital’s Movie Transit content delivery system”,” says Brown.  

Large chains can centralise trailer scheduling from a head office while smaller independent exhibitors could manage programming remotely from home as required. The system further rationalises and understands delivery operations by giving exhibitors the ability to order trailers for an entire territory from one platform.

A next step is to feedback playback data - about when trailers are played, demographics of the audience - to aid studio marketing campaigns.

Initially launching across the UK and Ireland where Unique Digital has a deal across over 100 cinema locations, DTDC and Unique plan to expand the service into the Nordic and Baltic regions and additional territories.

“This is a scheduling platform with the focus squarely on adding extra value for both distribution and exhibition by allowing bespoke programming of content responsive to market conditions at either a central or local level with a further opportunity for reporting and analytics from the exhibitor”,” says Brown.

Another initiative linking the value chain in support of content owners and exhibitors will automate the creation and distribution of keys to the exhibition community.

Currently, the keys required to unlock content ordered by exhibitors for playback are sent by email to a cinema office where staff will typically transfer this to the specific projector and server via USB. With vital information stored on PCs and sticks, not only is this process fraught with increased danger of theft, but it is time-consuming and inefficient. Any issues which cinemas have regarding the encryption have to be resolved via call centre, again not always the most efficient solution.

 “There is a way in which keys can be better integrated at an operational level.” explains Brown.

The Deluxe Technicolor Portal is an online platform which enables exhibitors to raise and resolve key generation and delivery issues as well as gain access to critical technical and operational information associated with content playback. From the portal an exhibitor can review device information at a theatre level as well as download active keys and supplemental feature version files. 

“This is about improving operational procedure and not about standing in the way of any commercial relationship exhibitors have with content owners,” says Brown.  “It will reduce the number of errors and queries through increased automation, reduce the need for manual email correspondence and free up the support team to deliver a more defined and detailed support.”

The portal lays the foundation for further connectivity and automation with the exhibitor. The industry must move to a more automated approach with repect to all operations with all content and keys being delivered direct to exhibitor Theatre Management System (TMS) and projectors. 


“Integrated systems will ultimately enable full operational visibility of all distribution delivery to the exhibitor and real time management and fault resolution”.

Thursday, 15 December 2016

Could AI Create Your Next Production?

IBC
Artificial Intelligence is a staple trope of film and TV plots from classics such as Bladerunner to recent Channel 4 series Humans, but AI is also emerging as a serious content creation tool.
Amazon, Google, Microsoft, IBM, Oracle and Apple are among those now pouring millions of dollars into natural language processing, facial detection and sentiment analysis with the aim of using machine learning or cognitive computing for everything from automated cars to robot-run hotels, educational tools to manufacturing.
IBM CEO Ginny Rometty heralded 2016 as the dawn of “the cognitive era.” Research firm IDC predicts that AI will grow from $8 billion this year to $47 billion by 2020, at which time it will “eventually be built into all kinds of products and services.”
“That’s where we’re headed – AI everywhere,” states IDC chief analyst Frank Gens.
Sooner or later AI will also be a part of everyday media production and there are signs that this is already happening.
Insights provided by IBM's AI system Watson were used during Wimbledon this year to drive official social channels. Watson was also used to assemble video clips to help create the trailer for Fox sic-fi feature Morgan.
While a human editor crafted the end result, the use of a machine trained to ‘look’ for relevant clips was claimed to shave the best part of a month off the conventional post process.
A number of editing applications are using algorithms to analyse in-frame action and camera motion to partially or fully automate the logging, edit, grade and sound mix of anything from prosumer action videos to high shoot ratio documentaries.
There are suggestions that this software will encroach into professional post - either leading to the loss of assistant editing jobs or freeing up editors for more creative work, depending on your point of view.
IBM says it views Watson as an assistive tool which works in the sweet spot between computer and human expertise. Some prefer to dub this IA - or Intelligent Augmentation.
“Perhaps in the near future, Artificial Intelligence systems will take advantage of this metadata and take things to an entirely new level by using metadata along with complex search algorithms to find new relationships between media and audience interest,” suggests David Schleifer, COO, Primestream. “Technology such as speech-to-text, voice-search systems like Siri, Cortana and Alexa are all maturing into powerful tools that will improve how we search and find what we are looking for. Properly catalogued media gives the broadcaster more opportunities to leverage their libraries to the public.”
From 2001: A Space Odyssey to Ex Machina via Terminator, Hollywood has tended to view AI as a dystopian vision. Even today the prevailing - dismissive - view is that machines are only as smart as the data you give them. Yet, given time, it may be possible for an AI to synthesise human creativity.
AI is being experimented with on the fringes of Hollywood. Sunspring, a short film shown at this year’s Sundance film festival was scripted by a computer programme.
At the Cannes Lions festival in June, a pop promo produced by agency Saatchi & Saatchi was scripted and directed (using drones) entirely by AI. Even the casting was done by a program that examined electroencephalogram (EEG) brain data from actors and matched them to the emotions it had detected in the song and its singer. 
Canadian data-analysis company Greenlight Essentials has launched a Kickstarter campaign to fund the first feature film co-written by artificial intelligence.
Logically, a machine will only be as intelligent as the data with which it is fed and humans will always have the edge on Emotional Intelligence - the ability to identify our own emotions and to empathise with those of others. It’s a crucial part of the creative process. 
Yet it would be foolish to ignore the trend toward using data to assist the production process - and who knows - in the creative process too.

Is Artificial Intelligence on the verge of taking over media?

Red Shark News

Slave to the algorithm: The next wave of technology that looks set to break over our collective heads with the promise of changing all before it is Artificial Intelligence. Here is a snapshot of the story so far.
Everything most of us have learned about Artificial Intelligence (AI) has come from Hollywood. There are the time-traveling robots trying to terminate us before we can give birth to the future leaders of the rebellion, the sadistic machines taking control of spaceships and the deus ex machina which seem more human than human — and still kill us. There are even some we fall in love with, but it still doesn’t end well
AI is more prosaic than that. Well, at least it is for now.
The biggest corporations on the planet are embedding predictive intelligence into everyday apps to make our lives easier. The most obvious examples are chatbots or artificially intelligent digital agents like Alexa, Siri, Google Assistant, Facebook M, and Cortana.
Another example is Facebook’s recommended photo tags which uses image recognition. Amazon provides recommended products using machine learning algorithms; Waze (a GPS and maps app) provides optimal travel routes. It is all becoming increasingly ubiquitous.
This is Google CEO, Sundar Pichai, speaking in October; “We are at a seminal moment in computing. We are evolving from a mobile-first to an AI-first world.”
While right at the beginning of the year IBM CEO, Ginny Rometty, told the CES, "It's the dawn of a new era: the cognitive era."

AI and the cognitive arms race

And she’s probably right. Just now IBM, Amazon, Google, Facebook and Microsoft are locked in an AI arms race.
Equity funding of AI-focused startups reached an all-time high in the second quarter of 2016 of more than $1 billion, according to researcher CB Insights.
AI is a catch-all phrase ­applying to any technique that enables computers to mimic human intelligence, using logic, ‘if-then’ rules, decision trees, and machine learning (including deep learning).
There are nuances though. Machine learning is a subset of AI that includes abstruse statistical techniques that enable machines to improve at tasks with experience. The category includes deep learning. Deep learning is composed of algorithms that permit software to train itself to perform tasks, like speech and image recognition, by exposing multilayered neural networks to vast amounts of data.
“There’s a been a huge influx of data with everyone feeding sound, text and imagery over social media which has accelerated our ability to try and find ways to process and understand it,” says Ian Hughes, analyst at 451 Research. “Traditional processing and analytics is too slow since it doesn’t scale so research has been pushed research into AI as a way of dealing with data.”
In turn, AI data crunching has honed in on specific areas as opposed to blanket analysis.
IBM’s cognitive computer system Watson, for example, has combined its Alchemy Language APIs with a speech to text platform, to create a tool for video owners to analyse video – forming IBM Cloud Video. It is able to scan social media in real time to monitor reactions to live streaming events.
“AI used to be relegated to only the fastest supercomputers, but recent advances in software and the use of GPUs to process the algorithms mean that the cost of AI assistance is no longer a barrier to entry,” says Paul Turner VP, Enterprise Product Management, Telestream. “AI offers the promise of aiding in many facets or the business. You can certainly imagine that systems will be able to analyse the actual content for metadata gathering (facial recognition is one part of this, but other object detection could be just as useful). Given that metadata is key to automated workflows, this could vastly expand our capability to ‘mine’ content for other purposes.”
Nvidia, which has a history of leveraging computing power for interesting purposes, describes its TensorRT product as a “high performance neural network inference engine for production deployment of deep learning applications”.
It is targeting use in delivering super fast inferences and significantly reduced latency, as demanded by real-time services such as streaming video categorisation in the cloud or object detection and segmentation on embedded and automotive platforms. “With TensorRT developers can focus on developing novel AI-powered applications rather than performance tuning for inference deployment,” says Nvidia.

AI should not be confused with intelligent creation, yet even here the edges are being blurred. Editing software such as Magisto, with 80 million users, takes in raw GoPro or smartphone shot video and automates the process of editing and packaging it with a narrative timeline, tonal grade and background music for consumers and even marketeers facing huge demands on their time and too much video to process and publish online.
For live production how about AutomaticTV, devised by Spanish facilities provider MediaPro as a cost-effective alternative to a full OB and in use at Barcelona FC to live stream training sessions. If you imagine a soccer match as a series of pixels, the system is essentially instructing an algorithm to distinguish between green for the pitch, white for the ball, black pixels for the referee and a collection of coloured pixels for the players. Since soccer is fairly formulaic in presentation the system can follow those pixels based on certain rules. Handball, roller hockey, and basketball have also been trialled.
A documentary assembled by the Lumberjack AI system is hoped to be presented before the SMPTE-backed Hollywood Professional Association (HPA) by 2018 and has already helped create Danish channel STV ‘s 69x10’ episodes of semi-scripted kids series Klassen.
Many questions are thrown up by this advance. Is the accidental juxtaposition of sound and image to create something new a function of solely human intelligence or can a machine be trained to produce content which is not formulaic? Can an AI recognise, in a mass of images, that the nuanced facial expression in an observational documentary is the heartbeat of a story?
And as for scripted shows - well Sunspring, a short film shown at this year’s Sundance film festival was scripted by a computer programme. IBM’s Watson was used to sort and recommend a dozen clips from the full length horror feature Morgan for a craft editor to compose a trailer. And at the Cannes Lions festival this year, a pop promo produced by Saatchi & Saatchi was scripted and directed entirely by AI. Even the casting was done by a program that examined electroencephalogram (EEG) brain data from actors and matched them to the emotions it had detected in the song and its singer.
Canadian data-analysis company Greenlight Essentials has launched a Kickstarter campaign to fund the first feature film co-written by an artificial intelligence.
These latter examples could be put down to novelty except for the thread linking the rapid rise of AI across all media. We want machines to take care of the neurologically impossible task of processing the volume of data we create and receive each day. The utopian view is that if a machine can take this off our hands then humans can find more time to create.
The dystopian? One word: Skynet.

Saturday, 10 December 2016

Live VR trialled at IMG Champions Tennis at the RAH

Sports Video Group
There can be few grander or more theatrical arenas in world sport than London’s Royal Albert Hall, the venue this past week for IMG’s Champions Tennis and the focus of another live virtual reality demonstration.
The season-ending finale of the ATP Champions Tour featured Slam Champions John McEnroe and Pat Rafter, former world No.1 Juan Carlos Ferrero and former doubles Grand Slam Champion Xavier Malisse. British fan favourite Tim Henman, former British Number 1 Greg Rusedski and former world No.4 Guy Forget were also in the line-up.
For the first time this event was given the VR treatment, streamed live to a bespoke app for a specially invited audience of broadcasters, sponsors and technologists.
“This is an awe-inspiring venue, ideal for the immersiveness of VR,” says Paul James, co-founder and head of production at Surrey-based production company Focal Point VR which is behind the demo.
Viewers were able to switch between multiple streams for different perspectives on the live action much as they would access a Red Button on TV – but all without leaving the VR stream.
“We are not editorialising in any way,” explains James, co-founder and head of production at Surrey-based production company Focal Point VR. “This is a viewer based experience where they are the editor. The viewer makes decision about when and where they want to watch. Our role is to make sure that when consumers do come to VR for the first time they experience such a ‘wow’ that they will want to return for more.”
Necessity is the mother of invention?
Part of this approach is out of necessity. A traditional directed broadcast feed may prove uncomfortable to a viewer unable to adjust to the uncontrolled, unpredictable switching of camera angles or viewing distances in 360-degrees.
With the event running from Wednesday 30 over five days, this was a prime opportunity for Focal Point VR to experiment with various aspects of what is still an emerging technology. The plan was to trial various camera positions and deliver a public stream from the best set-ups during the finals on Sunday.
Three different rigs were earrayed around the court ranging from a stereo pair for 3D VR capture designed by Focal Point VR, a GoPro-mounted rig for output to YouTube, and a high-end Mini-Eye rig designed by San Francisco’s 360-Designs comprising three Blackmagic Design Micro Studio Camera 4K outputting dual streams of 4K and 6K.
Signals from the latter configuration were fed into Focal Point VR’s own stitching and processing solution, which allows for multiple simultaneous VR streams and PVR functionality for viewers to rewind the action. The company partnered with video streaming agency Streaming Tank to encode and distribute the video.
“We are not producing highlights or replays in this instance,” says James. “The viewer can control the action they want to review. Rather than introduce graphics we want the viewer to look toward the scoreboard in the venue if they want to know the score.”
The 6K and 4K production was intended to offer a direct contrast with the standard VR experience on YouTube. Focal Point offers an end-to-end VR live solution but it’s main advantage lies in a proprietary stitching and signal processing software.
“Our technology gives us the ability to have areas of higher quality in a scene,” explains James. “We call it packing. It allows us to have native effective 6K resolution in the main areas of interest such as the field of play while areas of less interest – such as the view behind the VR viewer’s head – receives fewer pixels.”
With this region of interest optimisation the producer is able to achieve a three to five times reduction in sizes and data costs. Its solution is agnostic about codec and CDN.
Support is provided for all major HMDs, including Gear VR, Android/iOS cardboard, PC (Oculus Rift / HTC Vive) and PlayStation VR, with an embeddable HTML5 player also an option.
“A year ago the main question around VR would have been the reliability of stitching,” says James. “Now we have a rock solid approach. The issue is solving the problem with depth. There are clever ways of managing this with adaptive streaming or disguising stitch lines. We have a desire to achieve this with depth extraction.”
A grounding in gaming
The ability to write software tools comes from the company founder’s background in computer games development. Julian Davis, formerly CTO of Geomerics, which was recently acquired by ARM, is CTO; while James, CEO Jonathan Newth and chairman Ian Baverstock have built several video games and technology companies together.
“Making VR a shared and social experience is key to its lasting appeal, leveraging our games industry experience we are designing and building the components required to allow people to feel they are together in VR,” says Newth.
Recent research by Ampere Analysis found that 18 to 24-year-olds, the younger end of the so-called ‘millennial’ age group, were 17% less likely to identify sport as their favourite form of programming than the general population.
This should come as no surprise. As a demographic, millennials have become accustomed to being more in control over how and when they consume TV content and, as a result, streaming services have grown in popularity with this audience more than any other.
Streaming an event live in 360° to smartphone apps and virtual reality headsets offers a fundamentally different experience and what’s more, one that puts the viewer in control.
“VR is fundamentally different to any other form of media where someone else is directing where the audience should look,” says Baverstock. “You lose some advantages of editing and action replay and the ability to zoom in and out. In effect you lose the directed experience. What you get in return is a feeling of authenticity and presence which is of tremendous appeal to a cynical younger audience who see less value in the traditional live directed programme. Offering millennials the chance to be part of something live and under their own control is extremely valuable.
“Ultimately you want to be able to share the experience of attending an event virtually with a member of your family who may live many miles away,” he adds. “While this capability is unlikely to be available in the first commercial launches of VR, the technology will soon offer the chance for both of you to teleport to the same experience. A step even beyond that is to enhance the ability to move around within and interact with the VR world, for example by being able to point rather than just look. Our technology allows broadcasters to engage with consumers on a completely new level.”
Focal Point announced the launch of its VR platform at IBC this year. Seed funding investors include: Simon Muderack (MD at Sigma Systems; John Chasey (Operations Director of TVR Automotive, Jez San (CEO of PKR .com), and Chris Wilks (CFO of Signum Technology).

Friday, 9 December 2016

BBC demos Planet Earth II footage in 4K HDR

Red Shark News
It might only be four minutes of footage, but, 80 years after it first started broadcasting, the new clips of Sir David Attenborough’s Planet Earth II now available on the BBC iPlayer are the best quality pictures the BBC has ever made available to the public.
Somewhat out of the blue the BBC has announced that it’s begun a trial of 4K HDR video over iPlayer in a test featuring four minutes of footage from Planet Earth II - the first BBC series made in both 4K and HDR.
The demo is intended to show how the BBC could make UHD HDR material available to stream.
"We want to use this as a trigger to work with manufacturers to get their products updated so there's a pathway there for future on-demand BBC content,” explained Phil Layton, head of broadcast and connected systems at BBC Research & Development.
"One of the clips is a frog on a leaf with lots of rain, and the reason this is so interesting is that the redness of the frog is a really deep, Ferrari red that you would never get in broadcast television at the moment," Layton told BBC news online.
Naturally the BBC has chosen to deliver the HDR version in HLG (Hybrid Log Gamma), the HDR variant it developed with Japan’s NHK and one of two schemes, along with Dolby’s PQ, ratified by the ITU and the DVB for UHD phase 1 production (the ITU’s standardisation should make using either approach interchangeable although in practice PQ will probably be destined for more feature film and drama content with HLG for live work).
The difference is subtle but can be boiled down to PQ being perceptually better to look at, provided you have a top-notch reference monitor like the Sony BVM X300 or a panel certified with Dolby Vision. HLG is designed for HDR to be playable on all kinds of TV from the top-end 4K HDR modules coming to market in the new year through to HD and older 4K screens where pictures will still benefit from an HDR sheen.
This theory seems slightly at odds though with a note from the BBC that as of today only Panasonic's latest screens support HLG, but, as the Corporation also says: “Although recent models from other manufacturers can also be updated to add the facility, it is unclear whether the firms will do so."
We’ll probably know more after CES next month. Certainly there is plenty of momentum building behind HDR, with estimates being that in North America the penetration of true HDR TVs will rise to between 10% to 14% at the end of 2017. Meanwhile, in the UK both BT Sport and Sky are experimenting with HDR for live sports and will likely introduce it next year.
Watch this (wide colour gamut) space.

Thursday, 8 December 2016

Boom puts pressure on space

Broadcast 

With demand at an all-time high, an estimated 300,000 sq ft of purpose-built stages came on stream this year, with capacity set to rise further in 2017.

http://www.broadcastnow.co.uk/techfacils/boom-puts-pressure-on-space/5111971.article?blocktitle=Features&contentID=42957

The UK’s world-class craft skills base and facilities have long made it attractive to overseas producers, but the collapse of the pound post-referendum poured more fuel on an already overheated sector.
“To put it cynically, the exchange rate since Brexit has done us a favour and has, if anything, intensified the volume of enquiries, particularly from the US,” says Adrian Wootton, chief executive of Film London and the British Film Commission, both of which continue to “aggressively” market UK studios.
“We’ve done really well to manage capacity, but we do want more,” he adds. “I don’t think we’re tapped out in terms of demand or potential for more infrastructure.”
The unprecedented boom in studio occupancy continues to be driven by TV tax breaks, with high-end drama soaking up capacity on the nation’s sound stages.
Among those in production this year were series three of E!’s The Royals (3 Mills Studios), Sony Pictures’ The Halcyon (West London Film Studios), ITV’s Victoria (Yorkshire’s Church Fenton Studios), BBC1’s Poldark and Starz’s The White Princess (both Bristol’s The Bottle Yard). Amazon’s The Collection was one of the first shows to land at Pinewood Wales and Netflix’s The Crown was resident at Elstree.
“The Crown is shot as a feature, has similar budgets and uses the 15,000 sq ft George Lucas Stage, backlot and other, smaller stages,” says Elstree managing director Roger Morris, who had to turn away films such as Paddington 2 to accommodate it.
With demand showing no signs of abating, Elstree owner Hertsmere Borough Council has tabled a proposal to build a new 13,000 sq ft studio by 2018, on top of existing plans to add a 21,000 sq ft stage.
The sector seemingly can’t build space quickly enough to sate demand.
By Broadcast’s estimates, 300,000 sq ft of purpose-built stages came on stream this year: three sound stages totalling 70,000 sq ft, at Warner Bros Leavesden; 170,000 sq ft across five stages at Pinewood’s Buckinghamshire HQ; and the 66,000 sq ft North Foreshore Film Studios, a new £20m development by the Belfast Harbour Commission.
With Game Of Thrones sucking up two new 21,000 sq ft sound stages at Belfast’s Titanic Studios, Northern
Ireland needed fresh bespoke space to keep pace with the rise in TV drama. There are already plans to build a further 42,000 sq ft.
“We’re investing more in training and bringing more people into the region to service anticipated new productions,” says NI Screen head of production Andrew Reid, who is anticipating the departure of GoT by April 2018, once production wraps on the eighth and final series.
Capacity is set to rise further by the end of 2017 when Elstree unwraps its new stage, TVC returns with three studios, and Manchester’s The Space Project doubles in size with Outer Space, which will include a 30,000 sq ft stage. It follows the first full trading year of the drama hub, which has housed productions such as Cold Feet and Houdini And Doyle.
A little further down the track is a proposed £30m, 110,000 sq ft complex in Liverpool, the city that attracts more film and TV location shoots than any outside London.
Construction is earmarked to begin in the new year.
In Birmingham, which welcomed Steven Spielberg’s location shoot for Warner Bros’ movie Ready Player One in September, the city council and the
NEC Group are considering whether to build a TV and film studio complex.
Meanwhile, unable to expand its current site but wanting to cash in on demand, the owners of Twickenham Studios are investigating management of external studios, either in the UK or abroad, under the Twickenham brand.
Like Pinewood, they’re keen to invest in TV and film production (indie feature Finding Your Feet was the first).
The X Factor final on 11 December will bring down the curtain on Wembley’s Fountain Studios, following January’s sale by parent Avesco to property developer Quintain for £16m.
Its laser-levelled floors will be demolished for housing, leaving shows like Britain’s Got Talent looking for a new home and calling time on a site that has been used as a film and TV space since the 1930s. The second iconic studios to go under the hammer to a real estate firm in 2016 enjoyed a different outcome.
The Pinewood board raised the possibility of a sale in February, announcing a strategic review of its structure and ownership. The motivation was a need to access quick capital for expansion, which its existing owner ship base appeared to be stifling.
The £320m deal struck in July with Venus Grafton, a subsidiary of the French PW Real Estate Fund, was immediately hailed by chief executive Ivan Dunleavy as securing the studio group’s long-term future.

Full steam ahead
Pinewood director of strategy Andrew Smith declared it was full steam ahead for phase two of the £200m Pinewood development plan, which will build another five stages spanning 170,000 sq ft.
With the Buckinghamshire site busy with Star Wars Episode IX, the group launched a TV investment fund under creative director Helen Gregory (The Catch) and co-production director Christian Wikander (The Bridge), beginning with a Sony/AXN development deal.
Movie producers and their stars – and, by extension, the marquee names driving high-end TV – prefer to be within shooting distance of London.
That is one factor driving the proposal to repurpose a 17-acre brownfield site in Dagenham into a film and TV studio second only in size to Pinewood in the UK.
“We are getting repeat requests from productions wanting to be within the M25,” says Wootton. “Property in London is at a premium and this site is one of the last big development opportunities for media in the capital.”
The London Local Enterprise Partnership, Barking and Dagenham Council and Film London are all optimistic that a business case can be made for a public and private financed facility, which they estimate could bring in more than £100m in UK spend. Pending the results of a feasibility study due in late spring 2017, doors could open in three years.
The move would offset the long-term uncertainty around another London site: 3 Mills Studios.
While studio head Tom Avison reports the busiest time yet for the Bromley-by-Bow base and solid bookings, operator London Legacy Development Corporation is mulling whether to reassign the 80,000 sq ft site, possibly as storage for museums that are due to open up in the Olympic Park.
“The film industry has historically been based in west London, yet the natural evolution of London is to the east,” says Avison. “We welcome a new studio, since it will entice more crew to be based near here.”

The wait continues in Scotland 

The Scottish government had been expected to rule on the £140m, 130,000 sq ft Edinburgh studio development in August but, at the time of writing, there had been no decision. However, it did approve a 30,000 sq ft extension at Wardpark, the home of Outlander.
Two 50 ft high sound stages will bring space there to 78,000 sq ft, but the only other purpose-built unit in the nation is a 5,000 sq ft stage on Stornaway.

Record Spend 

While Scotland attracted a record £45.8m shooting spend in 2014 (the latest year for which fi gures are available), much of this was for the country as a backdrop. Across Scotland, annual production spend was up by around 40% in 2016.
“The need for more dedicated studio space is pressing,” says British Film Commission chair Iain Smith. “The Scottish government and Creative Scotland have been very cautious, despite attempts by many of us to persuade them of the opportunity they are in danger of missing.

“Ten years ago, Scotland had the biggest amount of media activity outside the south-east. Now it has lost market share to Cardiff, Bristol, Northern Ireland, Manchester and Leeds at a time of greatest opportunity.”

Wednesday, 7 December 2016

Milestone year for OB firms

Broadcast

From the Rio Olympics to the Euros, 2016’s major sporting events were a showcase for
emerging technologies such as UHD, VR and Dolby Atmos.


There’s no argument about what was the year’s biggest live event: host broadcaster Olympic
Broadcasting Services (OBS) churned out more than 7,100 hours of coverage from Rio, with broadcasters including NBC Universal and the BBC contributing to 350,000 hours aired globally.
This smashed the 100,000 hours broadcast from London 2012 to reach an estimated all-time high TV audience of 5 billion.
Particularly significant, says OBS, was the unprecedented reach across digital platforms. The volume of online coverage was nearly double that of traditional TV, with live streams available from every session for each of the 28 sports featured – amounting to 218,000 hours, almost three times 2012’s 81,500 hours. OBS calls it “a milestone in Olympic broadcasting history”.
Drawing on more than 1,000 cameras, including mini-cams mounted on cycles and drones deployed for overhead views of events such as canoeing and triathlon, Rio was a testbed for future production tech.
The BBC was among the broadcasters offering viewers a 360-degree view of select Olympic action, part of 85 hours of live VR content captured for posterity.
With NHK, OBS achieved the largest live 8K UHD production to date, with around 100 hours sent back via satellite to public viewing areas in Japan and down-converted for 4K Ultra High Definition (UHD) output.

Pushing boundaries

This mix of capture and distribution has laid the groundwork for Tokyo 2020. “Along with the use of advanced technologies, we believe this experience has helped to define how we will move forward in the digital era,” says OBS chief executive Yiannis Exarchos.
By comparison, the year’s other major sporting event, Euro 2016, was far smaller – but even here, tech boundaries were pushed.
Uefa’s host broadcast team shot eight matches in 4K for the first time at the tournament, produced by lead contractor Telegenic, and tested immersive sound system Dolby Atmos and VR.
Nokia Ozo camera rigs were positioned behind the goals, on the centre line, in the tunnel and in the dressing room, in a production managed by Deltatre. The results suggest more work is needed.
“The debate we are having is about which experience feels most natural to the user,” reports Deltatre chief product and marketing officer Carlo De Marchis.
“Football is a challenge because of the size of the pitch. For sports with smaller areas of play, like basketball, a full live-streamed game in VR may work better because you can make it more immersive.”
On 13 August, Sky officially debuted its live UHD service for Leicester City’s opening Premier League game with Hull. It was one of 124 planned 4K broadcasts of Premier League matches this season, a tally strategically designed to contrast with BT Sport’s 42 live 4K games (the sum of its rights package).
Yet while Sky will give the UHD treatment to the full 2017 Formula One racing calendar, BT Sport – in its second year of 4K coverage – has a broader range.
It is fielding the format at FA Cup, Uefa Champions League and Europa League football games, as well as Aviva Premiership rugby matches. MotoGP meetings, FIM Speedway races and even PSA squash matches are also being covered in 4K.

Down-converting HD

To streamline workflow, reduce onsite facilities and minimise risk, both broadcasters have opted to downconvert UHD to produce their regular HD programming.
“This is all about making sure that the HD editorial product isn’t compromised,” says Sky director of operations Keith Lane.
That means no experimenting yet with High Dynamic Range (HDR), although it is likely to be a fixture by this time next year.
“The bigger issue is working out an elegant workflow so we can produce HD HDR, UHD HDR and Standard Dynamic Range deliverables out of the same truck,” says Sky Sports technical manager Robin Broomfield. “We know where we want to get to, but it’s an added complexity.”
Meanwhile, as part of its attempt to innovate ahead of Sky, BT Sport will become the first UK broadcaster to implement Dolby Atmos surround sound into its live productions from early 2017.
Sky’s planned switch to UHD set the clocking ticking for OB suppliers to upgrade around mid-2015.
While Telegenic had equipped trucks for 4K as early as 2013, specifically for the Fifa World Cup in Rio the following year, and Timeline had been commissioned to supply BT’s 4K output, other outfits held back to gauge demand in the market.
More pertinently, there was a critical technical decision to be made: whether to equip with the reliable, but ultimately inflexible and outmoded, combination of four HD SDI cables, or take a leap of faith to the relatively untested, patchily standardised but future-facing approach of data-centric IP systems. Get the timing wrong and an OB firm could face ruin.
Lacking any UHD contract, Arena gambled on what managing director Richard Yeowart called the biggest risk of his career.
Unveiling a trio of 32-camera channel all-IP triple expanders at a combined cost of more than £20m, he declared: “Quad HD is a sticking-plaster approach to UHD.”
Partnering with Grass Valley, the key onboard equipment included LDX 86 cameras, IP processing and routing kit. It paid off when the company landed rolling contracts with BT Sport and Sky for its first two facilities, with the third scanner due in the new year.
These large-scale trucks also put paid to the idea that the introduction of IP might entail an immediate reallocation of resources for remote production.
“The amount of deliverables has more than doubled,” says Grass Valley product specialist Phil Myers. “It’s not just HD with stereo and 5.1; we’re also having to do UHD with 5.1 and Dolby Atmos and stereo and then make an HD variant and a clean version for the world feed.”
With HDR yet to be added to the mix, it seems premium events will need the larger on-site production spaces for some time to come.
“Just a simple thing like being able to reduce the weight of cabling in a truck that is going up and down Britain every week using a large amount of fuel is a bonus,” says Myers.
Other firms are upgrading too. CTV is building an IP core for its latest scanner and will arm it with a contingent of Sony HDC-4300 4K cameras in time for Sky’s Open golf coverage in July 2017.

NEP UK: A Phoenix from the flames 


The fire that gutted NEP UK’s head quarters with the loss of six OB vehicles in November 2015 forced the company (rebranded from NEP Visions in July) down the tried-and-tested Quad-HD route to UHD.
Barely 10 months after that potentially business-crippling event, the firm had not one but four 4K OB trucks on the road, each built to a similar template.
Costing around £10m each, they were equipped with Sony HDC-4300s cameras, EVS XT3s, SAM Kahuna 9600 vision mixers and Imagine Communications Selenio signal processing.
Like rival Arena’s trucks, the OB facilities are designed to operate in HD and UHD.

Acquisitions Spree 

Following a string of purchases in 2015, including Ireland’s Screen Scene, parent NEP Group continued its acquisitions spree with June’s deal for flypack and facilities provider Broadcast Solutions Group and July’s acquisition of Danish OB firm DBlux (now operating as NEP Denmark). In November, it spent £124m on kit and services provider Avesco.
The company also signalled its intent to move into cloud-based post and remote production. “Because of clients’ changing demands, we have to be more than just an outside broadcast company,” says NEP UK and Ireland president Steve Jenkins.

The group already offers these services outside the UK, with NEP Netherlands using cloud technology on RTL shows Carlo’s TV Cafe and Voetbal Inside.