Friday, 28 October 2016

Drones come of age

Broadcast

If all the buzz that attended drones last year seems to have shifted onto VR, that’s symptomatic of how quickly unmanned aerial filming has become commonplace. It’s also significant that innovation in the sector is incremental rather than dramatic.


For most film, TV and commercials filming requiring high quality but heavier camera/lens packages like Red Dragon, Sony F55, Canon C300 MKII, Phantom Flex4K or Alexa Mini, basic drone tech remains a trade-off between weight, power and flexibility.
The Aerigon rig from Sweden’s Intuitive Aerial remains one of the most popular with updates to the Mark II model including new image stabilisation technology for full size pro cinema cameras with lightweight zoom lenses such as the Angenieux Optimo 15-40 and Canon 15.5-47.
Competitors include the Shotover Camera Systems’ U1, launched in April, and Freefly Systems’ Alta 8. The former features a gimbal which can be detached from the rig and used as a standalone gyro-stabilised platform for mounting on cranes and almost anything that moves.
Freefly’s octocopter, released in May, has upped the payload over the Alta 6 from 13kg to 18kg (6.2kg of which is taken up by the weight of the craft, giving a 11.9 working payload). A camera can be switched between a conventional underslung configuration to a mount where the camera sits above the rotors for up tilted shots.
U.S. start-up xFold’s Cinema rig can be configured with 8 or 12 motors and propellers and has a payload capacity of 20kg. What’s unique about this rig, at the heavy payload end of the market, is a design that can fold down to a third of its full size for ease of transport.
All these models include redundant flight control and battery systems which are important safety features.
Despite this, some aerial filming companies prefer to build their own UAVs in order to fulfil client demands to fly over congested areas with more complex flight patterns while earning a Operating Safety Case from the CAA.
Flying Pictures says its bespoke 12 motors and 12 propeller UAV with dual (master and reserve) controllers is unique in the redundancy it offers (it has an OSC for just 10m separation in congested areas).
Camera manufacturers are answering the call to reduce the size and weight of their cameras - Red’s Weapon is built from carbon fibre, for example - but the glass used in lenses remains the heaviest onboard item.
The more that the power to weight ratio can be reduced (while retaining redundancy), the greater the range of lenses from anamorphic to short zooms can be offered. Battery technology is one factor holding this back. Flight batteries can be 2kg each and cinema-ready craft need to operate with a minimum of two. Even then, 8 minutes remains a typical flight time (20-30 minutes is common on sub 7kg craft carrying a GoPro).
The standard Lithium Polymer batteries could give way to Lithium-ion batteries designed by Tesla which would offer more energy for longer flight duration but these are currently unproven and expensive.
There’s no point carrying high quality optics if the footage is shaky. When a craft with a heavy payload flies at full tilt, buffering by wind is a major issue. UAV filming outfits tend to prefer image stabilisers made by Shotover (the K1 is 6-axis gyro-stabilized gimbal); Freefly (its Movi brand includes the 3-axis M10) or work with the Aerigon’s integrated gimbal.
An alternative is the Ecofly version of the H-7 from Gremsey which features a 32 bit ARM microprocessor for super fast response and accurate calibration.
Operators want more powerful image stabilisation platforms that enable them to fly heavier packages, reliably and a gimbal that can copy with the shift in payload balance when longer zooms are used mid-air. There are rumours of new technology due next year which could shake-up this field.
Downlink options include the Teredek Bolt Pro 2000, the Cobham Solo7 Nano which weighs just 51 grams, and the 60 gram Connex Mini made by Amimon. The latter is claimed by its manufacturer to provide deliver zero latency HD wireless links and it comes with all antennas, cables and connectors required for air and ground unit setup.
Based on this tech, Amimon has also addressed the rising interest in broadcasting drone races. Currently, these events are post produced since the first person view feeds from drone-mounted cameras need to be realtime for the pilots to fly them, and are therefore analogue and SD and not broadcast quality.
ProSight, launched in the summer by Amimon, is described as the “first digital HD vision solution specifically designed for first-person view drones” and is being tested with U.S drone racing producer FPV Live.
While heavier payload rigs can cost upwards of £15,000 there are less expensive UAVs designed for lighter weight payloads and usually less demanding broadcast applications.
These include models made by market leader DJI. Its flagship, Phantom 4, includes an obstacle avoidance system and a 4K camera which can shoot 120fps slo-motion video. The Chinese vendor’s Inspire1 also comes with its own 4K camera and the Matrice M600 hexacopter claims a 6 kg payload capacity and a 36-minute flight time. The company has incorporated its own HD live streaming system called Lightbridge into many of its models.
In the last few weeks it has debuted the Mavic Pro a consumer drone that can be collapsed to roughly the size of a water bottle for transportation and sports the same sensor as the Phantom 4, shooting 4K video at 30 fps and 1080p HD at 96 fps. It comes with optional goggles designed to offer up first-person views shot from the quadcopter - something that could contravene CAA rules which state that drones must be controlled in line of sight. It costs £800 and is in direct competition with GoPro’s new Karma.
GoPro Heros have been the camera of choice for many drone users at the budget end and Karma (costing £1000 including Hero5 Black camera) also comes with GoPro Plus cloud storage service and Quik editing app, which is an attempt to simplify drone video production.
Reports suggest a rising demand in flying these type of drones as props within film and TV storylines.
Aside from technology advances perhaps the biggest shift in UAV useage may come from tightened regulations. While basic national safety rules apply (the CAA permits drones weighing 20kg to be operated with a licence within 150m of a congested area), rules differ across the EU and the European Aviation Safety Agency believes a number of key safeguards are not being addressed coherently. It proposes to standardise rules across the EU, though whether the UK will find itself exempt post-Brexit remains to be seen.

Tuesday, 25 October 2016

Sky VR: ‘A smash-up between gaming, internet and broadcast’

Sports Video Group
JosĂ© Mourinho’s fĂȘted encounter with Chelsea fans, his first since leaving the club as manager last December, was captured in intimate glory in virtual reality. For Sunday’s match with Manchester United at Stamford Bridge, Sky fielded multiple VR rigs. The filming followed a test run during the Chelsea versus Leicester City game on October 15 and will feature in a VR documentary being made by Sky VR in association with the Premier League and destined for the Sky VR app around Christmas.
Sky VR is also hoping to continue its up close and personal relationship with British heavyweight Anthony Joshua at his upcoming world title fight with Wladimir Klitschko (recently delayed until 2017 due to a Klitschko injury).
The boxer’s last bout with Charles Martin was also filmed by Sky VR. “Of all the sport’s we’ve shot, the footage of him is my favourite,” says Richard Nockles, creative director at Sky VR. “We’d like to create a really intimate piece about him. We’re crossing our fingers that the request [to film] will be approved.”
During April’s clash with Martin, Joshua delivered the punch which floored the American in the corner where Sky’s VR team happened to be. It’s the kind of luck needed in sports coverage, but was only possible because of a lightweight Steadicam used by Nockles’ team.
“We used gyro-stabilised carbon fibre poles to float a camera next to Anthony Joshua,” says Nockles. “We were able to walk with him into the ring which, from a viewer’s perspective, is nothing short of magical.”
Similar systems were used at Chelsea. “We’re trying to capture what it’s like to go to a Premier League game and transport viewers into the heart of it. When done well VR can blow people away.”
Evolution of a VR vision
Nockles has been working in the medium for six years, setting up Surround Vision, a full serviceVR/360 production company in 2013 based in Shoreditch and seconded since February to Sky to help the broadcaster shape the output from its in-house 10-person production unit, Sky VR Studio. It’s a period of experimentation which has seen Sky VR trial VOD and live streams at events including soccer, cycling and boxing.
“Before Sky, I’d always been on the sidelines of major events, even working for major brands in the ad industry, but now we have access to some of the biggest sports events in the world,” he says. “We have the luxury of taking that access to find ways of creating something truly dynamic, going over the barriers and the taking audience right into the heart of sport.”
With Sky VR he has made two F1 VR experiences produced with Formula One Management and Williams Martini Racing. Each a couple minutes long, they ran on Facebook, taking viewers into the pit lane, team garages and out onto the track. Each generated 2.5 million views apiece.
Most of Nockles’ work is focused on short form VOD, but he is also working with Sky’s tests of live VR streaming.
“When shooting live the majority of camera positions will be cabled because of the need to return feeds from the camera back out to the cloud,” he says. “We’ve still yet to really see a robust wireless live solution for live 360.”
Sporting differences
While certain sports, like boxing, lend themselves to the greater intimacy which VR promises, Nockles acknowledges that this is trickier covering sports with a large field of play.
“The intimacy is lost but your motivation for viewing is different,” he says. “Most football fans go to watch a game of chess happening in front of them. When I go see Manchester City I love watching the movement across the entire pitch on and off the ball. A lot of the time you can’t see that the way soccer is conventionally covered but with VR you can.”
Sky is experimenting with a live solution where viewers are able to play with content by “pushing in” to a feed and switching from 360 to traditional feeds. “This becomes more of a playful experience,” says Nockles. “It’s all about testing to see how audiences like to engage.”
That’s important because the more engaged the viewer is the more likely they will want to spend longer in the experience. Just now videos of 2-5 minutes are optimal, though this is also a function of uncomfortable headgear.
“The weight of the first generation headsets is a problem,” he suggests. “Sony PS4 VR is a better design since it places the weight of the goggles on the head rather than the face.
“I bet if we look back even a year from now we’ll be laughing at the size of headsets. VR will develop like the mobile phone industry – we’ll quickly get smaller headgear with sharper, crisper screens.”
Gear specifics
Although Sky is an investor in VR camera developer Jaunt, Nockles says he is agnostic about kit, and is able to use “whatever is going to do the best job”. His personal preference is for smaller cameras which have less impact on the scene itself.
“If I can stash a camera so that no-one can see it means I can get into the heart of the action. Ideally the industry will develop cameras that are as small and as unobtrusive as the microphones placed in front of speakers at a press conference. That’s how close we need to be.”
Sky VR uses the Adobe Creative Cloud for editing and GoPro’s Kolor for stitching, although Nockles is also open to using tools from The Foundry and others. He prefers to work in mono rather than stereo largely because 3D doubles the workflow and production costs.
“Part of my role is to analyse the costs involved in a job, asking whether it’s worth doubling the effort to get a wow factor and engagement. There are some excellent examples of 3D and there’s a lot of effort being invested in both styles. There is no right or wrong way – it depends on the creative direction of each project.”
For VOD content Nockles is combining conventional static and moving camera shots typically filmed using ceiling- or floor-mounted positions with floating Steadicam shots for more dynamic movement. The team has used zip lines to propel a VR camera alongside action and put 360-cams on drones, notably covering the build up to this year’s Tour de France.
Live presents added difficulties, notably in avoiding motion sickness for viewers watching fast action. Putting VR cams on F1 cars as they corner at 150mph is one such red flag.
“I would absolutely love to see drone racing (a sport which Sky plans to broadcast) in VR but we have to work out how to make that safe for the viewer,” says Nockles. “We are trying to push the boundaries of content production at the same time as creating safe VR. Perhaps we can run a series of ziplines alongside the drones?”
Graphics are another huge asset in VR “for transitions and overlays, for social media interaction and as a way to hide the crew,” he says. “There will be a huge smash-up between gaming and the worlds of internet and broadcast. Augmented reality will be a part of this experience.”

Saturday, 22 October 2016

Nanocosmos' H5Live Player Offers Reduced Latency in Live Streaming

Streaming Media 

Berlin-based Nanocosmos has taken its core codec technology and developed a suite of tools for end-to-end live streaming and claims 0.5- to 2-second latency for cross-platform playback including on HTML5 browsers.
It recently debuted the patent-pending H5Live Player, which modifies the transport stream to enable playback by HTML5 browsers as well as iOS, Android, Windows, and MacOS platforms.
This was the result of feedback from its introduction in 2015 of WebRTC.live, an HTML/JavaScript client enabling plugin-free streaming directly through the browser (Google Chrome, Firefox, and Opera, with Microsoft Edge planned).
"After launching WebRTC.live we found customers increasingly asking us how they could create live end-to-end streaming services including a server environment," explains co-founder and CEO Oliver Lietz. "That's when we decided to create our own backends for live streaming. We can offer to take care of any worries a customer has about installing server products, configuring different streaming encoders. Scaling is another challenge for companies which are rapidly growing their online audience. They need to manage multiple ingest and output streams and yet still be able to perform instant live streaming with as low latency as possible.
"We felt existing solutions—DASH or HLS—were not really suitable because of the native latency of 10 seconds or more in an end-to-end live stream," he adds. "So we decided to create our own player software for mobile devices based on RTMP.
"H5Live runs on any platform and is a great replacement for existing Flash players based on RTMP," says Lietz. "It is a client/server solution based on the HTML5 Client and the H5Live Server. It works similarly to MPEG DASH but keeps end-to-end latency below 1-2 seconds."
Users can connect to any RTMP live stream from their existing live streaming workflow, or create a complete end-to-end live streaming solution from a camera or screen with nanoStream software services.
nanoStream is the firm's live encoding and playback cross platform technology. Cross-platform SDKs and apps enable different integrations of CDN and server for live video delivery.
The firm's new bintu.live stream management platform enables users and developers to create, tag, and group streams.
"Businesses can either use ready apps or create their own branded apps with our help, and use the bintu.live streaming service for secure streaming and stream tagging," says Lietz. "For enterprise-level security, we offer both cloud-based and on-premise streaming servers."
A real estate company in Australia is already using the suite to power a live streamed property auctioning business where latency of a second is critical to ensure bids are legally binding as the hammer falls. First responder fire and police services in the U.S. and in India are using the software for live streaming multiple views from mobile devices from incident sites.
Users also have the ability to add drone video feeds by sending live video direct (from DJI or Parrot BeBop drones, for example, as well as GoPros) out to the cloud or share through social media.
At IBC in Amsterdam in September it demonstrated connecting and playing a VR live stream from Orah's 360° camera, with its mobile RTMP player for iOS and Android, and the H5Live HTML5 player.
Its nanoStream technology powers Stringwire, the live video streaming platform from NBCUniversal News Group and Comcast, which supports Periscope-style citizen journalism.
TalkPoint, a provider of cloud-based webcasting software and part of the PGi group, also uses nanoStream codecs for its web video conferencing applications.
German broadcaster RTL is another customer, using Nanocosmos codecs for classic content preparation scenarios.
Such heavyweight customer endorsement shouldn't be surprising for a company that has nearly 20 years experience. It started in 1998 as a software engineering company focused initially on MPEG1/2 codecs with a strong R&D background in cooperation with esteemed scientific researcher Fraunhofer/HHI. It later developed into high-level encoding and streaming APIs, for desktop and browser applications.
Privately funded, the company has 15 permanent staff and operates a licensing model for its software.
"Our aim is to create and deliver live end-to-end low latency and cross-platform streaming services which can be run as white label," says Lietz. "Increasingly we will do that on the cloud to enable services to be accessed worldwide. We would also like to grow our partnerships with CDNs and other key vendors. Long-term partnerships with close customer relation, support, and consulting services are a decisive part of our offering."

Tuesday, 18 October 2016

I Am Hardwell

 Total Production International

The heart-pumping BigCityBeats Virtual World Club Dome with the finale to the DJ’s two year world tour at Germany’s Hockenheimring featured an ambitious Ultra HD multi-camera production live streamed to Hardwell’s global fan base in an adrenalin fuelled event led by VPS Media and NOMOBO.

p26 https://issuu.com/mondiale/docs/tpioct16_digitallr/1

‘Imma kick lips, tonight / We're gon' shove our ass, tonight / I hope I remember, tonight’. And what a night - or 48 hours - it was for superstar DJ Hardwell and 25000 fans at the Hockenheimring in deepest Rhine country.

It was the DJ, record producer and remixer’s finale to two years on the road with the I AM Hardwell World Tour which has seen him play to sold-out shows in Jakarta, Singapore, Bangalore, Mexico City, Cape Town, Tel Aviv, Guatemala City, Manchester, Sao Paulo and Sydney.

On the 27th August 2016 the Hockenheimring, more famous as the biennial home to the German F1 Grand Prix, was bathed in the light of electronic dance music with the BigCityBeats Virtual World Club Dome - and the heat of an extremely warm summer’s day. Temperature’s reached a peak of 35°C and didn’t relent throughout the weekend.

German video production agency VPS Media has worked closely with BigCityBeats since 2001 and is tasked with AV production at most of their events, including live production and aftermovie / trailer productions.

For the BigCityBeats Virtual World Club Dome and the tour finale of “I Am Hardwell – United We Are”, BigCityBeats wanted something extra special. This meant not only giving the spectators an unforgettable evening but sharing the experience with DJ Hardwell’s global fanbase. With no compromise on production values VPS Media translated BigCityBeats’ call into a multi-camera high resolution capture combining a live switched 4K/UHD recording for on-demand and aftermovie purposes, together with a HD encode streamed live to YouTube.

“This show is a definitely the first time ever an EDM dance event has been captured this way,” says Constantijn van Duren, Executive Producer and Managing Partner for NOMOBO, a Netherlands live event music producer specialising in EDM. NOMOBO itself was commissioned directly by Alda Events to produce the broadcast side of things - 4K live capture and live broadcast on YouTube.

The combination of NOMOBO’s flyaway production kit (including many Blackmagic Design components), used with ARRI Amira cameras for a live multicam production has never been employed before. 

Explains VPS Media producer Björn Aßmus, “In early 2016, BigCityBeats approached us with the request to provide technical equipment and crew for video production at the BigCityBeats Virtual World Club Dome. NOMOBO were charged with responsibility for the live production and were bringing their own infrastructure, glass fibre connections, live studio and FOH equipment. They asked us to provide most of the cameras and camera crew for the live production, a video mixing panel, as well as some of the streaming equipment for the playout to social media. VPS managing director, Andreas Schech, was in charge of the planning phase.”

The thrust of the brief was to produce the live multicam coverage in 4K/UHD of the complete show, to be broadcast on Youtube and Facebook (in 2K), while also playing out parts of the show to SWR (German public TV network). SWR were on site with a satellite news gathering unit ready to beam pictures across Germany. In addition, artist interviews and aftermovie material was to be shot in and around the event with a small documentary camera crew.

Having supplied and produced the AV for BigCityBeats’ EDM shows and festivals over many years, Aßmus says the chief difference with this show is the open-air setting. “The major events we have covered in the past were almost always indoor events in stadiums, so this time, all equipment has to be rain-proof, and protected from intense sunlight.”

Did we say it was hot? You never want rain of course, but the heat presented its own challenge. “Given the extreme temperatures, it was an interesting challenge for both personnel and equipment,” says Aßmus. “The extreme atmospheric conditions combined with an event taking place on tarmac in bright sunlight, are a challenge not only for the camera crew, but also in terms of preserving the equipment. We have to be especially mindful of managing the schedule so that our crew can capture footage all day while also resting enough. Preserving the equipment was crucial for the reliable and failsafe production, especially for this live coverage.”

A black fabric cover was jury-rigged for the main FOH camera and camera-op while sun-absorbent black umbrella’s shielded other kit and crew. “When they weren’t shooting the camera-ops placed their camera under the stage and crane in shade at a central location.”

NOMOBO’s production team travelled from its HQ in Amsterdam and VPS Media travelled from their offices near Frankfurt. Everyone arrived on the Thursday prior to Saturday’s show and performed an extensive test that combined cameras, lenses and fly-away production kit. Friday was a full setup day.

As you might expect of a F1 race circuit (used just a month earlier for the German Grand Prix), the area behind the paddock and pit lane at Hockenheimring provides plenty of space that is usually used by the racing teams, which resulted in short distances between FOH, stage, live mixing workstation and production office. The organisation made this very convenient. Placing the production offices and mixing workstations in the pits was especially advantageous, because mixing and production staff were very close to the stage, while still avoiding the high temperatures outside, which also helped in protecting the equipment. 

Aßmus explains how they managed preparations and rehearsal. “NOMOBO developed the camera layout and approached us with their production request. We went through revisions together, made recommendations regarding certain components and finalized the tech plan together with them, and briefed our camera crew accordingly. On location, our director of photography, Alexander Weber got in touch with NOMOBO’s live broadcast director (Christian Laurman) and they conducted a briefing with our camera crew. During the final hours of the day, we defined our workflows for footage logistics, data management, and communication between our cameramen and NOMOBO’s staff.

At EDM shows it is crucial to focus on the interaction between the DJ and the crowd. The BigCityBeats Virtual World Club Dome with the I Am Hardwell concert was no exception. “What you experience with EDM is a dramatic curve in the music that motivates the crowd to be active and to react to the cues given by the DJ by musical means or by shout-outs,” explains Aßmus. “Capturing the essential moments in this interaction and creating a narrative, like a dialogue of some sort, through careful selection of images, is the key to transporting the atmosphere to the viewer.

“It’s also important at this show to make sure to transport this glorious summer atmosphere, with people enjoying the light and with such an enjoyable feeling of an outside event in perfect weather.”

Doors opened at 14:00 and the team were straight into action covering the stream of fans into the stage area and various video interviews with Hardwell and support acts DANNIC, Funkerman and Kill The Buzz. After each set backstage videos were conducted with each artist. The evening built to a crescendo at 20:30 when Hardwell lit up the stage and played for three hours of electrifying energy.

The camera plan included seven ARRI Amira cameras each operating at 4K (UHD) resolution supplied by VBS. “We’ve got pan shots from our crane and wide-angle views of the stage to convey a sense of scope to show and the size of the event, and we’ve placed mobile cameras at the barriers and on stage to give a sense of presence and to show the emotions of the crowd in what we hope is an intimate way,” says Aßmus.

To support this, VPS employed Zeiss and Canon Zoom lenses with a narrow depth of field, and used slow-motion recording at 200 fps for the aftermovie and trailers.

Using the ARRI Amira camera sets, VPS delivered feeds from the camera positions through glass fibre connections to the central mixing suite, which was operated by NOMOBO. Live streaming technology was provided by NOMOBO sending out the master broadcast feed to its Master Control Room in Amsterdam. From there high-quality encoded streams were distributed to Hardwell's YouTube and social media channels as well as media partners in the U.S. and Asia. 

VPS media used a separate Teradek Cube streaming encoder to stream to BigCityBeats YouTube channel, configured and operated by a MacBook Pro, which could also be used as a backup encoding device.

NOMOBO has built a unique fly-away production kit that is able to live capture any event in the world, using high-end digital cinema camera such as Sony F55, ARRI Amira and Panasonic Varicam. This fly-away production kit provides full camera control (iris, colour temperature, ND, ISO levels) as well as tally indicators, talkback, audio and camera power all over hybrid SMPTE fibre cabling.

“Working with NOMOBO made this production a breeze, as their unique production kit made it very convenient for our camera operators,” adds VPS Media’s Schech. “Their sophisticated infrastructure and experienced crew, together with our expertise in high-quality camerawork is a combination that is hard to match.”

At Hockenheimring, NOMOBO deployed the ARRIs in combination with cine-servo lenses from Fujinon and Canon. The camera package also contained a motorised Sam Dolly track (provided by Eurogrip NL) and a two-man operated Jimmy Jib of 24- feet length.


More specifically on the cameras, the crew arranged one handheld Amira on stage and another mounted on the Sam Dolly. A third was located front of stage barrier also handheld. Between the first and second barriers to the left hand side of the stage, the team placed the 24ft crane carrying another Amira. A camera-operator wielded a fourth ARRI nearby, the fifth was FOH on a pedestal. The final camera was carried around the festival to capture behind-the-scenes footage and artist interviews.

Friday, 14 October 2016

Boom time for UK studios

Broadcast

The impact of Brexit has, in the short term at least, increased demand for space as TV projects and feature films battle it out while studios expand capacity. http://www.broadcastnow.co.uk/features/boom-time-for-uk-studios/5110280.article

The UK might be mired in economic uncertainty, but studios are confident that their recent boom can continue.
“To put it cynically, the exchange rate post-Brexit has done us a favour and, if anything, intensified the volume of enquiries, particularly from the US,” says Film London chief executive Adrian Wootton.
Since 2013, the UK has gone from having almost no alternative studio space to ramping up space at The Bottle Yard, the opening of Pinewood Cardiff and Church Fenton in Yorkshire – not to mention three new stages at Leavesden, multiple warehouse conversions like the one in Hayes for Tiger Aspect’s Fortitude, and additional stages coming on stream at Pinewood HQ and in Belfast.
“Production is cheaper than it was before,” says Wootton. “In truth, we’ve done really well to manage capacity, but we do want more, from London to Scotland, because we know the demand is there. I don’t think we’re tapped out in terms of demand or potential for more infrastructure.
“We’re being very proactive and aggressive about marketing UK studios.”
In November, two massive sound stages, comprising 66,000 sq ft, will open in Belfast.
Costing developer Belfast Harbour Commission £20m, the soundproofed North Foreshore Film Studios are bespoke builds, not retrofits of existing warehouses, and can be split for two entirely separate productions or joined to create one workspace.
High-speed internet pipe Project Kelvin provides access to the US east coast, for studios wanting to screen dailies.

“The UK reached a tipping point a few years ago where there just wasn’t enough bespoke stage space to keep pace with the rise in TV drama,” says Andrew Reid, head of production at NI Screen.
“The Harbour Commission has had the foresight and confidence to address that demand.”
Indeed, there are already plans to build a further 42,000 sq ft. “We’re investing more in training and bringing more people into the region to service anticipated new productions,” says Reid.
Belfast’s capacity is at its limit with Game Of Thrones installed in the eight acre (64,000 sq ft) Titanic Studios.
The latter has added another two sound stages of 21,000 sq ft, and there is also the Linen Mill half an hour to the south.
NI Screen also leases three studios at a former Britvic distribution hub, comprising 86,000 sq ft, to productions including Fox feature Morgan and BBC drama Mother And Other Strangers.
Studio owners almost universally report a boom that shows no signs of stopping.
Bristol’s Bottle Yard “has had its best business year yet”, according to site director Fiona Francombe.
Although Deal Or No Deal is no longer a fixture, shows including Company Pictures’ Starz drama The White Princess and Netflix and E4 co-pro Crazyhead are shot there.
Mammoth Screen’s Poldark shot series three in Tank House 1 and 2, with the entire production co-ordinated from the Yard’s offices.
“We’re looking to invest next year to bring more spaces into use,” says Francombe.

In East London, “business is busier than ever for occupation and future bookings”, reports 3 Mills head of studios Tom Avison.
While MasterChef cooks away on site, the main driver is high-end drama, notably series three of E!’s The Royals, for which 3 Mills has pencilled in further series.
In Manchester, The Space Project plans to double capacity by September next year.
Benefiting from a £14m investment from the City Council, Outer Space will include a 30,000 sq ft stage, 10,000 sq ft set construction workshops and 40,000 sq ft of business units.
This follows the first full trading year of drama hub The Space Project, whose productions included Cold Feet, The A Word and Houdini And Doyle.
“Netflix and Amazon have driven exponential growth in the amount of content the industry has to create, and someone has to create space in which that is made,” says founder Susan Woodward.
“We are gaining a reputation for repeat business – Dragon’s Den is signed for another run – and we’ve have had interest from American producers keen to locate outside London.”
Having been sold by Avesco to property developer Quintain in January, Fountain Studios (below) will close in December, taking with it one of the biggest fully equipped TV studios in the country.
But there was some solace this week when London mayor Sadiq Khan announced plans for the capital’s first major new TV and fi lm production studio in 25 years.
The London Local Enterprise Partnership and Barking and Dagenham Council are spending £80,000 putting together a business case for the proposed new Dagenham site, led by Film London, which they estimate could bring in more than £100m in UK spend and attract international productions.
Meanwhile, the new owner of Pinewood Group, US real estate firm Venus Grafton, paid £320m for the legendary brand and seems prepared to expand the business.
Five new sound stages, totalling 170,000 sq ft, opened in June, marking phase one of a wider £200m development.
The new owners will inject capital to kick-start the second phase of expansion, totalling a further 170,000 sq ft, and there’s a “masterplan” to redevelop the 80-year-old Shepperton site, says director of strategy Andrew Smith.
“Capacity has been constrained for many years and the UK has had to turn away business,” he says.
“Our five new purpose-built stages will allow Pinewood to accommodate two additional large inward-investment films this year.”
Films that may have shot in the UK but for a lack of space include Fox’s Alien: Covenant and Disney’s Thor: Ragnarok. Both rerouted to Australia.
Elstree will unwrap a 21,000 sq ft stage late next year, increasing site capacity by 30%, with a further 39,000 sq ft planned.
Strictly Come Dancing is lodged in Stage 2 and Netflix’s The Crown will be back at the end of the year.
“The studios are probably going through their most successful period ever, but there’s still a shortage of suitable stages in the M25 London area.
“That’s where most clients want to work, and where the majority of crew, cast and skills are,” says managing director Roger Morris.
“We already have clients who wish to use our new stage once it’s built, but more is needed.”
That’s where new space in Liverpool might come in.
Developer Capital & Centric is to build a £30m, 11,000 sq m studio near the city centre. The first phase of the project, including the soundstage, is likely to begin in early 2017, according to Liverpool Film Office, which says C&C will invite bids to run the studio.
Meanwhile, the Scottish government continues to mull the rubber-stamping of the £140m Pentland Studio development outside Edinburgh, which would bring considerably more purpose built stages to the nation.
A decision is expected within the next month and, if approved, PSL Land – the private company spearheading the studio and fi lm academy – could break ground on the 86-acre complex as early as February for opening in the first quarter of 2018.
Plans include five stages totalling 130,000 sq ft plus an exterior water stage.
The government has already approved a 30,000 sq ft extension at Wardpark, which is where Amazon and Starz’s Outlander is shot.
Two 50fthigh sound stages will bring space there to 78,000 sq ft, but the only other purpose-built unit in Scotland is a 5,000 sq ft stage on Stornaway.
Creative Scotland is also marketing 50,000 sq ft of converted space at the BBC’s Dumbarton Studios and an additional 435,000 sq ft of pop-up space at places like The Pyramids in Bathgate, Leith’s Pelamis Building, Glasgow’s West Way and Dundonald in Ayr, where ITV’s Loch Ness recently shot.


IMPACT OF BREXIT: SO FAR SO GOOD
“In the short- to mid-term, we’re as strong as we’ve ever been, but we need to find out what the government is going to do and then see what effect that has.”
Tom Avison, 3 Mills
“Most of our customers are domestic and there’s no sense of change yet. The biggest issue is uncertainty and that’s the one thing the industry doesn’t like.”
Fiona Francombe, Bottle Yard
“The biggest concern is the freedom of movement of labour. Some aspects of the industry, notably VFX, draw heavily on skilled labour from abroad. We can’t automatically fill the vacuum from the UK.”
Adrian Wootton, Film London
“Some clients have expressed concern about future investment and obviously the lack of production investment would affect studios. However, our business model is perhaps more resilient than some and we are confident we can weather any storm.”

Tuesday, 11 October 2016

Unified Remix Combats Ad Blockers

Streaming Media

Unified Remix is a software engine that creates a single stream from a playlist created by a call to an ad network, a rule set, or by CMS generation, preventing ad blockers from identifying ad server URLs.

Unified Streaming has launched Unified Remix, a solution to counteract ad blockers. Server-side ad insertion with Unified Remix can "significantly reduce lost impressions" claims Unified Streaming CTO Arjen Wagenaar.
"Ad blockers are browser plugins that have been configured to block known ad server URLs," he explains. "They are designed to notice when a playlist is edited and manipulated. When a browser makes an ad request, the ad blocker will simulate an error from the ad server, thereby effectively blocking the ad. With our solution, even if there's a trailer at the beginning, a bumper at the end of the video, and a mid-roll, the player will see a single stream, not one stitched from different origins. When presented  to a player the stream has a single origin and a single timeline and no discontinuity: there's no way for the player to differentiate between the main content and an ad bumper."
The software is a development of the Dutch company's Unified Origin module for industry standard webservers like Apache, Nginx, Microsoft IIS, and Lighttpd. The plugin allows a web server to ingest one format (HLS, MP4, fMP4) and to package it on the fly to all formats, including HbbTV and progressive. It supports standard DRM schemes like Adobe Access, AES, Marlin, FairPlay, PlayReady, and Widevine and is in use among CDNs, media organisations, and video subscription service companies.
Unified Remix is essentially a software engine that resolves a playlist created by a call into an ad network, a rule set, or by CMS generation. It creates a reference MP4 file for Unified Origin to use as source. Unified Origin then creates a single stream from the reference MP4.
"While a playlist may be created by a player in JavaScript or via a player plugin, it can just as readily be done completely on the server side, removing all decision logic from the player," explains Wagenaar. "With Unified Remix, the playlist is based on a general ruleset defined, for example, in a CMS or by consulting a recommendation system or an ad network. The ad choice can be based on individual characteristics originating in the player or by associating content with a specific channel – allowing for targeted channels.
The product supports many use-cases, such as live-to-VOD (separating live ingest from playout);  live scheduling (rotating 24/7 playlists); pre-, mid-, post-roll (VoD); bumper for everyone (VOD and live); dynamic ad insertion VOD, and live dynamic ad replacement.
"Viewer information (e.g. session ID or cookie) can be used to personalise streams on any level," says Wagenaar. "From a personalised stream for every viewer to streams to groups of viewers (e.g. based on geolocation, subscription model) or even a single stream for all viewers (e.g. a bumper)."
A broadcaster wanting to serve catchup TV for live channels can create an archive-to-VOD (infinite live archive) which will dynamically change if the EPG changes. Creating a remixed stream opens new doors for content owners to monetize content but also offers new options for creating personalized streams of for instance live archives. Best of all, Unified Remix will create a stream that plays on all platforms in all formats and devices, and as such it provides a great multiplatform experience."
The product runs on an annual license per web server.

Amazon is Coming: Why Amazon will be big at IBC2017

IBC
Next month Amazon will unleash Clarkson, May and Hammond and the worldwide debut of The Grand Tour, the much anticipated new series from the former Top Gear team. You couldn’t get a more vocal or headline grabbing act with a ready-made global audience accumulated over many series by former masters BBC Worldwide.

http://www.ibc.org/hot-news/amazon-is-coming-why-amazon-will-be-big-at-ibc2017
It is an ideal show with which to herald Amazon’s arrival as a major TV player - if indeed it needed further trumpeting.
“The GAFA are coming,” warned WPP CEO Sir Martin Sorrell, in his IBC2016 Keynote, referring to the media ambitions of Google, Apple, Facebook and Amazon.
He highlighted the e-commerce giant as the key one to watch. “While all the focus is on Google and Facebook, the big one coming is Amazon,” he warned.
WPP represents, by Sorrell’s reckoning, between a quarter and a third of the total advertising media services market. On behalf of its clients WPP will spend $5.5bn with Google and another $1.5bn with Facebook this year. He would welcome Amazon as a serious challenger. 
“Google and Facebook account for 76% of ad growth,” he said. “We will be supportive of Snapchat, AOL/Yahoo and others [in their efforts to grow their share of digital advertising]. A duopoly is not what our clients want.”
However, Amazon will represent a serious threat to legacy media organisations since it has the deep pockets to spend on new exclusive content. It will also give Netflix, runaway SVOD leader, a run for its money.
Juniper’s Digital TV & Video: Network and OTT Strategies 2016-2021 report predicts that revenues from SVOD services including Netflix and Amazon are set to grow from US$14.6bn this year to US$34.6bn in 2021 as consumers in more countries move to non-linear video consumption.
The world’s largest online retailer has already doubled the amount it spends on content this year in an effort to attract more customers to Instant Video, part of its wider Prime service. Original content plays a key role in converting free trials to paying subscribers and driving consumers to Prime.
For example, the company’s German division looks set to pick up rights to the second instalment of hit drama Deutschland 83, after original broadcaster RTL declined to recommission. German TV trade site dwdl.de reports that Amazon may also top that with a third series, Deutschland 89.
Amazon is also recruiting a head of programme commissioning in London, signalling its willingness to invest millions in more British productions, according to a report in The Telegraph. The high-profile, London-based role will build on the eight series Amazon has so far commissioned in the UK, making it easier for British and European producers to deal with Amazon. Currently all its programme-making decisions are made in LA.
In order to differentiate itself from Netflix and Facebook and to prevent Apple getting in on the ground floor, Amazon is also believed to want to move into live sports.
A report from Bloomberg, suggests that the Seattle-based retailer is on the hunt for rights to a number of high profile sports events, including the French Open tennis championship, rugby and football games and basketball matches.
According to Bloomberg, Amazon has hired Sports Illustrated Executive James DeLorenzo to lead its sports division as well as YouTube Executive Charlie Neiman to manage sports partnerships and business development.
Such a move would tally with the company’s acquisition of encoding specialist Elemental Technologies a year ago to give it the basis to launch a live OTT solution.
Much has been made of the use Netflix has made of analytics - rather than conventional TV demographics - to determine content commissions and serve targeted recommendations to consumers. Arguably, Amazon has even more data about consumption habits based on purchase behaviour. Plus, it can make use of data culled from VOD service Lovefilm and movie and TV database IMDB, both of which it owns.
Data based on behaviour is arguably more precise than demographics, the theory being that if you have all this consumption data, you can market the right show to the right customer. The more programming that is inside Amazon’s app, the longer people are going to spend with the site.
In a recent advert simultaneously promoting The Grand Tour and Amazon’s connected TV stick Fire, Jeremy Clarkson launches a fleet of drones - surely symbolic of a pre-Christmas deluge of new subscribers signing to Amazon’s TV package.

FPV drone racing profile to increase as barriers to live broadcast are broken down

SVG Europe
As a sport drone racing is still in its infancy but the combination of video game-style esports action and a millennial fanbase has enticed major broadcasters to invest. The problem is that the video feeds that are currently used to transmit the exciting first person view (FPV) from onboard the remote controlled craft is SD on an analogue frequency — making live broadcast tricky.
Technology to be demoed live at CES in Las Vegas, January 2017 will showcase HD feeds streamed from multiple drones and taken straight to air. Forthcoming broadcast coverage however will be post produced. ESPN, Sky and ProSiebenSat.1 have all recently signed rights deals for the Drone Racing League (DRL) in which contestants race drones through empty malls, stadiums and subway tunnels. Eurosport is also in talks to show drone racing, according to chief executive Peter Hutton, speaking to Reuters.
Under the deals with the broadcasters, 10 hour-long episodes are being prerecorded, featuring six contestants and giving viewers a video feed from the cockpit.
“Drone races will be the Formula One of the future,” said Zeljko Karajica, who heads ProSiebenSat.1’s sports business 7Sports, in a statement. “It’s the perfect combination of physical racing, eSports and virtual reality.”
While aerial photography UAVs are designed to hover in place and carry a camera with a gimbal and GPS-assisted flight, racing drones are small, typically 250mm in size, and have front mounted cameras, usually configured in an H-style shape. These cameras operate in the analogue video transmission frequency 5.8Ghz with a resolution of 480-600 TVL lines. This signal returns to the base station or receiving device which feeds the goggles worn by pilots to see and control the drone in flight. The same feeds could be used for mixing into the broadcast or web stream — but will be SD. Boost the data and the latency can drop — and that’s lethal for drone racing.
“Even a delay of a couple of seconds is enough for piloting to become near impossible with crashes bound to happen,” says Marque Cornblatt, founder and CEO at Aerial Sports League (ASL). “The downside is that the signal is not pretty. It’s good enough to fly by but not to broadcast.”
Where ASL races have streamed live on the web the SD feeds are used in a quarter or third of the screen with graphics and statistics filling up the rest of the real estate. A further hazard is signal interference such as reflections of the signal off walls and metal (especially if flying indoors) and multi-pathing caused by conflicting signals from neighbouring drones.
Since, in many competitions to date, it is the pilot themselves who DIY the drones, including integrating the FPV cameras, one pilot’s set-up can inadvertently interfere with the signal of a competitor. There are no standards yet in this nascent sport.
Race ready with less breakup and no multi-pathing
However, drone racing producer FPV Live is on the cusp of introducing a digital HD technology “which will revolutionise our ability to broadcast live,” claims Todd Wahl, Co-Founder and Business Development. It is working with Israeli-based wireless links developer Amimon which has adapted the technology used in its Connex lightweight receiver/transmitter specifically for drone racing.
ProSight, released in June, includes a 720P HD camera, plus the transmitter and receiver. FPV Live has tested the unit on board eight competitor drones flying simultaneously at the MultiGP Championships in Muncie, Indiana in September.
“We’ve proved we can be race ready with little to no delay at a higher definition with less breakup and no multi-pathing,” says Chris Toombs, founder and technology manager at FTV Live. “It’s not exactly HD, it’s lower than 720p so we call it high quality rather than HD, but it’s certainly a better image than analogue.”
FPV Live and Amimon plan on providing a live demo of a drone race streamed in HD at CES2017. Despite the challenges of maintaining a digital signal in FPV drone racing — like rapid maneuvering, obstacles and low flight, Amimon says ProSight provides zero-latency, HD streams.
“In digital you can have a really tight frequency position because of the encoding,” explains Toombs. “Each pilot’s goggles are plugged into the base station receiver and bound directly to a signal that can only be transmitted from that pilot’s drone.”
Aside from wireless issues, it’s also tricky covering such a fast paced sport where tiny drones are zipping around at 80-140MPH. “You would need professional camera-operators trained in drone racing to be able to track a drone in flight and directors also versed in drone racing to understand when to switch between wider coverage and first person views,” says Cornblatt.
He doesn’t believe drone racing is yet ready for broadcast primetime. “The real experience is more akin to watching or playing a video game but the current drone racing viewing formats are built around old school models like NASCAR and Formula 1. There’s no harm to that but I think the format needs to evolve and take cues from esports and video gaming culture.”
In particular, Cornblatt thinks that when AR and VR mature these technologies will converge with drone racing. “We’re doing some engineering projects to converge the worlds of gaming and drone racing,” reveals Cornblatt. “Being able to user-select a FVP camera angle and pilot for example seems like a natural to me.
“I definitely see ways we can bring the audience into drone racing and making them part of the experience. This could be as simple as using social media voting for a favourite pilot to give their drone a power boost, but we are also looking at other ways to involve an online audience. For example, can pro-pilots or even fans at home control a drone when they are not present at an event? Flying via the web yet racing in real space is a distinct possibility [widespread in the military] for incredible user interaction.”
Shortcut to serotonin and dopamine system
Self-described as “one part X-Games, one part NASCAR and one part Mixed Martial Arts, the ASL specialises in producing live events (such as Game of Drones) and has chapters in Las Vegas, Austin, Amsterdam and Shenzhen, China. ASL’s two day live event Drone Sports East at the World Maker Faire in New York last week drew 100,000 spectators.
“Drone racing is wildly popular among pilots, but the viewing audience has been slower to develop, primarily because the DIY underground culture of drone racing has not produced any events with the needs of an audience in mind,” says Cornblatt. “Aside from a few Youtube videos, few have experienced the thrill of a pilot strapping on FPV video goggles and experiencing an athlete’s out-of-body ‘flow state’”.
He describes this as “a shortcut” to the brain’s serotonin and dopamine system, which “acts as a drug for pilots who quickly become ‘addicted’ to the FPV flying experience.”
“This is great for the pilots,” he adds, but does nothing for the spectators, who are left watching tiny machines flying off in the distance – not particularly thrilling. “A real challenge is to create an event that will serve the live audience as well as those watching online.”
One way the ASL does this is to make the live event ‘theatrical’ by for instance having gates light up as drones pass through them; to project a mixed feed onto large screens on-site; and to install viewing stations with goggles at the event for fans to view the cockpit FPV.
“If you look at esports as a model then the audience online is growing well past 100 million (globally for 2015) but you can’t disregard the enormous popularity of the live esports event where audience numbers rival that of a Super Bowl. The same is happening in drone racing.”
More edgy and off the cuff than ESPN
FVP Live produces events such as Mega Drone X held underground at the Louisville Mega Caverns. Its broadcast setup features low cost prosumer cameras like the Canon G20 and small network based HD cams placed on gates or on a commentary position (usually featuring Joe Scully, race director at FPV Racing Events) and “the voice of FPV racing” according to Wahl.
FPV Live simulcasts to YouTube, the host organisation website and its own site, and is working to deliver feeds to Periscope and Facebook Live. It sees its audience as millennial. “These fans are are interested in this raw behind-the-scenes action rather than the slick and polished world of traditional broadcast. We are a little more edgy and off the cuff [than a ESPN],” says Wahl.
He elaborates, “Although we have a production sheet [schedule] we interact with our audience with live chat and have them influence the live production. They can ask questions of pilots or request different camera angles. Education is the biggest hurdle. Commentary and colour helps but we want to make fans based on information so we interview drone manufacturers about the technology. We want to make them feel they have the power to interact with us.”

Tuesday, 4 October 2016

The limits of perception

AV Magazine

Director Ang Lee’s new film The current limits of projection technology can push frame rates up to 240 and even 480fps if resolution is reduced from 4K to 2K and 3D is neglected.



The November theatrical release of Billy Lynn’s Long Halftime Walk has a groundswell of Awards buzz because of its unprecedented visual format of 4K resolution, 3D and a speed of 120 frames a second (fps - also denominated as 120p or 120Hz).

No major motion picture has ever been released at rates of 60 let alone 120, the nearest being Peter Jackson’s The Hobbit, a 48fps 3D 4K film in 2012.

Nor is this some experimental sideshow. Its director is the two-time Oscar winner Ang Lee (Brokeback Mountain, Life of Pi) and the $50 million Sony Pictures distributed Iraq war drama stars Kristen Stewart and Vin Diesel.

Yet higher frame rates have been relatively common in proAV for decades - and demand is steadily increasing, according to Barco’s Maria Dahl Aagaard, strategic marketing manager, simulation.

“This is especially evident within the projection industry and several of the high end markets it serves such as simulation and high end visitor attraction,” she says. “Within these sectors HFR helps improve the viewing experience by providing smooth life like images and suppressed motion blur artefacts. Think fast jet and racing car simulators, roller coaster dark rides and also museum-based astronomy shows. These trends are filtering into other important market segments such as rental and staging and more mainstream AV as well.”

Historically, expectations have been much lower for the consumer market which has primarily been driven by the movie and TV industry where films have been shot in 24 fps - an artefact of Hollywood’s decision to accommodate the optical soundtrack 90 years ago.

Its shortcomings include blurry images - particularly noticeable in any fast action panned movements - and strobing - immediately apparent in stereoscopic, large-format movies.

For this reason HFR is widespread in theme park 3D and 4D attractions. “When you immerse an audience in front of a large screen the fast motion can create kinetosis (motion sickness),” explains Bryan Boehme, director of sales for location based entertainment, Christie. “Humans can perceive motion at 960fps so if we can bring rates up to 120 this smooths the image out.”

The current limits of projection technology can push frame rates up to 240 and even 480fps if resolution is reduced from 4K to 2K and 3D is neglected. “Resolution is actually perceived to be greater at higher frame rates,” says Boehme. “It’s like looking through a window in comparison to the representation of an image you get with 24fps.”

A problem: Making content at such extreme frame rates is barely practical. Ang Lee is pushing production boundaries his film which consumed 40 times as much data as a conventional film.

Moreover, to shunt the data through the projector fast enough some serious horsepower is required which is why Christie’s Mirage is one of a handful of machines capable of it. It can process data at 1.2 Gigapixels a second with a Trulife electronics processor and also uses Christie laser technology to throw 28fL (72000) on screen. Even then two of them are needed to show Lee’s picture at 120fps for both eyes.

Many cinema fans will argue for the textural aesthetics of 24fps and have criticised Jackson’s use of 48 on the basis that the picture looked too much like video.


That said, trends are changing “due to the exposure from home entertainment and visitor attraction venues. People are no longer willing to settle for second best,” says Aagaard.

PC’s have always had higher refresh rates than TV but the latest TV sets into retail are  incorporating 120Hz. After broadcasters like BT and Sky have rolled out 4K and high dynamic range, Ultra HD services they will turn attention to upping speed from 50/60 to 100/120 to erase the motion blur which mars fast sports action.

Douglas Trumbull, the vfx whizz who pioneered HFR motion simulation ride films in the 1970s with Showscan, is back with a new system. Magi Pod is capable of playing back 4K, 3D and 60 (per eye) using a single Christie Mirage and a hemispherically curved Torus screen. Trumbull intends to pre-fabricate the 70-seater Pods for quick install into multiplex or amusement parks.

Shooting at 120 unlocks the ability for fine control in post over ‘the look’ of the material because the camera’s shutter angle is no longer locked in to the rushes. In effect it offers a new technical and creative tool: technical because from the 120 master a number of exhibition options (e.g 2K 3D 120fps, 4K 3D 60fps) can be derived; and creative because a filmmaker could select frame rates for different moments within the same same story much as they’d use focal length, lighting or colour.

The only thing dragging back HFR from going mainstream is cultural inertia. “When cavemen drew stick paintings on rock that was state of the art,” says Boehme. “Imagine if they could see a renaissance painting. It would look much more realistic but it would also be so alien to them. It will take a while for us all to get out of our comfort zone.”