Monday, 18 April 2016

Faster than a ray of light

Broadcast 
Lytro’s light-field cinema camera could revolutionise film-making, while developments in holographic display are bringing Star Wars-style technology closer to reality.
Despite swapping celluloid for silicon, the fundamentals of recording images haven’t changed since the invention of photography.
New advances in light-field technology, first conceived in the 1840s, are about to change all that. Light-field is a way to capture or project rays of light in a scene.
Technically, a light field is 5D and includes three spatial dimensions (X, Y, Z) plus two angular dimensions, describing the direction and intensity of the light ray.
Record all of this with enough fidelity and in theory you have a holograph. While there is currently no commercially available display capable of showing a holographic video (see box), there are potential applications in using light-field data flattened into 2D to create visual effects and other image manipulations in post.
Moving rapidly from science-fiction to experimentation, light-field has made it onto the agenda at the SMPTE Cinema Conference at NAB and at the MPEG JPEG committees, where a new working group on the topic, plus sound fields, has just been established.
What’s more, the company that brought light-field to the mainstream in 2012, with a consumer stills camera that enables users to refocus after capture, claims to have developed the world’s first light-field cinema camera.
“Light-field is the future of imaging,” declares Jon Karafin, head of light-field video product management at Lytro.
“It is not a matter of ‘if ’ but ‘when’.” Light fields can be captured in two ways. One is to sychronise an array of cameras, each recording a different point within the same space; the other is to place a microlens array (MLA), comprising hundreds of thousands of tiny lenses, in front of a single sensor. Lytro has chosen the latter approach, building on a decade of software R&D, 18 months of intensive hardware design and $50m (£35m) in funding.
Last November, the company released stills camera Lytro Illum and announced Lytro Immerge, a video camera designed for virtual reality, which it has not yet released. These products are stepping stones to the company’s new system, the specifications of which are astounding.

Lytro Immerge
Currently in alpha test and due to launch during the third quarter of this year, the Lytro Cinema Camera carries the highest resolution video sensor ever made at 755 megapixels (MP), capable of recording 300 frames per second (fps) through an MLA comprising more than 2 million lenslets.
The next highest resolution sensor publicly announced is a 250MP model being development by Canon.
By contrast, HD equates to 2MP and 4K to 9MP. The resolution needs to be vast for the system to process the unprecedented volume of information. According to Lytro, the effective output resolution will be 4K. “We are leapfrogging the leapfrog, if you will,” says Karafin. “We are massively oversampling the 2D to be able to change the optical para meters in post.
Everything that makes the image unique is retained but you can re-photograph every single point in the field of view.”
For example, the shutter angle and frame rate can be computationally altered in post. As studios move towards higher frame rate cinema (Ang Lee’s Billy Lynn’s Long Halftime Walk is shot at 120fps), Lytro thinks there’s a market for being able to render the same project at 24fps or 120fps for theatrical release and 60fps for broadcast, at the touch of a button.
With plug-ins for The Foundry’s Nuke available on launch, the system has a route into facilities where calibrations including depth of field refocus, tracking, stereo 3D and Matrix-style VFX can be created from the same raw information. Lytro admits the technology is “very challenging” and that work needs to be done on the software algorithms, compression system and hardware.
Nor will it be cheap. Too large to be handheld, the camera needs to be physically mounted to a dolly or crane and be cabled to a server up to 100 metres away.
Video village
The server itself is powerful enough to crunch data up to 300GB/s. Even then, the server will sit in a ‘video village’ supervised by technicians. The camera requires “a non-standard optical format” and Lytro will offer the unit with custom focal lengths.
The whole system, including operators, is being offered for rental. Cinematographers, already wary of virtual production technologies eating into their core crafts of camera position and light exposure, are likely to feel threatened.
Anticipating this, Lytro has incorporated a physical focus control and aperture check on the camera to give DoPs reassurance that the decisions they make on set are preserved down the line.
“There are those who swear by film as a capture medium, but for most cinematographers there is no right or wrong, just a tool that best meets the creative requirements,” says Karafin.
“Ultimately, this is a disruptive technology and we need the feedback of the studio, cinematographer and VFX community to make sure what we are creating meets their needs, as well as helping them understand the creative control this unleashes.”
Light-field-captured video may appear little different to the viewer until there are holographic screens capable of projecting a three-dimensional image.
Unsurprisingly, there are major developments here too, with companies including Zebra, Holografika and Leia3D among those with designs.
“We are not there today but we will cross that threshold,” says Karafin. “Holographic is the next generation of display technology and will truly change the way we all think about capturing native imagery.”

LIGHT-FIELD RIVALS AND HOLOGRAPHIC DISPLAY

Fraunhofer IIS
Fraunhofer researchers believe light-field technology could be a more efficient means of creating VFX in post. It has test-shot a live-action fi lm from a 16-HD-camera array, but its main work has been in devising software to compute the data and route it to a conventional post environment.
A plug-in for Nuke will be available soon. “An MLA will not work so well for movies because the baseline difference between the images captured by one sensor is too small,” says Fraunhofer’s Siegfried Foessel.
“It’s good for near distance but not larger distances, and for this reason we work on camera arrays.”
Raytrix
Raytrix has designed a micro-lens with three in front of a 42MP sensor to capture 2D video plus 3D depth information.
This outputs 10MP at 7fps. Raytrix markets to robotics and science industries because cofounder Christian Perwass believes light-field systems are not suitable for capturing most film scenes.

Raytrix
“They are workable with close-up subjects like a face, but if you want to extract depth information for scenes 10-20 metres away, you might as well use standard stereo 3D cameras,” he says.
Visby Camera
The US start-up is in stealth mode on a light-field VR system until 2017, the first part of which will be a codec.
Founder Ryan Damm explains: “The potential data stream with light-field is enormous since you need to encode and decode a 5D image at every possible angle.
To make this workable, these should not be much larger than today’s large video files.”
Leia3D
A team of former HP researchers is developing a holographic display using diffraction patterns.
Due for launch next year, the prototype is able to process 64 views with a depth that lets a viewer move their head around the image. Samsung and Apple have patents fi led on similar technology. “Our patent has priority,” says co-founder David Fattal. “We are talking with them.”

Leia 3D
It has also invented ‘hover touch’, the ability to interact with a holograph without touching the screen, and may combine this with technology that gives a physical sensation of touching a hologram, based on miniature ultrasonic microphones developed by the UK’s Ultrahaptics. “Holographic reality will be on every screen, everywhere,” says Fattal.

Friday, 15 April 2016

TV music: what’s the score?

Broadcast 

The much improved range offered by production music libraries is a bonus for programmemakers, but commissioned music can take a show to another level, says Adrian Pennington.

http://www.broadcastnow.co.uk/features/tv-music-whats-the-score/5102599.article?blocktitle=Features&contentID=42957

The natural home of bespoke composition is drama, but programme-makers of all stripes want to give their production a distinctive voice and identity, a big part of which is carried aurally.
“Composed music works really well for formatted entertainment shows such as The Apprentice or The Singles Project, where you can have it timed perfectly for a reveal or a story beat,” says Lime Pictures head of non-scripted Derek McLean. “We try to make reality shows feel more like drama.”
Ironically, music in drama has tended to become less overt as a result. “For documentaries and reality, there is often a need to create suspense and mood using a type and tempo of music traditionally associated with narrative drama, because there may be relatively little actual drama happening on screen,” says composer John Hardy.
While off-the-peg music has its place, there is a risk that it could feel generic. “You can never get the subtlety and sophistication from production music that you can get with a specially composed piece of music,” says Anne Dudley, whose credits include BBC1 family comedy Billionaire Boy.
It is the flexibility of being able to tailor, dissemble and time a bespoke track to the story that often attracts producers. “In documentaries, you need quite specific music that lifts the production but lets the footage speak for itself, rather than overpowering it,” says Grace Reynolds, executive producer at Twofour, who has used composed music on shows including Channel 4’s Royal Navy School and Educating Yorkshire. “As much as we may need a full dynamic tune, we are often crying out for simpler tracks,” she adds.
Philip Guyler scored BBC1’s National Lottery Stars and Hollywood movie Her, while his library compositions were used in Ricky Gervais’ Channel 4 comedy Derek and AMC’s zombie drama The Walking Dead.
“With library music, you can probably catch the mood and atmosphere, but you can’t react very closely to the emotion on screen,” he says. “You can move a library track around a bit, but you don’t have the flexibility to tailor it to the picture.”
The right fit
The fast-turnaround nature of some shows can make fresh compositions unpractical, though a decision on musical direction will take the content into account.
“Composed music doesn’t work for TOWIE,” McLean says, because of the show’s tight editorial deadlines and the decision to feature a lot of commercial music to keep the storyline current (it also had a “very successful” product placement deal with Ministry of Sound’s Marbella Anthems album series).
For ITV Be spin-off Life On Marbs, Lime had a particular Balearic soundscape in mind, for which it commissioned composer Dobs Vye.
The two routes are not mutually exclusive. For Hollyoaks’ recent serial-killer plot, regular partner Audio Networks supplied catalogue sounds and created bespoke music.
“It can be more expensive, but it’s worth the investment,” says McLean. “If you can negotiate a share of the rights, specially composed music becomes a revenue stream for the producer and the library gets to retain the music for reuse in other productions.”
Composed music is perceived as being expensive but, as Guyler points out, a good composer will adapt to the budget. “Electronic music can produce realistic acoustic-sounding scores and you can overdub the sample with live music to give more realism. This sort of production can be done cheaply.”
Paul Farrer (1000 Heartbeats, The Chase, Judge Rinder, The Weakest Link) adds: “Obviously it depends on the kind of score you want, but it’s always much less expensive than you think. As with other TV craft disciplines, the key is communication. Be up-front and trusting by sharing your concerns about time or budget – 99.99% of the time the composer will respect that and work with you to provide their best work for the budget.”Working to a brief
A brief can be as open as half a dozen adjectives or fully finished pilot episodes with temporary music. “I’ve had people send me hours of music on CD to soak in and even had 3D set models delivered to give me the sense of scale,” reports Farrer. “The worst jobs are when producers have become welded to a piece of temp music that they don’t have the rights to use.”
Most discussions tend to be about tone. “‘Too cheesy’, ‘not catchy enough’, ‘too scary for afternoon audiences’,” says Farrer. “Once you’ve figured that out, the rest falls into place. Generally speaking, the more creative freedom you can allow, the better it goes for everyone.”
Highlighting specific music or elements of a track is often a better guide than a verbal description. “People don’t tend to know what they want until they hear it, so if a producer has only a vague idea, I’ll request that they identify some existing tracks that they like as a starting point,” says Guyler.
Producers will issue a pitch to half a dozen – sometimes more – composers and production music services. This puts added pressure on the composer to devise the right sound. “It’s a bit of a lottery,” says Guyler. “You might not win the pitch, but then hear music used in the final show that is nothing like the initial brief.”
A pitch can be anything from a sample of previously produced music or a 30-second taster to a full-blown demo. Demos are supposed to command a fee but there are reports that producers increasingly expect them to be done without payment.
“When you get called for a face-to-face pitch, you are aware that lots of others are going for the same job and sometimes you have to articulate coherent ideas before you’ve seen a frame of a picture,” says Dudley.
Composers increasingly find themselves working without sight of the locked-off picture but, much like colourists, it is their skill at interpreting a brief and collaborating with a producer and editor from script to post that is in demand.
“When you are working at speed, then any information the producer can give you is a help,” says Hardy, who scored BBC1 drama Hinterland using instruments as diverse as a bowed psaltery and wind chimes recorded forwards and backwards. “Lots of late changes to the script can be a nightmare if the music is no longer of the right duration.”
Reynolds adds: “I try to contact composers early on and want them to understand the project, with lots of meetings for them to share their thoughts on how music will fit in. We’ll show them footage as soon as we have it and the dialogue continues through the edit as they send tracks and we provide more detailed feedback about specific scenes. Relationships and trust are really important.”
On a returning series, producers will want to retain the musical continuity of the programme brand while moving the show forward. “Labelling and cross-referencing every audio version can come in handy if, for example, the production reintroduces a character from an earlier episode and you can quickly find and insert elements of their musical signature,” says Hardy.
He has used stems (audio elements) in a novel way to create an online game accompanying the third series of Hinterland, in which users can create their own soundtrack to a clip from the drama using a mix of separate audio pieces.
“The days of production music being seen as the poorer cousin to commissioned music are long gone,” says Farrer. “People will always want new music, just as audiences will continue to demand innovative film and TV content. There are parallels with photography. Image libraries are huge and instantly accessible. Does this mean we no longer need new photographs? Of course not.”



case study: Scoring Poldark

Anne Dudley is working on the 10-part second run of BBC1’s Poldark after scoring the first run of eight. “With a long-running series, there’s more opportunity to develop music than with a feature film,” she says. “The overall aim is to hide the music from the viewer while heightening the emotional content.”
She researched Cornish folk music and the history of the drama’s late 18th-century setting before meeting executives at producer Mammoth Screen to win the pitch.
“I went the extra mile and trawled through some film and TV soundtracks that I thought were in the right vein, to give us some references,” she says.
Dudley explored the idea further with Mammoth managing director Damien Timmer before replacing the temp music on a couple of scenes. “At this point, I began to understand the vocabulary the producers were using,” she says.
After a spotting session with the producer and editor, detailing the dramatic drive of each scene, Dudley spends two to three weeks on each episode, typically working to a final cut. “We’re scoring about 35 minutes per hour. Starting a series from scratch is always hard, but as you go on, you can return to certain ideas for characters or emotions.”
The series’ composition will combine elements of the original score with fresh music IDs to accompany new plot threads, themes and characters. Dudley writes on piano before orchestrating for violin and harp soloists, then oversees recording with an 18-piece string orchestra at Angel Studios.

Ang Lee's war drama 'Billy Lynn' faces projection challenge

Screen Daily 
EXCLUSIVE: Only a handful of exhibitors will be able to screen Ang Lee’s anticipated drama as the director intends.
When Ang Lee’s Billy Lynn’s Long Halftime Walk goes on release in November only a handful of exhibitors will be able to screen it as the director intends.
The TriStar Pictures war comedy-drama, starring Kristen Stewart, Vin Diesel, Steve Martin and Screen Star of Tomorrow Joe Alwyn in the title role, is the first to be shot in a combination of 4K resolution in stereo 3D and at 120 frames a second (fps) - a bold specification that exceeds all DCI compliant presentation systems.
The Oscar-winning director, who previously pushed the boundaries of 3D with Life Of Pi, spoke last year at CinemaCon about shooting at 120fps, which was in part chosen as a means to immerse viewers in Billy Lynn’s combat scenes.
“What Ang Lee is aiming for cannot be done on any DCI-compliant equipment today,” said David Hancock, director, head of film and cinema, IHS Technology, who was speaking to Screen as part of an upcoming feature on flm specifications.
“The latest [projectors] can show 4K 60fps but if you wanted to show Lee’s movie in 3D you would need two of them [one for each eye of the 3D view].”
Digital Cinema Packages (DCP’s) can be made in 4K 3D 120fps but they are non-compliant and there is not a DCP player or Intergrated Media Block (IMB) - a server necessary to throughput the data - capable of handling that right now. 
Preview
Lee is set to screen a 15-minute preview of the film in its fullest specification on Saturday (April 16) at the SMPTE Future of Cinema Conference in Las Vegas. The screening is made possible by a specially modified arrangement of an IMB with dual Christie Mirage projectors.
“This is the only projector capable of doing this and you need two to make it happen,” confirmed Christie Digital’s senior director, product management, Don Shaw.
“Our solution is not intended for cinemas but for theme parks.”
“No current technology in the market can do 4K 3D at 120fps per eye,” added Oliver Pasch, sales director, Digital Cinema Europe, Sony Professional Solutions Europe. “No system has the necessary bandwidth.”
There are reports that Texas Instruments is developing technology to upgrade projectors to play the format and there is work being done on more efficient compression algorithms to improve the efficiency of systems without damaging the overall image quality.
“The bottom line is that we will need adoption of these better encoders and upgrades to the projection systems to get to the point where 120fps 3D 4K can be distributed as a playable DCP,” commented Richard Welsh, chair of the SMPTE Future of Cinema Conference.
High frame rate
High frame rate (HFR) filmmaking gained a high profile push with New Line Cinema / MGM’s The Hobbit franchise, directed by Peter Jackson in 2012.
The effect, which in the case of The Hobbit doubled the traditional 24fps at which film is shot and presented to 48fps, eradicates motion blur to deliver a crisp hyper-realistic look.
Ahead of that film, and anticipating a glut of HFR content, exhibitors upgraded select screens at a cost of $6,000-$10,000 per screen with HFR technology.
In figures supplied by IHS, at the end of 2015 there were 162,000 cinema screens worldwide of which 149,000 are DCI compliant digital screens and of which 74,000 are 3D.
However, the number of installed projectors capable of showing HFR content is harder to quantify since there is no data source on this, according to Hancock.
Estimates range from as low as 3,000 screens, which upgraded to show The Hobbit, to as many as 60,000 screens worldwide today which are equipped with the latest projection systems.
However, the maximum that Sony digital cinema projectors and the DLP systems of NEC, Christie and Barco can playback is 2K 3D 60fps or 4K 3D.
Released in several formats
The likelihood is that Billy Lynn’s Long Halftime Walk will be released in several formats including 2K 3D 48fps, the same specification as The Hobbit, with just a handful of specialist venues outfitted with non-DCI compliant technology to screen Lee’s ultimate vision.
“Lee is pushing the boundaries by trying to show what could be done. But there is no business model to get the film [at its maximum creative intent] into the hands of exhibitors,” said Shaw.
Lee’s comedy war drama, adapted by Simon Beaufoy and Jean-Christophe Castelli from the novel by Ben Fountain, centres on a heroic infantryman’s final hours before he and his fellow soldiers return to Iraq.
In 2014, filmmaker and VFX artist Douglas Trumbull presented his short sci-film UFOTOG at the IBC trade show in 4K 3D 120fps on dual Christie projectors.
Trumbull is developing a system called MAGI for processing content to that high specification and is seeking to licence the technique to studios.
TriStar parent Sony Pictures declined to comment for this article.
Screen International will explore this issue in greater detail in an upcoming ScreenTech feature on film specifications.

Lytro debuts holographic video camera for cinema

Screen Daily 
The prototype is being revealed at the Future of Cinema conference (part of the NAB show) in Las Vegas this week.
Camera maker Lytro is launching a first of its kind light-field video system intended for feature film with significant implications for the future of production.
The system combines the highest resolution sensor ever developed with specialised software to enable frame rate, aperture and focal length to be determined in post-production.
The prototype is being unveiled at the Society of Motion Picture and Television Engineers’s SMPTE Future of Cinema conference in Las Vegas on April 16, part of the NAB Show (Apr 16-21), along with a short film showcasing the technology, and will be launched commercially in Q3 this year.
While conventional cameras record light as a two dimensional image - a mechanism that hasn’t changed since the invention of photography - Lytro’s camera is able to capture multiple viewpoints on set in a single shot, including the direction and intensity of the light rays.
This data can be used to rebuild detailed pictures of the depth and colour of objects in a scene enabling traditional craft decisions such as focal point, camera position and even lighting to be reframed after the event.
“Currently, key creative decisions such as the frame rate are ‘baked in’ at the time of capture. Light-field means you can computationally change the shutter time, frame rate and aperture as a virtual camera process,” said Jon Karafin, head of light-field video, product management, Lytro. “This is a disruptive technology and we need the feedback of the studio, cinematographer and VFX community to make sure that what we are creating meets their needs.”
One advantage of the technology, according to Lytro, is that studios will be able to create versions of the same film at 24 frames per second (fps), 48 fps or 120 fps as desired for theatrical releases and at 60 fps for broadcast transmission all from the same source material, rather than limiting such decisions to the time of shooting.
Other benefits of recording include making Virtual Reality and VFX production less resource-intensive and enabling live action to be treated in the same way as computer generated imagery. Lytro has developed a plug-in for The Foundry’s software Nuke giving the raw data a route into post-production suites where image manipulations such as a change in the depth of field, a shift in zoom or matting for virtual backlots is possible.
The unit will output 4K resolution but in order to do that, alongside computing information such as the depth of objects in a scene, Lytro has developed the highest resolution video sensor ever made at 755 Megapixels (MP) shooting 300 frames a second.
This is more than three times the resolution of Canon’s highest rated 250 MP sensor, which is still in development. In comparison, High Definition is the equivalent of 2 MP.
Before light hits the sensor it will pass through a micro-lens array comprising over two million hexagonal lenses each with a slightly different perspective on the scene. The captured information is otherwise known as a hologram.
“We are leapfroging the leapfrog, if you will,” said Karafin. “We are massively oversampling the 2D image to be able to change the optical parameters in post. Everything that makes the image unique is retained, but you are able to rephotograph every single point in the field of view.”
Challenges
He admitted that the technology is “very challenging” and that work needs to be done on the software algorithms, compression system and hardware.
Too large to be handheld, the studio camera needs to be physically mounted to a dolly or crane and be cabled to a server up to 100 metres away. The server itself is being designed to throughput data at speeds up to 300GB/s (it currently works at 100GB/s).
The server will sit in a “video village” off-set and be supervised by Lytro technicians. The camera requires “a non-standard optical format” and Lytro will offer the unit with custom focal lengths. The whole system, including operators, will be offered for rental. Lytro says some studios are running trials.
Anticipating negative reaction from cinematographers, Lytro has incorporated a physical focus control and aperture check on the camera plus keyframe metadata to provide reassurance to the DoP that the decisions they make are preserved into post.
“When the software is opened in post then whatever decision the DP or assistant or focus puller has made on set will be the same,” explained Karafin. “We are trying to be sensitive to that workflow.”
Advancements in tech
Light-field was first theorised 125 years ago but research only took off in the 1990s. While Lytro uses a single camera, German researcher Fraunhofer IIS has employed a multiple camera array to achieve a similar effect.
Developments such as these have put the technology on the industry’s map. Light-field will be discussed by SMPTE at the Future of Cinema Conference. In addition, a new working group exploring light-fields and soundfields (the audio equivalent) has been established by the MPEG JPEG committees chaired by Fraunhofer’s Head of Moving Picture Technologies, Dr Siegfried Foessel.
The Lytro Cinema Camera builds on a decade of R&D at the company which brought the first consumer light-field stills camera to market in 2012. Lytro made a move into light-field video last year after receiving a $50 million investment led by GSV Capital. It also made a quarter of its staff redundant in order to hire specialists in video and virtual reality. In November 2015, it announced Lytro Immerge, a light-field video system designed for VR acquisition which is yet to be released. The Cinema Camera is the company’s debut professional acquisition equipment.
Lytro was founded by Standford university math professor Ren Ng in 2006 and has been helmed by CEO Jason Rosenthal since 2013. The company received a total of $150 million in funding from backers including Netscape co-founder Marc Andresssen.
Audiences will not notice any difference in content shot with a light-field video camera while presentations are made on flat screens. However several companies, including Leia3D and Zebra as well as tech giants like Microsoft and Samsung, are working on holographic or augmented reality displays.
“I have no doubt that light-field is the future of imaging,” added Karafin

Thursday, 7 April 2016

Sky Goes After Millennials With Live Streaming on Facebook

Streaming Media Europe

In a continuing effort to reach younger viewers on mobile devices, the pay TV powerhouse will start delivering sports coverage and live news on Facebook Live.
http://www.streamingmediaglobal.com/Articles/News/Featured-News/Sky-Goes-After-Millennials-With-Live-Streaming-on-Facebook-110295.aspx

Pan-European pay TV broadcaster Sky has revealed its intention to stream live news and sports coverage on Facebook Live this summer.
This does not mean live streaming of English Premier League matches, Formula One, Ryder Cup golf, or other sports to which Sky has broadcast rights, but ancillary coverage of those events in a bid to reach millennial audiences who are deserting the more conservative studio-bound presentation of traditional sports for coverage more in tune with informal fan discussion accessed on smartphones.
It's a strategy that has led Sky to invest $7m in sports streamer Whistle Sports and to partner with them on a social media channel in conjunction with Sky chat show brand Soccer AM last October.
Sky says live streaming to Facebook will offer a unique behind-the-scenes look at a number of major events and breaking news stories.
In a blog post Sky digital director Lucien Bowater said Sky had already been experimenting with live streams and that the results have been "impressive."
"In less than 24 hours since [Facebook Live's] release, more than 100,000 people viewed [former cricketer] Michael Atherton’s exclusive interview with [cricketers] Freddie Flintoff and Kumar Sangakkara at the T20 World Cup," Bowater wrote. "In news, similar numbers watched [Sky News journalist] Mark Stone’s live report on the migrant crisis in Calais."
He added, "The platform works for us because it allows our news and sports teams to connect with the audience in a slightly different way—offering a different perspective, one that might not otherwise be possible in such a fast moving environment."
Sky previously partnered with Facebook on 'Instant Articles' enabling Sky to host articles from news and sports teams on the social network, as they’d appear on the web.
"By working with them, we've seen more views, more shares, and more interaction," said Bowater. "Increasingly though, our digital strategy has been geared towards video. We’ve seen an incredible response to some of the videos we've uploaded to our own social media channels."
Sky News, Sky Sports, and Soccer AM Facebook pages have amassed 14 million followers. In the past three months, the broadcaster says it has attracted almost 370 million video views, and in January, Sky News was one of the most watched Facebook video publishers in the world, Bowater noted.
This week Sky said it would stream some exclusive content ahead of boxer Anthony Joshua’s bid for the world heavyweight title, as well as coverage of the junior doctors' strike from Sky News.
Facebook is set to replace the Messenger button on its site as it ramps up competition with Twitter's Periscope. This will enable any user, not just major content owners, to live stream.
Facebook has also rolled out the ability to 'go live in Facebook Groups and Facebook Events; 'live reactions' make it easy for viewers to express their feelings in real time during a live broadcast; while a new dedicated place on Facebook's mobile app lets users discover live video.

Wednesday, 6 April 2016

Whisper Films On The Starting Grid

TVB Europe

After Channel 4 took over free to air F1 coverage for the start of the 2016 season it put Whistle Films in the driving seat. 
https://twitter.com/presteigneuk/status/719825291863056384/photo/1

One of the prized contracts in the outside broadcast calendar went to Whistle Films earlier this year, an award that caused some controversy.

The indie, which formed in 2010, is backed by Channel 4 as part of its Growth Fund and for which the broadcaster took a small stake in the company. When the BBC divested itself early of terrestrial UK rights to cover Formula 1 at the end of the 2015 season, Channel 4 took up the reigns and put the presentation of it out to tender.

Established by Sunil Patel who oversaw the BBC’s F1 output before leaving to launch Whisper, BT Sport presenter Jake Humphrey and ex-F1 driver David Coulthard, Whisper beat more seasoned sports producers like North One – which produced ITV's recent coverage – to the chequered flag.

Sunhil is unfazed by suggestions in the press of favouritism. “The pressure to succeed because we had this high profile win doesn't come into it. The pressure comes from all those on our team to deliver on our own high expectations. We are duty bound to keep fans entertained and to improve coverage where we can – there is no added pressure from any other party.”

Because of the tight nine week turnaround between landing the contract in mid-January and the start of the 2016 season in Australia – nearly two weeks of which was required for shipping equipment to Melbourne – Whisper wisely decided to rehire Presteigne Broadcast Hire as its OB partner.

Presteigne had designed and supplied F1 flypacks complete with air conditioning and power distribution systems for the previous seven years of BBC broadcasts and had the kit ready to go at its HQ in Crawley. It also supplies up to 15 crew including sound ops and engineers.

By and large we are using the same kit as the BBC operation with one major uplift in editing,” explains Patel, who will executive produce C4’s coverage. BBC Sport had made a fateful decision to base its editing on Final Cut Pro 7 in 2011 just as Apple decided it would no longer support a professional version of the software. A sensible decision then for Patel to swap the suites out for four new ones of Adobe Premier.

On site, these are linked with EVS IP Director logging and search tools, themselves integrated with a trio of EVS XT3 servers and further hooked into an EditShare rack of collaborative storage. The rest of the kit contained in two flypacks remains the same and includes a Ross switcher, Lavo sound desk and Riedel Artist for talkback with the only other significant addition being a Sony PMW-F5 with Canon Cine lenses to work alongside conventional RF cams.

This will give us a real cinematic look for feature making, content we are familiar with given our heritage of branded high end content,” says Patel.

Whisper has created a range of brand-funded sports content in association with companies such as Red Bull, UBS, Shell and Hugo Boss. It has also won conventional TV commissions such as BBC1 doc Racing With The Hamiltons: Nic In The Driving Seat, and produced highlights for ITV4’s coverage of DTM German Touring Cars. Whisper also produced BBC2’s NFL studio presented highlights in the run up to the Superbowl 50 introducing a touchscreeen for pundit analysis.

Patel said he decided to apply for Growth Fund investment when TV commissions began to “dry up”, and he felt that C4’s backing would give Whisper “credibility” and better access to commissioners.

Formula One Management (FOM) run a strict and well-oiled machine. Rights holders have to join them at the F1 sting 5 minutes prior to race start and only leave the host feed once the podium ceremony is over at the end. In between rights can only tailor presentation with commentary.

There is limited opportunity to do anything within the sport itself but the real difference is around the presentation aspect, hence our commitment to our talent line up,” says Patel. “The difference will be in the insight we can give to viewers from the people we have on in the pit lane and paddock.”

The FOM set-up is deliberately formulaic across the world. “There are new places – such as Baku and Mexico for 2016 - which we will be keeping a watch over this year but each venue has its own unique challenges,” comments Presteigne head of technology, David O'Carroll.

All the opening sequences and feature material is stored and played off the EVS. A catalogue of historic race material is also held there. “For example, if Lewis Hamilton does something special in practice or race day and he refers to a previous race we do have the ability to find that moment he is talking about and play that incident as live,” says Patel. “It's just a question of searching for the clip logged in IP Director.”

Much of the chatter in outside broadcast circles is about how technologies such as IP can be used to cut the costs of sending crew and kit to events around the world. The F1 flypack and its dozen or so engineers and technicians are already a slim-line production.

The only aspect I can see coming back to the UK would be editing which won't happen until the cost of fibre reduces and internet speeds increase,” says Patel. “The next generation of IP-enabled kit might hold the key to saving costs instead of transporting kit and the cost of hotels.”

Perhaps more than any other sport, Grand Prix racing would seem to lend itself to a higher resolution yet despite dabbling in stereo 3D and making the most of advanced wireless technology, FOM's coverage remains resolutely HD for this season at least.

In principal we could supply 4K in the pods with some minor alterations,” says O'Carroll. “What is more challenging is the reliance on RF acorss the site. There's not a viable 4K link that would allow us to acquire 4K. That said, we can upconvert 50p from the camera which would look pretty good, if not true 4K.”

Given the detail and design of the cars 4K would be amazing for Formula One but until a platform like Sky (which is broadcasting the rest of F1 schedule in the UK) offers 4K to consumers then I can't see it happening,” says Patel. “It will take a year or two.”


With a contract for ten races a year until 2018 the indie will be in the best place to anticipate an upgrade should FOM – or Channel 4 – decide to up the ante.

Tuesday, 5 April 2016

Eurovision offers Flexible route to fibre future

Sports Video Group Europe
Eurovision – the EBU’s broadcast distribution division – is offering its new live transmission service Flex for use at events including the Rio Olympics and UEFA EURO 2016 Championships. The service, which marries a new set of portable Flex terminals to the Eurovision FiNE global fibre optic network, has been in development for just over a year following a market review by Eurovision.
“We service tier-one major live sports productions right down through short term transmission, newsgathering and streaming solutions,” explains Graham Warren, Director of Network at EBU/Eurovision. “For tier-one organisations and events our service is highly reliable, increasingly reliant on fibre bandwidth (as opposed to satellite) and moving toward very high quality video, some in 4K. This will not change in the short to medium term.
“But while we also service the lower end of the market with streaming solutions, we felt there was a gap in the middle which could be serviced better. This sort of event or organisation requires medium quality video, perhaps with mobility, and is associated with cost effective coverage such as ENG and tier-two sports events.”
Flex is a hybrid IP solution that gives Eurovision customers a cost-effective way to bring content on to the Eurovision global network. It is also designed to provide transmission resources in places where traditional broadcast fibre isn’t available or would be prohibitively expensive.
There are two principal applications. One is for mobile ENG and another for point to point contribution. Eurovision is also looking at providing audio over IP and streaming direct to the internet.
To do all this it has developed a set of Flex-compatible software-driven terminals. These include a mobile backpack solution for bonding up to eight 3G/4G cellular connections and a unit small enough to fit between a camera body and a battery pack that can still offer up to eight 4G modems, a LAN connection, satellite/Ka-Sat/BGAN support and 32 GB of removable storage for recording video locally. There’s a rack-mounted encoder with two bonded Ethernet connections that can be expanded with eight 3G/4G modems and designed as a back-up for SNG trucks.
There are also a series of encoding devices including the Flex IO, a 1U high-IP encoder and decoder terminal, with two Ethernet connections for managing equipment and connecting to public internet or the Eurovision FiNE Network; and the Flex O, a receiving (decoder) edge terminal for two or more simultaneous live broadcast video streams with the ability to connect over fixed networks (Eurovision FiNE, corporate internet, ADSL).
These packages include a service providing access to the Flex web portal, for self-administration and control of maximum bitrate, video resolution and delay as well as management of connections to other Flex receiving terminals.
Eurovision is also developing a software-defined network controller called SON, or self-optimising network. It detects where the Flex terminals are located, captures a stream to the internet automatically – thereby reducing manual intervention – and routes the signal on to its destination selecting the best route possible.
The current Flex online web portal shows user statistics of network use and reliability and will be enhanced with a new version by the end of this year. “It’s a self-provisioned service – there’s no need for a network operations centre,” says Warren.
Higher bandwidth and higher quality
The Netherlands encoding specialist Mobile Viewpoint is Eurovision’s first partner in IP mobile news gathering but the EBU has already lined up other third-parties to provide Flex support. “There are fewer and fewer requests to use satellite and fibre connectivity and more use of the public internet both with ourselves and our competitors,” says Mobile Viewpoint CEO Michel Bais.
(Mobile Viewpoint is separately developing a system with Euromedia Group and Avid that will enable broadcasters to route low-res proxy files via the cloud for remote editing).
The system has already been deployed by Eurovision in the US to cover the presidential campaign. It will next see action during the Eurovision Song Contest, to be held in Stockholm in May. Flex will provide back-up connections for 28 participating nations for the live-to-air voting segments, in the event that satellite links fail.
“We will provide the point-to-point configuration and will remotely take control of the terminal and flow the signal back to the venue in Stockholm where it is decoded,” says Warren. “This is a major event (viewed by 197 million people in 2015) and we realised that the voting element was less redundant than it could be. Now we’ve added some real solidity to back up the satellite.”
The service will also be deployed in Rio and offered to non-rights holders at stand-up positions in several locations for the Games. The plan is to install mobile hotspots augmenting a broadcaster’s connectivity through the WiFi and/or cellular network and ingest into the Eurovision network.
While the French soccer stadiums playing host to the Euro Championships are well served by fibre, Eurovision will likely offer Flex in a similar way during the UEFA Championships tournament, for news stand-ups outside the venues.
“The service maximises use of any available connectivity at a certain place, captures it to the internet, routes the traffic to our closest POP, and pushes it back to the destination. This happens automatically.”
There are limits. The technology, while fully HD capable, is suited to sports and news type activities at typically less than 15 Mbit/s rather than higher bitrates used for tier-one sports or the outreaches of video quality at 4K.
“Everything points to IP and the importance of software-defined programmable networks as enhancing the degree of flexibility you can achieve with this kind of service,” says Warren. “As the technology evolves and we see a more IT-centric approach in software apps and data centres, it would be reasonable to imagine that, down the line, Eurovision Flex could cope with higher bandwidth and higher quality.”