Thursday 31 May 2018

A unique superpower – in conversation with Joe Walker


IBC

Cutting films for an auteur is a tough challenge, one that demands a deep immersion in the script and the personality of the filmmaker. Imagine then what a balancing act must have to be performed when you are the go to editor for two of the world’s leading directors.
The most successful directors enjoy longstanding if generally unheralded creative partnerships with their editor. Thelma Schoonmaker has cut every Martin Scorsese feature since Raging Bull. Steven Spielberg’s collaborator in the cutting room since Close Encounters is Michael Kahn. Brian de Palma’s best work was co-created with the late Jerry Greenberg. Ridley Scott has come to rely on Pietro Scalia; Clint Eastwood on Joel Cox. Paul Greengrass’ signature kinetic style is undoubtedly enhanced by Christopher Rouse.
So, it says something about the talent of Joe Walker that not one but two of the most acclaimed directors working today have chosen to partner with him.
Walker has edited all of Steve McQueen’s features including Hunger, Shame, 12 Years a Slave and, currently, Chicago-set heist thriller Widows, alternating with projects for Canadian auteur Denis Villeneuve, Sicario, Arrival and Blade Runner 2049, gaining two Oscar nominations in the process. He has also worked with Rupert Wyatt (The Escapist), Rowan Joffe (Brighton Rock), and Michael Mann (Blackhat).
Rather disarmingly Walker says the editor’s relationship to a director is like being a 1950s housewife. “If your husband likes puff pastry then so long as I am with him I’m going to put puff pastry on the table.”
More seriously, he imparts, “You are going to be sharing a lot of time and creative effort three feet away from someone for the best part of a year, so it’s essential you find a common purpose.”
The first feature collaboration is inevitably a way for both parties to find a way of working together. If there’s a creative and personality fit, then the pair will find their own shorthand.
“On Sicario I felt I wanted to give Denis the space to react honestly and not feel I was going to cry if he didn’t like what I’d done,” says Walker. “I love that he’ll direct me in the way he might direct an actor. He won’t give instructions so much as succinct notes like: ‘I need a quieter performance’ or ‘We need to see her resolve’, leaving the minutiae to your own intuition.”
Although he interviewed for McQueen’s first feature project in 2007, they discovered they had grown up less than a mile away from each other in London. Beyond that, they both share unconventional attitudes to filmmaking. Walker was in a band, trained as a classical composer and says he’s drawn to the avant-garde; McQueen is a Turner Prize-winning artist.
“There’s a kind of punk attitude that comes from that background and it’s always going to lead to unconventional, even perverse choices sometimes,” Walker says.
His films with both McQueen and Villeneuve showcase experiments with cross-cutting and flashback structures.
“Playing with time is a superpower unique to editing,” he says. As an example, the original book and the screenplay for 12 Years a Slave were chronological. “After filming finished, we realised that this might not be the best way to drive the story and maybe the better way was by following the expressive inner world of [central character] Solomon. That led us to restructure the story with flashbacks.” 
Similarly, for director Kevin Macdonald’s Life in a Day (2010), Walker distilled 4,500 hours of footage, filmed in a single 24-hour period by YouTube users around the world, into a coherent documentary feature.
 “There was no way of telling what the flow or structure was going to be, with an open brief to be about anything in any language in any country,” he says. “We hired 25 assistants, who spoke multiple languages and would watch and translate the footage as it came in. The tonal key for me was finding a clip of a young American woman rushing to meet the midnight deadline and realising that nothing happened in her day, which somehow makes her feel lonely and disconnected from a world of amazing events. But isn’t that how we all feel?”
Short cuts
Walker trained at the BBC in the 1980s, splicing and track-laying 16mm for docs, arts, children’s and graphic design departments before gravitating to the drama cutting room.
Editor Ardan Fisher, best known for seminal BBC drama Edge of Darkness, helped Walker focus his career. “He developed my appreciation and appetite for editing,” says Walker. “He was a role model of someone getting tremendous creative satisfaction from their craft and was the first to really trust me with sequences to cut.”
At the same time, he had a parallel career writing music, including an orchestral score for BBC/HBO drama Dirty War and credits for sound editing on TV series like First Born and The Old Devils.
He avoids temp tracks [music used as a guide in production] as far as possible: “as soon as you rely on them, it’s probably John Williams pushing through a sequence rather than the sequence itself.”
It is far from unusual for an editor to have a background in sound, “[but] back in the day a picture editor wouldn’t prioritise sound and made no effort to dig out sound effects,” he says. “The term picture editor no longer does justice to the degree of involvement that most editors have with how a film sounds.
“I suppose I see the editing role as the rhythm section of a movie; the Charlie Watts of the ensemble. Setting the tempo and carefully placing all the rhythmic elements – whether dialogue or sound effects or the twirl of a hologram in the corner of a shot. The editing that goes into music is not so different from the editing that goes into a film,” he continues.
Imagination at play
Part of the creative bond Walker has with Villeneuve and McQueen is that both are happy to use silence or shape a scene with sound effects rather than have it washed over with a score. And their attitude to cutting is at one: “Our taste is not to cut too often unless warranted – it creates a tension and puts a strong frame around the performances,” says Walker. “An editor is not originating from a blank canvas, but we will use every subterfuge on every element of performance, picture, sound, VFX and soundtrack to improve the way we tell the story. It could be the subtle alteration of just one syllable of a line or replacing someone’s head. I take great delight in that alchemy.”
Such control becomes ‘sudoku heavy’ the more that VFX shots are added. While 12 Years… featured a handful, Sicario contained 200+, Arrival 750 shots and Blade Runner 2049 topped 1150.
“With Arrival, I learned every trick in the book to try and provide progressively less and less to imagine as the cut progresses, because for a lot of the time the dailies had a blank white screen where two major characters would one day exist,” says Walker. “From temping in clip art from the storyboards, to incorporating the sound the alien heptapods made, the aim is to make it easier to select and time a shot.”
Walker’s own editorial process tends to start by learning the script much like an actor would. He makes fastidious notes on index cards and pastes them on the wall around his Avid for quick reference.
“When material is shot out of order you have to have an idea of where the story has come from and where it’s going to go. That’s simple continuity. For a scene in 12 Years… where Solomon is woken up by Epps [Michael Fassbender] and told to play fiddle – you have to know that in the previous sequence he received a beating.”
Both directors tend to evolve their projects over a lengthy period, so much so that Walker speaks of post as “a continuous shoot.”
“On Arrival we dropped a shot of Amy Adams’ hair in on the last day of DI – a year after we started post-production,” he says. “The creative process, with Denis in particular, is one of putting all the elements in place and then shaping them until someone tells us to stop.”

Because every possible option has been thoroughly explored, Walker says he doesn’t have second thoughts once a film has premiered.
“It can take a year or more to decompress from a film you’ve worked on so that you can watch it again without blinkers. At that point, when I see a film I worked on, I am watching as any member of the audience would, rather than just focussed on the edit.”
Early adopter
Having trained to cut on sprockets, then on Lightworks and Avid, Walker is keen to employ new technology. “Inexpensive VFX solutions such as Nuke have had a huge impact on what I do,” he says. “Routinely now I can alter a shot by combining different takes and move pieces of them around. This is a very crude assembly in my hands, but [VFX Editor] Javier Marcheselli will create a solid comp in Nuke as a guide for how long to hold on a shot, what its impact is going to be; otherwise, you’re just cutting plates and that can be less exact and certainly less compelling.”
The Las Vegas hologram fun-house in Arrival was created this way. It’s one of the most technically demanding sequences Walker has cut and one of which he is particularly proud. “It had so many layers and always felt very speculative. I was involved with pre-viz and the second unit in each iteration, carefully timing every holographic dancer against lighting designs and music. It didn’t work the way I’d first cut it – the ‘show’ had overtaken the manhunt and lost its spookiness. The whole scene was close to being dropped entirely but I’m glad we reworked it, not least because of the months of effort everyone lavished on it.”
He increasingly uses the RX Audio Editing tool to retrieve otherwise unusable sound recorded on set. “There’s a scene in Widows where the sound of an L-train drowned out the actor’s dialogue but applying RX you can hear them perfectly clearly. I’m almost certain we’ll be able to use more of the sync sound on this show, whereas previously much of this would have to be re-recorded.”
The main innovation he thirsts after, though, has only been theorised. “After 35 years sitting in front of a screen I could do with something to ease my neck and back. Instead of gripping a mouse and having my hands locked in one position I want to be able to spool through the shots, to work physically with the images just as I was with 16mm. I want something like the virtual screens of Minority Report to be able to play with.”


Dynamic APAC needs local attention

content marketing for Rohde & Schwarz 

The bald facts about the broadcasting equipment market in the Asia-Pacific disguise both the immaturity and the world-leading dynamism with which different countries in the region are facing the future.


Broadly, the APAC broadcasting kit market is expected to grow at a sizeable CAGR of 8.1% for eight years from its 2015 base of U$2487.5 million until 2024, according to Persistence Market Research.
While commercial UHD services are available in the region (e.g South Korean DTH platforms KT SkyLife, Japan’s Sky PerfecTV) and NHK even plans to switch on 8K UHD transmissions next year, other countries such as Vietnam, Cambodia and Burma face an uphill task just to switchover to digital.
For example, there were an estimated 44 million analogue cable homes in India at the end of last year – with a quarter of the pay-TV base still to switch by 2022 [per consultancy and research service Media Partners Asia].
But defining the market on pure broadcast terms is limiting in itself. This is something that the IABM recognised in its recent remodelling of the industry from one based around product categorisation to one that reflects the wider supply chain from creator to consumer.
It is also central to the expanded profile of BroadcastAsia. No longer a broadcast engineering trade show, the event has segued with CommunicAsia, and the new NXTAsia, to form ConnecTechAsia – the region’s answer to the converging worlds of telecommunications, broadcasting and emerging technologies.
Viewed from this lens, APAC’s real driving forces come into focus. Perhaps in Asia more than anywhere connectivity is seen as crucial to driving economic growth and the development of nations, business and individuals.
The region’s incumbent operators are finding it essential to address mobile first consumption, either by government-led projects to upgrade national living standards or in a race to rollout 5G cellular networks and open up fresh commercial opportunities.
Mobile operators are offering video to differentiate their service in alliance with OTT providers, while OTT providers are disrupting the market just as they are doing in the U.S and Europe.
While online video represents a fraction of the U$120 billion in total APAC TV revenues today (per MPA), it is already reshaping strategies and consumer expectations across the region.
OTT SVoD revenues in APAC are on track to reach U$10 billion by 2021 encompassing 200 million subscribers, according to Rethink Technology Research.

Thursday 24 May 2018

Football bodies and broadcasters tackle UHD HDR


IBC
The BT Sport coverage of the UEFA Champions League Final is set to be a UHD and VR spectacular, while chances seem high that the BBC will commit to UHD HDR live streaming of FIFA World Cup in Russia.
The UEFA Champions League Final, European soccer’s annual showcase, attracts 350 million viewers worldwide. While that’s more than triple the figures for the Super Bowl, it still lags behind the billion-plus which FIFA claims tuned in for the 2014 World Cup Final.
It’s a big deal then to host broadcast the UCL Final, and this year with the event staged in Ukraine. With facilities of the scale and production experience required not available in the country, UEFA is taking responsibility itself in tandem with BT Sport.
Having produced the first 4K UHD host broadcast of a Champions League Final in 2017 from Cardiff, BT Sport’s expertise proved invaluable to UEFA in preparing for Kyiv.
According to UEFA, it is looking to build upon the “extremely successful 2017 final as well as building momentum ahead of Euro 2020”. This means taking the opportunity to test new things such as the introduction of a separate clips channel, special cameras, and more additional footage produced on match-day as opposed to Cardiff. Coverage in high dynamic range (HDR) is however not on the cards. UEFA says it will nevertheless incorporate HDR into future events, with testing to occur over the next cycle.
The production template is otherwise almost identical to that which BT Sport set up last year in terms of the traditional TV camera plan, Dolby Atmos audio and the workflow for highlights streaming in full 360-degree – for which it won the IBC Innovation Award for Content Everywhere.
BT Sport Chief Operating Officer Jamie Hindhaugh, says: “UEFA have looked at what we’ve done previously and built on that. We were the first to move into 4K and have always had a very healthy relationship with them.
The host broadcast is also deploying Sony HDC-4300 cameras, as opposed to the Grass Valley LDX series cameras used in 2017. In addition, UEFA will use Sony MVS-8000X series mixers.
The production standard is UHD/4K – Quad – 3G – SDI SDR Rec709. In principle there will be no up-conversion, with the exception of certain special camera inputs, such as the in-goal mini-cameras, which will be up-converted from 1080p50.
UHD coverage is being delivered by 22 cameras, with feeds ingested to EVS XT4K live production servers, capable of up to four channels of UHD-4K in flexible in/out combinations to help create live slow-motion replays. Multi-angle replays are made available directly to the rights holders’ mobile applications through the EVS C-Cast content distribution platform.
Around 28 additional HD cameras are used to mix the HD feed and include specialist positions such as Spidercam, goal rail cameras and Heli tele. These signals are fed into banks of EVS XT3 servers.
VR operation
UEFA will be utilising two different workflows for VR 360 at the UCL Final. The first workflow replicates the successful model designed by BT Sport and Deltatre for Cardiff.
This allows fans to choose between different 360-degree camera angles (offered as highlight replays) or a live director’s cut.
BT Sport Chief Engineer Andy Beale says: “We developed this with UEFA and Deltatre as a joint concept. We’ve done 20 to 30 events in VR so it’s quite a standard offering from us now, yet it’s [still] trailblazing. We’re still the only broadcaster in the world with a complementary app [based on LiveLike’s app] that has a normal 2D interface plus embedded 360-degree.”
Last summer Nokia was still in business with its VR division and OZO camera range. It has since dropped all further development, but BT Sport and Deltatre continues to use the rigs.
“It’s been our workflow for this season, but it will probably change in future,” says Beale.
As at the FA Cup Final last Saturday, the VR production will deliver six or seven live streams from camera rigs positions behind each goal and on each six-yard line to capture all angles of action in the box. There will be a fan view position in the stands and another on each managerial dugout. Additional colour shots of fans outside the stadium, the pre-match ceremony, player arrival, dressing rooms and the trophy lift will be captured from portable VR rigs.
Deltatre manages the cameras out on the pitch and oversees the stitching from four VR vans. Those feeds are handed onto another unit for the live production
“It’s a ‘TV-esque’ experience within the VR world. You can choose from multiple ISO angles, or the BT Sport-produced feed,” says Hindhaugh.
BT Sport is also pushing use of the magic window where users touch and scroll or pan and tilt on mobiles or tablets.
“You can use headsets… but you don’t have to,” says Hindhaugh.
Simultaneously, UEFA will be testing the VR 360 production to be deployed in the 2018-19 season for VR 360, which will use two InstaPro 360° cameras behind the goals in conjunction with a 180° fixed camera.
HDR – not quite ready
BT’s unilateral presentation on site will be in full 4K (from Sony HDC-1400s), coordinated from a Telegenic UHD outside broadcast unit. This, and the VR production trucks, leave from Wembley straight after de-rigging from the FA Cup Final to arrive in Kyiv on Wednesday morning ready for the build-up on Friday.
The lack of HDR in Ukraine (and for its linear feed from the FA Cup) is explained by BT as a distribution issue rather than a technical one. While complex to monitor the different flavours of HDR or SDR output, it is the lack of in-home displays capable of receiving it which renders the effort of going to town on HDR a bit redundant at this moment.
This hasn’t stopped BT Sport innovating around HDR however. For the first time – BT likes to claim it as a world first – the 4K HDR coverage of the FA Cup Final included 4K HDR graphics.
Produced in tandem with graphics partner Moov and using ChryonHego’s Lyric platform to map the HDR graphics to the 4K live output, the result “offers a whole new branding and design perspective for creatives to explore,” says Beale.
BT Sport also completed a live trial from Wembley broadcasting HD HDR to smartphones cloned with its app [not universally available] capable of displaying the higher dynamic range.
Having recently acquired 5G spectrum from the UK government via mobile division EE for £302.6m, the broadcaster is looking ahead to rolling out a 5G network but doesn’t believe transmitting 4K HDR video to mobile is necessarily the best way to go.
Like Fox Sports in the US, which is trialling 4K HDR from two cameras at the upcoming US Open Golf, BT Sport believes 5G’s early benefits are more applicable to contribution rather than distribution.
“Because of the size and quality of the vast majority of smartphone screens we don’t believe that streaming 4K over mobile is necessary and that HD HDR would provide a substantial uplift in user experience,” says Hindhaugh.
UEFA says that bringing a UCL Final to Ukraine is an opportunity it has been looking forward to, but it does come with certain logistical challenges due to the geographical distance between the Olimpiyskiy Stadium and UEFA HQ in Nyon.
“The logistics of organising a Final across this distance are challenging as far as the organisation of all staff and resources on-site,” it says. “This proved to be a challenge worth overcoming as the venue is set to provide a beautiful Final, and it additionally provided testing of procedures ahead of producing EURO 2020 all across the European continent.”
Chief suppliers to UEFA for the event include Sunset & Vine and Germany’s TVN, as providers of the host broadcast production team; Gearhouse Broadcast for the Technical Operations Centre, feed distribution, FANTV and commentary; TV Skyline for commentary and special cameras; Deltatre for graphics and VR; Incast for commentary and media monitors; Heusser TV for FANTV production; Hawk-Eye for goal line technology; PERI providing scaffold and cable ways; as well as transmission and archive from the EBU.
BBC close to UHD HDR live from Russia
It is increasingly likely – though the BBC won’t commit at this stage – that some if not all of the 33 FIFA World Cup games to which the broadcaster has rights will be streamed live in UHD HDR over BBC iPlayer.
BBC Executive Producer for Football Phil Bigwood says: “It all depends on the trials in workflows and what can get back from Moscow. We are in ongoing conversations with FIFA. There’s a cost balance [to consider].”
He adds, “Watch this space – in a couple of weeks.”
The BBC has been trialling the format online for many months, notably upping the ante with the April streaming of a rugby league match in UHD HDR. The BBC contributed significantly to the HDR broadcast standard Hybrid Log-Gamma but the potential for anyone at home to display HDR images remains limited.
“That’s a big issue in terms of availability – it’s a niche product at present.”
All 33 live games will in any case by simulcast (in HD) online. iPlayer gets around 19 million views a month. “We are confident we can cope with it all streamed live on the website,” says Bigwood.
The live stream is a more complicated operation since it carries a lot of additional and interactive content on-demand such as goals catch-ups and highlights.
The BBC is sharing resources with ITV to greater extent than in many recent major events when the broadcasters team up in Russia next month.
“The cold war is over,” jokes Bigwood. “Budget undoubtedly comes into it – it’s a huge country to work in and it made sense to join up where we could.”
Match feeds and the FIFA Max Server containing a wealth of content are being shared, as is a main studio overlooking Red Square, while the broadcasters are adjacent to each other at the IBC in northwest Moscow.
With rights across all platforms the BBC is offering more World Cup coverage then ever, including full 64-game radio coverage and red button. But it is social media where most of the additional firepower is being concentrated.
Facebook, Twitter and Snapchat will receive quirkier news stories and analysis, for example.
“Our talent buys into it,” says Bigwood, referring to Gary Lineker’s hefty 7 million followers.
BBC teams in Salford will have full remote access to the FIFA Max Server onto which HBS, the host broadcaster, will dump content from 40 crew (one following each team) and eight roving ‘colour’ crews. That means the BBC need send fewer camera crew itself and allows for bespoke packaging of clips for social media, feature editing and match highlights can be cut in Manchester.
“It’s definitely the most comprehensive one we’ve ever done,” says Bigwood. “I have looked after every World Cup since 2002 and the change is amazing in recent years. The demand has grown but so too have the options available to us.”

Saturday 19 May 2018

D-VHS: This is how your HD film collection might have been

RedShark News
The 2004 sci-fi feature I Robot is notable for being more than just the last decent Will Smith movie. Oh yes. It was also the end of the road for D-Theater, a Hollywood branded packaged media format which had the distinction of being the only format to provide high definition programming before the arrival of Blu-Ray and HD-DVD.
It was also one of the shortest-lived physical media formats, ill-served by a combination of confused marketing, competing and incompatible equipment, limited programming and a market starved of HD TVs.
Along with I Robot studio 20th Century Fox, three other studios – Dreamworks SKG, Artisan Entertainment (now part of Lionsgate) and Universal – had backed the format which was otherwise known as D-VHS.
Releases include The Fast and The Furious, The Fight Club, The Ninth Gate, Love Actually, Mulholland Drive and X-Men
At the time of its introduction, in 1998, DVD was making inroads into the home. Released in 1995, and developed by Philips and Sony, DVD was not, however, an HD format. What’s more, a recordable version of DVD didn’t support HD either.
Spotting a gap in the market for a stop-gap until disc media advanced, JVC ganged together with Hitachi, Matsushita (Panasonic) and Philips to devise a new pre-recorded and recording mechanism for the living room.
D-VHS players were cunningly compatible with existing S-VHS cassettes but had the advantage of playing back D-VHS tapes – and recording programming – in SD and in HD.
D-VHS VCRs recorded in MPEG2 at 1080i or 720p and in a choice of speed: High Speed, Standard (STD) and LS or Low Speed (split into three and five-speed) so a tape could have a variety of different capacities. The quality of STD was considered better than DVD since this speed had a much higher bitrate (14 Mbit/s versus 5 Mbit/s approximately) and suffered fewer dropouts.
D-VHS was originally a standard definition format that recorded at the STD speed. When High Definition recording and HS speed was later introduced, it required twice the amount of tape. For this reason, a DF-240 will record 240 minutes of standard definition, HD recordings reduced the capacity of the tape by half.
As a result of all these different speeds, the tape labels were reportedly confusing for the consumer.
More critically, recordings from Mitsubishi and JVC D-VHS decks (NTSC and PAL) were incompatible. Tapes recorded on the JVC in D-VHS could not be played on the Mitsubishi or vice-versa. JVC's HM-DH40000U and SR-VD400U were also the only units to support the lower speed LS5 recording.
Also hurting demand was poor marketing, resulting in low consumer knowledge of D-VHS's advantages and capabilities. D-VHS had input limitations too. The FireWire input was the only way to record HD content to tape from an owner's TV, but very few TVs had FireWire connections and cable boxes with FireWire had to be specially requested from cable companies. Satellite FireWire-equipped boxes were rare or non-existent.
What’s more, studio recorded D-VHS D-Theater tapes could only be played on D-VHS players with the D-Theater logo (protected by D-Theater encryption). Even then, D-Theater tapes were only released in the U.S.
The introduction in 2006 of Blu-ray and HD-DVD put the final nail in the D-VHS coffin.
Well, not quite. Fact remains that while the commercial life of D-VHS was cut short, it can also be argued it was a product released ahead of its time.
You can still buy blank D-VHS tapes, obtain about 80 movies in the format from e-Bay and play them back on second hand JVC VCRs.
Accepting that you don’t mind chunky cassettes, the reason you might want to is that, unlike 20 years ago, everyone has at least an HD TV now and by all accounts, the quality of a D-VHS recording is superior to anything on standard DVD.
Not tried it, but with the standard tapes at 25GB rising to 50GB for versions with more capacity, it should also be able to store 4K video (played back HD).
In online forums where D-VHS is brought up, the overriding sentiment is one of: ‘Why have so few people heard of this?’ and ‘Wow, the picture quality is pretty good’.
Life in the old dog yet.

Friday 18 May 2018

A way of seeing: In conversation with Roger Deakins


IBC

Looking back on almost 50 years of iconic films, ranging from The Shawshank Redemption and The Big Lebowski to Skyfall and Blade Runner 2049, the British cinematographer discusses his creative approach and his life in pictures. 
It irritates Roger Deakins to be called an artist. Cinematographers do not create art, he says: “We are storytelling. I help directors to tell the story they want in visual imagery.”
It’s a typically no-nonsense and modest response from the Devonshire-born filmmaker who is nonetheless a master of his profession. Famously nominated for 14 Oscars (eventually landing one this year for Blade Runner 2049) and first choice collaborator for directors Sam Mendes, Denis Villeneuve and the Coen brothers, the sixty-eight-year-old member of the British Society of Cinematographers (BSC) and its US counterpart is declared by former President of the ASC, Richard Crudo, “the pre-eminent cinematographer of our time.”
What stands out in a 46-year career spanning iconic films like Deadman Walking, The Shawshank Redemption, The Big Lebowski and Skyfall is Deakins’ grounding in the visual, rather than the technical aspects, of the role.
“I certainly think there is an obsession with technical abilities at the expense of creativity and substance,” he says. “If you can light and photograph the human face to bring out what’s within that person, you can do anything.”
A natural painter who continues to sketch frequently while working out shot composition, Deakins has also been an avid photographer since childhood – indeed he might have become a photojournalist had his path not led to film school.
It was at Bath School of Art and Design, studying graphic design, that his passion for still photography took over and hastened a decision to enrol at the National Film and Television School. On graduation, he spent seven years travelling the world making documentaries.
One might think there is quite a gap between ‘run and gun’ documentary realism and the designed, rehearsed and infinitely larger scale production of a dramatic feature but for Deakins the progression was natural.
 “Certainly, on a feature film you can create a set and create the lighting and adapt what is in front of you to help tell a story in a more fictional way, but throughout my career I have used the techniques I learned through taking still photographs and making documentaries. I don’t see the two as dissimilar.
“The docs I made were generally unscripted and we would attempt to portray a situation, whether the aftermath of conflicts in Zimbabwe or Eritrea or more anthropological [such as ceremonial traditions of the Raj Gond in India] as truthfully as you can. That’s similar to features where you are working out how best to cover a scene. It’s an intuitive reaction to what is front of you. Sometimes you get the chance of another take, but often you do not. There will be something about the actor’s performance or about the existing light which will be unique, and you have to capture that.”
His first major feature film collaboration was with director Michael Radford on Nineteen Eighty-Four, a 1984 adaptation of Orwell’s novel, which Deakins still ranks among his own best work.
“It was the scale of the challenge and the brilliance of what Mike did with that film,” he says. “His vision for the film matched what I took from the novel and it was very much in tune with the culture and politics of the time.”
There followed a succession of acclaimed British indie features including Personal Services, Stormy Monday and Sid and Nancy, before Deakins shot his first Hollywood production, Mountains of the Moon for Bob Rafelson in 1990. Shortly afterwards he met Joel and Ethan Coen, lensed Barton Fink, and inked an inseparable creative partnership on their films including Fargo, No Country for Old Men and Hail Caesar!
 “With any director I’ll discuss the choice of location and sets and discuss the way the camera might move and lens length, and how subjective or objective the camera might be,” Deakins says. “I usually start off with a thorough read of the script, literally reading line by line but each director will have their own way of working – some prefer to be extremely thorough in advance [like the Coens], others want the spontaneity of doing it all on the day.
“A cinematographer has to be able to run a crew and take them on a journey,” he explains. “When I began I was actually quite shy, so working with large crews was daunting at first but it’s a very fundamental part of the job. You also need to make decisions very fast. In some ways a director can spend a little more time making decisions than their DP, who has to make sure that what the director decides, happens.”
Unlike many DPs who assign themselves a camera operator, Deakins prefers to get behind the camera himself, another legacy of his grounding in documentaries.
“I find that being involved with the camera and the way it moves is important to the way I work. It’s more important than lighting really.”
Likewise, his approach to recording digitally hasn’t altered his approach. “It’s a bit more reassuring seeing the image you are shooting on-set and knowing what it is you recorded rather than receiving it from the lab the next morning, but the mechanics of working with a camera and a lens haven’t changed. You are still working with light and camera movement to give the audience a perspective on the story.”
What digital has enabled, he says, is a reduction in the size and portability of the camera and therefore the speed of image capture. However, he is generally dismissive of latest image enhancements like ultra-high resolution, higher frame rates and higher dynamic range.
Having supervised the HDR transfer of films he has shot, for Blu Ray release, he says, “In theory it’s nice to have that range. In reality, it makes the image kind of crude and superficial. I am not a fan.”
He adds, “There are so many new ways of telling stories visually. I like digital, but I am a purist. I don’t like 3D or other immersive technology. I like cinema to be like seeing a picture on a wall, as if you walked into a gallery and saw an Edvard Munch painting coming alive.”
Deakins though is no techno-refusenik. He has shot digital almost exclusively for the last six years and isn’t nostalgic about the demise of film; “I love film… but things move on.”
You might think the advance of technologies for automating the extraction of scene metadata for deciding focus, framing and exposure in post are anathema to someone like Deakins whose modus operandi is to create the ‘look’ of a film in camera.
“I am actually surprised that this hasn’t happened already and that we still use a camera with a lens,” he says. “This technology is kind of old fashioned.”

Having lent his expertise as consultant to the look of a number of animated features including WALL: E and How to Train Your Dragon, Deakins doesn’t see a fundamental difference between computational cinematography and his goal of telling a story.
“An animated film is created in a computer by placing a virtual camera anywhere you want and putting a light anywhere you want. It can be as photoreal as you want it to be. I feel that this kind of animated production will merge and blend with live action at some point.”
Love of film
Growing up in Torquay, Deakins had no connection to the business “or with [TV / film] artists and certainly no expectation of working in the arts.” But he loved film.
He joined a film club and was exposed to French new wave filmmakers Jean-Luc Godard and Alain Resnais, themselves influenced by the work of Jean-Pierre Melville whom Deakins also admired.
He recalls the impression made on him by The War Game, a BBC TV drama directed by Peter Watkins about nuclear war, decreed so unsettling it was banned by the UK government for 20 years.
“I encountered so many films and filmmakers from the film club. My overriding experience was ‘Wow, what is this!’,” he says. He holds particular appreciation for the distinctive cinematography and mise-en-scène of directors (and their DPs) like Andrei Tarkovsky (Ivan’s Childhood, Stalker) and Kenji Mizoguchi (Ugetsu, The Life of Oharu). Russian director Andrei Zvyagintsev (Leviathan, Loveless) is one of the few filmmakers working today reminiscent to Deakins of this style. “We are in a different world of superhero movies now, where the effect is more theme park than cinema,” he says.
He finds it sad that films are increasingly premiering online (on the small screen) and that hardly anyone in power in Hollywood has seen a movie by Tarkovsky, or Melville or Mizoguchi.

Having wrapped principal photography on The Goldfinch for John Crowley (Brooklyn) starring Nicole Kidman earlier this year, Deakins says he selects projects only if he is emotionally drawn to the story.
“I ask myself whether I’d go and see this in a cinema. Making a film is a long investment of time; it’s always quite hard work with long hours, and an exhausting process if you care about what you are doing. So, I need to feel emotionally invested in the story.”
An occasional speaker at the NFTS when he is in the country, Deakins is always prepared to hand on advice. His website devotes a section to respond to the queries of aspiring cinematographers and peers alike.
“My advice is that you can observe somebody else and they can be an inspiration, but you can’t copy them. You have to discover your own way of doing it.”
One of Deakins heroes was the late Conrad Hall, ASC who lensed Butch Cassidy and the Sundance Kid, American Beauty and Road to Perdition, all of which won him Academy Awards. Hall’s 1972 film, Fat City, influenced Deakins’ decision to shoot movies instead of stills.
“I was excited by his work because he shot more in the Italian neo-realist way more than traditional cinematographers,” he says. “He had a certain eye; a sense of seeing. That is really all you have to offer as a cinematographer. Otherwise, what have you got?”
Deakins’ eye is recognisable across his body of work.
“You spend your life trying to figure out how you see things. You are interpreting what is in front of you. An image is your own in some way. It’s a discovery.”


Is TV’s future in your hands?

Broadcast

Quiz app HQ Trivia is setting the pace for interactive broadcasting
https://www.broadcastnow.co.uk/home/is-tvs-future-in-your-hands/5129490.article


Last autumn, the team behind Twitter’s six-second video app Vine launched what is arguably the world’s most successful new gameshow.
HQ Trivia is a free-to-enter, live playalong show, accessed via a mobile app. It attracts audiences of up to 2 million, mostly in the US, for its twice-daily broadcasts. A UK version launched in January and has built an audience of around 200,000.
On the face of it, HQ Trivia is an old-fashioned pub quiz dressed up in familiar TV quiz show trappings. It’s a rapid-fi re, presenter-led knockout competition, with 12 multiple-choice questions of increasing difficulty.
Users select their answer within a 10-second time limit (preventing Google searching), and correct responders progress to the next round.
“TV producers might feel the format is lo-fi, crude and lacking in innovation, but it has been executed in a very impressive way. HQ Trivia is setting the pace for scheduled live interaction,” says Tom McDonnell, chief executive of Monterosa, the company that helped the BBC devise one of the first live TV and digital programmes, Test The Nation, in 2002.
App integration
Mobile companion apps connected to TV programmes are nothing new. “But in many cases, they are not integrated into the show format from the start,” says Rob DeFranco, vice-president of sales and development at interactive producer The Future Group (TFG).
Lost In Time, which TFG co-produced with Fremantle Media, featured a mobile connected app that allowed viewers to play against TV competitors in real-time.
Gamification is a proven mechanism for generating loyalty and recurring behaviours from viewers, and HQ Trivia developer Intermedia Labs has latched onto a formula that is attracting admirers.
“Indies, broadcasters and app developers are all thinking about how they can pivot ideas towards HQ’s brand of appointment-to-view programming,” says Tom Young, executive producer at Somethin’ Else, a London-based indie that has developed online audiences for ITV’s I’m A Celebrity… Get Me Out Of Here!, Saturday Night Takeaway and The Voice UK.
Indeed, Intermedia Labs co-founder Rus Yusupov has publicly declared an ambition “to essentially build the future of TV”.
Many concur with him. An article in the New Statesman argued that HQ Trivia rather than Netflix signals the future of television because viewers want to be participants, not mere spectators.
“HQ Trivia is certainly the future of broadcast,” says Peter Maag, executive vice-president of video streaming software developer Haivision. “The combination of synchronisation and acceptable latency delivery at scale is amazing at driving engagement through interactivity.”
HQ pushes reminders out to players to ‘tune in’ to the next broadcast, which creates a sense of anticipation around the live event. “It’s dictating the lifestyle habits of its users in a way that TV and radio station schedulers can only dream of,” says Young.
A ‘Friends on HQ’ feature lets users search for and connect with friends and family, the first of several enhancements coming to the app aimed at cementing the appointment to view.
“There is a compelling social element in which players can see and play against friends and other competitors in the same space,” says Tom Williams, chief executive and founder of digital production designer Ostmodern.
“The boundary between viewer and gameshow has been broken down. The viewer is intimate with the live event. In a way, they become the star.”
It’s also exciting to realise that hundreds of thousands of people have been knocked out while you are still in the running.
“Seeing thousands of chat messages stream past underneath the live video provides a sense of occasion that is more immersive than a TV show can offer,” says McDonnell.
The show’s 15-minute duration is also important. “There are lots of examples of indies trying to create short episodic drama, but stickiness has been low,” says Williams. “HQ Trivia has no narrative arc, meaning viewers can just drop in.” Young admires HQ Trivia’s hyperactive running order.
“The gameplay zips by with punchy formulaic links that its audience know inside out,” he says. “Only the countdown seems to last longer than 10-15 seconds. It’s content that’s purpose built for a short attention span.”
HQ’s main draw is the carrot of winning cash prizes from a typical pot of around £15,000 shared between sometimes hundreds of winners.
According to a Strategy Analytics survey, viewers like the ability to win prizes and receive discounts, and are least interested in posting photos and videos related to the show on social networks like Instagram.
Funded by venture capital, HQ’s makers are hunting sponsors. Nike and Warner Bros have advertised on the app, the latter promoting movie Rampage by drafting in actor Dwayne Johnson as host and upping the cash giveaway to $300,000 (£220,000).
Betting companies like Betfair would be a logical fit for longer-term sponsorship, suggests Strategy Analytics analyst Brice Longnos.
“The prizes and rewards earned must be tangible and feasible to obtain, relevant to the user’s interests and comparable with the amount of time and effort being put into using the app,” he adds.
Provided HQ Trivia keeps audiences loyal, the potential for monetisation is evident. As a result, one valuation puts Intermedia Labs at around $100m (£75m).
It is the wider potential for interactivity with video that causes most excitement. “Visually, it feels incredibly personal – talking directly to its users, framed perfectly as vertical video,” says Young.
“This approach is certain to influence the future of live mobile streaming. Talent auditions, news broadcasts and online shopping formats seem a logical evolution, but how about live music or stand-up comedy gigs, delivered directly to you daily at a regular time?”
McDonnell ponders: “What if iPlayer, BBC3, ITV Hub or even Netflix began to introduce playalong concepts? Talkshows or news programmes could be made more engaging by involving audiences, and a new generation of talent shows could run rapid-fire audition knockouts at lunchtime every day.”
The growth of eSports and interactive platform gaming among 10-35s should also spur producers into action. “Traditional broadcasters and producers need to borrow and incorporate these strategies in their formats to stay relevant and increase revenue streams,” says TFG’s DeFranco.
Live-streaming over mobile networks is critical to HQ Trivia’s success and the technology behind it should not be underestimated. The app has experienced glitches and struggled to cope with peaks in demand.
 “We’re working on making the service more reliable as we scale to meet demand,” HQ Trivia tweeted to players in November.
Monterosa, which created the playalong companion for Channel 4’s Million Pound Drop, the interactive app for Love Island and a Doctor Who mobile trivia game for the BBC, is offering broadcasters a pre-built solution.
Launched at Mip TV, Gameshow Live is a customisable live-to-mobile gaming platform supporting features such as trivia, polls and leaderboards, a push notification system and fast video latency. “It is in production with a small number of clients,” says McDonnell.
He says the big challenge for HQ will be competition. Like Farmville or Pokemon Go, the game may go viral but have a limited shelf life.
“Perhaps we need to look at apps like these as periodic: you’re going to get a year out of the idea, and that’s just how life is now,” says McDonnell.
Live synchronisation
Synchronising with live broadcasts is a wider issue. “Cross-platform formats increase the complexity compared to standalones,” says DeFranco.
“There are different platforms, different time constraints and different form factors to consider. The necessary co-ordination is now happening prior to the show being produced and developed, rather than being an afterthought.”
Some believe further advances in streaming technology will ultimately make massive multi-user live interaction to mobile as straightforward as today’s broadcast.
“We are seeing requests for all types of genres,” says DeFranco. “The formula will become commonplace in the next three years.”

Microsoft's Surface Hub 2 claims it will change the way we work together

RedShark News
Few people are going to argue that working as a team in a ‘collaborative office culture’, is a good thing to aim for although this jury is out as to whether corporate employers will ever truly make their staff feel more than the cog in the machine they actually are. 
“Unlocking the power of the team has never been more important,” according to Microsoft’s blurb, which reckons this can be achieved by buying its Surface Hub digital whiteboard.
Fact is that at €6500 (or €18,500 for a 84-inch version) not many people did. Microsoft says it sold only 5000 of them, albeit to over half of Fortune 100 companies, serving to highlight the product’s rarefied price.
However, Redmond thinks it is onto something and is going ahead with a sequel. Surface Hub 2 has some interesting characteristics which could set it apart – provided the price is right. It’s not available until 2019 and there’s no information on cost at this stage, so it’s difficult to tell.
It will, though, come in just the single 50.5-inch size and will have a mount capable of rotating the 4K display from landscape to portrait mode. 4K cameras will rotate along with the screen and there are integrated speakers and far field mic arrays too, intended to allow those in the meeting room to feel that anyone dialling in remotely is right there with them as well.
The impact of this will be improved with a fairly unique 3:2 aspect ratio so that in portrait mode the video caller will appear pretty much life-size – an impact made the more impressive if the caller (or callers) are also equipped with a Surface Hub 2.
It will run the gamut of Microsoft corporate software including Teams, Microsoft Whiteboard, Office 365 and Windows 10.
Microsoft are pitching this as a bold attempt at transforming the way we work. Corporate president for surface computing, Panos Panay, describes it on the Microsoft blog, as more of a huddle board - something “to get people out of their seats, to connect and ideate, regardless of location.”
Ideate. Mmm. The Meriam Webster dictionary uses the transitive verb meaning ‘to conceive’ in this example sentence: “The psychotic would repeatedly ideate the act of committing murder.”
I digress. 
“People will engage differently as the form factor enables them,” Panay continues. “It's sort of like your phone. You pick up that device for the first time and your behaviour has changed. You pick up a Surface device for the first time, and you've changed. We already have customers that are using it, we have IT pros telling us what they need, we have facilitators telling us where the future of the office is going. We can see how people are starting to change.”
Another feature – again only useful for companies with the cash in their AV budget - is that up to four of the displays can be ganged together – either in a larger portrait or landscape view, or separately.
According to Microsoft this will have a profound impact on what groups can accomplish together, allowing users to display multiple pieces of content side-by-side or work simultaneously across Microsoft Whiteboard, PowerBI, PowerPoint, and a full view video call.
Of course, you can write on it too using a stylus for drawing, sketching and annotating. Being able to tilt the screen should make writing easier.
It quotes academic studies suggesting that companies that promote collaborative working are five times as likely to be high-performing.
Whether Surface Hub 2 does indeed move beyond “just passing along information” to a world where we are are collaborating real-time” is moot.

Wednesday 16 May 2018

How more standards could help the move to Media 4.0


IBC

Metadata is crucial as an enabler of automated production and targeted content. Would a standard help?
When a broadcaster shoots and distributes content, they are throwing away up to 95% of their raw material. For live broadcasts the figure is closer to 99%. Yet this massive wasted asset could be monetised if it can be accessed and shared online by anyone, or better still anything, within the media company.
Technology is arriving in the form of machine learning algorithms that will enable the automated production of video tailored to individuals on specific social media platforms, smartphones, streamed channels, and TV.
Some dub this Media 4.0 - the mass customisation and distribution of video content targeted to different channels (broadcast, digital and social media) using AI and metadata within an existing workflow.
Getting to this stage requires knowing what the content is and where it resides.
TVU Networks Chief Executive Paul Shen says: “The effort of locating content you’ve already shot is often costlier and potentially slower than going out and re-shooting material. With the increasing demand from consumers for customised video content combined with the coming 5G networks, producing more sophisticated stories faster will be critical to satisfy the market. The first step on this path has to be to index everything.”
The process of identifying and distributing video content so that media producers can follow their audience to whichever device they’re viewing their content is possible now but it’s a fragmented picture.
Codifying standards
There have been many attempts to codify standards for metadata in the past, most notably a push by the EBU to adopt a standard known as EBU Core. There are also well established common descriptive and rights formats including TVA and ADI, and common identifiers such as EIDR (Entertainment ID Registry) and ISAN, and, of course, many technical metadata standards.
Each broadly supports four basic tenets of data structure, content, value and format/exchange.
Prime Focus Technologies Vice President and Global Head, Marketing & Communications, T. Shobhana, says: “Together these provide the rules for structuring content, allowing it to be reliably read, sorted, indexed, retrieved, and shared. When metadata records are formatted to a common standard, it facilitates the location and readability of the metadata by both humans and machines.”
Avid VP Platform & Solutions Tim Claman adds: “We think a standard for time-based metadata would aid in the discovery of content. It should feel similar to enabling a search on the internet. If web pages were not designed using a common language and if data were not represented in a consistent form it would be impossible to find anything online. The industry should learn from that and agree to a common language and a common structure for time-based metadata.”
However, he cautions on the practicality of achieving this. “The industry has a mixed track record of developing and implementing metadata standards. You can’t be overly prescriptive without being restrictive.”
While a global metadata standard may ease broadcaster workflows, this is not considered likely.
IPV Executive Vice President of Sales and Marketing Nigel Booth says: “A lot of different standards already exist, but asset management vendors want to differentiate themselves – and their use of metadata is one of the ways they do this. So, standardising how content is tagged isn’t likely to be popular.”
Tedial General Manager, US, Jay Batista agrees: “Different AI vendors are supplying engines with various tagging options, and they consider their logging parameters both proprietary and a competitive edge in the marketplace.”
Broadcasters would benefit the most from a unified metadata schema, he says. “Yet, many content producers believe they must maintain an internal core metadata index for their unique productions and business requirements, and often these internal data models are not shared.”
It is believed more realistic to develop a way of sharing data rather than standardising it.
“It’s more important to standardise how content can be uniquely identified,” says Paul Shen. “If it is simple and transparent enough, we may not even need a standards body.”
However, it can be challenging to do this even within one company, let alone sharing data with third party systems and external media partners.
“We have done MAM projects which have failed because it has proved hard to get all stakeholders in one organisation to agree,” says Claman. “Even when you do get agreement on the metadata governance it is often only for a period of time.”
He explains that Avid conceives of metadata in strata. “What these layers have in common is the ability to be expressed as individual frames or moments. If you can aggregate that time-based strata the more discoverable your content becomes.”
Avid advocates the idea of a consortia to devise such a standard, much like the way the industry united to forge standards around carrying audio, video and ancillary data over IP.
“Some vendors go into these [standardisation efforts] looking for an opportunity to differentiate and maybe claim intellectual property and get an edge,” warns Claman. “A consortium will work best if vendors follow the lead of users. It leaves less room for proprietary technology to be introduced into the mix.”
“If MAM providers are required by large broadcasters to standardise, it’s possible that vendors will be forced to collaborate to put forward a single-solution way of working,” says Booth. “An example of where this has happened is IMF (Interoperable Media Format).”
TVU revealed it is working with a number of equipment manufacturers and major broadcasters - believed to include Disney - on the best approaches to the issues. An initial meeting is being held in June.
“We want to create a consortium which would provide guidance to both manufacturers and media companies,” says Shen. “Every part of the industry needs to come together if [automated production] is to happen faster. I don’t believe any one company can do the heavy lifting.”
Sharing, not conflicting
One aim is to address potential conflicts in working with metadata originated under different AI/ asset management protocols.
Primestream Chief Operating Officer David Schleifer says: “The immediate area where I would see conflict is in assuming that the value of metadata in a file would be the same regardless of the AI-driven dataset that generated it. As the area is still maturing, I would not assume that, for example, face recognition from one system would be equal to face recognition from another. In fact, one system may focus on known famous people while the other might be a learning algorithm to build collections – therefore, different data used in different ways built on similar technology.
 “AI is a perfect example of where an existing metadata schema would need to be expanded,” he adds. “With AI we do not yet know where it is going or how it will be used, so the schema needs to be extensible, allowing for growth. At a high level you can sort all types of metadata into categories like ‘tied to the entire asset’ or ‘tied to moments in time or objects in the image at specific times’, and so on. But in the end, creating the schema first will always lead to revisions later.”
Standardising the ontologies (terms used to describe media) that are used within different domains would be useful when sharing content.
“Standardisation in this area would mean less confusion across industries,” says Booth. “For example, IPV’s Curator uses controlled vocabularies to ensure consistency and accuracy. Specific terms are selected and tagged instead of having different operators selecting their own terms.”
An alternative is the use of technologies like XML and Rest APIs, which are becoming increasingly popular as a format when data is exchanged.
“The challenge with descriptive metadata is that you don’t know ahead of time what is going to be interesting after the fact,” says Claman. “For this reason, for news and sports, you want as much automation of metadata creation as possible.
“We need extensible data models if we’re going to see widespread adoption.”
Metadata fusion
Booth calls for ‘metadata fusion’, a means of bringing together data that’s saved by contrasting systems and checking where it agrees. “Doing so means that you can improve reliability. An example of this is combining speech-to-text and object recognition – if they both identify similar metadata, it’s likely correct. The key thing is to understand the provenance of the metadata - as long as you capture it you can make a decision based on it.”
Downstream, licensing and reconciliation issues need to be considered and adhered to. Additionally, some content owners have clear contractual rules which restrict platform operators from modifying their data.
Piksel Joint Managing Director Kristan Bullett says: “Providing clear traceability of origination of metadata and also providing a mechanism to lock restrict modification of attributes that should not be modified.”
Piksel is initiating its own metadata group. It is joining up some disparate systems that will allow customers to purchase, ingest and manage localised metadata on a per-title basis, enabling advanced recommendations and content discovery functionalities.
The first metadata providers to join are Austrian content discovery specialist XroadMedia, Bindinc Metadata from the Netherlands, France’s Plurimedia and Mediadata TV from Spain. We don’t know at the moment whether providers like ThinkAnalytics, Rovi/TiVo and Gracenote will be ‘invited into the club’ or whether it will act as a purely competitive offer to these alternatives.
Established primarily to aid content editors in the quest to augment and enhance their existing metadata, Piksel said its ‘ecosystem’ will prove particularly useful for customers dealing with multilingual or cross-territory titles.
“Platform operators have been abstracted away from the responsibility for their metadata needs and need to work with the data that has been provided to them,” says Bullett. “Part of our vision is to bridge this gap and put that decision-making process into the hands of the people who are responsible for ensuring end customers get the best possible user experience.”
Automated production, personalised distribution
For production, TVU’s solution is MediaMind which embeds metadata on ingest using text to speech recognition, as well as an AI to identify objects and people into specific video frames in real time. Content can be searched with TVU’s own search engine and it has an API allowing broadcasters to integrate it with existing MAM systems for archival search.
“If the producer is just interested in a few frames out of a twenty-minute file, today’s manual search processes can make locating the exact frames time-consuming,” says Shen. “Using an Artificial Intelligence engine with object and voice recognition will automate the process of tailoring and distributing clips to the appropriate outlets.”
Tedial’s similar approach targets sports production. Its SMARTLIVE tool uses AI logging to automatically create highlight clips and pitch them to social media and distribution.
 “Applications are being developed, especially in reality television production, where auto-sensing cameras follow motion,” says Batista. “AI tools such as facial recognition augment the media logging function for faster edit decisions as well as automatic social media deliveries.”
The current state of the art in AI only augments news and sports production and is intended to augment the human curated event presentation with automated story-telling.
The evolution of this suggests programmes will at some point be created entirely automatically to cater for different consumer tastes.
“With the growing capability of technology to collect every bit of data to analyse consumer behaviour, it could one day become plausible to create a formula for how content should be produced based on the target audience,” says Shen.
The BBC has been working on this, in the shape of object-based media, for nearly a decade and is expecting to deliver it within the next five years.
“The importance of metadata in this space will be crucial as an enabler of targeted content,” says Batista. “Object-based media is a hugely interesting concept and could transform the way content is consumed dramatically. There are challenges, obviously but you can easily see from a resource, storage, distribution, consumption and analytics perspective what opportunities this could bring.”
The media company of the future could be a mass producer of individually tailored content. A clue to how this would look is in music distribution. Five years ago, consumers tended to download music tracks to add to a personal collection. Today, it is more likely they will prefer a Spotify-like service to curate and stream the content they want for them.
“Media 4.0 will see video production move from a programme-centric to a story-centric process, where the content is automatically produced, targeted and distributed to the viewer,” says Shen. “Producers create the video content, and the AI engine automates the assembly of the material and delivers it in the most effective way to the target audience.”
Whether this is desirable or not for all content is another matter.
“The risk and challenge here is not in our ability to move certain types of programming to an automated process, but rather the loss of editorial judgement,” says Schleifer. “Systems that produce content in this manner will adhere to specific rules, and as a result will produce consistent content that will never challenge us to get out of our comfort zone. We need to figure out how a system like this can continue to push the envelope and challenge us.”