Wednesday 29 August 2018

Media Factory or Customised Curation? Content's future at IBC2018

content marketing for Rohde & Schwarz

Themed around ‘the search for growth’ IBC will showcase AI-assisted workflows, the status of IP infrastructure and the future of content.


Entering IBC2018 it’s clear that broadcasters are emerging from a period of initial, high-level exploratory discussions about IP, to one in which detailed requirements and challenges are now being actively investigated. The core protocols for IP video workflows are certainly there in ST-2110, but it’s the next level of detail, largely based on feedback from early adopters, that will further inform the standards bodies and help vendors to continue to advance IP workflows.
Broadcasters want a managed, seamless migration to IP with little to no risk. They don’t want to schedule a hard cut date and hope for the best. They want time to test their infrastructure, running legacy systems in parallel when possible, so that risk is completely mitigated.
That’s why worldwide, rollout of uncompressed IP is still very much in its infancy. There have been some SMPTE 2022-6 based installations replacing SDI and even a few of the new SMPTE 2110 based installations. If they are not building greenfield sites, then broadcasters will seek hybrid solutions to transition operations to IP over the next few years.
Expect talk among CTO’s of the merits of cloud-based microservices. This concept breaks down playout into its constituent parts enabling broadcasters to pick and choose variables like data centres, bit rates, and templates. Such an approach should make channel launch as easy as opening apps on a mobile phone.
One buzzword which will be everywhere on the showfloor is Artificial Intelligence (or Augmented Intelligence or even Intelligence Amplification depending on how far you are willing to believe the degree to which an AI can ‘think’). This technology is beginning to prove its metal with practical examples of production and cost efficiency. Fast turnaround, high volume content like sports and news are logical targets for AI-facilitated products.
Linked to this is the increasingly viable concept of just in time content assembly. Here, small segments are created from the live event and used quickly. Catch-up and on-demand follow the linear programme with ever shorter delay. Sports is the prime example: A tennis match can be available on-demand in a matter of minutes after a game has ended. At Wimbledon this year an AI workflow automated virtually the entire process of creating and publishing online two-minute highlight reels.
By breaking down a piece of media into separate objects, attaching meaning to them, and describing how they can be rearranged, a programme can change to reflect the context of an individual viewer.
The BBC for one think this approach (it calls object-based broadcasting) has potential to transform the way content is created and consumed: bringing efficiencies and creative flexibility to production teams, enabling them to deliver a personalised feed that understands individual viewing habits to every member of its audience.
All these trends hook into a new stage of media industrialisation or commoditisation in which media is broken down to be reassembled as in a factory. It is up to the industry whether this leads to bland cookie cutter content or fresh and challenging forms of curated editorial creation.

Monday 27 August 2018

Blast Off - Sam McCurdy BSC talks Lost in Space


British Cinematographer
To reboot 1965 TV series Lost in Space, Netflix returned to the source material, in the form of the novel ‘Swiss Family Robinson’ publisher in 1812, and combined it with a visual treatment straight out of the future.
The adventures of a family stranded light years from home on an unknown planet was made in an enjoyably camp Irwin Allen version for CBS and revisited in a 1998 New Line Cinema feature film. The Netflix version, produced by Legendary Television, is set in 2046, spanning ten one-hour episodes and starring Toby Stephens, Molly Parker and Parker Posey, whilst also featuring Taylor Russell, Ignacio Serricchio and Mina Sundwall.
Lost, not knowing where they are, they try to fix their spaceship as they encounter various kinds of mysteries, a robot, and also have to deal with a saboteur in their midst.
Showrunner (executive producer) Zack Estrin tapped director Neil Marshall and cinematographer Sam McCurdy BSC to set the project on its way. McCurdy lensed six instalments of the series, including the pilot, with Emmy-nominee Joel Ransom shooting five episodes to complete the first season. McCurdy had previously shot The Descent and Game of Thrones’ episode “Blackwater” for Marshall.


“Zack very clearly wanted the crux of the story to be about the family, but for the driving force to have a darker, more grown-up undertone,” says McCurdy. “The fundamental story is about a family lost in a difficult, life-threatening situation and how it challenges them and brings them closer together. In every tone reading and visual reference meeting we would mention (James Cameron’s) The Abyss and the Spielbergian world of E.T.
Indeed, there is scene in one episode in which a young member of the family has a close encounter with a creature, deliberately echoing an iconic shot of Elliot befriending E.T.
“We are all big fans of sci-fi – Neil, Zack and I – but we were keen to do something that wasn’t a brash, glossy superhero show. We wanted to build a world that was completely ours, but there are some definite references in there.”
The biggest thematic reference though was Jurassic Park/Jurassic World. “We know we are making a family show but it’s one grounded in a reality,” says McCurdy. “Because of the freedom Netflix gives you we knew we had the support to make this more a theatrical than a TV visual experience.”
They were wary too of comparisons with Netflix’s Stranger Things, wanting to differentiate their story by delivering a modern rather than ‘80s nostalgic look. The template here were Christopher Nolan’s films from The Prestige to Interstellar, which infuse high-concept with realism.

These sensibilities led to McCurdy’s choice of the Red Weapon with Helium 8K S35 sensor. “I’ve been a believer in Red for some years and applaud the way they try to develop new technologies to suit new ways of filming,” he explains. “We tested the Helium extensively and it was quickly apparent that there was nothing else out there like this in terms of its modern look.”
McCurdy paired the Red Weapon with a set of Leica Summilux-C Primes. “These offer a very sharp, European sort of feel if you draw it down a few stops, and a softer, American movie feel if you shoot a little more wide open,” he describes. “I wanted the softness to depict the family, but the extra sharpness and extra depth-of-field for visual effects to play around with.”
With a third to half the total budget being spent on visual effects, McCurdy wanted his shots to be big. “We don’t need to be spending money on a greenscreen comp outside of a window. If a shot needs VFX then let’s make it a big shot. Let’s make sure they are getting money from the set. So, we framed for family/group shots rather than close-ups. For me, this takes on a much more cinematic feel because you are not just cutting heads together but cutting from group shots against amazing locations and sets.”
The theatrical aesthetic also determined the use of two cameras (A-B) - and primary use of dollies and tracks with very little Steadicam.
“Although we deployed two Weapons, we shot the show as a single-camera drama. We used very little handheld or Steadicam and instead went back to basics. We didn’t want the new gimmicks like fancy 360-degree shoots. I felt we needed to be intimate with the family and their drama, and we were there to photograph that.”

Shot at The Bridge studio in Vancouver, a good third of the series used locations in British Columbia. “We travelled beyond Whistler and Blackcomb to mountain ranges covered in pristine snow. Just physically taking all the equipment up there was a big deal. The locations are remarkable and a big part of the show’s aesthetic.
“Even though we knew the plates would be augmented by VFX – and they could have chosen to design everything in a laptop – we all wanted to keep as much reality flowing through the piece as we could,” McCurdy notes.
The planet on which the family Robinson crash land houses several ‘eco-structures’ ranging from barren, snowy mountains to glacial landscapes and a rainforest with 30-foot diameter giant redwoods.
“You are kind of blown away by the sheer scale of this wilderness and that’s entirely what we wanted to capture,” he says. “The Helium was phenomenal in natural light. It has a softness and a curve that coped with the extremes of big skies and snow as well as contrasts of massive tree lines and being deep in forested areas. We pushed it as far as we could, and it upheld for creative reasons and for VFX. The depth of exposure and latitude gained by the sensor meant it was the obvious choice.”
With a 4K deliverable requirement, McCurdy shot 7K compressed 7:1. McCurdy, along with production designer Ross Dempster and gaffer Todd Lapp, devised a lighting system for the spacecraft, crew quarters, mobile vehicles and ‘garage’ to operate in a range of different scenarios. The fixtures were routed back to a console for McCurdy’s control, for example, switching the lights to ‘emergency’ mode or having individual lights illuminate parts of the ship as a crew member walks through them.


The inspiration for part of the design was the smart lighting effects found in modern office buildings, which switch off after a certain time, or switch on automatically when they detect movement. “Each of the fixtures had a daylight tungsten LED and an RGB LED system in it so we could mix any combination of colours and run sequences at a flick of a switch. This gave us incredible 360-degree freedom to move the camera where we wanted.”
He adds, “It was an incredible experience technically and creatively because we weren’t tied to the key lights. I don’t think we ever had need in any episode to bring a key light onto the set.”
Preferring to mix the camera “for the moment” rather than use a LUT, McCurdy set daily parameters with the on-set colourist. “The colour was then kept through the offline but inevitably when it came to DI, with so many VFX shots needing to go in, some things changed. A scene that was maybe scripted as dawn or dusk ended up as day or night.”
In what is now becoming commonplace for high-end drama, the entire show was given a HDR pass, as well as an SDR master. “Having done a few HDR shows, I was keen to split the two. You can’t just let one copy across to another. You need to start from scratch and Legendary were very understanding with this, giving us a week for the HDR and 4-5 days for the SDR colour correction [in the pilot] with time reducing a little per episode.
“I was very proud when I saw the first cut of the pilot. It felt like a movie.”


One DP… Three distinct looks - Danny Cohen BSC

British Cinematographer

Cinematographer Danny Cohen BSC exhibits versatility in crafting stories for different directors, and those skills were brought to task when shooting three films slated for release this year. Two were period pieces – Victoria and Abdul and Final Portrait – and one contemporary – Disobedience.

All three films were shot with Red cameras, but the look and feel of each couldn’t be further apart. “I guess you get comfortable working with a certain camera,” says Cohen. “The more you do, the more you can push things. You know its quirks, you understand the sensitivity to light, and recognise when you might lose detail in the highlights. There are always reasons to select the camera for a specific story and for these three films the Red just slotted-in and worked.”
Victoria And Abdul sees Cohen reunited with director Stephen Frears after their acclaimed work on Florence Foster Jenkins. The enlightening and unlikely friendship between the Commonwealth’s daunting Monarch and an Indian clerk is set around the late 1800s. This time around, Cohen selected a Red Dragon combined with ARRI Master Primes for the bulk of principal photography.
“That combination gave a good strong image – and a bit of bite,” he says. “Since the image you shoot on-set can be manipulated in so many ways, you need to begin with quite a strong image. And I definitely wanted that slightly punchier image to resonate with the story.”
As luck would have it, Cohen’s pre-production coincided with an exhibition at the Victoria & Albert Museum about early colour photography. “People were beginning to experiment with various formats. Naturally, there’s some colour deterioration from prints over 150 years old, but you definitely get a flavour of how things looked through a lens at that time,” he notes.

Cohen also shot a smaller, more intimate film. Directed by Stanley Tucci, Final Portrait is about American art critic James Lord’s relationship with artist Alberto Giacometti. It mostly plays out in the artist’s studio in Paris in 1964, and was shot at Twickenham Studios.
“We only had 20 days to film and so each day we had to achieve a lot,” Cohen recalls. “I couldn’t really hang about orchestrating lights for each set-up. We used a lighting rig with a number of set-ups pre-programmed – day, night, hard sun, soft sun, flat, overcast – so we could swap the look at the push of a button.”
Cohen, who was Academy Award-nominated for The King’s Speech in 2011 and who shot the 2016 Best Picture Oscar contender Room, is gaining a reputation for being able to conjure fresh angles for telling a story within confined spaces.
“What I tried to do on Final Portrait was to have very few lights on the floor so I could give Stanley and the actors (Geoffrey Rush, Armie Hammer) the freedom to go anywhere, and not be held back by lighting stands and clutter. In my experience, actors give a better performance when they are not having to worry about tripping over equipment.”
That’s also why he felt the Red camera was perfect. “During prep, Stanley said that he wasn’t interested in using a dolly at all. He wanted the whole film handheld. We sent the dolly back so we weren’t even tempted to use it. Being handheld gave the film energy. Some moves were slick, others were messy, but because the Red is quite small, we shot two cameras (second camera was Iain Struthers) both of us with a camera on our shoulder for the length of the day. If we’d tried to do that with Alexa we’d have been a collective wreck.”
A major factor swaying Cohen’s decision on all three pictures was the light weight of the Red camera lump. “I’d say half of Victoria And Albert, nearly all of Final Portrait and about two thirds of Disobedience were handheld. It really helps that the Red is small and user-friendly.”
The look of Final Portrait was inspired by archive footage of Giacometti in his Paris studio. “It was fascinating and informative to have that replicated by the superb production design of James Merifield,” he says. “I responded to seeing Giacometti in the original space, how he moved around and how it was lit at different times of day to get a sense of what the place must have been like.”
Cohen paired the Red Dragon with Zeiss Master Primes for Final Portrait and used the same arrangement for Disobedience – a film with a completely different aesthetic.
Shot last spring on location in North London, Disobedience is directed by Sebastián Lelio and stars Rachel Weisz and Rachel McAdams. It’s the adaptation of Naomi Alderman’s novel concerning a romance between members of an Orthodox Jewish family.

Disobedience
“It’s a very intimate story,” Cohen elaborates. “I was trying to use natural light and then manipulate it to make sense of the story. It’s a desaturated look, quite stylised and very different from Victoria And Abdul, which is lush and rich. I enjoy stylistically jumping around so long as it’s appropriate to the script.”
The trio of films were shot at 6K to maximise dynamic range with Victoria And Abdul and Final Portrait down-rezzed to 2K for cinema. Disobedience was treated with a 4K finish.
Cohen is currently back working with Frears on the TV drama A Very British Scandal, about politician Jeremy Thorpe who was tried for conspiracy to murder in the 1970s. It’s Cohen’s second project with the Red Weapon 8K S35 with the Helium sensor, following Night In Hatton Garden for director James Marsh and his retelling of the headline grabbing jewel robbery starring Michael Caine. Stay tuned for those visuals in 2018.


Friday 24 August 2018

Object-based broadcasting is coming, with wide implications for the media business

Videonet

Of all the technology initiatives that broadcasters are exploring, the one with arguably the most profound impact is not UHD-HDR or virtual reality or even OTT streaming. It is the ability to slice and dice content into a personalised feed delivered just to you on-demand, with customised editorial, length and quality of experience that fits the device you are using and the environment where you watch. This is all underpinned by object-based delivery over an end-to-end IP acquisition-to-distribution chain.
BT Sport has been exploring the potential for this concept for at least two years at its data-centric sports property MotoGP, trying to get fans more immersed in the action. Last month, BT Sport chief Jamie Hindhaugh called object-based delivery “the next major initiative.”
By breaking down a piece of media (a frame, a piece of audio, an object in the frame) into separate ‘objects’, attaching meaning to them and describing how they can be rearranged, a programme can change to reflect the context of an individual viewer. The individual would, in effect, be allowed to curate their own programme.
Live sports programmes are already at the forefront of just-in-time content assembly, as small segments are created from the live event and used quickly. Catch-up and on-demand follow the linear programme with ever shorter delays. A tennis match can be available on-demand in a matter of minutes after a game has ended.
Another UK broadcaster, the BBC, has been pioneering research into object-based broadcasting. Their progress update last week imagined how audiences in 2022 might create their own personalised streams for ‘Match of the Day’ (its flagship live and highlights football show), the weather forecast or even the popular soap opera ‘EastEnders’.
The BBC goes further and imagines the production roles that could emerge. We could see a ‘live reversioner’ who edits news programmes on-the-fly. There could be interactive drama producers who use automatically marked-up rushes of actors to offer bespoke packages, and who have access to all camera streams (from the cloud), with rushes classified automatically from AI-powered transcription.
The BBC thinks this technology has the potential to transform the way content is created and consumed. It anticipates efficiencies and creative flexibility for production teams, enabling them to deliver a personalised feed that understands the individual viewing habits of every member of its audience. “It’s about moving the whole industry away from thinking of video and audio as being hermetically sealed, and towards a place where we are no longer broadcasters but datacasters,” explains the BBC’s CTO, Matthew Postgate.
The audio side of object-based broadcasting has been developed in parallel, and in many ways is more advanced. Dolby leads the way here. It has reworked Atmos, its cinema audio mixing and playback technology, for use with TV. Sky Sports has introduced Dolby Atmos for subscribers using its Sky Q set-top box. BT Sport offers similarly enhanced viewing.
App developer Axonista has built an online experience for the shopping channel QVC using what it describes as an object-based workflow. This is able to extract graphics from the live signal so the ‘Buy now’ button on the QVC app becomes a touchscreen option on a smartphone.
The next step for object-based media pioneers is to find ways of making this concept scale, and making it infinitely repeatable and standardised. BBC R&D is partnering with Germany’s Magix Software and the French researchers BCOM in an EU-funded project called Orpheus that is working to build an end-to-end object-based audio broadcast system. This initiative is based on the BBC’s IP production studio.
The BBC has devised a media composition protocol to help drive scale and standardisation. The result is UMCP (Universal Media Composition Protocol – only a working title) which enables descriptions of media sequence timelines, processing pipelines and control parameters. “The crux of the problem, as with any standard, is finding the sweet-spot between being well-defined enough to be useful, but free enough to allow for creative innovation,” BBC R&D says in a blog.
There is a maze of other complexities to solve. For instance, an object-based workflow will need to manage rights for new versions of content that are assembled from many existing content parts. Then there is the IP infrastructure needed to efficiently narrowcast different versions of, say, ‘Match of the Day’ to millions of viewers at a time.
Despite the challenges, this is the way forward – content tailored just for me and you. The more sophisticated this becomes, the more personal the service will be, as the User Interface itself will be different for each individual.
This trend impacts every area of the media business, with the structure of intellectual property rights just one example. It will influence how media is scheduled, and how advertising packages are put together and sold. Media asset management and broadcast business software systems need even tighter integration.
Metadata becomes all-important because of the need to create sufficient relevant tags to define preferences in ever greater detail. AI or machine learning – buzzwords for IBC2018 – will play a key role, processing large amounts of data in a meaningful way on a unique user level.

Thursday 23 August 2018

Friday Night Dinner for Celere primes


content marketing for VMI

The latest series of long running Channel 4 comedy Friday Night Dinner for Big Talk productions was filmed by DoP Matt Wicks using Celere primes.
Each series has been distinguished with a different DoP and over ten years and five series you can see that each has its own look and feel as successive DoPs have approached the task in differing ways while still attempting to maintain a level of consistency in keeping with the show’s style. 
“There is one aspect of Friday Night Dinner that I feel governs most of the decisions you make when shooting the show - that it’s all filmed on location in an actual house,” explains Wicks, who trades as Feathercut Films. “This makes things rather tricky at times as the space is very limited, especially when there can be up to seven cast members in a scene at one time and two cameras. Therefore, we needed to a compact camera and kit footprint.”
Since all previous series had been shot on ARRI Alexa, a pair of Alexa Minis were the obvious choice. Wicks was also aware that previous series had been shot on zoom lenses but he felt he really wanted to use primes.
“My main reason for this was to try as much as possible to add depth and texture to the frame which can be tricky when your actors are three feet in front of a beige wall,” he explains.
“I’d used the Celere HD primes before on a couple of shoots and was pleased with how complementary they were to skin tone and their drop off. I didn’t want to share a set of primes between A and B cam so both of us [operator Barney Crocker] had our own set combined with two Angénieux zooms each.”
There was, therefore, a budget discussion to be made. “I knew at the back of my mind that the Celere’s are very affordable and having been happy with them previously, I decided to test them first,” says Wicks who visited VMI to find the set he wanted.
“We looked at all the lenses from Tstop 4 to wide open and found they worked best around 2.8/4, going a tad milky wide open,” he says. “We also looked at a set of Zeiss Super Speeds Mark II and found them fairly similar.
“I would say there’s a touch more character to the Super Speeds but I didn’t feel it would make too much difference to our show,” he shares. “We actually ended up using a Super Speed 35mm during the shoot when one of the Celere 35mm had a focus ring issue and was sent off to be repaired. They intercut perfectly.” 
The series was shot mainly on the 25mm, 35mm and the 50mm and in HD at 4444 XQ. Wicks tried to play a lot of the scenes in two shots or ‘deep 3’s’ but felt that for the close ups that the 35mm looked really nice.
“You could bring the characters close to camera with no distortion and you could see more in the background which I feel works best for comedy,” he elaborates.
“Overall, throughout the shoot the lenses performed very well. Both 1st A.Cs were impressed with them. We only had an issue with one 35mm which had to be sent back a few times for the same focus ring issue. One thing I would say is that we noticed, at wide open when filming outside, that the flare we got from head lights and any practical lights in vision was quite distracting. A strange ghosting would appear around the light. It was for this reason that we used the zoom lenses outside for the night exteriors. Other than that, we were very happy with the lenses and the support we received from VMI.”


AI: Building the future of broadcast


IBC
Artificial intelligence technology is swiftly moving from experiment to practical use across production workflows and into the heart of content creation.
Not so long ago it was the subject of science fiction, but artificial intelligence is now being used to write pop songs, break news, assemble reality shows and even create Hollywood blockbusters.
Software first developed at New Zealand’s Weta Digital for Avatarand The Planet of the Apes films, has been adapted by Vancouver-based Ziva Dynamics to create computer-generated simulations that can transform the way visual effects studios create characters.
Its machine learning algorithms are trained on real-world physics, anatomy and kinesiology to simulate natural body movements, including soft tissue-like skin elasticity and layers of fat. It is claimed to animate CG characters in a fraction of the time and cost of traditional VFX – and it’s been used on major productions Pacific Rim and Fantastic Beasts.
Japanese start-up Amadeus Code is one of many AI algorithms being trained to produce music at the touch of a button. In its case a user uploads their list of songs and the AI will analyse them before automatically generating new pop tracks based on era, rhythm and range, all via an iPhone app.
These are just two examples of AI’s pervasive reach across the industry right into the heart of content creation. It is taking on laborious, expensive tasks such as closed captioning, metadata tagging and social media clip generation. Because of its ability to crunch volumes of data and yield meaningful results it is swiftly moving from experiment to practical use.
When the half-brother of North Korean leader Kim Jong-un was murdered in Malaysia in 2017 the news agency that broke the news – half an hour before anyone else – was Japanese start-up JX Press Corp. It used AI to scour social media to find breaking news then used another algorithm to write it up.
So impressive are its results, that broadcasters NHK, Fuji Television and TV Asahi are clients, with the latter’s deputy editor-in-chief Koichiro Nishi quoted by Bloomberg as saying “it’s a world of 100 million cameramen. A must-have tool.”
Endemol Shine Group (ESG) is using a Microsoft Azure AI workflow to replace an entirely manual selection process in the Spanish version of reality show Big Brother. Through machine learning, the system recognises patterns of language, keywords and emotional reactions. It tracks, monitors and indexes the activities of the Big Brother house’s residents and infers relationships between them.
“Watching screens for so many feeds and arduously logging moments is very tedious,” explains Lisa Perrin, CEO Creative Networks, ESG. “Now, we zero in on the most interesting actions rather than wading through hours of footage.”
Declaring the technology “ground-breaking”, Perrin says it will “completely revolutionise the way we produce our global formats” and open up “an unprecedented level of creative freedom”.
Accenture reports a major Latin American content producer experimenting with AI to automate and optimise production script creation for telenovelas. “An AI might help maximise the number of scenes scheduled for shooting per week, maximise the use of scenarios, minimise actors’ idleness or reduce time to move between shooting locations,” says Gavin Mann, the consultancy’s Global Broadcast Industry Lead.
AI-powered algorithms are able to analyse every nook and cranny of every frame of video, making it possible for a sports production team to sift through a mountain of metadata and put together a montage of great plays in a few seconds.
Getting granular
Wimbledon, for example, used IBM’s AI to automate the tagging and assembly of two-minute highlight reels for online publication. The system rates each play based on metrics such as crowd noise and player gesture to speed the search of creative editors to build more extensive highlights. Isreal’s WSC Sports has developed a similar automated workflow for the United Soccer League in the US and is currently churning out 300 clips per game in near realtime.
 “AI essentially turns haystacks of information into needles of insights, which might be the best metaphor yet for how traditional media companies can advance their businesses in a big way by focusing on all things small,” says Joe McGarvey, Marketing Director at Imagine Communications.
AI-powered machines are also proving adept at identifying unwanted content. Google reports that AI, not humans, detected about 80% of the 8.28 million videos removed from YouTube in the last quarter of 2017. Facebook acted against 1.9 million pieces of content on its platform in the first quarter of 2018, detected as fake accounts and fake news by AI.
“For many, the primary driver of adoption of AI technology is the opportunity to automate routine workflows that are manually executed,” says Stuart Almond, Head of Marketing and Communications, Sony Professional Europe. “Calling upon metadata in particular is a catalyst towards a richer environment for audiences. When applying this consumer lens, that’s when AI gets really smart and creates real, tangible benefits for both companies and end-users.”
Netflix is probably one of the best examples of how AI can help create a richer and more tailored experience for consumers, while at the same time driving business efficiencies.
“Its AI-driven recommendations engine is safeguarding over $1 billion of revenue each year by showing consumers the content they are really interested in and, in turn, keeping them from cancelling the service,” says Almond.
“It is a strong proof point that shows AI-based solutions can have a significant positive effect on revenues, if done right. The key going forward is adopting media supply chains that support this, bringing content acquisition and production into this process.”
Data, down to the finest detail, is now the currency with the most spending power in the media and entertainment industry. The more granular that media companies can get when it comes to knowing their networks, their audience and the way their audience is consuming content, the richer they will be.
“The challenge for media companies is finding a way to manage all the information generated from every aspect of workflow from viewer preferences to rights and network errors,” says Ian Munford, Akamai’s Director Product Marketing, Media Solutions. “Most media companies are drinking from the fire hose but AI has the potential to turn that data into action. Most uses of AI today are cutting edge.”
Speaking at CES at the beginning of this year, Amazon Vice President of Alexa Software Al Lindsay had advice for those concerned about an AI-powered future.
“Learn to code now,” he said. “If you know how to code, you can help change the outcome the way you want it to be. It’s that simple.”
AI at IBC
There has been a large focus on AI within Sony’s media services, which returns to IBC under the banner of ‘Go Make Tomorrow’. “The key drivers will be to open up more efficiencies and possibilities with how content is used in any workflow,” says Almond. “Sony is fiercely committed to collaborating with industry bodies and innovators to help our customers drive efficiencies and untap the potential of new technologies like AI and machine learning.”
Accenture is working with broadcast and video clients to incorporate AI into projects spanning basic automation of back office processes and compliance checks, to the optimisation of programming schedules and interpreting payments for complicated royalties contracts.
“We believe AI’s real power is helping reimagine business by augmenting, not replacing, human capabilities,” asserts Mann. “Automation in content review is one area in which companies can use AI to leap ahead on innovation and profitability.”
With such a new technology, and one developing at an incredible pace, Mann says often clients want to start with a small proof of concept to demonstrate that it actually works. “We can help them measure what is working, scale fast when it does and fail fast when it doesn’t.”
Accenture also offers access to its Amsterdam-based Innovation Center (only a mile from the RAI) for further discussion and demonstration of its “very wide range of use cases and client stories”.
Nuance Communications, which describes itself as a pioneer in conversational AI, says it is seeing demand for enhanced targeting based on voice profiles. “Telecommunications customers are asking for the ability to better target and tailor specific offers and messages to their end users,” states Dan Faulkner, SVP & GM. “Developments are beginning to make this targeting more intelligent.”
At IBC2018, Nuance is presenting a new voice biometrics tool for its voice and natural language understanding platform Dragon TV. Aimed at Smart TV deployments, the innovation is intended for more secure authentication through natural voice patterns. For example, when purchasing a film, rather than PINs, passwords and security questions, this technology allows consumers to buy the movie using their voice alone.
According to AWS Elemental Chief Product Officer Aslam Khader, the next phase of AI will involve the concept of “content lakes”, which means having all content and related metadata in a unified location and proximal to scalable, cost-effective and easy-to-use cloud-based AI services. He says: “The content lakes concept makes searching, identifying and moving huge chunks of content across different media workflows easy and efficient. You might think about this as media asset management 2.0.”
At IBC, AWS will showcase ways to make it easier for media companies to enrich and monetise their content, with demonstrations of machine learning applications that highlight capabilities such as video analysis for metadata enrichment, automated compliance and editing use cases, automated transcription, translation and text to voice capabilities for closed captions, subtitling and audio description use cases, and clip production for personalised clip generation and advanced sports graphics creation.


Friday 17 August 2018

Broadcast Management Systems: Moving Just in Time Further up the line


InBroadcast

BMS solutions are powering dynamic changes and individualised delivery to different audiences
Broadcasting is shifting from a ‘create one distribute to many’ to a narrowcasting model in which multiple variations of content are created for distribution to different groups, platforms and ultimately individuals. The Broadcast Management System (BMS) is an essential component in this evolution required to streamline and automate workflows. Content is being delivered directly to the consumer as OTT, on social media, or in packages sold to third-party platforms. Pop-up channels are created for specific audiences and events.
This shift impacts every area of the media business – the nature and structure of IP rights, the scheduling of media and how advertising packages are put together and sold. However, the change has not been a ‘big bang’ and while BMS vendors are increasingly implementing digital first solutions, traditional linear television still represents the largest revenue source for many of their customers.
“The key to supporting our customers through this change has been to provide them with a suite of applications and solutions that can handle both broadcast and narrowcast, non-linear and linear in a single application,” says Sina Billhardt, product manager, Arvato Systems. “This is combined with automation in workflows to mitigate extra workload from the additional variations and an approach to delivering software and solutions that anticipate and can accommodate future shifts and opportunities without knowing the detail of what they might be.”
According to Geert Van Droogenbroeck, marketing manager for MediaGenix, the changes make it essential for operators to recommend content based on usage. “Algorithms are written to track viewing preferences and content is personalised based on this information. Metadata becomes all important in creating sufficient relevant tags to define preference in ever greater detail. AI is being used to process large amounts of data in a meaningful way on a unique user level.”
He adds, “The more sophisticated this becomes, the more personal the service will be as the UI will be different for each user. Added complexity comes when a service needs to take into account both what a user is watching in a linear / catch up environment and an on-demand environment.”
Another challenge is that the content has to adapt to the destination communication devices, not just technically, but also from an editorial point of view. Facebook’s long form story format needs to be different viewed on web site or a connected TV.  “The changes have a huge impact on content itself,” says Droogenbroeck. “Traditional players have become more creative in how they package content for a target group. They split content into smaller parts that fit social and on-demand media, repackage seasons into different themes or add bonus content for the true fan.
“Media companies need to be able to slice and dice content for use on different platforms,” he says. “They need to manage rights for new versions of content that are assembled from many existing content parts. They will also want to present existing content in new and different ways, and group titles into collections for selling, planning or re-packaging. This requires a flexible content-centric system and quick editorial decisions that rely on a powerful management of media and material workflows and of complex rights and underlying rights and royalties.”
The BMS should therefore assist producers, curators and schedulers in managing this complexity and make it simple to connect the right audience to the right content version.
“Unfortunately, the total revenues of the market remain flat or show only a limited growth and broadcasters have to survive with lower incomes per channel,” points out Michal Stehlik, director - product development, Provys. “This leads to an increased pressure on the automation of all processes associated with each individual distribution channel.”
Provys believe that this automation process is possible only when it is built on a strong foundation of content and rights libraries, regardless of the type and coverage of the channel. The key to success, it asserts, is finding the right content to offer, efficiently utilising all available rights and using information to support further content procurement or production.
“This is why we think BMS is a key system for transformation from a linear channel broadcaster to a content centric, multichannel operator,” says Stehlik. “We guarantee that Provys is the right solution to support this transformation.”
Just In Time assembly
In general, it is more efficient to package the channel for delivery at the end of the broadcast technology chain – a strategy which supports efficient reuse of content. The time when commercial breaks were compiled onto a single tape, subtitles burnt into the picture and audio tracks recorded together with the picture, are over. Today, playout automation assembles all the necessary pieces ‘just in time’ (JIT) with graphics rendered during playout and no need to use post production resources.
“It is now possible to introduce changes just a few minutes before transmission and produce multiple feeds with different branding from a single media,” says Stehlik. “From the BMS system perspective, we first define the rules and then schedule individual elements as separate objects. We expect that more and more broadcasters will discover the beauty and power of the information kept in our system which enables an enhanced, individualised experience for viewers of the future.”
In many ways, scheduling applications used in broadcast environments have been applying a just-in-time methodology for a while. Placeholders are commonly used for content that is not yet available while workorders, analogous to those used in manufacturing production processes, are sent to Media Asset Management systems to ingest/create/produce the media.
“The key to making this work is tight integration between the MAM and BMS systems and an understanding in the workings of both,” says Billhardt. “This is not common in the industry as few vendors offer solutions in both areas and on the user side, solutions are often specified and implemented by different departments. It’s an area where Arvato Systems offers a unique and valuable perspective.”
Looking a little way into the future, there’s work in progress to extend the Interoperable Mastering Format (IMF) specification for advertising. According to Billhardt this has potential to move the JIT assembly process even closer to the consumer and presents some really significant opportunities to further push the boundaries on ad sales.
Stream Circle, maker of the eponymous TV automation platform, also anticipates that the future lies in network streaming to narrow groups of viewers “provided that we know their exact profiles,” says CEO Josef Vasica.
Stream Circle works with raw content and, using its own graphics engine, “is able to assemble the final stream at the very last moment in the light of the latest available information,” says Vasica.
“Our system functions on the basis of a generic programme definition which is then enhanced by secondary events, graphics, ads, self-promos, etc. strictly on a JIT principle.
“IBC2018 can expect to see our latest multi-layered playout functions with all the latest and greatest features of IP-based television,” he adds.
Live sports programmes are at the forefront of just-in-time content assembly, as small segments are created from the live event and used quickly.  Catch-up and on-demand follow the linear programme with ever shorter delay. A soccer match can be available on-demand in a matter of minutes after a game has ended.
ProConsultant Informatique (PCI), which refers to its BMS (called LOUISE) as a Business Management Solution, says the BMS must be integrated with Business Process Management (BPM) tools to manage the workflow’s operations and tasks and to bring significant operational efficiency to media customers.
“Non-linear platforms are going to be more and more specialised, addressed to precise targeted individuals or groups, based on their characteristics, with multiple variations of content,” says PCI’s Laurence Thill.
“In this framework, LOUISE is providing integrated functionalities to manage these different variations and to personalise the specific content addressed to the individuals and/or groups of final viewers. Media companies using LOUISE will be able easily combine all of this information in order to precisely adjust and feed, non-linear platforms with the appropriate content addressed to viewers.”
Cleary, all of this must be done in compliance with the rights and rules associated with each content. Since the rise of non-linear platforms has significantly changed the rights management needs for broadcasters, PCI will introduce at IBC2018, a fully integrated module within LOUISE, which enables users to manage the sale and/or re-sale of rights to third parties.
International media groups are centralising their content in a global content management system so that it is ready to be shared by channels and platforms across the globe. As the media asset management in MediaGenix’ WHATS’ON pilots all video, audio and subtitle flows, the content is ready to be shared by all channels in whichever region, platform, version or language the content is needed.
Swiss public broadcaster RTS can, for instance, automatically create clips for fast publication online.  WHATS’ON users open the frame-accurate player from their WHATS’ON screen and set markers segmenting the content.  This facilitates its distribution on any platform while tracking the various rights on every individual segment of the content. The system informs users about rights problems and the additional costs for clearing the content.
At IBC2018, Mediagenix promises to deliver on a new concept of content itself – one that “breaks down the barriers between interstitials and products, episodes and programmes, but also between media and nonmedia content, such as derived products, apps, books, entertainment events.”
This will apparently make it even easier in What’s On to split content up, assemble new content with constituent parts, schedule additional content, present it in alternate ways, and group titles into collections for selling, planning or repackaging.
“The whole exploitation lifecycle will be managed in an integrated way. With ‘Flights’ you will not need more than one scheduling action for multiple publication windows on multiple services and platforms,” he says. “Ultra-dynamic publication with one click of the mouse.”
For IBC, Arvato is focusing on “programmatic advertising” and bringing the best of online and digital advertising to traditional TV. “With linear still providing strong revenues for broadcasters and brand safety for their advertisers, by combining big data on audience insights with smart, automated placement optimisation, our customers are able to offer advertisers accurate targeting and reduce waste by controlling reach and frequency on linear channels for the first time,” explains Billhardt. “Alongside metadata-driven automated scheduling, we’ll also be demonstrating how the placement optimisation algorithms in our S4AdOpt application can now also be applied to promos, increasing viewer numbers and providing further revenue opportunities for our customers.”


Looking beyond the Game’s end

Broadcast



The Northern Ireland film and television sector is looking to a future beyond the final episode of the hit HBO show by planning to compete on the global stage.
In the decade since HBO agreed to shoot a pilot for its new worlds series in Belfast, to the final day of filming last month, it is no understatement to say that the industry in Northern Ireland has been revolutionised.
Previously renowned for documentaries but lacking network commissions – described to Broadcast by Green Inc Film & Television owner Stephen Stewart as “chronically under-achieving” and with no history of large, incoming productions – the region has been transformed.
“The supply chain infrastructure is unrecognisable from what it was then,” says Richard Williams, chief executive, Northern Ireland Screen.
“We have two new studios both effectively built on the optimism and value proposition of Game Of Thrones and a depth and breadth of skilled resource from crew to post-production that is giving a generation of talent the feeling that anything can be made here.
“Arguably the most significant change lies in the perception and credibility of Northern Ireland, in London and particularly in Los Angeles,” he adds.
Succession plans
Williams’ screen agency is widely credited among indies for its ambassadorial and practical support. It has been ambitiously planning for Game Of Thrones’ succession, with the drama ending after the eighth series.
“It was strategically extremely important to have Belfast Harbour Studios open before Game Of Thrones came to an end for the simple reason that we aimed to shift from an ecosystem that broadly supported one largescale, inward investment project to one that supported two such projects,” Williams explains.
The privately funded, £20m Harbour Studios comprises 64,000sq ft of soundstage. It is busy with the second season of Warner Horizon’s Krypton, “meaning a large chunk of supply chain companies had a degree of business no matter what happens”, says Williams.
Post-production houses Yellow Moon in Holywood, County Down, and Ka-Boom in Belfast, have both benefited from Game Of Thrones’ location in the nation.
Yellow Moon has employed more permanent staff, leased several buildings for the HBO team and installed new kit and editing suites, while Ka-Boom has expanded into wider production services, including being CAA-approved drone pilots.
Demand for craft and crew facilities is being further shored up by a growing number of UK-anchored TV dramas.
These include 3 × 60-minute period drama Death And Nightingales from Imaginarium and Soho Moon for BBC2 and three-parter Mrs Wilson, starring Ruth Wilson (and based on the memoir of her grandmother), co-produced by the BBC and PBS’s Masterpiece.
Romantic indie feature Normal People (co-produced by Canderblinks Films and Out Of Orbit) starring Liam Neeson and Lesley Manville (Phantom Thread) is currently filming based on a script from Irish playwright Owen McCafferty.
Later this year,  BBC1’s 8 x 60-minute The Dublin Murders from Euston Films, Element Pictures and Veritas Films will shoot in Dublin and Belfast, while the fifth series of World Productions’ Line Of Duty will return to Belfast.
HBO’s confirmation of a pilot for new Westeros saga (w/t The Long Night) at Titanic Studios’ Paint Hall in October is more good news. “Our hopes and expectations are that HBO will remain in Northern Ireland for many years yet,” says Williams.
“We are still keen on studioscale feature projects and certainly when we ramp up to three inward investing projects over the next four years we expect at least one of those to be a feature.”
The estimated value of HBO’s investment to date in the region is £206m – not a bad return on £16m in Northern Ireland funds (see chart).

The Game Of Thrones halo is less tangible outside of drama but has impacted nonetheless, not least in raising the profile, skill levels and work load of location scouts to costume designers.
“Everyone knows they can come here and make high-end shows,” says Kieran Doherty, the joint managing director of producer Stellify Media. “They know our crews are world class.”
Sony joint-venture Stellify is riding high on multiple wins, including Channel 5’s revivals of Blind Date and quiz Gino’s Win Your Wish List, plus ITV’s resurrection of Who Wants To Be A Millionaire?.
While the company has made entertainment formats like Can’t Touch This for the BBC and is making social experiment show Celebrity In Solitary for C5 in warehouse spaces in Belfast, Doherty says the region lacks suitable studios for larger-scale shiny floor shows.
Who Wants To Be A Millionaire?, for example, is housed at Dock10 in Manchester.
“One benefit of the Game Of Thrones crossover is that we can draw on set design and construction or make-up talent to make shows here, but high-end Saturday night shiny floors are harder to make without a dedicated TV studio,” says Doherty.
The keys to a burgeoning entertainment and fact ent sector in the region, however, lie with network commissioners. Locally this is known as ‘the Sean Doyle effect’ after the impact made by the London-based, Belfast resident commissioning editor at C5.
“He doesn’t have a remit to look to the regions but because he knows the sector here there’s an immediate trust and understanding of what we can all deliver,” says fellow Stellify managing director Matthew Worthy.
Doyle recently ordered a pilot for Celebrity Meltdown, about Britney Spears, from Waddell Media.
“A big turning point for all Northern Ireland indies would be if the BBC and Channel 4 could find someone who could fit Sean’s mould,” says the indie’s managing director Jannine Waddell.
“This is still a relationships business. There is more engagement from those broadcasters, but the difference is that I can meet Sean for a coffee today, whereas I’d need a day, spend £500 and arrange other appointments, in order to catch-up with execs in England.”
Green Inc’s Stewart adds: “Sean has been a very successful commissioner for the community here but that’s a direct result of him knowing who is on the ground. A lot of executives simply don’t have that knowledge. C4 and the BBC are doing a lot of good work to get more local commissions but there is more work to be done.”
Unfortunately, C4 recently struck Belfast off the shortlists for its new national HQ and creative hubs, which would likely have propelled production in the city and surrounding regions into overdrive.
Some indies are launching satellite offices in Belfast. Initially, perhaps, this was in anticipation of an increased C4 presence but it is also in order to tap network quotas, Endemol Shine’s Darlow Smithson Productions launched a Belfast base in April to expand its factual output.
Headed by producer Anne Stirling, who was hired from running her own production outfit, the indie is up and running with series three and four (40 eps) of Ill Gotten Gains for BBC Daytime.
Working together
For Green Inc and Waddell one answer lies in increased co-pro alliances. “It’s down to finance – broadcasters want more bang for buck,” says Waddell. “Americans tend to move faster than broadcasters in the UK but it’s always a slow process trying to get everything together.”
Waddell has half a dozen returning series, including Find Me A Home, Francis Brennan’s Grand Tour and At Your Service, all for RTÉ.
“The expectation of broadcasters in terms of development is so high and so expensive that you can’t compete with the big guns who have massive budgets unless you join forces,” says Stewart.
Green Inc co-pros include BBC4’s Hive Minds with Saltbeef and Ireland’s Got Talent with Dublin’s Kite Entertainment.
Northern Ireland Screen’s most recent funding incentive aims to boost co-finance deals with Canadian producers. Around £330,000 over three years is being made available to support development of digital media and TV projects.
Everyone is searching for a long-running returnable series such as 24 Hours In A&E or Bargain Hunt. “Once you have that volume of hours you can build an industry around it,” says Stellify’s Doherty.
Waddell adds: “A few years ago productions shooting here would have had to bring a lot of people over here, while talent growing up in Northern Ireland would have felt the need to move away to find work. That has changed. Now our talent can see that there is a consistent volume of fantastic work to build their careers on their doorstep.”