Wednesday, 14 August 2024

ACE Cinema Editor: Slow Horses

Cinema Editor 

Tinker, Tailor, Editor, Spy: Sam Williams makes a successful show even better

article here p23

 Mixing espionage intrigue with whip-smart humour Apple TV+ Slow Horses has burned through three seasons since 2022 with two more in the works. Adapted from author Mick Herrons award-winning novels by See-Saw Films and screenwriter Will Smith, the drama revolves around a group of British spy misfits under the notional command of washed-up MI5 chief Jackson Lamb (Gary Oldman) who somehow manages to get mixed up in plots that endanger state security. Also starring are Kristin Scott Thomas, Jack Lowden, Olivia Cooke and Jonathan Pryce.

Critically acclaimed from the start with multiple Bafta nominations including for Katie Weilands editing of the pilot Failure's Contagious, Season 3 landed a Bafta win for Sam Williams editing of episode 1 Strange Games(also ACE Eddie nominated) and a nomination for Zsófia Tálasediting of episode 6, Footprints.

We're a little bit shocked to be getting all these plaudits for season three because normally that doesn't happen on a repeat show,” says Williams (Luther, His Dark Materials) who joined for Season 3. When you come into edit a season of any show you are standing on the shoulders of giants because they've already done a lot of the work.”

One of the productions hallmarks is that one director is handed responsibility for the whole six episode run. In this case it was Saul Metzstein (Brassic) whose paths had crossed with Williams on Dr Who but theyd never directly worked together. With Metzsteins regular editor committed to another project, Williams got the invite.

Season 3 opens with an extended sequence shot on location in Istanbul with Williams also in attendance. He explains, They didnt want to have the expense of having to come back for reshoots so they sent me out there with a mobile edit suite to begin cutting it together. The problem was that I then fell in love with every piece of the shoot because I'd been so closely involved in it.”

The first cut for this sequence ran about 15 minutes when they ideally needed it around five.

We were always aware that we might have to compromise the sequence because editorially we needed to do two things,” Williams explains. We needed to establish that our two lead characters (Sean and Alison played by Sope Dirisu and Katherine Waterston) are in love with each other. We also need to ensure that we re-introduce Jackson Lamb and the rest of the series regulars as soon as we can.

The whole motivation for the Season 3 story rests on whether you believe Sean and Alison are in love and the emotional impact you feel at the end of the sequence when she dies.

Thats down to the quality of the direction and acting but just the meeting and falling in love part of the story was originally about five minutes long. In the end we trimmed that down to around 40 seconds.”

To do that Williams says he had to become more objective. No-one cares if we were up until four in the morning to shoot and cut a scene in Istanbul. I had to make hard decisions, trim the scene to its essentials, while retaining the action and emotion.”

The ten scenes following this opening were the trickiest to finesse and to order, he says.

The task was to introduce all our characters without stretching out 20-30 minutes and suddenly finding half of our time has gone by and weve barely started on plot.”

The frenetic action which closes the opening scene in Istanbul gives way to Lamb in a doctors waiting room.

There's a slow tracking shot across a doctors waiting room and we just see a pair of feet and hear someone break wind.  You don't quite know it's him, but if you've seen the show before then you know. The tracking shot ends on Lamb’s face where he is ruminating on death.

The editing here is just very simple. It says now we're just going to take things very slowand shows that were back in a world where life feels weary.”

Nothing works out the way you want it to. That sort of feeling is imbued in a lot of those opening scenes, so naturally, you're not going to start cutting all over the place.”
A follow up scene with River (Lowden) and Standish (Reeves) packing files of boxes at Slough House continues the theme.

Packing files is about the most tedious job you can imagine but theres something else going on that isnt immediately apparent which is that the whole story is really about files. If the camera team are favoring shots of boxes in this scene, thats the reason why.”.
Season two hadn
t been released by the time they started editing but Williams and Tálas were able to watch rough cuts and get up to speed.
Music is always a big thing to get sorted before you start any show since its a large part of the look and feel and pacing,” he says. Obviously here we already had a whole box of tricks to instantly call on whereas the editing team on season 1 were still figuring it out.

That said, we do as much if not more work on the sound as on picture. Executives who have invested in a project or are about to buy into it like to see as finished a product as possible so the closer you can polish it with temp FX and music the more likely they are to buy it. The process also helps you as an editor since a little bit of sound adds so much to the drama.”

For example, Slough House, the operational hub of Lambs division in a less affluent part of London, is intentionally depicted as a dull environment. To help convey that Williams layers in sounds of road works, police sirens and traffic.

Its supposed to be in a rubbish part of town so by adding some atmos you can do a lot of the storytelling.”

He continues, It also helps your editing if you've fallen in love with the characters. With two series under the belt, theres a whole history to rely on. You can note certain little looks or things that they do that are short hand for their character. You know instantly that when you see something in Garys performance that that is a very Lambway to do things.”

One short scene in episode 1 shows Lamb ordering a greasy kebab from a high street take-away. He asks the shop owner to put as much spicy sauce on the sandwich as possible so he cant taste the meat. Its a typically rude and witty remark that tells you all need to know about Lambs character. Williams thinks Oldman may have ad-libbed it.

Did you notice the design team having fun in that scene? Its subliminal but theres a wide shot where Jackson goes into the kebab shop and you see a poster on the wall that just says, Lamb is Great.’”

Williams edited episodes one, three and five with Tálas cutting the other three. We critiqued each other's work and watched all the episodes together with Saul.”

Another signature of the shows style is intercutting storylines, much of which is scripted but still requires tightening in the edit.
Many scenes are written a lot longer because the writer wont be quite sure if its going to work on screen. My job is to ensure its to the point. I've always found that having a lot of material in a scene really helps the actor with their performance. Even though Ive taken some of it out, their performance is really strong emotionally because they've worked through the original script.”

He says one of the joys of Slow Horses for an editor is to show off different styles. We go from the energy of a chase in Istanbul or the suspense of the siege at the file storage facility in episode 6 to simple ones such as Lamb when he's at the doctors.

It was without doubt the best show Ive worked on in a long time. So much fun. Everyone was lovely. And then to be getting all these accolades at the end just makes me realize how lucky am I to have worked on it.”

 

 

Behind the Scenes - Worldbuilding in Furiosa: A Mad Max Saga

IBC

article here

Director George Miller had been plotting a prequel to Mad Max: Fury Road for some time and even approached original cinematographer John Seale to shoot it. Seale had been nominated for work on the 2015 film but, approaching 80 when Miller finally got around to making it, Seale decided to pass.

The baton went to fellow Antipodean DP Simon Duggan, a New Zealander who has shot Alex Proyas’ I, Robot, Duncan Jones’ Warcraft and Mel Gibson’s Hacksaw Ridge.

“I knew that George had been developing both Fury Road and a prequel story for many years and had originally planned to shoot both films back to back,” Duggan tells IBC365.

“He explained how the prequel was to tell Furiosa’s journey from her childhood, her abduction and years growing up at the Citadel mostly disguised as a boy, then her first failed attempt to find the Green Place and her final revenge against her abductor Dementus.

Furiosa is a much more complex story than Fury Road. One creative goal was to establish a larger society in the wasteland that surrounds the Citadel Fortress.”

The lead is played by Anya Taylor Joy with Chris Hemsworth as Dementus, the deranged warlord leader of the Biker Horde.

Flawless continuity

Back in 2012, Miller had been planning to shoot Fury Road in stereo 3D and was testing a prototype 3D sensor camera and rig with the film’s DOP John Seale. In pre-production on that film, Seale visited Duggan in Sydney where he was shooting The Great Gatsby for director Baz Luhrmann in digital stereo 3D.

“I believe John realised that shooting real 3D was going to be too limiting for the intense action scenes in the harsh and dusty Namibian desert and decided to shoot digital but in 2D using the ARRI Alexa Mini,” Duggan says.

With the prequel filming in the Australian desert, a chief task was maintaining visual consistency between the two films. The Citadel Fortress, where chief villain Immortan Joe ruled over his dominions in Fury Road, makes a return in Furiosa as does the look of the barren wasteland. The design of the War Rigs is also consistent with other films in the franchise.

Fury Road was the reference point as it is so original with the intense red plains and blue skies but there was still the opportunity to expand the look with the establishment of a much larger world than previously seen.

“George and I also talked about the Black and Chrome version of Fury Road and specifically how it feels more like an apocalyptic film from a much earlier time. I did manage to slide in a few nods to film noir lighting techniques such as slashes of light across faces and deep contrast lighting for interiors of the Citadel.”

Duggan says Miller also inserted a direct throwback to Mad Max 2 of Mad Max observing proceedings below from a cliff top. “Furiosa does reveal a much larger world that exists in the postapocalyptic wasteland so there was much more scope for the visuals,” he says.

Camera selection

Duggan photographed on the Alexa 65 “because it captures beautiful looking faces and the desert vistas attain an almost 3D feel when covered with the large format.” He also required several smaller cameras for the action work. These were mainly RED Komodos fitted into tight rigging points around the78ft 18-wheel War Rig and RED V-Raptors for Steadicam or handheld cameras. They also needed the additional cameras for higher filming speeds to amplify moments of a character’s predicament, especially in sequences with Furiosa.

“Generally, for dialogue scenes with our actors we used two to three cameras,” adds Duggan. “George would rehearse with the camera operators recording on iPads and then his onset editor would do a quick 10-minute assembly in his trailer. Once George was happy with his approach to coverage we would shoot. We paid a lot of attention to the quality of light, facial modelling and making sure we could read into their eyes. Anya’s eyes especially were a window into her soul.”

Duggan principally shot Arri DNA Primes. “ARRI even produced two new full coverage 25mm Primes for the 65mm format with each lens individually named ‘Furiosa’ and ‘Max Max’ for our production.”

“For action sequences, we would often run the remote camera crane tracking vehicle, a second camera tracking vehicle, a drone, and rigged camera mounts to the various vehicles. The film’s Action Unit would have up to half a dozen cameras rolling at once along with an array camera rig of Komodos on a buggy capturing background sky plates at War Rig speed. We weren’t worried about other cameras being in shot as they could be erased in post later.”

Unreal Engine

As in Fury Road, it is the kinetic camera movement which propels the action in what is a relatively dialogue-free movie.

Duggan explains that Miller used Unreal Engine to construct virtual animated scenes using real inputs such as locations with sun data, vehicle dimensions and speeds and humans with actor heights.

“He was able to quickly plot movements for camera tracking vehicles, drones and also Steadicam shots especially the ‘oners’ where it was to be a single shot sequence. It was from this process that George realised how much more dynamic it was to be continuously part of the action with the cameras. This method also provided the crew with the technical information to make it happen.”

The whole film was shot in the state of New South Wales, Australia. The main location for the exterior barren red earth locations was Broken Hill with much of the War Rig stunt scenes shot in a town called Hay which had a 4km highway which could be closed down. The remainder was set back in various exterior locations in Sydney including a short time on the Disney Sound Stages. For example, shots of Furiosa hanging underneath the War Rig were shot here.

“I was very honoured to be included in George’s Mad Max universe and to be able to follow up from Fury Road with another great film,” he says.

Duggan is currently back in the Australian outback shooting TV thriller Desert King for Netflix.

Tuesday, 13 August 2024

WBD Basks in Record Olympic Streaming but the Challenge is Retaining New Subs

Streaming Media

article here

The results are in and the winner by a mile are The Streaming Olympics. Labelled, rightly, ‘the first true Olympiad of the age of perpetual content’ the decision by the Olympic host broadcaster OBS and some key rights holders to embrace the everything everywhere of action from Paris and to stream it online is a triumph – with irreversible implications for the future of live.

NBC by all accounts threw the works at these Games and it has paid off with handsome viewing figures to pair with advertising dollars, particularly on Peacock. In the UK, Discovery+ became the UK’s fastest growing paid streaming service this month, justifying the more than $1bn it took to take control of rights (in the UK) from the BBC.

It’s no coincidence either that the reviews of coverage on conventional broadcasters, without wall-to-wall OTT options, have not been as good.

There are lessons though. Audiences in Europe have enjoyed the Games being contained in the same time zone; something that won’t be case in LA 2028 let alone Brisbane in 2032.

As noted by The Guardian’s correspondent, some AI highlights have not matched the highest production values; there have been glitches in the live stream and some wasted duplication. These will be ironed out. In the meantime, what this Olympics has demonstrated beyond doubt is that there remains appetite for live shared televised events provided audiences are given every option to slice and dice content as they wish.

WBD wishes the Summer Games were every year

For Warner Bros. Discovery Olympic success is a welcome respite from weeks of bad news which has seen it forced to write down $9bn in the value of its TV channels accompanied by doomsday headlines for the future of TV. WBD is also smarting from the loss of NBA rights to Amazon and CEO David Zaslav contemplating an asset sale in order to restore investor confidence.

Following the Paris Games, arguably its Eurosport division which owns the rights to the next two Summer games in LA and Brisbane in Europe, is the prize asset which just shot up in value.

The challenge for WBD going forward will be to keep those subscribers engaged over the next four years. True, WBD’s Olympic rights package also includes the Winter Games in Milan Cortina 2026, in the French Alps 2030 but it will need to retain and recruit more sports fans, particularly given the loss of NBA. Paris has proven the appetite for streaming coverage and WBD will likely want to replicate some aspects of the Olympic coverage to feed into coverage of its regular properties like tennis majors and cycling until the next Olympiad.  

In a bulletin touting the response to its coverage, WBD said cumulative reach of more than 215 million in Europe viewing Olympics content on its platforms was 23% (+40 million) more than for the Tokyo Games in 2021. This includes Max and discovery+, as well as Eurosport TV channels and free-to-air networks in Norway (TVNorge), Sweden (Kanal 5) and Finland (Kutonen, TV5).

It saw a record number of new paid streaming subscribers over the Games period; 77% more than Tokyo 2020 with most significant growth in France, Italy, Poland, Sweden and the UK.

WBD also boasted 4.5 billion video views of its Olympic content on social which is nearly ten times more than Tokyo 2020.

Andrew Georgiou, President and MD, Warner Bros. Discovery U.K & Ireland and WBD Sports Europe, noted: “Max has proven to be a game-changer for sports viewing with an enhanced product experience and new interactive features which encouraged more subscribers to come on platform and stay engaged for longer.”

While streaming led the way with more than 7 billion minutes streamed over the course of the Games by WBD (six times more than Tokyo) it was at pains to point out that its linear TV audiences were double that of the previous Games demonstrating the continued attraction of the Olympic Games in Europe across all platforms, it said.

 JB Perrette, CEO and President, Global Streaming and Games, WBD said: “Paris 2024 has exceeded all expectations for Max and Warner Bros. Discovery’s streaming business. We’ve added millions of new paying subscribers, and engaged millions of viewers daily on streaming who have watched billions of minutes of content during the Games. Our streaming growth momentum is only gaining strength, and we’ve still got almost half the global addressable market to go.”

IOC official media and broadcast stats

The IOC wasted no time in declaring the extent of its coverage and reach, claiming that over half of the world’s population would have engaged via broadcast or digital channels with Paris 2024.

OBS's online content delivery platform (Content+) became the primary method of delivering short-form and social media content to the 36 media rights holders. Over 17,000 pieces were made available, of which approximately 790 are vertical content designed specifically for social media.

This resulted in more than 113,000 downloads over the course of the Games, according to the IOC, and “unprecedented” results on Olympics social media handles, with over 12 billion engagements, - more than double that of Tokyo.

There was record usage of the Olympic web and app, reaching approximately 300 million people during Paris 2024, the highest for any Olympic Games edition.

AI was used to generate over 95,000 automatic highlights culled from the 11,000+ hours produced by OBS.

How the BBC fared

The BBC, traditional free to air home of the Olympics in the UK, has had the number of hours it could show slashed after losing some rights to Discovery. That led to criticism from some viewers annoyed that the broadcaster wasn’t covering the events they wanted to watch.

BBC Sport’s coverage of the Paris Games was streamed a record-breaking 218 million times online, more than doubling the Tokyo total of 104 million, with 12.2 million people watching on iPlayer.

Over 28 million unique users and 8.9m signed in accounts used the BBC Sport website and app for the latest news and updates from Paris with 62.2 million online requests for highlights clips

In addition to the live coverage on iPlayer and the BBC Sport website, BBC One “enjoyed consistently high” viewing figures throughout the competition with 36.1m watching on TV (which is 59% of the UK population and a peak of over 6 million on 14 separate days).

 Alex Kay-Jelski, Director of BBC Sport, said: “It is not an easy job, but these figures across digital, linear, online and audio demonstrates that BBC Sport’s unique multiplatform offer is capable of uniting the nation with the very best of British storytelling.”


Thursday, 8 August 2024

The height of virtual production

Definition

Whether you want to fly among the clouds or capture the perfect golden-hour sunset, virtual production environments can deliver convincing, meteorologically accurate (or even science-fiction fantasy) skyscapes.

article here

and p14-17 here

Getting it right means ensuring you have chosen the right content pipeline for the project. “For a beautiful generic sunset, a 2D video playback content pipeline may be best,” explains Joanna Alpe, chief commercial officer at Bild Studios and MARS Volume. “You can capture this at high resolution with a camera and play it back with agility on an LED volume. If the scene calls for a more dynamic range of action, and your director needs more flexibility to control action sequences – such as planes flying across the sky and explosions occurring – 3D playback with scenes built in a real-time engine is the best tool for the job.”

Recent productions such as Masters of the Air have been important in proving what’s possible. Both Bild Studios/MARS and teams at Dimension worked on the Apple TV+ drama. “For flight scenes in the sky, you have a motion base moving your set piece around,” explains George Murphy, creative director, Dimension and DNEG 360. “The environment is genuinely surrounding the actors and filmmakers. They have the freedom to move through it and the environment reorientates to their movement. We’re able to immerse the actors and filmmakers in a world with natural reflections on surfaces, in characters’ eyes and in glass.”

Masters of the Volume

The workflow for creating skies in Masters of the Air involved integrating all available Unreal Engine sky-related techniques and systems, along with custom-made solutions to render convincing dynamic skies for the aerial battle scenes.

James Dinsdale (VP supervisor) and Chris Carty (senior content generalist) at Dimension/DNEG 360 helped merge these tools and systems into one larger asset, allowing for complete control and fast iteration when running the scene on a volume.

“We based our approach on an art-directed high dynamic range image (HDRI) projected onto a sphere 100km across – so the distant sky moved with the correct parallax effect,” says Dinsdale. “We layered in effects like atmospheric height fog and hazing to blend the horizon and integrate with the rest of the sky. We crafted fully volumetric clouds within each scene by using 3D volumetrics along with masking layers and extra custom tools and interfaces.”

This novel workflow was crucial for key moments, like the B-17 bombers dipping in and out of clouds. “It ensured the skies existed in a grounded and consistent space along with the other assets and planes in the scene, allowing for natural interaction in camera,” he adds.

Typically for big, open skyscapes you won’t miss the parallax 3D scenes (CGI) will bring, as the objects are often not found in the close foreground. Two-dimensional or background plates are cost effective to capture and photoreal out of the box, so often the most suitable as final-pixel assets for skyscapes.

“Productions would capture a plate array from a physical location, these would be stitched to create a seamless 270° or 360° plate which is then played back on the volume,” says Alpe. “This approach has been used at MARS Volume for rooftop scenes, backdrops for set-built walls with windows and helicopter travel sequences.”

An exception to this was its work on Masters of the Air – its explosive aerial action required animation sequences and timed explosions happening in the foreground. In this case, 3D scenes were the best pipeline.

“One of the goals for Masters of the Air was making it historically accurate, and virtual production allowed us to do that,” says Alpe. “Weather data, sunrises and sunsets could all be reconstructed in the real-time engine with historical accuracy.

Virtual cinematography of skies played a vital role in delivering natural reflections and performance flexibility – allowing director and actors to see and respond to the aerial dogfights and explosions in the action in real time.

“The Unreal Engine scenes were painstakingly created with this degree of care and attention. We were able to play them back to the LED volume and build tools that gave the directors maximum flexibility, for timing action sequences and what they would see on the set, in camera, at any one time.

Augmented with practicals

If you need to get the sun in shot (or any kind of directional lighting), you need to augment it with practical lights. “A limitation of LEDs is in generating hard, crisp light, but they’re very good at general directional and ambient fill, with natural colour and immersion,” comments Murphy. “For aerial shots featuring movement through clouds, you’d need to replicate fog on-set, which is possible but requires forward planning.”

For the VR worlds of Netflix sci-fi hit 3 Body Problem, the team filmed against a large 180° wall consisting of ARRI SkyPanel LEDs filtered through and hidden behind a Rosco scrim.

“Our board operators could control any kind of colour we wanted,” explains Richard Donnelly ISC. “This enabled us to light the actors precisely – for instance with the sun rising – instead of being led by VFX. We augmented the set with many other lights but, essentially, we lit the actors by the wall. It’s almost the reverse of volume capture where you use plates filmed on location to light live action.”

The subtlety of balancing practical lights with the skyscape environment generated on a volume ‘is an art form’, says Alpe, that sets an experienced DOP apart from the rest.

“The ambient lighting from an LED volume, as reflected on the skin tones of a person, can sometimes look different to what you would expect by being outside,” she says. “When working in a real-time environment, it’s important for volume control teams to take the time to share lighting understanding with the DOP, to empower them to fully understand exactly what they have control over in the scene. It is through the strength of this collaboration that DOPs can be set up for success on-set.”

LED screens featuring high dynamic range (HDR) are becoming standard, which means they produce a variety of intensities that feel more natural. “When you’re framing something up in camera, even if you’re supplementing it with practical lights, it can feel like those intensities and exposures and colour saturations are realistic. Once cinematographers begin to trust that, they’re becoming more confident in what they can achieve.”

A good VP team will be able to advise on the best approach that will suit the production and director needs.

Ever skyward

“There’s some unusual phenomena in the skies – from rainbows to light shimmering through rain,” concludes Murphy. “Recent developments in ray tracing are enabling us to accurately capture the refractive nature of light through atmospherics in camera. We’ll see much richer clouds and skies on demand, which are more photoreal and detailed.”

IBC Conference: Solving the “common challenge” for pay TV operators

IBC

Orange’s Chem Assayag navigates the challenges facing modern broadcasters and how to develop a service proposition that captures customers in an increasingly crowded and competitive marketplace.

article here

Who owns the customer? “No one really owns,” says Chem Assayag, Senior VP Home Services Innovation, Orange. “We are at a time where the customer has tremendous choice in terms of what they can consume with more players battling to get their attention and for a share of the household wallet.”

Assayag’s IBC2024 Fireside Chat is titled ‘Who owns the customer? Winning the battle for control of the TV experience’ and the Orange executive emphasises that the battle for broadcasters is not lost.

“I grew up watching TV from a handful of channels and if we wanted to watch we had to be in front of our single TV set,” he says. “Today with so many ways to access content this freedom of choice has changed the balance of power between viewers and content providers.”

Assayag reminisces on how the landscape changed over three decades as more players entered the game. The arrival of multichannel cable and satellite networks such as Canal+ and Sky introduced competition to broadcasters “but were still very much TV-related content companies.”

Then in the early 2000s, telcos like Orange entered the market with IPTV services. Netflix followed a decade later ushering in SVOD streamers.

“Now we are seeing the increasing role of TV manufacturers like Samsung with connected TV (CTV) offerings. Tech companies like Google also muddy the waters by being both a technology supplier underpinning some operator’s services and a content provider. As a result, the market is highly complex with lots of players competing and cooperating."

Orange, for instance, offers Netflix as part of its service at the same time as competing with the SVOD.

“The situation is kaleidoscopic and the configuration of those relationships is not always simple,” Assayag says. “For us as service providers and for the consumer.”

Content discovery

Assayag has strong experience in the world of digital services and digital TV, working for companies such as OpenTV, NDS (now Synamedia) and Qualcomm. Prior to joining Orange, he was the EVP of Marketing & Sales for Viaccess Orca, driving the company’s expansion for six years.

He says even the word ‘TV’ might be misleading. Are we talking about general video consumption or viewing solely on the large set in the living room?

“The two are related. People are consuming video on many more devices and in many more ways. They can pause, rewind, watch on-demand, catch up.

“There is still room for the main TV set in the living room as a device that allows people to gather around and create moments to share,” he insists. “If you wanted to watch a movie at home with your kids you would most likely do so on the big TV.

“The issue is how we make this movie available. It has to be easy to find. You have to enable restart and pause and all other consumption modes. You have to take into account the pricing framework.

“In that respect, all pay TV operators have a common challenge, which is to improve the quality of experience around search and recommendation. Despite everyone’s best efforts search is still a pain for viewers. With more and more content, the ability to find content relevant to you has not been solved. We still have a lot of people complaining about it.

Orange’s own recent ethnographic research which sought to understand consumer behaviour through detailed diaries over several weeks revealed just how frustrating it was to find content.

“It was taking 15 minutes at least to find content they wanted to watch and despite the huge amount available they would end up watching an episode of a series they enjoyed for the third time because it offered no risk. As an industry, we need to solve this problem.”

User interface

AI might help, for example, in providing more sophisticated natural language interactions with the systems embedded in a TV set or set-top box (STB).

In May 2024, Orange France launched the STB 6. This device integrates a far-field microphone, paving the way for a more immersive experience through natural voice commands enabled by a partnership with Amazon Alexa.

“It offers a friendly interface even without a remote control and marks a shift towards more interactive user experiences,” says Assayag. “In turn, this makes access to content easier which goes back to search and recommendation and quality of experience.”

Another trend Orange spotted is that younger generations either don’t know or don’t care where their favourite TV show or film is from, they just want access to it.

“Ten years ago, people would associate a show with a particular network, channel or broadcaster but now if they want to find the next episode, that show is no longer attached to the channel brand in their mind. If the channel is no longer relevant you need to take that into account when designing the search and recommendation engine.”

Orange in France sees itself as an aggregator able to help consumers navigate their way around content choice.

“There’s an expectation among customers that we aggregate the best content meaning all the regular TV channels, the key SVODs and the main pay TV channels, and that we allow them to consume on all devices using all video consumption modes,” he says.

“Our role as an aggregator is to negotiate the commercial deals in order to have this content line up and also ensure the technical integration end-to-end is smooth for quality of experience.”

He says the appeal of live content watched on TV is in decline but remains of great importance. “Consumption of live is declining but much slower than what we expected. There is still an appetite for live sport and also for 24/7 news programming, particularly in France over the last few years.”

FAST channels haven’t taken off in quite the same way in France as in other territories because of the history of free-to-air services.

“The French market already has a lot of free thematic channels and we include many in our packages. Certainly, there is appeal for very targeted niche FAST channels such as those rolling episodes of the same series. We are monitoring FAST but it’s currently not a huge area of business for us.”

Smart homes

Large telcos like Orange have sought to diversify their product portfolio to capture revenue from the wider ‘smart home’ but Assayag calls business to date disappointing.

“I would say there’s a general disappointment about what smart home has delivered in terms of business for all market players including Big Tech with their voice assistants. One reason is that the market is fragmented among different technologies. We hope that standardisation of the sector through smart home protocols like Thread will make interoperability between devices easier and more attractive to customers.”

That said, Orange has identified home surveillance as an area of growth. It has rolled out Maison Protégée, a home surveillance service with dedicated hardware and support.

Back to the question, ‘Who owns the customer’ and the answer, for Assayag, is that there is potential for growth provided you get a few things right.

“Understand that you are not going to have the customer forever if your service does not keep performing well,” he says. “Two decades ago if your service was poor the customer had little choice but to stick. Now, they have so many options so you have to be very agile to make sure you keep them.

“For us, that means ensuring you have the right content, onboarding new content providers with commercial agreements and technical integration, and you also have to make it attractive from a pricing standpoint.

“There is a huge sensitivity to price now as a result of the standard of living crisis and price is a very significant factor for customers. You have to find pricing schemes which are appealing and you have to deliver extremely good QoS which includes making it easy to consume content on different devices. If you do that, then you can retain their business.”

Wednesday, 7 August 2024

5 Minutes with Jeff Drury, Director of Technology Whitehouse Post/ Carbon

interview and copy written for Sohonet


‍Whitehouse Post is a post-production partner of choice for blue-chip brands from Nike to IKEA, Bank of America, Samsung, Porsche and Budweiser, many of which have been featured in the Super Bowl. They primarily focus on servicing clients or creative agencies with editorial, finishing and color grading. The company’s talent roster of creative artists, engineers, designers and editors spans the globe.

article here

Whitehouse operates main offices in London, New York, Chicago and Los Angeles. Their sister company Carbon is a renowned creative studio, specializing in design, animation, live action production, and VFX, alongside color and finishing capabilities. Carbon also has studios in New York, Chicago and Los Angeles.

Jeff Drury is the Director of Technology and sits across both divisions, overseeing workflow, adaptation and integration of new technology as well as site-planning, installation, and system configuration of facilities.

The advertising market seems tough right now, would you agree?

Business is definitely less predictable than in years past. It rebounded following the pandemic but has stuttered since then with some creative agencies retaining post in house. This is a talent driven industry and at Whitehouse we have some of the best and most experienced editors and CG leads and directors around. They are what we will continue to lean into.

‍How have you seen post production technology change?

I would characterize it right now as ‘anything anywhere.’ In the 20 years I’ve been in the industry we’ve gone from installing computer systems the size of refrigerators to virtual machines in the cloud. We are now able to offer all the technical resources of an individual facility to anyone, anywhere. 

Carbon has requirements to scale up and down in accordance with each project and has the ability to connect with the best freelancers. Whether they are in Mexico, Portugal or the United States, they are able to dial into our central infrastructure and work with media from anywhere.

The same distributed architecture has enabled us to establish bases in specific markets that we weren’t necessarily in before: Kansas City, San Francisco and Richmond. Sometimes we operate within an agency’s walls, but under the same Whitehouse technical infrastructure. The media backbone, the service and support is the same.

Does this mean you can downsize the traditional on-premises facilities?

It’s a big question and one we are grappling with now. The amount of machinery we have in one physical location with all the necessary aircon and electric energy and carbon output – all that will go and potentially be replaced by systems of equal if not greater firepower that could fit into the space of a closet. From there we would establish connectivity to all our artists, directors and clients. It would be a virtual studio networked in the cloud. 

Is there a central media repository to which staff and clients can access?

We remain flexible, depending on the project. For instance, if a project is based in London, it makes more sense for us to keep the media nearby, and if it then switches to our LA team we’ll replicate the project there. On the flip side, if we have 60 artists based in multiple locations working on a single project and they are constantly iterating versions, we will centralize that perhaps using AWS. 

Can all the craft functions of post be conducted remotely?

Technically they can be, but in practice some aspects of post remain a challenge. Audio is the main one because we can’t control the viewing and listening environment. Remote audio mixing, especially for final sessions, doesn’t yet compare to the experience of being in a professionally calibrated room. It does depend on the project though, since most video content is ultimately being consumed by the audience as a video streamed to their TV or mobile the remote listening environment is often good enough.

Tell us why you recently began to work with ClearView Flex?

We’ve known about Flex for a long time and often talked about implementing it for Carbon but since everything we were remoting was to the Cloud we had no need to route HDMI and SDI signals. As soon as Sohonet made ClearView Flex available over NDI – a data-based video stream –– it became a slam dunk. 

Flex is so much better than conference video quality connections. Plus, you can screen media on an iPad or Apple TV which many clients are really comfortable with. It looks great, it’s very robust and it is very simple to use.

We use Flex every day for CG and grading with our telecine, with Flame and nuke as well as Premiere and Avid where appropriate for the project.

To what extent are you using AI in your workflow?

AI has been helpful in the pitch stage to give people a better idea of the vision for post. In the past, you’d rent DVDs to put together a pitch reel. AI is incredibly useful at explaining concepts quickly to kickstart that creative process. For final output though, the results tend to be generic. It tends to look like stock footage. I don’t think anybody believes Generative AI will actually take over the role of humans in terms of creatively thinking through a concept, but it can accelerate that process.

What motivates you day to day?

Reinventing the way we can do things. My tenure at Whitehouse Post has been about transitioning to new ways of doing very old things. Every day we find new tools to take even more mundane work out of artist’s hands to enable them to spend more energy on thinking creatively. In my department we envisage our roles to be figuring out ways to cost effectively take infrastructure away from users so they can focus on being creative.

We are in the process of planning new studios and new locations. We have such a dispersed workforce now; we don’t need 20 suites locked in expensive real estate. Being able to reinvent how that all fits together in a way that allows people to operate wherever they are is an exciting challenge.