Tuesday, 9 February 2021

Behind the Scenes: Judas And The Black Messiah

IBC

Editor Kristan Sprague explains why they made multiple cuts of the story of Fred Hampton, the firebrand leader of the Black Panther Party who was killed in 1969.

https://www.ibc.org/trends/behind-the-scenes-judas-and-the-black-messiah/7256.article

Several recent releases have tied the US civil rights struggle burning in the 1960s with the renewed urgency of today’s Black Lives Matter movement.

One Night in Miami prefigures the assassination of Malcolm X; Spike Lee’s Da 5 Bloods and Aaron Sorkin’s The Trial of the Chicago 7 are set in the aftermath of Martin Luther King’s assassination.

Now, Judas and The Black Messiah relives the potent history of Fred Hampton, a firebrand leader of the Black Panther Party who was killed by police during an FBI-led raid in 1969. He was a peripheral figure in The Trial of the Chicago 7.

These films will figure prominently in the race to win Academy Awards. Like Mangrove, Steve McQueen’s feature about harassment and injustice in ‘70s London, these stories have been criminally neglected on screen; if they were about white persecution, they would already have been filmed.

That’s particularly noticeable in the story of Hampton, a charismatic activist whose short life is a natural subject for cinema. He momentarily succeeded in uniting black power with rival anti-establishment factions, most strikingly a group of white Patriots (depicted in the film waving Confederate flags). Judas suggests that this spooked the federal authorities into extrajudicial execution although even FBI suits are shown baulking at chief J Edgar Hoover’s extremism.

Dual protagonists
Filmmakers including Forest Whitaker, Antoine Fuqua and Casey Affleck had long tried to get a Hampton project off the ground. Rapper Mos Def tested for the role. Eventually, Ryan Coogler (the director of global smash Black Panther) and Charles King, who had produced acclaimed civil rights themed dramas Mudbound and Harriet, joined forces. They championed first time feature director Shaka King who had been wanting to make a Hampton film since 2014.

“Shaka told me about his idea over many years and that he found it hard to get funding for a straight biopic,” Kristan Sprague, the film’s editor, tells IBC365. “The argument was that few people knew who Hampton was. That forced them to come up with a new angle which is part biopic but also more of a crime thriller genre.”

“O’Neal’s character is going undercover with divided loyalties but Hampton is also conflicted about how much force should be used to achieve the Panthers’ goal,” Sprague says.King and co-writer Will Berson merged their ideas with Keith and Kenneth Lucas who had also been working on a Hampton screenplay. Their story wove the rise of Hampton (played by Daniel Kaluuya) with that of petty thief turned FBI informant William O’Neal (Lakeith Stanfield). In doing so, the film is structured a little like Martin Scorsese’s The Departed, with two lead protagonists.

“Theirs is a very complicated relationship. Fred is a progressive socialist and O’Neal is capitalist. He was cool as long as he was getting paid, although I’m not sure he ever understood why he did what he did.”

Judas and the Black Messiah conveys complex political machinations while compressing the timeframe of events and juggling several key relationships. Those are between O’Neal and Hampton, between O’Neal and his FBI handler (Jesse Plemons), and between Hampton and his fiancée Deborah Johnson (Dominique Fishback).

“This is not a documentary or a docudrama,” Sprague says. “Even if we know far more about Hampton’s life from documentary footage, stills and newspaper articles than anyone does about O’Neal, our story is very much a balance between these two.”

Rainbow coalition
A case in point is a scene which the filmmakers dubbed the ‘Rainbow coalition montage’. This portrays the Panthers uniting with groups of Puerto Ricans and white radicals and features a firey speech from Hampton in front of the Chicago police department.

Inserted in the sequence is a flash forward of O’Neal taking FBI bribes at a restaurant. Cut back to the steps of the police department the crowd is chanting anti-police slogans. Hampton and Johnson are then shown looking at each through the crowd before the scene cuts to an intimate moment of the lovers together.

“We were working on this sequence almost to the very end,” Sprague says.

“It was originally a much longer scene and included a sequence in which we see Hoover (Martin Sheen) declaring his hatred for the Panthers. We felt we needed to set up Hoover’s personal antipathy to Hampton much earlier so we shifted that to the beginning of the film.

“We’re aiming to show the strength of the Panthers at this time. It’s the high point for them in terms of building an organization with wide appeal. Simultaneously, O’Neal is getting in deeper with the FBI. He’s noticeably well-dressed because he is taking money, even while he is in two minds about doing so. This is also the last scene before we see Fred in jail, so it was important for us not to lose the focus on his relationship with Deborah.”

The shared looks between Hampton and Johnson are mirrored in a later demonstration this time with private glances between O’Neal and Mitchell. Sprague smoothly maintains the story’s momentum by often overlapping dialogue to bleed between scenes.

“There was a lot of to-and-fro in the edit,” he explains. “We’d pull back on Fred’s story so that we only see him through O’Neal’s eyes. Then we’d make a cut of the whole picture that went the other way toward a much more straightforward Hampton biopic. You can only really do that in the editing process where you get to watch all the rushes and select the takes that can tell the same story in different ways until you arrive at the one that feels right.”

The editor Jennifer Lame, hot from negotiating skipped timelines in Christopher Nolan’s Tenet was hired as a fresh pair of eyes and is credited as additional editing.

Compressed timelines
“We did two different cuts that tried two different things. Jennifer did one while we were working on another. We didn’t go her way in the end but seeing it told differently unlocked other ideas which we did inform our decision making.”

While the production shot in Cleveland, which doubled for Chicago in the late 1960s, Sprague received footage from the set at his office in New York. He has been a friend of King’s since high school and began their filmmaking career together making the short Cocoa Loco in 2009. They reteamed for indie drama Newlyweeds in 2013 and two further comedy shorts Mulignans and LaZercism.

Assisting the production was Hampton’s son and Johnson herself (now known as Akua Njeri). Sprague says they were involved during the writing process and spent a lot of time on set.

“It was hard enough for Daniel to have Frank Hampton Jnr watching him every day but Dominique had to sit and perform in the presence of the woman who she was playing. At the same time, Akua had a constant reminder of what happened 50 years ago.”

Behind the camera is Sean Bobbitt BSC who had previously worked with Steve McQueen on Hunger, 12 Years A Slave and Widows. He employs a distinct colour palette throughout the film worked out with production designer Sam Lisenco.

“Scenes with the FBI tends toward drab brown and oranges or very cold blue,” Bobbitt explains. “With Panthers, you have a more vibrant colour palette. There’s a lot of green and yellows. Really, we’re trying to subtly differentiate those two worlds but not in a way they are so disconnected that we don’t believe they are part of the same world.”

Bobbitt lensed with Arri Alexa LF and Alexa Mini, using the wide screen format to frame strong compositions of groups of people and also to isolate characters – variously Hampton, O’Neal and Johnson at different times.

Bobbitt says the most technically challenging sequence was the climactic scene of police raid on the Panther’s property and execution of Hampton.

“An overhead shot on a motion control arm goes from room to room to room. To get something like that is expensive so a lot of time and effort went into designing the sequence to that we made the most of the time on set. Since this was also the culmination of the film it needed to have more power to it and horror as well.”

Friday, 5 February 2021

Riddle of the Sphinx: Haris Zambarloukos BSC GSC on Death On The Nile

British Cinematographer 

The peace and tranquility of a Egyptian river cruise onboard the S.S Karnak is shattered after one of the passengers is found murdered. Renowned Belgian detective Hercule Poirot is entrusted with identifying the killer before they strike again in an exotic mystery steeped in lust, jealousy and betrayal.

https://britishcinematographer.co.uk/haris-zambarloukos-death-on-the-nile/

Director Kenneth Branagh assembles an all-star cast of potential murderers for Death on the Nile, a new feature adaptation of Agatha Christie’s 1937 novel. Disney’s follow-up to 2017’s Murder on the Orient Express sees Branagh as the fabulously mustachioed and famously fastidious sleuth tasked with solving the death of an American heiress onboard a honeymoon cruise in Egypt. Like Orient Express, it is written for the screen by Michael Green, production designed by Jim Clay and photographed by Haris Zambarloukos BSC GSC.

One of the delights for an audience in watching one of the Christie canon is playing detective themselves and sorting the red herrings from the essential facts among an array of likely suspects. Death on the Nile features Gal Gadot as unfortunate heiress Linnett Ridgeway, with Annette Bening, Armie Hammer, Letitia Wright, Sophie Okonedo, Ali Fazal and British comedy stars Jennifer Saunders, Dawn French and Russell Brand.

With many cinemas closed, Disney delayed Nile’s release by a few months but has stuck by a pre-Christmas theatrical outing. That’s fantastic news for a production shot at the highest fidelity large format 65mm film and for which 32 70mm release prints have been struck. Zambarloukos details the scene of the crime.

There is a coda at the end of Orient in which Poirot hints at vacating in Egypt. At what point did you know this would be the next Poirot project?

HZ: We knew that Michael Green had already been commissioned to write it but we also had to shoot Artemis Fowl first. For anyone that knows the books, Poirot’s reason for travelling to Egypt in the first place was not to solve the case that becomes Death on the Nile. He does go on vacation and then returns. In our film he says, ‘I’ve just got back and now I have to return.’ So, it was a nice touch from Michael to lead into Nile at the end of Orient.

 

Production was delayed due to the completion of Disney’s purchase of 20th Century Fox, so we began in principal photography in September 2019 not 2018 as planned. That was great because it meant I got a lot of time to prep. I had the read book, but I did not go back to it because I just wanted to rely on our interpretation and Michael’s fantastic script. I remember it being my favourite Christie but I think Michael did a superb job.

In the interim, the whodunnit genre has been satirised by Rian Johnson’s Knives Out with Daniel Craig as a Poirot-esque sleuth complete with rogue accent. Did that play into your approach?

HZ: I enjoyed Knives Out but it really is a different interpretation to ours. I don’t know anyone who has researched Agatha Christie’s writing as much as Ken – down to what she was thinking and feeling while writing each book. This feeds into his interpretation of the story.

Ken has been championing diversity in casting for decades now (for example, casting Denzel Washington in 1993’s Much Ado About Nothing). His philosophy is ‘this is how I want the world to be and it doesn’t necessarily have to be the one inhabited by the characters defined in the source material’. He did it on Orient (casting Leslie Odom Jr. as Dr Arbuthnot) and he’s gone much further on Nile. Together with Micheal’s script, he’s added a race issue which didn’t exist in the book and is a much better story for it. We went away from a more English, tamer version of the story to get under the skin of the human condition.

Similarly, the images could portray the world as if from a LIFE magazine issue from the 1930s of a cruise down the Nile but, given the darker outlook for film, I wondered what it would looked like as a cruise down the Mississippi. That was always my question to Ken - could we add any elements that bring that aspect out? Certainly, the choice of a blues soundtrack played into that.

 

 

Did you refer to previous film adaptations or were you inspired by other artistic references?

HZ: The references are quite varied. I always look at photojournalism from the period and in this instance Lee Tanner’s The Jazz Image: Masters of Jazz Photography and The First World War in Photographs by Richard Holmes were both great references. Artwork by Edward Hopper provides another tonal element. For example, his 1939 painting New York Movie depicts a girl waiting in the wings of a theatre. She’s in the shadows and you can see the red velvet curtain. In that picture there is nothing ominous or overtly scary or sexual but it evokes all those things in a very clever and emotional way without eliciting fear or showing her in a derogatory way, or as a victim. All of this added to the way we photographed this film.

The first eight minutes are in French and in black and white. That’s brave. Why that creative choice?

HZ: It’s a really daring thing to do in a big studio film. It recounts Poirot’s past as a solider in the first world war, his approach to remembering and analysing how he becomes who he is. The film then moves to the ‘30s and a gritty, blues club in London where we feature the music of Sister Rosetta Tharpe, an African-American blues artist and guitarist. She is a very assertive woman, standing up and playing guitar. We meet a trio of characters played by Armie Hammer, Gal Gadot and Emma Mackey in that second part of the beginning of the film through dance and music. There is dialogue but it’s really about body language in this dance sequence. We shot this with very long Steadicams that intertwine the music, the choreography and their relationship. We planned all of this to be a very visceral and immersive introduction.

 

The design of Orient was bright, classic, glossy. Was this a template for this film or did you evolve the look?

HZ: It’s definitely evolved but we had to stay within the same language. We wanted the highest fidelity photography which is 65mm analogue film, so we managed to stay with that process and certainly a classic approach to lighting and always an ode to all the film noir that Ken and I grew up loving such as Dial M for Murder or those large format human condition films like Giant (1956).

Both Ken and I love John Cassavetes. For us, it was how do you capture that rawness of performance while making a studio film? Those were the influences we played around with.

For Orient we’d watched a lot of Anthony Mann films and those were a huge influence on this film. The Night of the Hunter shot by Stanley Cortez ASC was a key text. It is a very Americana film that combined daytimes shot on a real river with night-time work shot in a studio. It’s a masterpiece in every way with some very iconic imagery but I think my idea of turning the Nile into the Mississippi is more understandable as a concept when you see that film.

Yourself and Branagh have shot film for your previous five collaborations. Did you go with the same camera package as Orient?

HZ: Yes. Two Panaflex 65mm synch-sound cameras with a combination of System 65 lenses and Spheros. The Spheros are slightly older, from the David Lean era of 65mm. The System 65s are made for the format’s resurgence in the early nineties led by Ron Howard’s Far and Away (1992) and Ken’s Hamlet (1996).

Our film stock was a Kodak combination of 500T 5219 and 250D stock which is super fine grain. Exteriors in sunlight on 50 ASA 65mm look almost three dimensional. And 200T as well, even though we were large format we wanted to go further and do large format and fine grain film. I certainly chose lenses based on their clarity. I know many people go for a vintage look and find something a little softer. I was going for the most immersive type of analogue filmmaking I could.

The team at Panavision, led by Hugh Whittaker and Charlie Toddman, were very supportive in making a 65mm film possible. They have kept those cameras working over the decades and it’s a privilege to use them. Likewise, Rob Garvey at Panalux was instrumental in helping us accomplish our very complicated stage rig.

 

This was shot like Orient at Longcross Studios with plates filmed on location in Egypt. Was it ever a possibility to shoot entirely on location?

HZ: The issue is that 1934 Egypt barely exists today. For example, in the 1960s they moved the Abu Simbel temple 300 metres away so that the Aswan Dam wouldn’t flood it. So, we built the entire four-storey high Abu Simbel at Longcross, complete with banks of water. The same with Giza and the Sphinx. In the 1930s the Nile went up to the feet of the Sphinx. Now all you see is the concrete expanse of Cairo.

Secondly, it’s difficult to shoot complex shoots on a river while floating, taking all the cast down there and scheduling them, on top of ensuring everyone’s safety on such a high-profile project.

Our whole design and research went into creating a set. We wanted to build a life-size boat inside and out; not to break it down into small sets but to shoot it as if we were on a boat. That’s a huge undertaking. Jim Clay built an amazing set to scale for the Karnak. It was so big we needed to build a temporary sound stage around it. We also wanted to use some real daylight when we got great sunlight in Longcross and use a little bit of water to basically film the boats carrying guests to the Karnak.

We recycled the railway from Orient and built the boat on that so we could wheel it in from outdoors to indoors. We built a very elaborate lighting rig that you could pull back and see the entire boat in one shot. You could step onto the boat and walk through all the rooms which were all lit for an analogue film f-stop. It was complicated and took most of our planning but I personally don’t think you can tell the difference when we cut - even from a shot filmed outside in real sunlight juxtaposed with one in apparent sunlight on our sound stage. It’s seamless because we took such great care and a detailed approach to our rig and construction.

You augmented the studio work with plates photographed on location in Egypt. Tell us about that.

HZ: We filmed on the Nile from a boat with a 14 8K Red camera array. We had a 360-degree bubble on top of the boat and two three-camera arrays pointing forwards and backwards as we travelled up and down. We specifically chose areas where modernity wasn’t present (or where it was, we removed it in post) and we also shot plates from the point of view of passengers onboard the Karnak.

VFX supervisor George Murphy edited the footage and stitched the plates together into an essentially very, very advanced virtual reality rig in which I could pan my camera. We did that before principal photography, so we never had to guess a month or so later what to put there. That’s a big help. Most shoots do their plate photography afterwards. It meant I could pretty much place the camera on any deck of the Karnak for any scene and know what the background would be.

 

As with Orient, did you play back footage realtime on LED screens outside the boat set?

HZ: I’d love to have done it live but on Orient we were only dealing with one wagon’s windows at a time. It was still the biggest LED set-up ever done to that point, but the Karnak set is 20 time bigger than that. There aren’t enough LED screens available – plus it would have been prohibitively expensive.

Instead, I went for a much larger version of a technique I’d used on Mamma Mia which was to hang back projection screens all around the boat – 200m in circumference, 15m high. We used Arri SkyPanels at a distance to create a sky or a part of the background. It could also be converted into a blue screen when we needed to. It meant that if I had a shot looking above the horizon line into the sky then it could be done in camera.

How confident were you of retaining colour and contrast from set to post?

HZ: I took stills on the recce and we used those to the create colours with this back projection for our skies. I take prints (not digital stills) so there is no misinterpretation. A still is a piece of paper that you can see. Once something is emailed across and seen by someone watching on another screen the information can get lost.

At the same time there were a lot more checks and balances put in place. We had a projector at Longcross and I watched dailies with (dailies colourist) Sam Spurgeon every lunchtime. With Kodak and Digital Orchard we have a very quick process to convert analogue filmmaking into digital by the next morning. Film is processed at night, they scan at 4am and by mid-morning those digital images are transferred to our dailies suite at Longcross. At lunch we’d watch it digitally projected, having been processed, scanned and graded at 2K.

I check that first and give notes to Sam and those get transferred onto our dailies which is what Ken, the editorial team, VFX and studio team sees. That’s a major check. It’s me with someone in a room, rather than me talking over the phone which is a big difference. I have a very good relationship with Goldcrest and (DI colourist) Rob Pizzey who also sees things along the way. I supervise the grade at the end. So, there’s no need for anyone to interpret anything. It’s a collaboration in which we all look at the same images.

 

Did you shoot black and white for the opening scene or convert?

HZ: We shot colour for a couple of reasons. Although Kodak could manufacture BW 65, there is no lab in the world to processes it. Plus, there’s a certain skill to grading BW using colour negative and the added benefits are that that you can place a grey tone to a colour. For example, you could take red and decide it will look a very dark grey or a light grey, so you get very detailed tones. Ultimately, I get much more control in the DI this way. They were very monochromatic battlefield sets and costumes so it was quite limited in this case. The Germans wore grey and the Belgians wore dark blue and it’s a dark sooty gas-filled battlefield but you could manipulate the blue in the sky a little bit more and certainly manipulate the intensity of people’s eyes - especially if they had blue eyes (which Branagh does).

Tell us about your operating team.

HZ: Absolutely. If there’s any merit or artistry in cinematography it is 100 percent because of the crew. I have a fantastic team, many have been collaborators for decades now. My A camera operator is Luke Redgrave, B camera Andrei Austin, Steadicam is Stamos Triantafyllos. I cannot describe how difficult it is to do the shots Stamos does at those lengths carrying that weight. I don’t know anyone else who could do it. It’s a combination of extreme artistry and extreme physical ability.

My fantastic key grip is Malcolm Hughes. The work that Luke and Malcolm do on a crane is incredible. My great gaffer was Dan Lowe and my A cam focus puller was Dean Thompson who has been my first since Cinderella and is an expert on 65mm.

How did you handle sound sync?

HZ: To do sound sync work on Orient we used sound cameras that are twice as heavy as high-speed cameras, so I wanted to develop soundproof housing (blimp) for our camera on Nile. I took the problem to Stuart Heath at BGI Supplies at Longcross. They’ve made all sorts of props for us before, from Cinderella’s carriage to the furniture on Nile. I told him that I needed it really quickly. All my other attempts had failed. Stuart suggested using a material that they soundproof the interior of helicopters with. He brought a draper in who basically measured the camera as if making a dinner suit for it and quickly made a couple of versions for us. It was very effective and really opened up the Steadicam possibility for us. All from just wandering onto a workshop on the lot and asking a friend if he had any ideas about how to achieve something. In the old days that’s what everyone did – the answer was somewhere on the lot.

 

In Orient you created some stylish direct overheads of the train carriage. You’ve told us of the Steadicam dance sequence in Nile. Were there other stylistic flourishes?

HZ: Inside the sound stage we went twice round the Karnak with the entire cast all choreographed for this one great reveal of a murder. It was really hard work to do. I understand why it was cut in the edit although they have kept a lot of other single long takes and there are lots of places where you see the whole cast in a single shot.

However difficult you might think setting up a long single is in terms of lighting and operating, it is equally, if not more difficult, to block a scene with multiple actors, keep the audience engaged and choreograph it in a way that is exciting and at the same time reveals things gradually. There’s a lot of pressure on a lot of people in shots like that. Everyone’s got to be on top of their game. Because we’re all so interdependent, it’s a domino effect in that the further you go in the take, the bigger the responsibility is for not getting it wrong whether that’s the operator, focus puller, the actor saying the final line, the gaffer lighting a corner at just the right time. We always get excited about those shots but also very nervous.

Did you complete an HDR grade?

HZ: We did a HDR DI and one in Dolby Vision. I’m in two minds whether I prefer the 70mm analogue print or the Dolby Vision DCP. Film is inherently HDR but projectors now are interpreting the information that exists in a film neg. They are only just getting to where they can interpret the dynamic range of a digital camera, let alone of a film camera. There is nothing more immersive or more HDR than a 65mm neg scanned at 8K down to a 4K Dolby DCP. It has everything you’d want in a projection. We pushed the limits in our DI with Rob and I’d do minute adjustments in Dolby Vision. The HDR process is becoming very seamless.

 

 

Finally, after six films and 14 years working with Ken Branagh, could you tell us what makes your relationship tick?

HZ: It is a fantastic friendship. To begin with you must be able to maintain a professional friendship with any cast and crew which is all about doing your very best and understanding where you have common aesthetics and shared thoughts about humanity. Ask what kind of world you want this to be, because that will come through in your filmmaking.

As you say, I’ve spent years working in close proximity to Ken and we have a mutual affection and admiration for each other otherwise we wouldn’t be doing it for so long. He is relentless in pursuit of perfection and in his advancement of storytelling and is inspiring to work with. It means you have to be as relentless in your area of craft.

I think we both like making the same kinds of films. I’m a Greek Cypriot who grew up with Greek myth and tragedy. Ken’s love of Shakespeare is legendary. You can easily see the lineage between Aeschylus (the ancient Greek creator of tragedy) that goes all the way to Shakespeare. Perhaps that appreciation for the human condition in its best and worst forms is the tie that binds.

 

Thursday, 4 February 2021

Animation Powerhouse Animal Logic: Insights on Remote Review and ClearView Flex

copywritten for Sohonet

We talk to the animation powerhouse that is Animal Logic and see how they’ve been responding to C-19 and look at how they’ve been using Sohonet’s ClearView Flex to facilitate remote review across the globe and hit key production milestones.

https://www.sohonet.com/our-resources/blogs/behind-the-scenes-with-animation-powerhouse-animal-logic-insights-on-remote-review-and-clearview-flex/

Award-winning animation and visual effects studio, Animal Logic, has been inspiring audiences with great stories, breathtaking visuals and groundbreaking technology for 30 years, having worked on Hollywood blockbusters including Peter Rabbit, The LEGO Movie Franchise, Captain Marvel, The Great Gatsby and Happy Feet. With headquarters in Sydney’s Fox Studios, a second home in Vancouver, and development offices in LA, the animation powerhouse’s creative excellence relies on the collaboration of a global team of 600 artists, practitioners and support staff. 

In response to COVID-19, Animal Logic was quick to adopt remote production tools across all sites. This allowed them to continue to hit production milestones on cross-site projects including Warner Bros’ animated feature Super Pets.

We talked to Animal Logic Media Engineer Harry Smith about how they collaborate, both cross-site and remotely, from across the globe.

Q> When Covid-19 forced lockdown, what did Animal Logic do to continue working on projects?  

We quickly pivoted into a work from home strategy, employing various remote desktop solutions and ClearView Flex for secure, remote review of content from outside of our network. 

Q> What was Animal Logic looking for in a remote viewing solution? 

For a remote viewing solution, we required something that would allow viewers to connect from their preferred device that would support as many viewers as possible.  Critically, we needed a system that would maintain our color fidelity and a stable frame rate so people could focus on what they’re seeing.

Q> Did your team trial other remote viewing solutions?  

We did try other solutions and found they weren’t able to provide a stable viewing experience for the number of viewers we needed to support.

Q> How important has remote working become to your business?

Our preference will always be to work together in the studio, but when our employees or clients can’t, remote work allows us to continue to hit production milestones, thanks to tools like the ClearView Flex.

Q> How did you learn about Flex and what does it enable you to do? 

Animal Logic uses Sohonet to connect our locations and our clients on a private, secure network, so it was a natural extension of our relationship. ClearView Flex has allowed us to set up reviews for our teams and external partners, allowing up to 30 viewers on a single stream.  We can use the system in place of our traditional review practices, which involved many people sitting in our on-site theatres. Flex has helped maintain communication between our teams in Vancouver and Sydney working on Super Pets [based on DC Comics’ family of crime-fighting animals] and the creative team at Warner Brothers in LA. 

Q> For this project, where are your clients located and how did Flex enable you to interact with them?  

Flex has allowed us to share content and, combined with video conferencing, enables us to discuss that content live with our clients in Los Angeles.

Q> And how would you describe the remote session experience using ClearView?  

The remote session experience has been good. We haven’t run into any issues after setting up the system, and the support team at Sohonet has been quick to respond and help sort out any issues we have had.

Q> Lastly, does ClearView Flex integrate with your other key remote production equipment? 

The Flex units sit in our facility server rooms, connected to our standard workstations provided to our staff. With remote desktop capabilities to those units, anyone can book out the resource and connect to control the desktop and set up a review session on ClearView Flex.

 


Monday, 1 February 2021

All the new TV smarts from CES 2021

As stay at home orders sent streaming through the roof the TV remains the centerpiece for entertainment in homes. A quarter of views to the TV in North America are now from streamed services, according to Nielsen. Households upgraded their tellies in a record-setting year for shipments in 2020, according to the Consumer Technology Association (CTA) which expects steady demand for displays in North America through 2021. TV sales will hit 43 million units this year with sets over 70-inches and/or those with 8K UHD in high demand.

TV vendors have had a tough time in recent years as consumers tended to pay more money, more frequently for smartphones. Greater attention from consumers on the value of the main screen and the maturation of a number of technical ingredients which, combined, make for more immersive viewing experiences, has opened the window for TV makers with the CES as ever the perfect place to launch a sales pitch.

“The killer app for TV is TV,” said Madeline Noland, President of the ATSC put it during a CES panel session about Next-Gen TV.

ATSC research reveals that home viewers want higher resolution and enhanced audio (from built-in 3D speakers to sound bars and 21.1 channels). They also want higher dynamic range, higher frame rates and bigger displays and Filmmaker Mode. This is a button which sets a film’s color-palette, contrast, aspect ratio and frame rate supposedly as the director envisaged. In fact, they want the whole package to which can now be added smart interactive personalisation.

“What is exciting is the synergy between these consumer desires and today’s TVs which are bringing these features to life,” she said.

We’ve been a while getting here. Michael Davies, SVP, Field and Technical for Fox Sports said on the same CES panel, that visiting Japan last year he was “embarrassed” to admit that Fox was still broadcasting 720p SDR when the Japanese were talking about 8K.

“We’ve been living with HDTV for 20 years,” he said. “It’s been a pretty slow roll from there. We had three 8K cameras at Super Bowl LIV but that pales besides the other 120 cameras we had there.”

8K TV sales are predicted grow by 300% in 2021 albeit that that comprises a relatively small 1.7 million units.

 

Upscaling

“Even in 4K there is limited content today,” acknowledged Grace Nolan, VP Integrated Marketing, Samsung. “It will be a little stretch to get to 8K on a more mainstream level – [but] we won’t get there unless the industry is pushing. It’s encouraging to see 8K games consoles (PS5 and Xbox Series X support 8K gaming) coming out. We will catch up with Japan.”

In the meantime, vendors are relying on upscaling technology to make incoming lower-resolution pictures 8K-ish.

“We lean hard on AI and upscaling tech,” said Nolan. “The more data that is input into the TV, the better the processor is able to work to produce a more beautiful upscaled image.”

Aside from UHD, HDR and enhanced audio the other near universal component of TV hardware 2021 is applications for gamers. Larger, brighter screens with higher refresh rates and special gamer-only features as well as tie-ups with cloud gaming vendors should help TV brands shift more gear.

Round-up by vendor

Decoding the barrage of branding and acronyms which go hand in hand with new TV launches is a minefield and two in particular make for confusion this year: MiniLED and MicroLED.

MiniLED could overtake LCD to become the main illumination source for the bulk of consumer electronics, let alone flat panel displays. Apple is widely-rumoured to be using MiniLED panels in its upcoming iPad Pro and MacBook upgrades. By siting tens of thousands of LEDs behind an LCD panel, combined with ‘dimming zones’, this technology helps deliver more precise differentiation between bright details without the light spilling into surrounding dark areas. Black levels of course have a direct impact on accurate color representation in SDR and HDR images but overall brightness levels are superior to those produced by OLEDs.

MicroLED is the more expensive solution to manufacture. It involves assigning microscopic LED arrays to individual pixels, therefore allowing even greater control over the picture brightness. Like OLED, this technology allows true blacks to be shown by switching any pixel off but unlike OLED, MicroLED can deliver much brighter dynamic range and more impressive contrast. Compared with LCD technology, MicroLED displays offer better contrast, response times, and energy efficiency. MicroLEDs form the basis of Sony’s Crystal LED screens which it is now marketing to film and TV productions wanting to shoot on virtual sets.

 

Samsung

Samsung, the world’s largest TV seller, has made MiniLEDs the backlight system for its range of new NEO QLED TVs. By shrinking the LEDs to a 40th of their traditional size Samsung says it is upped both brightness and black levels while allowing for more precision and less bleeding of bright areas into darker spots. The flagship is the 85-inch Neo QN900 which comes with a bezel-less screen similar to the infinity screens of its Galaxy smartphone. It’s less than a centimeter thin too with the speakers embedded behind the screen. No price was given but this could cost north of $10k.

An interesting feature is a ‘game bar’ which enables quick access to settings such as refresh rate and aspect ratio when attached to a PS5 or Xbox Series X. The aspect ratio can be changed from 21:9 to 32:9. It also supports 4K at 120fps which is another must-have for gamers.

Samsung’s CES headline generator though is its new MicroLED TVs which come in 88, 99 and 110-inches. Reports suggest that the largest one costs $156k and it’s only 4K. Two years ago Samsung was demonstrating this technology in a 8K 150-inch version called The Wall and directed at the business to business market. These consumer grade monsters come with a Multi View feature that enables the screen to be split into four separate 55-inch pictures. Each of the four sections have their own separate volume control too. 

LG 

The world's second bestselling TV brand is also introducing MicroLEDs into a range of displays it is calling QNED. The Q refers to the fact the displays use tiny ‘quantum dot’ crystals to display their colours. ‘N’ refers to ‘nano cell’ particles used to absorb unwanted light wavelengths to improve colour reproduction and viewing angles; the ‘ED’ refer to ‘emitting diodes.’

LG is packing 30,000 of these tiny LEDs into the back of its largest 86-inch screens to produce a contrast ratio of 1,000,000:1 when paired with up to nearly 2,500 dimming zones and advanced local dimming technology,” the company claimed.

In a video presentation the firm said, “The only way for LCDs to get bigger is for details to get more precise – hence Mini-LED. Blacks that are deeper and more precise than any other of our LCD TVs.”

However, LG’s premium picture quality will still be found in its OLED ‘Evo’ range which now come with a new processor.  During its press conference LG said the Evo benefitted from “a new luminous element” that would deliver “punchy images with high clarity, detail and realism.”

Gaming is a focus for LG too. It has a new partnership with Google which will see Google Stadia run on its TVs while Amazon Twitch has earned a prized position on LG’s ‘magic’ remote control.

A prototype of a 48-inch OLED capable of bending 1000mm from a conventional flat screen into a curved display for greater immersion also targeted gamers.

LG’s processor features ‘AI Picture Pro’, which is reckoned to be an improvement on last year’s debut. Developed from a database of over a million visual data points, the algorithm recognizes onscreen objects such as faces and bodies and distinguishes between foregrounds and backgrounds, removes noise and optimises contrast and saturation ostensibly “to make images more three-dimensional.”

Panasonic

Panasonic majored on OLED and also zeroed in on gamers with its 55-inch and 65-inch JZ2000. The model is billed as having low latency and support for HDMI 2.1 variable refresh rates as well as frame rates up to 120fps. An AI processor can automatically detect what you're watching or playing as well as the ambient light settings of the room you are in and calibrate settings including Dolby Vision IQ and HDR10+ Adaptive for optimal viewing.

As an example, Panasonic said it can detect a football game and adjust the picture accordingly to help accentuate things like the grass on the field or how players look. The AI system will also give you a sound setting that feels like you're in the stadium, Panasonic said.

Alongside Dolby Atmos support, the TV comes with side and upward-firing built-in speakers which create what the company calls 360° Soundscape Pro.

Hisense

Hisense is adding an 8K up-rezzing chip to its flagship ULED TVs later this year and will promote this through its official partnership with the rescheduled Euro 2020 soccer tournament.

Bigger news from the Chinese vendor, though, was the unleashing of a massive 300-inch version of its TriChroma laser TV.  These are laser projectors which use short throw technology to display 4K images on walls but Hisense has added in a smart platform, AI cameras to support interaction like online karaoke and fitness and a TV tuner.

“Laser TV is the only TV category which experienced growth in China last year,” said boss Fisher Yu, who added that sales had rocketed 288% outside China in 2020.

Since launching the first laser TV in 2014, Samsung, Sony and LG have followed suit with their own ranges.

 

 


Saturday, 30 January 2021

HBO Europe Gothic Horror Series 30 Coins Finished in DaVinci Resolve

written for Blackmagic Design

Directed and co-written by acclaimed horror master, Álex de la Iglesia, (The Day of the Beast, The Last Circus), 30 Coins takes viewers into a world where nothing is as it seems, and nobody can be trusted.

https://britishcinematographer.co.uk/hbo-europe-gothic-horror-series-30-coins-finished-in-davinci-resolve/

HBO Europe’s eight-episode drama series follows Father Vergara (Eduard Fernández), an exorcist, boxer and ex-convict, who is exiled by the church to be the priest of a remote town in Spain. As past enemies come back to haunt him, strange things begin to happen. The town’s ambitious mayor Paco (Miguel Ángel Silvestre) teams with local veterinarian Elena (Megan Montaner) to unearth the secrets of Vergara’s past. The three find themselves in the middle of a global conspiracy to control the 30 pieces of silver paid to Judas Iscariot when he betrayed Jesus of Nazareth. The coins are, as one might imagine, cursed, and contain supernatural powers.

Lending the story its distinct and unsettling visual style is director of photography, Pablo Rosso, ([Rec], Veronica) working closely with regular collaborator, Chema Alba, senior colourist at Deluxe Madrid.

“Pablo is a wonderful person to create with,” begins Chema. “Our starting point was discussing texture more than colour. In pre-production we did a lot of lens tests to get the best anamorphic ‘feel’ with a special interest in flares, blurred corners and chromatic aberrations.

“I’ve also worked with Alex before and I know that he likes a lot of contrast, but you have to give him the possibility of seeing into the blacks and the shadows.”

When principal photography ended after a six month shoot in December 2019 and Chema had receipt of the first raw edit of the pilot, he and Pablo asked the production company if they could take a week out to prep the show together.

“Just before Christmas, we spent a week together brainstorming without being under pressure to commit ideas immediately to screen. This really helped us to hit the ground running when colour correction began properly in February.”

The pair had already curated a colour bible of looks and moods, which really helped to keep the series on track when the pandemic enforced lockdown in March, as Chema was able to complete a lot of the work on his own.

30 Coins is acquired on the Sony Venice, mounted with anamorphic Hawk Vintage ’74 lenses. Rosso shot 6K framed for 2.1 aspect ratio for a 4K UHD deliverable. Aside from a few scenes filmed in New York, Jerusalem and Rome, the bulk of the show was shot on location in a small town near Madrid.

“The lenses themselves generated a natural blur around the edge of the frame, which helped with texture. For large parts of the show we enhanced and accentuating the soft blur of the image. The action stays central to the image, but the surrounding area has a lot of blue tones, grain and textures.”

Chema explains that the overall look of the show harks back to the classic horror films of the ‘70s which featured soft imagery and a deathly green colour palette.

“We are set in a Spanish town so there is a lot of yellow and orange in there, but we pushed the greens every time we could, especially in the shadows. We avoided blues at night and instead opted for grey hues, with black and white contrast for skin tone.”

Chema partnered with colourist Charlie Villafuerte, and together the pair made a huge number of shot composites in DaVinci Resolve.

In one episode, Chema and Charlie were tasked with including different layers within 300 shots, most of which contained complex VFX (DMP or 3D integrations). But there were no plates of the layers shot in-camera; they were all created in Resolve.

“VFX gave us different plates and we just played around with the image, depending on the type of shot, angle of the lens and the size of the flies, and we composited layer upon layer,” Alba explains. “Alex always wanted the ability to change things in the last moment. I had 5 to 10 different plates of animated flies and we’d move them closer or further from the camera in real-time. Some shots have seven layers of image composited in Resolve. We have the source camera, then VFX and now we have half a dozen different layers – all combining together in 4K. We were exploring the limits of the Resolve software and the hardware.”

Chema performed a significant amount of the compositing with the Curves Editor. “I used curves to make two different contrasts, one for about 80 percent of the image and a lower contrast, about 20 percent outside this curve, in the same image. This allows me to put a lot of contrast in that main curve, while retaining the raw image for the shadows. This technique allows you to get a lot of detail.

“We had a lot of texture and contrast across the show but if you push too much texture it works for the male characters, because they are rough or tattooed, but was too harsh for the skin tone on the female characters. So, for 10 or 20 shots across the show we removed the extra texture to soften some of the skin.”

In the final episode the director wanted a really dense atmospheric look which drew on all the colourists’ skill. Resolve combined VFX, multiple layers of atmospheric elements and blur, with the actors rotoscoped in the middle.

“Thanks to FX simulations or 2D/3D creative assets, our VFX department provided us with 30 different layers with diverse speeds, movements and density. We used a combination of layer blending modes in the editing page, plus extra alpha outputs to isolate people or buildings. We used up to 3 or 4 layers of elements for each shot, played with contrast and tuned with Lens Blur FX get the final texture. It was pretty challenging.”

The biggest challenge grading this show was simply time. “Since every episode has about 2000 shots because of the fast-paced nature of the edit, and because there are a lot of layers to craft, so trying to get this amount of work done in three of four days per episode you have to go really fast.

“I’ve been working with Resolve for three years so I am very familiar with it. Resolve is really easy to use, it is so intuitive and really helped us manage the demand.”

 30 Coins is co-written by Iglesia and Jorge Guerricaechevarría. Executive producers for HBO Europe are Steve Matthews, Miguel Salvat, and Antony Root. Iglesia and Carolina Bang are executive producers for Pokeepsie Films. The series was produced with participation from HBO Latin America.

OnLine Editors at Deluxe Madrid are Mario Martínez Duque and Juan Ugarriza and postproduction producers are Yolanda Hurtado and Paula Lidón.

 

Thursday, 28 January 2021

Episodic Content Continues to Grow and Virtual Production Meets the Demand

copywritten for Sohonet

Chuck Parker, CEO, Sohonet – looks ahead to 2021 and the top trends the Film, TV and Ad industry should expect to see. Blog 1 in this series zones in on the continued investment in episodic content, and how C19 has given virtual production a material boost in financial viability.

Content has always been king but it has never been more important to own a larger, higher quality and exclusive library than your competitors than it is this year. As content providers have doubled down on direct to consumer strategies, it is episodic content that continues to surge in volume. In contrast to single features, episodic productions provide a sustained wave of fresh content with which to engage subscribers for longer, generating higher perceived subscriber value.

https://www.sohonet.com/our-resources/blogs/episodic-content-continues-to-grow-and-virtual-production-meets-the-demand/

Market leader Netflix, which ended 2020 with over 195 million paid customers worldwide, spent $17.3 billion on content last year – two and a half times more than Amazon Studios. By 2028, analyst BMO Capital Markets predicts Netflix will spend $26 billion per year on content. Amazon and Netflix have to spend considerably more on originals than rivals, in part because companies like Disney, NBCUniversal, CBSViacom and Warner Media Group have deep content catalogues they can pull from to enrich their services. Nonetheless, Disney+ and HBO Max have also unveiled huge investments at their investor days in their episodic slates building on franchises from Marvel, Star Wars, Pixar and DC. 

In early 2020, there were analyst concerns that the lengthy lull in content production would negatively affect streaming business models in 2021. Netflix, however, said it still expects to launch more originals in each quarter of 2021 than in 2020. And with 73% of its subscribers coming from North America, Netflix has to pursue growth overseas with local content commissioned to attract customers in territories like India, Brazil and Spain.

As a result, the 2015 industry average of 1,400 annual TV and film productions (with material budgets) will soar well beyond 2,000 a year as competition intensifies. Even at such stratospheric multi-billion content budgets, each provider is aiming to strip costs out of production and still retain top tier value on screen. In parallel, the biggest challenge beyond the health and safety constraints of the pandemic are the real estate constraints created by the surge in volume of productions, which is driving more and more productions to shoot their content “off-lot” in empty warehouses and other large structures.

 

Virtual production offers an answer 

Combining live-action footage with computer graphics in real time was still relatively nascent for TV and film production when Covid-19 crashed into our industry, but the key benefit of reducing the need to travel for on location shooting has given virtual production a material boost in financial viability. While the pandemic persists, virtual stages are both a means to continue production safely and to reduce the cost of on location work (ie associated travel and accommodation costs).  If the practical effects delivered via virtual production can reduce the overall VFX budget for productions as promised, then this trend will accelerate rapidly.

The upfront capital cost of the technology remains expensive and the techniques are not yet embedded throughout the workflow, so expect growth to be gradual in 2021. When LucasFilm and Disney made The Mandalorian Season 1, each 40-minute episode reportedly cost $15m – the same as the per-episode budget of Game of Thrones’ finale. From their investor day, we learned that Disney’s long-term goal in pioneering virtual production at scale is to produce episodic series with all the production value of its blockbuster features with a step order reduction in costs and a faster turnaround. 

 

The underlying technology, such as fine pixel-pitch LEDs and hyper-performance graphics cards, will advance in performance and reduce in price as the industry scales up (aided by Moore’s Law), making virtual production more accessible. The surge of real estate investment for purpose-built production studios will certainly give it a boost as well.  Directors and actors will gain creatively by being able to work in real-time with CG assets on set. Practical effects – like miniature models and explosions – will increasingly be done more efficiently in software on the virtual production stage. Greater use of the techniques and technologies will lead to better results and more streamlined production.

This has implications for the VFX industry and in particular for the titans of the sector. We will see VFX houses rapidly pivot to offer workflows and talent which marry virtual production with practical effects and the specialist creation of creatures and digital humans. It is not a given that those companies which were powerhouses prior to the pandemic will win this work. The VFX sector has suffered disproportionately during the halting of production; revenue streams were stunted; thousands of staff laid off. The winners in the race to virtual production will be the businesses that can perform at scale with the available capital to invest in the resources required to support this industry pivot while re-invigorating their core VFX business.