Tuesday 31 October 2023

Behind the Scenes: The Killer

IBC

The precision design of David Fincher’s hit-man feature mirrors its subject.

article here

James Bond by way of B&Q is how director David Fincher conceived of The Killer, his feature adaptation of an acclaimed French graphic novel.

The character played by Michael Fassbender doesn’t have cars or gadgets provided by a state-of-the-art organisation. Instead, he shops on Amazon and in convenience stores, dresses like a tourist and creates his own low-fi means of trapping and eliminating people.

“The Killer is a spider sitting in a web,” said cinematographer Erik Messerschmidt ASC.

While the source material by Alexis Nolent and artist Luc Jacamon is quite expansive in terms of story and politics the screenplay by Fincher and long-standing collaborator Andrew Kevin Walker (Se7en) strips things back to become a more straightforward story of retribution.

When Nolent wrote the comic, one of his influences was Le Samourai, the 1967 picture directed by Jean-Pierre Melville and written by Melville and Georges Pellegrin. It starred Alain Delon as an ice-cold Parisian assassin.

Fincher also admired Melville’s movie and sent Messerschmidt a copy as a reference point.

“I’d seen it, but in film school,” the DOP said. “I watched it again and immediately understood what he was going for: that kind of patience, the idea of staying in purgatory while you’re doing the job. If you’re going to be good at anything, in the 10,000 hours thing [the notion, popularised by Malcolm Gladwell in his book ‘Outliers’, that it takes 10,000 hours of practice to become world-class at something], 9,000 of it may just be waiting.”

Messerschmidt was the gaffer on Fincher’s Gone Girl (2014), before taking the role of cinematographer for the majority of FBI killer interview series Mindhunter (2017). He won his Academy Award for the black and white work on Mank (2020) and is now in high demand (he followed The Killer by shooting Ferrari with Michael Mann, and the pilot of the crime TV series Sinking Spring with Ridley Scott).

“When I sat down and talked to David he explained it’s not really about nihilism, it’s about precision and someone reconciling themselves with what they’re doing for a living,” he said.

“It is not your classic plot-driven drama. It’s much more existential and philosophical than it appears. And also about the monotony of procedure.”

The rules of the production aren’t codified as much as discussed and felt. “My experience with David is generally the creative decision-making is almost entirely instinctual,” said Messerschmidt. “The way he keeps things interesting, for himself and the rest of us, is pushing us to do something different. Not to do the pedestrian method, not cover things in the traditional way – not just in technique, but also stylistically and creatively. He encourages everyone to take chances.”

Behind the Scenes: The Killer - Location shoots

There were three main legs to the production: France, the Dominican Republic and America. With pandemic restrictions still in force in the summer of 2021, the team effectively prepped the movie from Europe, scouting in Paris and then flying from there to the DR and New Orleans (which also doubled for Florida), before traveling to Chicago and St. Charles, Illinois (which doubled for upstate New York).

The film opens in an empty office space overlooking a Parisian square, opposite a plush apartment where the first target is anticipated. The interior of the under-construction WeWork space, where Fassbender waits for his prey, was built on stage in New Orleans, as was the inside of the plush Parisian apartment opposite, where his target appears. They chose New Orleans, partly because of French filming restrictions hinted at by Fincher in the film’s production notes.

“The French government wants to know where you’re going with that sniper rifle with a foot-long silencer on it,” he said. “Even if it’s a fake one.”

“We shot exteriors and then built the interiors of the apartment he was firing into,” explained production designer Donald Burt (Oscar winner for The Curious Case of Benjamin Button and Mank). “They were comped into existing buildings in the style of mid-19th century Parisian architecture. We took those buildings and built our own version, to have the control we needed.”

Behind the Scenes: The Killer - Location shooting in 8K

Since shooting The Social Network (2010) on the Red One, Fincher has only used cameras manufactured by Red Digital Cine. With each film – or TV series – he has used a different model. For The Killer, Fincher and Messerschmidt opted for the V-Raptor, which meant they could record footage in 8K resolution, with the amount of detail captured allowing for maximum flexibility for any adjustments required to colour or framing

They shot in the widescreen anamorphic aspect ratio, 2.39:1 (having shot both Mank and Mindhunter in 2.2:1). “I felt strongly we should change it up, especially as we’re going to all these different locations,” said Messerschmidt.

With rifles and cars and over-the-shoulder shots, or seeing someone in the foreground putting a rifle together and seeing the window across the street, the scenes continually offered up more width than height.

Each country had a slightly different look. Paris for instance is lit with a steel blue sodium vapor said Messerschmidt. “And out of that came this concept of the whole movie having a dual colour, split tone of oranges and blues, or oranges and teals.”

To make the DR feel hot and humid he used a filter to help the highlights bloom. Traditionally, this might have been achieved by using a glass filter on the camera, but the filmmakers were wary of this slowing them down. So Messerschmidt investigated other options, finding a software plug-in that could be applied in post to mimic the effects of certain diffusion filters.

They also considered using handheld cameras, something which Fincher has generally avoided in his films. After tests exploring the idea of shooting handheld and then stabilising the footage in post, Fincher opted against it.

“We had lots of discussions about that at the beginning,” said Messerschmidt. “Then David came back and said, ‘Maybe I’m thinking about this the wrong way, what if we do it stable?’ So, instead, we reversed the process: shooting everything stable and then adding movement in postproduction, to reflect either the chaos of a scene or the state of the killer’s mind.

“The basic rule was when The Killer is in a state of confidence, the camera is fluid,” he adds. “And when he’s frazzled, like at the beginning when he’s thrown into this maelstrom of confusion, the camera breaks away. We were playing with when to apply that.”

A fight scene between The Killer and a character called The Brute (Sala Baker) in the confines of The Brute’s house required collaboration between Messerschmidt, Burt and stunt coordinator Dave Macomber.

“We had to think about how to explain the space, while simultaneously shooting a fight scene,” said Messerschmidt. “The sequence is hard, the camera is moving all over the place, the actors are moving all over the place, and it’s fast. So we have to think about how we’re going to stage it for the light.”

This meant discussions with the art department about finding sources, from lights fitted under the kitchen cabinets, to establishing streetlights outside. “We decided we wanted hard, artificial street light through the windows,” said Messerschmidt, which meant erecting lights on the exterior location to match that. “In terms of the scope of the movie, a tremendous amount of energy went into just figuring out that fight.”

Behind the Scenes: The Killer - Big Mouth Strikes Again

The conventional use of soundtrack in a film is to segue way the audience smoothly from scene to scene. You aren’t meant to notice the join – but this is exactly what Fincher wanted.

His instruction to nine time Oscar-nominated sound designer Ren Klyce, was to make it “very antisocial.” This meant experimenting with ‘vertical’ sound cuts so that in a transition from a shot that has been very quiet you cut to a street scene with, for example, a loud police siren, the audio is actually amped up.

“Initially it was very hard for me to get on board with that because my instinct is to smooth everything,” said Klyce. “David wanted it to feel like there was a microphone attached to the camera and every time we’re in a new angle, that mic is picking up the sound from that perspective.”

The film’s opening provides a good example of this. “When we’re with The Killer, and he’s looking through the sniper’s scope, that has a sound. And Paris has a sound. When we are in the park, and the little boy is barely watched by his mum at the fountain, every time we cut the picture the sound of that fountain is moving in perspective to where the kid is and where The Killer is.”

Voiceover is a crucial element in putting the audience in the perspective of The Killer. This was rewritten significantly after shooting informed by what Fassbender brought to the character, and simple practical realities.

What the character chooses to listen to – whether through the earbuds of his MP3 player, or in his hired vehicles – also provides an opportunity to give the audience a sense of who he is.

“We went through a whole process of vetting The Killer’s taste,” said Klyce. “Because we know so little about him – he barely speaks, aside from the voiceover.”

Various options were considered – from Bach to Dusty Springfield – but eventually the idea solidified around having all the songs from The Smiths.

The director said The Smiths had “the requisite mix of sardonic, harmonic and nihilist” adding, “What songwriters have as much fun with sinister concepts as Johnny Marr and Morrissey? We just kept coming back to The Smiths.”

Sunday 29 October 2023

Motion Capture Makes a Move Into Mainstream

NAB

Motion capture may have brought to life the Na’vi in Avatar for multi-billion dollar success but creating realistic motion is always a challenge and perfect data is a myth. What’s more, no one outside of Marvel or James Cameron has the budget for the most high-end systems or the time to work with them.

As Jon Dalzell, co-founder of the British-based developer Performit Live, explains in a webinar, traditionally there are two ways to capture motion.

You can use generic motion capture libraries, which involves searching for assets and paying a royalty fee for use. You then would need to adjust every animation for cohesion and manually keyframe any missing motion.

Or you can originate a location-based shoot, which entails everything from sourcing talent to hiring technical support, while typically waiting for days for the capture data to be processed.

All techniques, including top-of-the-range studio-based models using advanced camera tracking and markers, generate imperfect or noisy data that needs to be cleaned up.

Foot slide, for example, is a common issue where an animated character’s feet appear to slide or glide across the floor, instead of having a firm, realistic contact. This problem occurs due to inaccurate capture or translation of motion data onto the character model. It can also result from inadequate synchronization between the captured motion data and the animation rig, due to imprecise calibration.

Facial capture technology in mocap studios involves tracking dots on the actor’s face using infrared light. Dalzell notes that the number of cameras and their proximity to the actor affect the accuracy of the subtle movements captured in 3D space.

Studios must procure specialized cameras to detect markers on the motion capture suits, converting physical movements into digital data. Commercially available camera systems can cost upwards of $250,000 — custom systems used by large studios likely cost even more.

Those suits are also expensive. They are embedded with sensors, crucial for capturing the essence of human motion. A $2,500 suit is considered cheap, with many camera-based options costing more than $15,000 each.

Alongside these, there’s the software to process and convert raw motion data for animators to work with.

“To capture your vision, you need to be able to create authentic motion,” Dalzell says. “Researching, organizing and producing motion capture takes time and money, often leaving you working against the clock.”

He claims the high attrition rate in the industry, with 90% of animators citing high stress and burnout, meaning there’s a need for more efficient and effective working processes and methods. 

(That’s where Performit Live comes in. “Our platform digitally connects you seamlessly with highly skilled professional performers wearing our smart motion capture system enabling you to direct the performer through remote rehearsal, capturing the exact moves you need and downloaded to you in seconds.”)

“Wearable technology will have a renaissance with advancements in fabrics and electrical integration,” Dalzell says. “You will be able to capture motion data in any location without cables or limitations.

Wearable technology like Performit’s can store motion data locally and upload it to the cloud when a connection is available, “allowing for unique and nuanced captures of elite performers in their real environments.”

Performit reports that they are developing technology for multi-performer remote production.

 

Saturday 28 October 2023

Majority Report: How Do We Monitor AI?

NAB

It is urgent that we regulate synthetic media and deepfakes before they undermine our faith in the truth, says Russell Wald, the director of policy for Stanford’s Institute for Human-Centered Artificial Intelligence.

article here 

“I’m concerned about synthetic media, because of what will ultimately happen to society if no one has any confidence in what the veracity of what they’re seeing,” he says in an interview with Eliza Strickland at IEEE Spectrum about creating regulations that are able to cope with the rapidly evolving technology.

“You’re not going to be able to necessarily stop the creation of a lot of synthetic media but at a minimum, you can stop the amplification of it or at least put on some level of disclosure, that there is something that signals that it may not be in reality what it says it is,” he says.

The other area that Wald thinks would help in terms of overall regulation is greater transparency regarding foundation data models.

“There’s just so much data that’s been hoovered up into these models, [but] what’s going into them? What’s the architecture of the compute? Because at least if you are seeing harms come out at the back end, by having a degree of transparency, you’re going to be able to [identify the cause].”

Of calls for regulation coming from AI developers themselves, Wald is scathing, “For them, it really comes down to would they rather work now to be able to create some of those regulations versus avoiding reactive regulation. It’s an easier pill to swallow if they can try to shape this at this point.”

What he would really like to see is great diversity of viewpoint in the discussions and decision-making process, not just from those in the tech industry, but from academics like himself and from law makers.

“Others need to have a seat at the table. Academia, civil society, people who are really taking the time to study what is the most effective regulation that still will hold industry’s feet to the fire but allow them to innovate?

This would mitigate the risk of inherent bias in certain algorithms on which decisions in judicial systems or legal systems or medical contexts might be based.

Like many academics with knowledge of the subject, Wald calls for a balanced approach. AI does have significant upside for humans as a species he says, pointing out the unprecedented ability of AI to sift through and test data to find solutions for diseases.

“At the same time, there’s the negative that I am truly concerned about in terms of existential risk. And that is where the human comes into play with this technology. Synthetic biology, for instance, could create agents that we cannot control. And there can be a lab leak or something that could be really terrible.”

Having given a precis of what is wrong, Wald turns to potential solutions by which we might regulate our way out of potential disaster. This is multi-pronged.

“First, I think we need more of a national strategy, part of which is ensuring that we have policymakers as informed as possible. I spend a lot of time in briefings with policymakers and you can tell the interest is growing, but we need more formalized ways of making sure that they understand all of the nuances here,” he says.

“The second part is we need infrastructure. We absolutely need a degree of infrastructure that ensures we have a wider degree of people at the table. The third part of this is talent. We’ve got to recruit talent and that means we need to really look at STEM immigration, and see what we can do because at least within the US the path for those students who can’t stay here, the visa hurdles are just too terrible. They pick up and go, for example, to Canada. We need to expand programs like the intergovernmental personnel act that can allow people who are in academia or other nonprofit research to go in and out of government and inform governments so that they’re more clear on this.”

The final piece in Wald’s argument is to adopt regulation in a systematic way. For this, he looks to the European Union, which is one of the most advanced territories in terms of formulating an AI Act. However, this is not expected to be ratified for at least another year.

“Sometimes I think that Europe can be that good side of our conscience side and force the rest of the world to think about these things. This is Brussels effect — which is the concept Europe has such a large market share, that they’re able to force through their rules and regulations, being among the most stringent and it becomes the model for the rest of the world.”

He identifies the UK’s approach to AI regulation as a potential model to follow because it seems to be more balanced in favor of innovation.

“The Brits have a proposal for an exascale computing system [to] double down on the innovation side and, where possible, do a regulatory side because they really want to see themselves as the leader. I think Europe might need to look into as much as possible, a degree of fostering an environment that will allow for that same level of innovation.”

Wald’s concern that AI will stem innovation is not to protect the larger companies, who can look after themselves, he says, but the smaller players might not be able to manage to continue if the law is too stringent.

“The general public should be aware that what we’re starting to see is the tip of the iceberg,” he warns. “There’s been a lot of things that have been in labs, and I think there’s going to be just a whole lot more coming.

“I think we need to have a neutral view of saying there are some unique benefits of AI for humanity but at the same time, there are some very serious dangers. So the question is how can police that process?”


Thursday 26 October 2023

ICVFX: Virtual Production enters 2.0

 IBC

Virtual production technology has reached a new level of sophistication and many productions are now being written with a Volume in mind. However, it is not right for all stories and some tech gremlins remain.

article here

LED volumes for the production of in-camera visual effects (ICVFX) are now a maturing technology, according to a post-IBC show report by Futuresource Consulting.

The report also concludes that virtual production (VP) and extended reality (XR) have gone past the “hype phase” of their development, with the performance of systems now being refined and the technology becoming accessible to more users.

A key factor in this is increased ease of use, with installations no longer reliant on a technical team with specific, specialised technical skills to operate the equipment. Manufacturers are now producing VP systems that are not as challenging to run, with software and firmware upgrades appearing on a regular basis to enhance the production of content on LED volumes.

ARRI Stage London is seeing this trend first hand. Filmmakers who have used an LED wall are returning with the understanding of what it can do, and significantly, are now working from scripts written for virtual production.

“Productions know they can save money using virtual production,” said Rob Payton, production specialist, ARRI Solutions. “The big shift in mindset is filmmakers are now thinking of the technology as a tool to create things they were previously unable to. The best results we see now stem from scripts written for the volume.”

The Chemical Brothers’ promo for their latest single ‘Live Again’ is a prime example. Produced by Outsider with directing duo dom&nic, the promo follows a dancer emerging from her trailer into a series of environments—with several scene transitions taking place live, within one continuous shot.

After the initial rush into VP and a period of “learning by doing”, confidence in the technology has grown. Instead of feeling confined in a tech-led space, Payton believes filmmakers now appreciate the LED volume for its creative freedom and production efficiency.

“There were some misconceptions that the whole look would be controlled by a VP supervisor and VFX,” he said. “But once cinematographers have experienced the workflow, they realise they retain complete creative control over the imagery.”

ICVFX: Virtual Production - Test and learn

Whilst some DPs might be on their second round of VP work, most filmmakers need to ‘try before they buy’ into the technology. It is why new VP spaces operated by Sony in Pinewood, by Anna Valley in West London and by kit hire firm CVP in Fitzrovia are primarily for training purposes. They give producers and DPs an idea of what to expect and test out scenarios.

“Systems are more interoperable with each other than they were a couple of years ago but issues do remain,” said Callum Buckley, Technical Consultant, CVP. “We can provide guidance for users to help them understand whether product ‘A’ work wells with product ‘B’ and we can advise on whether their budget is being spent wisely.”

“We see it as a fantastic tool but it is not the be all and end all for a production,” reported Anthony Gannon, COO, Garden Studios which has hosted over 85 productions at its Volume stages ranging from adverts to multi-cam drama. “It’s not as expensive as it was and the market has become more competitive.”

The rate card for a day on Garden’s Volume stage is around £10K which also includes the studio’s full service. It will advise on whether a project is actually more suitable for a conventional studio. “It’s important to understand what is realistically achievable,” said Gannon. “VP has to be fit for purpose.”

ICVFX: Virtual Production - Fit for purpose

Volume work is understood to be hugely beneficial for shooting scenes featuring moving cars where the virtual backgrounds and additional physical lighting can deliver realistic reflections on the car surface or interior.

Examples include the neon-drenched driving sequences in Marlowe (2022) and the equally neon drenched car work in Blonde (2022), albeit this is shot black and white. Mank, another black and white shoot, also employed virtual production for its driving scenes. However, the majority of these films were shot on regular sound stages or on location.

If the project is VFX heavy, like the Chemical Brothers’ video, a Volume can work wonders in terms of efficiently corralling all the VFX into one space for the director, DP and acting talent to see in what they are producing, live.

With 75% of its action taking place in mid-air the production for AppleTV+ mini-series Hijack made extensive use of VP. The drama was shot on four Volume stages in the UK, all operated by Lux Machina and featuring a combination of LED configurations. This show could arguably not have been made with the degree of authenticity it has without VP.

By the same token, extensive use of a Volume also has its own aesthetic and, despite the claims for realism which it undoubtedly has over green screen, can’t yet replicate real world situations as accurately as being there.

Some critics have noted the relative lack of scale in the lightsaber duels of recent Star Wars TV productions and chalk that up to limitation of the Wall.

Star Wars: Andor is an exception. Perhaps the best live-action Star Wars spin-offs since The Mandalorian, it made a virtue of shooting conventionally rather than using ILM’s StageCraft volume.

“We definitely discussed [using virtual production] but decided it did not lend itself to what we were trying to do,” John Gilroy explained. “Virtual production frees you up in some ways and it limits you in others. In the production design and look we wanted to go more realistic and therefore to shoot in a more old-fashioned way.”

Cues to the look of Andor came from Star Wars: Rogue One, which Gilroy edited for director Gareth Edwards. The same lived-in look using real-world locations was prioritised by Edwards for his latest sci-fi feature The Creator.

“There was a little bit of Volume work at Pinewood but very low,” Edwards admitted [as quoted in this Frame Voyager video]. “And if you do the maths, if you keep the crew small enough, the theory was that with the cost of building a [single, conventional] set which is typically like $200,000 - you can fly everyone anywhere in the world for that kind of money.”

Numerous VFX houses were contracted to work on The Creator, including ILM, Unit, Folks VFX and Outpost VFX using a method that flipped recent VFX workflows on its head. Edwards shot the material first on location then retrofitted backgrounds, sci-fi craft and AI robots to achieve a more organic and less expensive method of production.

“We’re not saying that there wasn’t work done with the backgrounds, but VFX are not having to be fully digitally recreate every scene,” he said. “This allowed more effort to be put into making the robots of this world feel even more real because locations, props and characters are already in the shot. Often there was no additional work or relatively minimal labour needed in finalising a character or environment.”

Production wasn’t all plane sailing on Hijack. Some issues with lighting in the Volume needed finessing in post, according to VFX Supervisor Steve Begg.

“Being a TV show, it was shot in a mad hurry (i.e. with no testing time) and although everyone was initially quite happy with the results, after closer scrutiny we saw all sorts of issues that needed fixing. Lots!”

Scenes featuring a Eurofighter cockpit in the 270-degree Volume on a motion base proved most problematic.

“I’d anticipated we’d have problems with the reflections in the visors so we had them high-rez scanned in order to get a good track and replace everything in them, with CG versions of the cockpit and the pilots arms and sky,” Begg explained. “The moment the reflections were sorted the shots really started to come together with added high-speed vapor on top of the Volume sky along with a high frequency shake. I stopped them doing in-camera as I had a feeling we’d be fixing these shots, big time. If we had they would have been a bigger nightmare to track.”

That said, Hijack is an ambitious series that would probably not have been attempted a couple of years ago and was made by filmmakers pushing the tech to its limits.

ICVFX: Virtual Production - Lighting Evolution

Achieving accurate colour, especially for skin tones, is a problem that bedevils cinematographers working in Volumes but manufacturers are coming up with solutions to tackle it.

Most LED displays follow the standard RGB format, but new panels are being released that incorporate a fourth emitter of white LED. ROE has one such product called RGBW which it claims offers greater colour accuracy and minimises the amount of colour correction required in post.

A rival technology was launched by INFiLED at IBC. Branded ‘Infinite Colors’ the innovation is said to improve a variety of LED applications by allowing full variations in tone, saturation, and colour appearance in white light and custom colours.

LED lighting manufacturers are also moving in this direction. The MIMIK tiles from Kinoflo comprise not two additional white bulbs as well as RGB and are typically used as a reflective surface with the key difference that you can also play back video content through them for real life reflections. CVP has a set at its Great Titchfield test facility.

“Traditional LED panels won’t light skin tones well, their Colour Rendering Index is basically zero in the pinks whereas with Kinoflo you can light costume and skin tone correctly because they afford a much more balanced colour profile,” said Buckley. “The difference is staggering when you see it in the flesh.”

Another downside of a Volume is that it can’t yet mimic the sun, even if the rendering quality is as good as the sun.

“It’s not hard light, it’s not a million miles away. It’s not a point source,” said ROE Visual R&D Manager Tucker Downs in a tutorial on colour science. “And if you’re thinking about how a streetlight casts light down, you have a shadow, well, you can’t recreate that in virtual production. A Volume can only really provide the global illumination and indirect illumination.

“Could we get a fully spectrally managed colour pipeline and virtual production to do spectral rendering with distributed rendering technologies? I think these are the things that you can look forward to and we should push for in the next ten years.”

ICVFX: Virtual Production - GenAI

AI-driven software Cuebric enables filmmakers to dream up camera-ready media in seconds, for playback in a volume.

“Five percent to 20% of budgets today goes into reshoots because even when you’re working with green screen or virtual production, reshoots are often the only way to achieve the director’s intent,” said Pinar Seyhan Demirdag, Co-Founder of Cuebric developer Seyhan Lee.

They calculate that for a medium size picture, reshoots cost a production $375,000 on average. For bigger budget shows that rises to an astonishing $24 million.

“If we were to save the industry even a fraction of that using AI it could help funnel those funds back into creativity and cut out unnecessary labour,” Demirdag said.

Further out and Generative adversarial networks (GANs) could be used to achieve higher quality footage. French postproduction giant Technicolor is testing this and also also using ChatGPT for concept ideation.

Technicolor’s labs are testing machine learning to standardise the more repetitive tasks like rotoscoping and exploring how AI tools like ChatGPT, Cubric and Nvidia’s Omniverse might apply to games engines and virtual production.

“We’re using Stable Diffusion and Midjourney quite a bit for content creation generation ideation moments,” explained Mariana Acosta, SVP, Global Virtual Production and On-set Services, Technicolor. “We’ve created a few TV spots that actually use Runway to generate backgrounds including one for Salesforce which very much played into the AI aesthetic.”

Olan Collardy: Filmmaking with empathy

IBC

Cinematographer Olan Collardy talks to IBC365 about his approach to lighting diverse skin tones on screen. 

article here

Filmmakers would like to believe they have empathy with their subject but the degree to which they actually do when committing stories to screen is being questioned.

Recently, IBC365 covered the work of Digital Melanin (DMC), a project that seeks to challenge the way cinema’s defacto camera and printing technology represents skin colours on screen.

For cinematographer Olan Collardy the approach is less about tools and technology and more a mindset that takes care that not everyone on screen receives the same treatment by default.

“Anyone should be able to tell a story about characters from a different culture to them but the onus is on the filmmaker to do due diligence in ensuring they empathise with the culture whose story they are telling,” said Collardy.

Until recently, image making whether on film or painting has not been democratic, he contends. “It has been controlled by a small group of people who decided what stories to tell and how to tell them. It is only since digital cinematography that anyone could get their hands on a camera to take a picture of themselves and say ‘this is how I like to look’. Empathy means listening to people when they say ‘this is how I like my skin tone to look on camera’.

Collardy, an exciting British talent who shot the vibrant and playful feature rom-com Rye Lane, explained how he likes to work.

“I want to ensure that any subject looks at the image I’ve created with pride,” he said. “That means collaborating with all departments to make sure you are respectful to your characters as the script mandates.”

That might mean working with make-up artists to tone a skin tone down or apply more oil to the skin to make it more reflective.

Collardy noted that darker skin tends to be a more reflective than Caucasian skins and when light shines on it this can cause specular highlights or glaring patches.

“As a black person I know how another black person on set must feel. Everyone has vulnerabilities and insecurities and the last thing an actor needs is for a producer or director to say ‘so and so is looking a bit dark, Can we can put some light on them?’.

“An actor will feel like there’s something wrong with them. Instead, if you feel someone is lit too darkly, it’s better to finish that take and have a discussion about it sensitively.”

However, it’s a misconception that when you light a darker skinned person you need more light.

Collardy said: “It is okay to embrace shadows with dark skin as long as there is information and shape in the image. You need more shape to ensure that you’re not losing the contour of faces. That’s what makes any face look good. The last thing you do is bring a light and just blast it into actor’s faces because then the image will look washed out.”

“We all have different tones of colour,” Collardy said. “Caucasian skin tones are not the same. Some are pale, some more reddish, others peachy. Some folk with darker skin tones have redder tones or exhibit more blue in the shadows. Every filmmaker comes to a story with different ideas for how to best photograph the people in it.

“It would be a disservice to artistry in general to dictate any particularly style,” he insisted. “For example, when shooting dark skin tones there is no mandate to use warm light. That is one person’s philosophy. It always comes down to empathy. Does this lighting work for the character in the story and for the human actor? Am I doing them justice by keeping the shape of their face?

“There are many films shot by white DoPs and white directors where black people look amazing. All I’m saying is that the process should be about doing research and ensuring you have integrity in your work and that you are not shoehorning some process you have heard about somewhere into your process. It still has to be authentic.”

Collardy said he tends to get asked to shoot ‘black stories’. He said the script treatment will often call for a visual style for skin tones to either look like Moonlight, Barry Jenkins’ 2016 Oscar hit photographed by James Laxton ASC, or to look like HBO comedy drama Insecure.

“There’s a danger in putting a particular way of shooting something on a pedestal,” he stated. “When Moonlight and Insecure came out everyone tried to regurgitate that look.”

One technique used on Insecure was to shape the actor’s faces using rotating polarisers which angle the light in such a way as to reduce reflections.

“It was a new and fresh look we’d not seen on TV that made black people look delicious. Then every DP started using polarisers which homogenised the process.”

On Rye Lane, Collard worked with production designer Anna Rohodes to ensure there was a certain contrast been the characters and the south London they inhabit

“If there’s a darker skin tone subject let’s make sure we don’t put them in a dark background and in a dark T-shirt. If we put them in a green jumper, against a wall which has a complimentary colour then things will start to look amazing.”

He often used extreme wide -angle lenses paired with a Bronze Glimmerglass filter. With Jack McGinity, colourist at facility Cheat, he ensured skin tones had filmic softness to them.

Rye Lane is alive with colour,” said Collardy. “We shifted the colours around, perhaps gave one scene a green hue, just something that doesn’t feel too real or neutral.”

Lighting Genius: MLK/X

British cinematographer Trevor Forrest recently shot episodes of Genius: MLK/X, a forthcoming Disney+ dramatisation about the relationship between Martin Luther King Jr. and Malcolm X.

He explained that gaffer Justin Dickson was vital to his approach to lighting.

“I’m a white guy from Wells-next-the-sea who only brings privilege,” Forrest said. “Justin had grown up in the Baptist South. When you bring someone onboard with that depth of subject matter there will be a fizzing of energy.”

Dickson explained, “My roots go back to Mississippi. My uncle marched with Martin Luther King Jr. My grandfather picked cotton. My great grandmother married a slave. When you are brought up like that you understand that education is different when you have to fight for it. A bus ride feels different when you have to fight for it. You understand that somebody came before me so I could sit in this seat. So, you bring that mood and that intuition and what was instilled in you at birth to everything you do and especially a project like this.

He said he talked to his mother during the project to understand more of what her lived experience was like in order to convey that to Forrest. DP and gaffer also visited the Mississippi Museum of Art to look at prints of the period where they learned that many portraits had their subject lit by the natural light of a window.

“Skin tones are complex,” Forrest said. “Black skin has a reflection, then there are flesh tones underneath and then there’s the characteristics that any individual has. There are a huge range of skin tones in MLK/X but our motivation was to plug emotion into the lighting.”

Dickson confirmed: “With some stories it is less about the technical aspects of lighting a scene or a character so much as about feeling it. That feeling is what we are trying to translate.”

Wednesday 25 October 2023

Pretty/Scary: Cinematographer Aaron Morton on “No One Will Save You”

NAB

Home invasion movie No One Will Save You can be added to the low budget horror film renaissance (think Huesera: The Bone Woman, Barbarian, M3GAN, Talk To Me) and the most streamed film across all platforms in the US when it was released last month.

article here

Made for $22.8 million, the Hulu original is an almost wordless thriller in which Brynn (played by Booksmart’s Kaitlyn Dever) is a young woman living alone as a seamstress in the countryside fights back against alien invaders.

“He didn’t tell me about the lack of dialogue before sending me the script,” says cinematographer Aaron Morton (Black Mirror: Bandersnatch; The Lord of the Rings: The Rings of Power) of receiving the project from writer-director Brian Duffield. “Reading it for the first time it dawns on you.

He adds, “It sounds counter-intuitive given the lack of dialogue, but the way the script conveyed the tension and terror that the character’s feel did so much work to help us understand the approach to this film.

“One phrase we had throughout prep was that horror can be beautiful. We tried to make a beautiful film that was scary.”

Duffield, who wrote 2020 monster adventure Love and Monsters and 2020 sci-fi horror Underwater, is drawn to ideas that smash two things together that don’t necessarily go together.

“We talked a lot about if Todd Haynes was making Far From Heaven and if aliens invaded in the middle of it what that would feel like,” Duffield commented on the set of No One Will Save You in a video featurette.

It’s the kind of a thing he brings to a lot of his scripts, says Morton, a New Zealander who previously lensed a film for the director about teenagers spontaneously exploding.

Spontaneous (2020) was really a lovely love story between two kids who happened to be in a situation where their friends are literally exploding next to them. Brian loves smashing two disparate situations together and seeing what comes of it.”

He continues, “What is clever about No One Will Save You is that while we are learning about what’s happening to the world in terms of the alien invasion we’re also being drip fed information about Brynn’s character and what has happened to her in her life.”

While the script has somewhat of a conventional narrative driver from set piece to set piece, what’s missing from more traditional film is coverage — the over the shoulder, reverse shot grammar of two people having a conversation.

“We’re still using filmmaking conventions but making sure we have an awareness all the time that we were ticking the boxes for the audience in terms of what they were learning about story and character,” Morton explains.

“We knew we didn’t want to treat the lack of dialogue as a gimmick. It was just a by-product of the situation that Brynn found herself in. Our aim was not to draw attention to that device in the movie while being very aware of how much extra weight the images needed to carry in a ‘show, don’t tell’ sort of way. The nature of the film relies on the pictures doing a little bit of extra work.”The most challenging aspect of the shoot for Morton was lighting the aliens. “The light is part of how they control humans,” he says. “We definitely wanted to be reminded of Close Encounters which did inspire a lot of what we were doing in our movie.”

This includes leaning into the classic tractor beam trope of being sucked into the mothership. They were lighting some reasonably large areas of swamp and night exteriors of Brynn’s house, often using cranes with moving lights. These were powerful Proteus Maximus fixtures from Elation Lighting, more commonly found at rock concerts.

“The moving alien lighting is built into the exterior night lighting. For instance, when Brynn is walking up the road in a forest, the forest is lit and suddenly a beam is on her and she gets sucked up into spaceship. We had the camera on a crane, and we had ambient ‘forest’ lighting on a crane so I could move that ambient lighting with her as she walked. We had another crane with the big alien light that stops her in her tracks. So it was this ballet of things happening outside the frame.”

He continues, “I love using moving lights (combined with the right sensor) because you can be so accurate in changing the color temp by a few degrees, even putting in Gobos to change the beam size, using all the things moving lights are great for in live events and putting them into the cinema world.”

The lighting design also gave the filmmakers another way to show the audience things that Brynn does not see. “She could leave a room and just as she turns away we play a light across a window just to remind people that the aliens are right there though she’s not aware of it.”

Since the color red plays a particularly important role in depicting the alien presence, Morton tested various cameras before selecting the Sony VENICE.

“You can quickly over-saturate red, green and blue colors with certain cameras, so it’s a big piece of puzzle that you can figure it out early. We felt the color science of the VENICE was incredible in terms of capturing that red.”

They shot anamorphic using a set of Caldwell Chameleon Primes. “What I also like about the VENICE is that you can chop and change the sensor size. The Chameleons cover the large format size very well but not perfectly so it meant I could tailor which part of the sensor we were using depending on which lens we were on.”

Elaborating on this he says, “You can very easily go from a Super 35 size 4K sensor on the VENICE to 6K using the full horizontal width of the sensor so sometimes, if I needed a bit more width in the room in a certain situation and what we were framing was forgiving enough, I could use a wide lens in 6K mode and not worry about the distortion we were getting because we’re using every inch of the image circle.”

The production shot in New Orleans but is not location specific. “It’s Generica,” says Morton, who commends the Louisiana crew.

“We ran three camera bodies all the time with two full teams so we could be shooting with two and prepping another camera then leapfrogging one to the next.”

Morton has somewhat of a horror film pedigree having photographed 2013’s Evil Dead remake, the “Metalhead” episode of Black Mirror (arguably the series’ bleakest story) and The First Omen, directed by Arkasha Stevenson for 20th Century Studios starring Bill Nighy, which is scheduled for imminent release.

He was in the middle of shooting a feature for Universal in Dublin when the strikes hit and still has three weeks of work on that project to complete, for which he is using a package of Alexa 35 and Cooke S4 sphericals.

“I love going job to job to approach with fresh eyes and be given a clean slate. Whilst I welcome giving my opinion on what approach would work I’d much rather have a two-way back and forth with a director about what they think so we can get the best story to screen.”

Reaction

With a big full-orchestra score from Joseph Trapanese, “Duffield admirably trying to turn his Hulu budget into Amblin production value,” reviewed Benjamin Lee at The Guardian. “It’s got the feel of a real movie, the highest compliment one can give right now to a film designed for streaming.”

Forbes critic Mark Hughes said, “The film is expertly paced and gets endless mileage from its premise. It’s one part Signs, one part Body Snatchers, and a whole lot of Cronenberg, all of which I love independently and am thrilled to see combined here.”

He adds, “the largely dialogue-free approach might sound gimmicky or distracting, but it’s neither and highlights how well written, directed, and performed the whole thing is.”

Others were more negative. Here’s Sam Adams at Slate: “[It] could have been a spectacularly scary short film. Instead, it’s a movie that starts off with an incredibly strong premise and sense of itself, and then squanders nearly all of it in a scattershot middle and confounding conclusion.”

 


Wednesday 18 October 2023

EMG handles ‘A comet tail of complexity’ for NFL London games

SVG Europe

Two NFL games came to London earlier this month, with the Buffalo Bills facing off against the Jacksonville Jaguars on 8 October, before the Tennessee Titans took on the Baltimore Ravens at Tottenham Hotspur’s purpose-built NFL stadium on 15 October.

article here

Apart from the vibe of the fans, which is different in Europe, the NFL international games, which this year were held in London and Germany, are a mirror of a regular season American Football Conference match from a broadcast point of view, so facilities providers and production teams have to bring their A game.

“There’s a comet tail of complexity to the operation,” says Bill Morris, Emmy Award-winning technical consultant and technical supervisor for NFL London games, speaking ahead of the Titans game. “We don’t just do a major full traditional OB onsite but a completely remote production for the wrap around shows. It is that combination which makes NFL Europe games unusually complex.”

Morris has been involved in the NFL Europe games since their launch in 2007, first at CTV, then at EMG and latterly as head of his own consultancy BMTV, which Morris dubs as “broadcast enablers” in terms of managing a lot of the planning to facilitate the smooth running of each live show onsite.

In Europe, EMG predominantly works with two major clients, Fox and CBS, depending on NFL rights deals. This year two of the three London games have landed with CBS, both held at the Tottenham Hotspur Stadium (the third game was held at Wembley Stadium on 1 October). CBS is producing the game within an envelope for NFL Network.

“This means NFL Network (NFLN) produces the wrap around pre-game, half-time and post-game show and CBS produces the actual game as part of that show. It is aired on both networks but it’s an NFLN show with NFLN talent,” Morris explains.

This follows the same pattern from 2022, which was a Fox show again within an NFLN envelope, but differs from previous years when the European games were done by unilateral broadcasters, CBS, Fox or ESPN (which produced this season’s Jaguars vs Falcons game from Wembley in concert with NEP in continuation of a relationship established in the US).

EMG’s “soup to nuts” provision on site in N17 ranges from accommodation for all production crew to complex transmission delivery.

“It’s a highly populated OB compound at Tottenham but geographically quite small,” says Morris. “We’ve had to think outside the box in terms of how we can accommodate all production needs.”

EMG has built “a small town” of stacked cabins and brought its own power generation units, and an enormous amount of tertiary kit for production needs. EMG’s Nova 111 is the lead truck supported by smaller units. The main gallery is in Nova 111, but EMG builds bespoke galleries for graphics and transmission in cabins which it fit out with flypack systems.

The scale of the game is on a par with a regular AFC Championship Game and as such attracts a raft of senior broadcaster execs to the venue who need to monitor and make editorial decisions onsite. To facilitate this, EMG builds them separate galleries too.

“There’s a lot of editorial work on site in terms of shaping the show, what needs to be seen and analysis,” he explains. “The broadcasters have a team of execs who are ultimately responsible for delivery so we give them access to all the sources. They have the ability to route monitoring sources to their own monitoring stack and we build a working environment where they can monitor the game in detail and have their own editorial discussions in a quiet environment rather than in the noisy cut and thrust of the main gallery.”

NFLN opens the show with an external presentation position looking at the stadium, and moves into the stadium with a custom-built NLF branded field set built by EMG’s contractor. Gradually the show moves via a reporter position pitch side up to the booth which also has a custom desk manufactured by EMG with construction partners Trans Sport.

Camera and format

Last year was the last interlaced production which has shifted to 1080p 50 SDR. “Progressive elevates the look without doubt while bringing us into line with the US broadcast norm,” Morris says. “And that’s straightforward because EMG, along with all the major OB companies, have the ability to shoot ‘P’ since we’re all UHD capable (of which Progressive is a byproduct).”

This year’s European games have more cameras than ever before. This game features 30 coverage cameras along with five presentation cams.

In addition, there are robotics on each goal provided by EMG’s specialist camera unit ACS; Spidercam, which is provided by the NFL itself; and POV cams capturing the team entrance to the field.

This year, for the first time, coverage includes Pylon Cams. The orange Pylons at each corner of the field are mounted with two cameras: one with a fixed standard lens, while the other carries a 360 lens in a feed produced by US developer C360. C360 supplies a baseband camera output from each pylon and the analyst camera.

“Within that 360 camera they are able to zoom, pan, tilt and identify action within the frame. It’s a really good system, it’s intuitive and it hit air nine times during the last game for analysis,” Morris says.

Another brand new piece of tech is the RF ‘line to game’ cameras which are designed, installed and operated by C360 in collaboration with EMG Connectivity. Operators carrying small orange Pylons move with the 1st and Ten line.

“As the action progresses down the field these RF Pylons follow to provide analysis on the line of action,” he continues. “These were used extensively at last week’s game and provide a level of analysis and level of detail of strategy we’ve not seen before in Europe.”

A Steadicam carrying a Sony FX9 and a prime film lens offers beauty shots from the sidelines offered for EVS or cut live.

“I was sceptical about this until I saw it and we’re all fans now,” Morris says. “We used it on the ESPN games last year for the first time and it gives incredible shallow depth of field. It does present a tough challenge for the operator because the line of focus is so small, but it elevates the game with that filmic look.”

Also new this year (for Europe) is the virtual 1st and Ten line (FDL) that tracks down the field as the game progresses. This is inserted locally but operated from New York “with no discernable latency”.

Morris explains: “Unlike in the US, games in Europe have unilateral broadcasters hanging onto our output. We have to produce the FDL onsite in order to pass a ‘dirty’ output to our local unilaterals.”

The main European broadcasters for this round are RTL Germany, M6 (France), Saudi TV, ITV Sport (swapping with Sky Sports for this game) and DAZN. They take the main host feed and to lesser or greater degree add their own talent and presentation.

More on 1st and Ten

The FDL was traditionally produced by SMT, initially by devising tripod heads capable of calibrating the field and by modifying the long lenses so that data was fed into the trucks to identify the FDL, which is of course a moving target.

“Now the mapping is done virtually with no modifications to the kit – which is great from a facilities company point of view,” Morris says. “We were always a little hesitant when a third party starts dismantling your lenses and providing different heads. The calibration time was also a protracted process but in the virtual world everything is much, much quicker. It is a crisper and more stable 1st and ten and it is not labour intensive.

“We’ve gone from a team of 2-3 crew on site to one operator on site just to guarantee the hardware, though in honesty the facilities provider could do this task as part of the service.”

If the workflow for the FDL operates on the same principals as remote production – labour and travel saving efficiencies – then why is more of the production also not remoted back to the US?

This is because game production is too complex for full remote. “The amount of individual feeds needing to be sent back to the States would exceed any reasonable amount of bandwidth,” says Morris.

He conservatively estimates a minimum of 100 feeds would need to go to LA. “It is a very, very well honed but complex operation. It takes the technical directors who run the switcher at least a day to build the show. Unlike European soccer events where a vision mixer cuts on behalf of the director with graphics inputs, NFL is so graphic heavy, multiple replay dependent, that the amount of feeds just exceed any reasonable budget.”

In terms of graphics, there’s no change to operation of the basic font and score. The four strong CBS/NFLN graphics team are in charge using standard Vizrt machines and three main graphics engines over and above the 1st and Ten engine and the Pylon engines onsite.

NFLN wrap production

But that’s just production of game action. A whole other setup is arranged for NFLN’s pre-, half-time and post-game programming and this is a full remote operation. It’s why just one main OB is deployed in Europe whereas an NFL game stateside would have two.

“Camera feeds with embedded audio go straight to TX where they are multiplexed and sent to the NFLN gallery in LA where the show is stitched. This comes with a comet tail of complexity in terms of transmitting multiple feeds, multiple comms between LA and site, and returns back from LA to monitor what is going on,” he explains.

The Tottenham Hotspur Stadium itself is a ground-breaking, purpose-built NFL field and one that Morris is proud of having a hand in designing the cable, connectivity and IO overlay for the NFL. The stadium is contracted to host a minimum of two games per year over a 10-year partnership.

“It is up there as one of the best stadia across both Europe and the States,” he says. “The unique thing about Tottenham is that it has a football turf pitch above the NFL astroturf – which this year is brand new. It even has a pair of dedicated NFL locker rooms which only get used twice a year.

“All the naming of the various areas are in NFL parlance so, for a US NFL operator, when they walk into that stadium it’s home. It feels, looks and is labelled as a US stadium.”