Wednesday, 12 July 2023

Even the Industry’s Pioneers Are Still Working Out Virtual Production

NAB

We’ve only just scratched the surface of how to use light from virtual production displays, says Sam Nicholson, founder and CEO of Stargate Studios.

article here

Lighting the scene using LED screens is a big top of discussion in the unions right now, he says. Because it is technology that is neither designed as a light but is not purely a screen since scenes can be illuminated with it.

“I’m not going to take sides in that argument. But that’s something that we’re dealing with both playback and lighting right now, and trying to define them. It’s a new technology that’s right in between, nobody knows what to do with it.”

Nicholson is one of Hollywood’s leading virtual production and visual effects creatives. With nearly forty years as a visual effects supervisor, DP, director, and now virtual production supervisor, Nicholson and his company, Stargate Studios, have combined the latest in LED technology with their proprietary “ThruView” lighting system.

Interviewed at the 2023 NAB Show by Erik Weaver, director of ETC’s Adaptive Production project, Nicholson explained that VP is the process of capturing the real world and making it usable in such a way that it’ll play back in a volume.

“Virtual production is fabulous. [But] if you can afford to go to Rome to shoot live action, and get some great pasta, and have a great time, you know, go to Rome. If you can’t, then send a small team to capture Rome, the Vatican, and bring the data back [to your volume], and now you can control the situation.”

In his presentation, Nicholson talked about his journey with creating in-camera VFX, including working with legendary Doug Trumbull on Star Trek: The Motion Picture. He described creating the effect of a 60-foot high column of light shot in-camera and in real time on stage for director Robert Wise.

 

He worked through the green screen period, which he described as one where “basically the crew would get a lobotomy going in. Nobody knows what it’s going to look like. How do you light it? How do you light for daylight if you can’t see daylight behind me? Where’s the sun? Green screen was very messy, because all the actors didn’t know which way to look. Really bad for the actors and very difficult for a director of photography, and very frustrating for the director.”

He later worked on the groundbreaking TV series 24, elements of which were shot on green screen.

“It was kind of a game changer, because all of a sudden, they said it’s a lot cheaper to bring the location to the actors than taking the actors to the location. It’s very difficult to shoot in Washington DC. when you’ve got to get a permit from one [multiple] groups to shoot anywhere.

“With green screen the actors didn’t have to be out all night. But dammit, we hate being on green screen. I mean, we had Kiefer Sutherland, like, walk off the set because he hated shooting on green.”

Virtual production solves a lot of these problems, giving actors a cue as to the environment they are in. But it is still very early days in the technology’s development.

Covering dialogue scenes with a shallow depth of field on a longer lens is ideal, but the wider the lens the more difficult it gets.

In addition, and perhaps the biggest challenge, is that virtual production isn’t as flexible as you might be led to believe. If a director changes their mind about a shot after the event it remains difficult to fix that shot with all the backgrounds baked-in in post.

“If you have a director who doesn’t know what they want or you’re on a short prep schedule, don’t try to do virtual production because you’ll get burned. Be really aware that you don’t have an alpha channel. There is no matte. So you’re gonna wind up with a big old rotoscoping bill if you change your mind.”

His advice? Prep, prep and prep. “Virtual production is not a panacea. It’s a great new tool that does certain things like reflective objects really well. But it does other things horribly, like changing your mind.”


Tuesday, 11 July 2023

It’s Definitely a New Media Experience in the MSG Sphere

NAB

More than just another giant screen or an upgrade to 4D cinema, the latest Las Vegas attraction is being touted as a new experiential entertainment format.

article here

“We are redefining the future of entertainment through Sphere,” MSG Entertainment executive chairman and CEO James L. Dolan says. “Sphere provides a new medium for directors, artists, and brands to create multi-sensory storytelling experiences that cannot be seen or told anywhere else.”

“This will be a quantum leap forward in the sense of what a concert can be,” U2’s The Edge told Andy Greene at Rolling Stone. “Because the screen is so high-res and so immersive, we can actually change your perception of the shape of the venue. It’s a new genre of immersive experience, and a new art form.”

U2 will be the opening act for the Sphere on September 29, the first of a largely sold out 25-date residency running through the end of the year.

The developers of the 366-foot-tall, 516-foot-wide dome are aiming to reinvent every aspect of the live event experience and is the culmination of seven years of work, with a budget that reportedly stretched to $2.3 billion.

Virtual reality without the goggles was the elevator pitch, MSG Ventures CEO David Dibble recalls to Rolling Stone.

“We thought, ‘Wouldn’t it be great to have VR experiences without those damn goggles?’ That’s what the Sphere is,” says Dibble.

It had to have the world’s highest resolution screen, and so it does at 16K by 16K. There is no commercial camera capable of recording at that resolution without having to stitch together images from a camera array. So MSG Entertainment built its own camera system and a whole postproduction workflow, which together comprise a system it calls Big Sky.

Naturally, the Big Sky single lens camera boasts t316-megapixel sensor capable of a 40x resolution increase over 4K cameras. It can capture content up to 120 frames per second at the 18K square format, and even higher speed frame rates at lower resolutions. They designed a custom media recorder to capture all that data including uncompressed RAW footage at 30 Gigabytes per second with each media magazine containing 32 terabytes and holding approximately 17 minutes of footage.

According to David Crewe at Petapixel, who saw the tech first hand, “since the entire system was built in-house, the team at Sphere Studios had to build their own image processing software specifically for Big Sky that utilizes GPU-accelerated RAW processing to make the workflows of capturing and delivering the content to the Sphere screen practical and efficient. Through the use of proxy editing, a standard laptop can be used, connected to the custom media decks to view and edit the footage with practically zero lag.”

Specialist lenses have been built, too, including a 150-degree field of view, which is true to the view of the sphere where the content will be projected, and a 165-degree field of view which is designed for “overshoot and stabilization” and is particularly useful in filming situations where the camera is in rapid motion or in a helicopter.

The 164,000-speaker audio system that can isolate specific sounds, or even limit them to certain parts of the audience. This was designed by German start-up Holoplot following investment in the company by MSG.

According to Rolling Stone, the patented audio technology they created allows them to beam waves of sound wherever they want within the venue in stunningly precise fashion. This would allow, for example, one section of an audience to hear a movie in Spanish, and another side to hear it in English, without any bleed-through whatsoever, almost like fans are wearing headphones. “It can also isolate instruments,” says Dibble. “You can have acoustics in one area, and percussion in another.”

The venue can seat 17,600 people, and 10,000 of them will be in specially designed chairs with built-in haptics and variable amplitudes: Each seat is essentially a low-frequency speaker. There’s also the option to shoot cold air, hot air, wind, and even aromas into the faces of fans.

“There’s a noise-dampening system that we used in the nozzles of our air-delivery system that NASA found really interesting,” Dibble tells Rolling Stone. “They were like, ‘Do you mind if we adapted that for the space program?’ We went, ‘No, knock yourself out.’”

Director Darren Aronofsky (The Fountain, The Whale) was commissioned to shoot Postcard From Earth, the first piece of cinematic content for the Sphere with the Big Sky camera wielded by Oscar-nominated cinematographer Matthew Libatique.

“At its best, cinema is an immersive medium that transports the audience out of their regular life,” Aronofsky told The Hollywood Reporter. “The Sphere is an attempt to dial up that immersion.

He added, “Like anything, there are some things that Sphere works particularly well with and others that present new problems to solve. As different artists play with it, I’m sure they’ll find innovative ways to use it and affect audiences in different ways.”

He adds, “We just recently figured out how to shoot with macro lenses and we filmed a praying mantis resting on a branch. Imagine what that may feel like when we present it 20 stories high.”

The venue could house events like Mixed Martial Arts and will also be a centerpiece of the Formula One grand prix in October. MSG has announced plans to build similar venues in London and elsewhere.

It is too early to say but perhaps the highly bespoke nature of the venue and the workflow required to produce “experiences” for it may work against it. Will the technology prove more restrictive than flexible?

The Edge made this comment to Rolling Stone: “Unfortunately, because of the amount of time and expense in creating some of these set pieces visually, it’s quite hard to be as quick on our feet and spontaneous as we might have been on other tours.

“But we still are determined that there will be sections of the show that will be open to spontaneity.”

 


Behind the Scenes: Wimbledon, The Championships

IBC

The Championships, Wimbledon 2023 may feature more UHD and High Dynamic Range coverage than ever this year but it’s the volley of editorial firepower being served up that is the real story of the Championship’s technical production.

article here

“We know millions of people are watching the linear output but many are also doing so while their phones we’ve designed and expanded coverage to fulfil a diverse range of audience needs,” said Georgina Green, Broadcast and Production Manager, Wimbledon. “There is so much going at Wimbledon across both weeks we essentially aim to get as much content out to everybody as quickly as possible.”

SW19 is dormant from a broadcast point of view outside of the annual fortnight but the team at the All England Lawn Tennis Club’s (AELTC) in house host broadcast division Wimbledon Broadcast Services (WBS) spends the full year on development. The team is led by Paul Davies who has all but lost his voice when IBC is invited to meet them in the middle of the Championships.

That’s okay though since Davies has more than capable deputies in Green and Broadcast Technical Manager, James Muir. “Wimbledon is the pinnacle of the sport and we aim to match that as the host broadcast service whether that’s technical quality of pictures, quality of picture in the edit, the cameras angles we offer, new mic positions,” Muir said. “It’s about making it as good as it can be whether that’s for a broadcaster in Australia or a smaller digital centric rights holder in Asia.”

The value of Wimbledon to the AELTC as a business was laid bare when Covid cancelled the Championships in 2020, shrinking revenue to just £3.8million, compared to a £292m turnover in 2018/19. Armed with new contracts including with ESPN until 2035, Australian streaming service Stan Sports, and the BBC extended until 2027, the Club returned to a £288m revenue pot for the last financial year (the next report for 2022 is due end of this month).

To maximise the value of rights holder’s investments and to ensure that bids for the next round of rights remain keen, the onus is on WBS to keep pace with changing audience demand. That means more attention to digital and social media but it also means more behind the scenes coverage. Sports documentaries like Netflix series Break Point have created mainstream audience expectation for stories beyond the action on the court.

“The onus is on us to facilitate that access for rights holders and make sure we can give them as much as they want but with an element of control and trust on behalf of the Club so that they [the AELTC] can see the benefit of it,” said Muir.

This manifests itself most obviously in a new pool called Access All England produced alongside the World Feed’s on-court action and made available to rights holders to cherry pick from for their own presentation.

It features, for example, new camera positions and audio of the players arriving to Wimbledon (via tunnel) and more of their journey throughout the day from practice court to lunch to locker room. This is bolstered by a wealth of beauty cams and ENG crews wandering among the summer dressed throngs of the Wimbledon complex. It also includes footage and interviews of broadcast personnel inside the Media and Broadcast Centre.

Whisper is Wimbledon’s new production partner for this year and next, signed in part because of its track record in producing sport with a bias toward entertainment as much as action. It is producing the World Feed, international highlights, a creative preview film and an official film of The Championships, as well as Access All England.

Wimbledon Threads, produced by Whisper, is a strand of stories looking at fashion and clothing. The Purple Carpet is a series of short interviews with celebrities and those in the Royal Box. Second Serve is a second take on the day using different camera angles.

“We’re producing all that in broadcast quality putting it up on (MAM network) Mediabank and making it available in all formats - 16x9, 9x16, 1:1,” said Green. “Our digital team are making sure that what they shoot (on mobile phones) goes into Mediabank for use in socials. We’re working more collaboratively with digital and marketing teams to bring it all together.”

There’s even metaverse style activation to attract the next generation who will be weaned on games not BBC One. These include an online, branded Fortnite race game, featuring Andy Murray and a Roblox experience.

One measure of success will be ratings. Last year’s Men’s final Novak Djokovic Vs Nick Kyrgios peaked on BBC One at 7.5m and was also streamed live 2.6m times on iPlayer and BBC Sport online. In addition, the volume of hours consumed by audiences on TV was the highest since 2016 which saw Murray lift the title.

“Getting through it cleanly is always the priority,” said Muir. “Then there’s broadcaster feedback and so far it is very positive about what we are producing for them. Another indicator is that when there are big significant moments did we cover it correctly with the right editorial. We spend a year planning and there always learnings to take away and improve for next year.”

Two flagship courts in UHD HDR

Last year Centre was the only court in UHD HDR and this was processed as a separate workflow to protect the HD feed. The big change this year is the No.1 Court is also UHD HDR with a workflow solely for the format. The 16 other courts continue to be available in either 1080p HDR or 1080p SDR.

Sam Broadfoot, Technical Project Manager, NEP UK, commented: “With the need to continue to offer rights holders HD SDR feeds as we have in previous years, we’re now using 224 channels of conversion but we’re working in just one workflow and it means the setup of our trucks are similar.

“We’ve also found that the quality of the converted SDR feeds have improved as well, since it is now being captured with a higher dynamic range.”

The increase in UHD-HDR feeds is only part of NEP Group’s full global production ecosystem in play at The Championships. NEP’s full suite of solutions includes broadcast facilities and OB trucks, connectivity, live display and other broadcast services supporting the World Feed.

Mediabank, NEP’s MAM solution is used for remote access to match highlights and other content to be ingested, managed and distributed for rights holders.

NEP Connect is providing a 10G link to Oslo from IMG, with further support from NEP Netherlands, which is supplying 1PB of onsite storage. Additional broadcast services from NEP include 36 EVS VIA machines, 58 host Sony cameras, 150 talkback panels and over 90 km of cable installed each year. More than 300 NEP broadcast engineers, technicians and crew members are onsite supporting the host broadcast and other rights holders.

Robotic cameras

Seven courts are equipped with automatic camera system, Tr-Ace, from NEP division Fletcher. Tr-Ace cameras use image recognition and LIDAR to automatically track players on the court, meaning just one singular operator can control and manage the system for all seven courts.

Aerial Camera Systems (ACS) is supplying 46 specialist cameras to WBS, eight more than last year. These are mostly Sony HDC-P50s with SMARTheads. The new areas covered are the player’s arrivals area and some behind the scenes shots.

“It is important to WBS that each court looks exactly the same from its technical coverage,” explained Matt Coyd, Sales Director, ACS.

Centre Court features five robotic cameras including a 10m baseline track sitting behind the players and tracking their horizontal movement. It is designed into a special hide which is hard to spot on air.

Centre also has two compact cameras, one for each player, fitted discretely to the Umpire’s chair, and remote at camera position 11 and in the northeast corner of the stadia.

No.1 Court is the same minus the NE corner remote, No.2 has two positions and most of the other courts have at least one robotic camera taking a wide master on a high pole or on the side of a building all with bespoke mounts.

There’s also a couple more track systems on the southern court of 25m and on the broadcast centre roof covering the northern courts running 36m.

Various robotic SMARTheads capture beauty shots from the trophy balcony and clubhouse (which sports a 100:1 box lens), player’s balcony, crowd cam and even a ‘flower cam’ – the latter among those in UHD HDR. The press conference area also has a P50.

The practice area is also covered with robotic systems enabling rights holders to provide live coverage of players warming up.

Serving new data

To established data gathering and analytics partners SMT (scoring), Hawkeye (ball tracking) and IBM (ball speed and AI driven highlights compilations) a new addition comes from TennisViz, part of sports analytics company Ellipse Data which is also home to the CricViz and SoccerViz apps.

It ingests the raw ball and player tracking data from Hawkeye and turns it around in less than a second into a range of new data points and insights that it claims have never been available before.

It does this for every point in every match and is used to support the TV broadcast and digital media coverage and, separately, to provide granular analysis for players and coaches.

One of the new metrics is Shots Played In Attack, a key aspect of the game for which there has never been an objective measure calculated in real time, according to Thomas Corrie, Head of Performance and lead analyst, TennisViz.

“Deciding whether a player is in attack or defence is not as simple as pinpointing their court position,” Corrie, a former LTA coach, explained. “The opponent’s position - left, right, forward or back - needs to be accounted for as well as the quality of the ball received.

“If I’m striking a ball and you are out wide then that gives me an advantage in that point,” he elaborated. “It’s not as simple as saying that if you are up the court you are in attack. It’s about the contact point at which you play.

“We consider the quality of the incoming shot because even I am inside the service line playing a volley, if I pick that ball up from my toes I am defending even though I am at the net. Or I could be playing a volley at the net but on the stretch.”

A Conversion Score shows the percentage of time when a player is in attack that they go on and win the point. “You won’t necessarily win if you are the more aggressive player, so you have to be clinical and convert those chances,” Corrie explained.

This can be correlated by another metric, the Steal Score, showing when points are won when the player is in defence.

“The average for Steal Score in the gentleman’s draw is 31%, for the ladies draw it is 33% of points but some players – like Alcaraz, Djokovic and Swiatek - win approx. 40-42% of points in defence. That will appear on screen and it will be commentator’s job to educate the audience that it is not normal for a player to win above a third of their total points when in defence.”

TennisViz also measures shot quality. It does this by breaking down the shot into dozens of data points including from basic serve, return, forehand and backhand to speed, height and spin of the ball as it crosses the net, the depth into court, its width, and bounce angle. It records this for every shot hit and the impact it has on the opponent and the algorithm offers up an instant score out of ten.

“A dropshot is not measured against the same quality parameters as a forehand drive, for example,” Corrie said. “Different types of forehand shot are also measured differently to each other and in context of the impact on the other player. The game has different nuances and this is reflected in the score.”

Every shot is aggregated so that over the course of the match stats can provide information about the quality of any type of shot.

TennisViz algorithms take account of different playing surfaces. The Wimbledon application is trained on 5 million shots from the last two Wimbledon championships. The information and insights are presented as lower third graphics on screen but the next stage, perhaps for 2024, is to use the data points to build CG highlights to be used in pre- and post-game production.

Behind the Scenes: Wimbledon 2023 - BBC

Major rights holders the BBC and ESPN essentially take the rushes of the World Feed but apply a generous serving of their own presentational cream.

The BBC has over 70 vision feeds produced by WBS available in its NEP-supplied production truck and supplements this with its own jib-cam behind Court 18 for those sweeping shots over Henman Hill towards St Mary’s court. It is fielding two ENG crews with radio-cams to reflect more of the atmosphere of the event outside of the court.

In this endeavour they are aided by new lead presenter Clare Balding. The BBC’s main studio position is in the Broadcast Centre. Three other positions are deployed for instance for weather forecasts and crowd colour.

The BBC has also made a change to its highlights format this year. This was traditionally a live programme that tended to get delayed in the schedule or not broadcast at all because priority was given to late finishing live matches.

This year’s hour-long highlights show are post produced onsite, transmitting every night at 9pm, also available on iPlayer and Red Button to guarantee viewers can see it.

 

Monday, 10 July 2023

So What Does Everyone Else Think About AI? (It’s the Beginning or the End or Both)

NAB

AI is out in the wild and being used most extensively for creative experiments, according to a new survey.

article here

People are generating music and videos, creating stories, and tinkering with photos using free AI engines like ChatGPT.

Above all, people have simply been using AI systems to answer questions — suggesting chatbots like ChatGPT, Bing, and Bard may replace search engines, for better or worse.

The report, “Hope, Fear and AI,” from The Verge and Vox Media, polled 2,000 Americans about their attitudes to towards artificial intelligence.

One finding is particularly clear: AI is expanding what people can create. In every category polled, people who used AI said they used these systems to make something they couldn’t otherwise, with artwork being the most popular category within these creative fields.

“This makes sense given that AI image generators are much more advanced than tools that create audio or video,” note the survey authors.

There is general awareness of the ethical issues around AI and art, but less clarity about what to do about it. For example, most people think artists should get compensated when an AI tool clones their style, but a majority also don’t want these capabilities to be limited. Indeed, almost half of respondents said they’d tested the system by generating art in the style or voice of a writer, artist or other well known figure.

A new survey from The Verge and Vox Media shows broad support for regulations on AI. Cr: The Verge

More than three-quarters of respondents agreed with the statement: “Regulations and laws need to be developed regarding the development of AI.”

These laws are currently in the works, with the EU AI Act working its way through final negotiations and the US recently holding hearings to develop its own legal framework.

 

The report highlights that there’s strong demand for higher standards in AI systems and disclosure of their use. Strong majorities are in favor of labeling AI-generated deepfakes, for example. But many principles with wide support would be difficult to enforce, including training AI language models on fact-checked data and banning deepfakes of people made without their consent.

The use of generative AI tools doesn’t stretch much beyond experimentation at this stage and in fact only one in three survey respondents have used them. When they do, brand recognition for ChatGPT is highest, though few people are familiar with the companies and startups behind the tools.

That said, people have high expectations for AI’s impact on the world — beyond those of other emergent technologies. Nearly three-quarters of respondents said AI will have a large or moderate impact on society. That’s compared to 69% for electric vehicles and a paltry 34% for NFTs.

More than 60% of survey participants predicted job losses as a result of AI and other societal dangers, including threats to privacy and government (ranked at 68%) and corporate misuse (67%).

“These dangers are weighted more heavily than potential positive applications, like new medical treatments (51%) and economic empowerment (51%). And when asked how they feel about the potential impact on their personal and professional life and on society more generally, people are pretty evenly split between worried and excited. Most often, they’re both.”

Fifty-six percent of respondents think “people will develop emotional relationships with AI,” and roughly half expect that a sentient AI will emerge at some point in the future (two-thirds don’t have an issue with companies trying to make one).

Yet, nearly 40% think that AI will wipe out human civilization.

Perhaps that’s why more people are worried than not.


Tuesday, 4 July 2023

DMC: A Pan Tone for Cinema

IBC

The efforts made to accurately represent diverse skin tones on screen are finally breaking through in the form of Digital Melanin Cinematography (DMC).

article here

From the roots of its invention the fabric of cinema has contained bias. Stemming from a deliberate decision to prioritise the aesthetic beauty of Caucasian skin over darker skin tones in the chemical composition of colour film stocks, the accuracy of how non-white people look on screen has largely gone unchecked. The bias is also ingrained in digital cinema systems perpetuating the false assumption that darker skin tones require more light or are harder to film.

A group of South African filmmakers and scientists aim to change that by creating a new universally accepted standard in the approach to filming, photographing and processing melanin rich skin.

“This inherent bias has turned film into a political weapon in that [film] was never made for non-white communities,” said Mandla Dube, director and cinematographer (Silverton Siege) who has pioneered Digital Melanin Cinematography (DMC) with fellow filmmaker Ndumiso Mnguni.

DMC, they explain, is a study of how the appearance of skin from the people of the African and Indian diaspora is affected in media. The results of its research are hoped to be presented in a white paper at IBC this year. One of its aims is to counterbalance the weight of R&D that has led to prevailing standards for capturing skin tones on film.

“The film industry been around for more than 100 years during which people have oriented around a body of knowledge,” Dube explained. “But [non-Caucasians] were not active in producing film stock. Film was expensive and not easily accessible, limiting representation and how we could portray ourselves in our own stories. When the industry migrated to digital that research simply didn’t exist. The status quo continued.”

DMC: Improving on manufacturer LUTs

They have partnered with South Africa’s Council for Scientific and Industrial Research (CISR) to devise a tool that cinematographers can use to more accurately calibrate a digital sensor to photograph skin tones of all hues. Digital cine cameras with 17-stops or more of dynamic range and wide colour spaces should have the sensitivity to capture any skin tone, yet when lensing their own work Dube and Mnguni avoid using manufacturer LUTs (the default Look Up Tables that come with digital cine cameras).

“We’ve found they don’t necessarily map around African skin tones accurately. So, we’re building up our own profile. One of our attempts is to build a colour chart that can sample skin tones at a higher rate and help the camera render a wide variety of skin tones.”

Skin is the largest organ of our bodies and something all of us are very conscious of. It provides us with important non-verbal information such as perceived ideas of age, health, and cultural background.

“Understanding how pigment is created and perceived through the human experience gives the study of melanin cinematography a foundation before translating into something that sensors can understand and reproduce accurately and beautifully,” he said.

They have conducted tests with different camera systems and compiled a database for the software which could be applied on different projects.

“We are multiple using data sources,” said Mnguni. “Some we think are good skin tone renditions and others are bad skin tones. It’s important not to get just one perspective which would echo one’s own bias but to achieve a universal sense of agreement with a larger sample. But where is the data going to come from as far as digital melanin skin tone is concerned unless we feed it?”

The filmmakers hope SMPTE and The Academy of Motion Picture Arts and Sciences will take notice. The Academy’s ACES colour image encoding system is itself nearly a decade old and is arguably due for an upgrade. Discussions would also be important within British and American cinematographer societies.

DMC: Industry collaboration is key

Reference monitors would also need calibrating to the same standard in order for cinematographers and colourists to see the results of their work. Ideally, all display panels from TVs to smartphones would also be built to take account of digital melanin’s skin tone accuracy.

“Clearly, this is not going to happen overnight,” said Dube. “We will need the collaboration of engineers, filmmakers and studios globally if we are going to be intentional about changing the rendering of all skin tones.”

There has been positive feedback from some camera and grading systems manufacturers including Sony and Blackmagic though none has yet adopted Digital Melanin’s schema into their product.

Google made improvements to the camera on its Pixel smartphone in 2021 to better capture darker skin tones. ‘Real Tone’ ensures that the auto white balance in photos is improved, so that darker skin tones don’t look paler than in real life. Announcing the development, Google said it would focus on making images of people of colour “more beautiful and more accurate.”

Just as important as technology change for Dube and Mnguni is to provoke dialogue among the cinema community in the hope of establishing best practices for the curation of images that respect all skin tones.

“We’re not saying that African skin has never been rendered beautifully through cinema history,” said Dube, “but we are saying there’s a lack of consistency of standards and that we have a framework that will yield the best results. Digital Melanin is a chance to interject new information into the curation of the digital negative and become part of the growing body of knowledge in film.”

They argue that when cinematographers prepare to shoot a project with the intent of skin tone accuracy it is a process of trial and error. The proposed DMC tool would give filmmakers a starting point that removes the guesswork.

“We’re not taking away from the filmmaker’s own creative process but giving them a good basis of truth from which to begin,” said Mnguni.

DMC: Positive progress to date

Their work has already made an impact. Sony Pictures’ The Woman King was shot in 2021 in South Africa by Polly Morgan ASC BSC with Digital Melanin’s guidance.

“Historically some filmmakers haven’t done the best job in lighting properly for dark skin,” Morgan explained. “From a light meter to a camera sensor every exposure tool [cinematographer’s use] is based on 18% middle grey. Since 18% grey is matched with lighter skin tones, when you are exposing everything is related to Caucasian skin. The aim of Digital Melanin is to help cinematographers everywhere to light dark skin with accurate tonality. It is important for everybody to have this conversation and not be shy of shooting black skin or nervous about broaching this topic.”

Netflix is also supporting Digital Melanin’s initiative. The streamer has invested $160 million in African originals since 2016 including commissioning Dube’s Silverton Siege.

“Netflix is talking to us about how to integrate our ethos into work shot for Netflix Africa,” said Dube. “Naturally, a lot of content shot here now mostly features African skin tones in front of camera.”

The ‘Digital Melanin’ approach doesn’t just concern lighting but encompasses a production-wide embrace of wanting to see people correctly on screen. Make-up artists, for example, have a role to play in learning about the best application for different skin tones.

“How people are portrayed in cinema has a profound effect on how people see themselves,” said Dube. “Growing up I was watching Caucasian people rendered in a way that looked beautiful and larger than life. That has an effect on your self-esteem. How someone looks aesthetically on camera can have a huge impact on how they perceive their sense of self-worth. This is gradually starting to change and we want to show how we can part of moving that further upstream. This is the philosophy of DMC.”

Other notable cinematographers like Tommy Maddox-Upshaw, ASC (White Men Can’t Jump) are making vocal and visual statements about this nuanced and global approach to tonality.

“When I look at shows, a lot of the Black folks look monochromatic and that’s not right. I have four sisters and we are all different hues of brown. I am a different complexion to my daughter.”

He extended this sensibility universally. “People in the UK have a different skin pigmentation from Caucasians in South Africa or the Mediterranean. All digital cameras interpret skin tones a certain way but my take is that I should be the one in control of manipulating skin tone if I want to.”

Cinematographers, encouraged by film schools, will still often take their cues from the classics of European art. Caravaggio, Rembrandt and Vermeer are revered for their lighting, true, but also their portraits of white skin tones.

“What does that say to me as a black cinematographer when working with African skin tones?” asked Mnguni. “Which painter do I refer to as a baseline?”

Inherent bias

Historically the industry standard for capturing images was centred around ‘Shirley cards’ which were used to calibrate skin-tones and light. They only featured Caucasian people up until the 1970s where photographers shooting commercials raised concerns over not being able to capture the ranges and variations of colour found in wood and chocolate.

Shirely cards were distributed by Eastman Kodak to photo labs and featured photos of “similar-looking alabastrine, brunette white women,” according to Kaitlyn McNab writing for Allure.

“Shirley became the standard for colour correction, the yardstick used for processing by technicians, and now, a symbol of the skin colour bias deeply rooted within the world of photography,” she writes.

While FujiFilm began to alter its colour stocks to better reproduce Asian skin tones, the R&D from Kodak the world’s dominant film stock manufacturer persisted in the transition to digital.

 

Dion Beebe ACS ASC / The Little Mermaid

British Cinematographer

The Little Mermaid is a complicated, VFX-heavy production in which the main challenge was to blend mostly CGI-created underwater sequences with the largely live-action shoot for above-water scenes.

article here

The latest in Walt Disney Pictures’ live-action retelling of its 2D animated classics is among its most complex productions yet. The Little Mermaid is the tale of two halves: one largely live-action shoot above the water and the other largely CGI creation underwater. Finding a way to shoot, light and blend those story worlds preoccupied director Rob Marshall (Chicago, Pirates of the Caribbean: On Stranger Tides) and cinematographer Dion Beebe ACS ASC (Chicago, Gemini Man) in extensive prep for the movie.

 “This was a complicated story to tell,” says Beebe. “We spent a good year trying to figure out the methodology. There are a lot of ways to tackle a movie like this and a lot of technology to support these ideas but as with any project, for me it’s about the story and the director’s preferences and strengths. What do they tend to lean into and what sort of technology will they best respond to?

“Since Rob’s background is theatre and dance and The Little Mermaid is a musical, timing is everything for him, especially in the musical numbers. We needed a methodology that would give him control over all live action elements to best serve his vision for the project.”

They were days out from starting principal photography in March 2020 when COVID forced production to shut down. They didn’t recommence rolling at Pinewood Studios until that December, but the extra months were used to dive deeper into previz and boards.

“Because the complexity of what we were doing was so tech-heavy, having that additional time working remotely meant we were very prepared when principal photography began.”

Aiming to protect the live action as much possible, Beebe and Marshall agreed to film actors including Halle Bailey, Javier Bardem and Melissa McCarthy in rigs against blue screen rather than use performance capture.

“Our actors could interact with each other, and we could control timing, choreograph their movement and how the camera moved around them. We were shooting as much of that as we could live knowing we were only extracting torso, faces and arms and that the lower half of the body would be tails.”

Three-pronged approach

To Beebe the film was effectively divided into three types of production: the above world “a fairly straightforward shoot with fantasy period elements”; below the sea “where we embraced a lot of VFX”; and a third mode in which CG characters Flounder, Sebastian and Scuttle were folded into the live-action plates.

“Because this incorporated puppeteering on set it was like another type of film. We’d film Ariel with the puppeteers and pre-recorded voice dialogue in those sequences so we could control reactions, eyelines, and timing.”

Most work was devoted to blending the story transitions between ocean and Earth. Shooting on the large canvas of ARRI Alexa 65, Beebe felt that Hawk Anamorphics were right for above sea shots. Below decks he switched to spherical Leica Thalia Primes.

“I felt it important that the VFX team had the ability to control depth under the water. If I’m shooting anamorphic wide open and my depth of field drops off at the ear of a character in the foreground then adding another character into that frame, particularly one that has dialogue, would be harder in anamorphic.”

Preproduction involved research and tests, for the look of the underwater sequences in particular. Beebe studied BBC documentary Blue Planet for references to how underwater photography should look in terms of visibility, depth of field and colour.

“We spent a lot of time developing a template for how water would look at different depths, building in ideas of what particles are in the water, water density and the fall off from depth of field we wanted.”

In this endeavour he recruited DI colourist Michael Hatzer, whose own collaboration with Marshall and Beebe began on Into The Woods (2014) and continued with Mary Poppins Returns (2018). His work with Beebe predates this, having first met when Hatzer colour timed the science fiction Equilibrium (2002). Now at Picture Shop in Hollywood, Hatzer was invited by Beebe to join The Little Mermaid early in production for screen tests for the main cast including Halle Bailey.

“We shot a lot of hair, make-up and wardrobe tests with Rob and Dion at Pinewood,” Hatzer explains. “Throughout pre-production the driving aim was to figure out how we WERE going to combine the above-water material with the underwater shots. We concluded that we needed two different workflows for two different colour spaces.”

They had to take account of the colour spectrum which changes when filming underwater and by water depth.

“When Ariel is at her shelf area underwater with Scuttle and Flounder we know that is close to the surface and it is quite clear and bright,” Beebe says. “In King Triton’s domain Atlantica it’s a little cooler in the shadows and visibility starts to lesson a bit. When we dive deep into Ursula’s world the red spectrum is largely eliminated and we’re into deep blues and purples. So, Michael and I developed ideas of how we’d tackle each realm in the underwater world.”

Creative collaborations

The red hue of hair chosen for Ariel is deliberately darker than in the original animation. In conjunction with VFX, hairstylist Camille Friend commissioned three custom shades of golden-orange, 30-inch hair extensions to encase Bailey’s own locks.

The DP adds, “I will always shoot tests in prep and then bring Michael into that process so I can take the test through the DI and really start to look at ideas for final colour.”

This also included tests for above water scenes such as in the castle, seat of Prince Eric’s kingdom, where there’s a lot of candlelight and firelight sources.

Consequently, they devised an ACES colour space for the underwater sequences – a pipeline the VFX vendors preferred – while above-water material was handled in LogC3, the wide colour gamut recorded by ARRI cameras.

“We had to integrate and blend the colour space of the two worlds before the grade,” says Hatzer, who built an ACES workflow that did just that in Baselight at Picture Shop. “Once we figured out how to do that we gave Dion and the VFX companies different LUTs to work with. I would give Dion the option to use the LUT so that his DIT could apply it on set and that way we’d arrive at dailies which were pretty close to what they wanted.”

Beebe picks up the story, “For the underwater world we developed looks in terms of how cool we wanted to be in the shadows, the density of water. As I move into different scenes I could apply the most appropriate LUT to the sequence. These looks would then be applied to dailies.  Knowing we had a long process of VFX and post to go through before it was finalised the tests were available for the VFX team to reference for colour and movement.”

Lighting the underwater sequences was another challenge, not least because the filmmakers wanted to bake practical light into scenes featuring performance.

“We knew we wanted movement in the light,” Beebe says. “Underwater is never static, the characters are never static, there’s always movement of people. It’s like floating in space and the camera is moving on multiple axes all the time. Whether day or night the underwater is still lit from above.”

Water trays are often used to illuminate subjects above ground and near water. These are on the floor, sometimes with broken mirrors in, onto which you skip a light which reflects back a ripple effect.

In this case though, the characters were going to be below the surface and at different depths for much of the time. With gaffer Dave Smith, Beebe devised a tray system that could work from above.

Trays with a clear Perspex bottom were rigged 30ft up in the ceiling of the stage. Moving theatrical lights were directed through the water to subjects below “as if lit through a giant liquid filter,” Beebe describes.

The SFX team then designed a way for gaffer and DP to stir and control the water’s motion remotely from the stage floor. Multiple trays were hung to cover multiple characters. “You could disturb the water and shine a light through it to emulate the movement of caustic light underwater. We could change the effect as we moved between the different realms.”

Sardinian sequence

One particular live-action sequence was shot day for night on location in Sardinia. In the story this is Prince Eric returns, at night, to the beach where Ariel had rescued him. Filming this involved testing many weeks ahead of photography to discern the best time of day.

“The beach we chose had a rock outcrop that we really wanted as a signature feature so the audience could instantly locate themselves in the story,” says Beebe. “In the scene Eric is lost in thought, looking out at the ocean. The idea of trying to light a huge night sequence on a beach where we had to feel the ocean’s presence wasn’t realistic, even to shoot at dusk. So, I proposed to Rob we do it day for night.

“I’ve done quite a lot of day for night, for example on Gemini Man with Ang Lee and I know you have to be considerate of any practical sources and the sun direction. Because of all the VFX enhancements you can add to it, day for night is a very useful approach when you’re in these huge environments that you can’t possibly light. It allows you to shoot with a lot of depth, see a lot of landscape and still create the idea of a night exterior.”

The tricky part was incorporating flaming torches which would illuminate the scene at night. They used extremely bright balls throwing nine kilowatts of light that would flicker and create a little bit of interaction onto Jonah Hauer-King playing Eric.

In the grade, as part of the test, Hatzer created an overcast atmosphere to block out the sun, brought down the highlights and vignetted areas to give the filmmakers a more accurate idea of what final picture would look like. The VFX teams used this as a guide to create the final look.

“Dion has such a keen eye and shoots beautifully,” Hatzer testifies. “I’ve worked with him for so long I have a good sense of what he wants the image to look like, where he likes his exposure to sit, where he likes shadows to be in relation to highlights. We will do very little tweaking here and there but, essentially, I am trying to add contrast or a certain saturation to colours or a little light into eyes or to bring out the beauty of skin tones. I am not recreating magical looks or anything that isn’t in camera. Rob and Dion know exactly what they want.”

While the DI was set up in Hollywood, Hatzer along with conform editor Everette Weber and assistant colourist Kevin Schneider moved to Picture Shop’s facility in Tribeca, New York for two and a half months of finishing. Marshall, editor Wyatt Smith and the sound mix was based out of New York making it far easier for the director to oversee the work. Beebe flew there too to supervise the grade.

Aside from the 2D 4K theatrical ‘hero’ grade, Picture Shop produced versions for Dolby Vision, the stereo 3D conversion and an HDR pass for home deliverables.

 

  


Monday, 3 July 2023

BTS: Evacuation

IBC

In 2021 an epic operation to rescue 15,000 people from Kabul as the Taliban closed in sparked headlines across the globe. A new documentary gives rare insight into the emotional impact on army and airforce personnel through first-hand interviews and on-the-ground footage.

“Soldiers don’t sit around waiting on their bergens to do documentaries,” said Bayard Barron, head of communications for the British Military. “You’ve got to stop them doing something which the tax payer is paying for and ask them to invest hours in interviews.”

article here

More pertinently, getting the army to speak on the record requires sign off from senior military bosses and the ministry of defence.

This the background to Evacuation, a three-part Channel 4 documentary reliving the last-ditch rescue of thousands of British nationals and Afghans from Kabul airport in August 2021.

Told from the Army’s point of view and largely avoiding political criticism of the chaotic retreat, the film features rare candid displays of emotion from serving personnel as they recount the desperate and alarming situation they encountered.

First hand accounts

Diana Bird, Squadron Leader in the RAF Police gave one of the eyewitness testimonies and said she wouldn’t have agreed if producers Wonderhood Studios were going to portray them as heroes.

“This is a story that is important to tell properly,” Bird said at Sheffield Doc Fest. “It was explained to me that [the documentary] was not going to be the hyper-masculine macho war type of thing you would normally see with the military but about the softer side of what we do as the armed forces. If we were going to do that and not make us all out to be heroes, then I was happy to be part of it.”

Also contributing his first-hand account is Private Ahmad Fahim, an Afghanistan-born interpreter for British military commanders. He was the only member of the army who understood the urgent questions and screams of Afghanis on Kabul airfield.

“Initially, when I was told about the project, I wanted to pretend [the evacuation] never happened but then I thought it would be more accurate if I presented my story myself rather than hearing it from someone else,” he said at Sheffield Doc Fest. “I was worried about taking part [in the documentary] because we haven’t as a group really spoken about what happened and you don’t want to come across as something you are not. It also brought up a lot of memories.”

During the evacuation, Fahim was just 100 metres away when a suicide bomber detonated a device which killed over 180 people.

“To be honest I didn’t know I was going to share that much. [Director James Newton] approached it as more of a chat than an interview which helped me to open up more.”

Over 2000 army personnel were involved in Operation Pitting, one of the largest British military undertakings since World War II.

Winning MoD buy-in

“I was wary about coming into an institution and being given people that are vetted and pre-selected when we are here to tell an honest story,” said Newton, also on stage in Sheffield. “We went to the barracks and started talking to people ten minutes at a time, furiously comparing notes and trying to work out what actually happened, since everybody has a different perspective.”

The casting process was also research. He continues, “You want to find people who have a story to tell, for example, those people who have served in Afghanistan, those who have a relationship with country, those who are changed by the experience. The strongest thing that came out was that everybody has been affected by the experience.”

Katharine Patrick, Head of Factual at Wonderhood Studios, negotiated access with the MOD. “Through my own contacts with the military I started to have informal chats exploring areas we could look at. I asked if there was any headcam footage of the evacuation. We did some digging and found out there was a combat camera team filming out there. We then talked about how we could obtain that footage.”

At that point they put in a formal proposal to the MOD and were invited in to pitch. Barron said the Army had lots of requests for access to the story and that Wonderhood’s bid was helped because it had Channel 4 on board.

“We want to appeal to as diverse an audience as possible, a youth audience,” he explained.

“The military has an internal audience to an extent; we’ve got veterans and there are those people who like watching troops marching up and down the Mall but we need to stretch ourselves to audiences in new places without making idiots of ourselves. The point with Channel 4 is the kind of audiences that it and the More4 app are going to bring.”

He also sought a commitment from the producers that there wasn’t going to be a political angle to the story.

“We know that the whole Afghan deployment is something that [people] can have a debate about,” Barron said. “The embers of it are still glowing hot in parliament as we speak. The military is a department of state so therefore there’s a political angle to any consideration but we don’t want to get involved in [media] that is going to skew the picture politically because we are not a political entity.

“So, the easiest thing for us to do is offer those people who can give authoritative testimony and try to get agreement from the production company not to chase a political piece – which we did with Wonderhood. We just wouldn’t be allowed to get involved in it otherwise.”

Doc as recruiting sergeant

While Evacuation eschews direct assessment of the UK’s wider presence in Afghanistan and the argument that the Foreign Office was sleeping at the wheel as the Taliban swept into Kabul, the military is presented as unprepared for the scale and immediacy of the task.

The first episode, for example, is subtitled ‘Did We Leave It Too Late?’

“I got a phone call asking me if I were to go to an unknown location to evacuate an unknown number of people in an unknown period of time with an unknown threat - so how many people would I need?” recalled Bird at the outset of the documentary.

She led a small advance party of twenty to Kabul to assess the situation, or as she describes it, “I basically took a Sixth form field trip to Kabul – most were 19-year-olds.”

At one point, trapped in Kabul by advancing Taliban forces, she wondered whether “it was a trip too far.” She was later diagnosed with PTSD.

Even the two-person combat camera crew appears make-shift. One of them admits to having zero prior experience of handling a camera. “Beyond taking pictures on holiday on iPhone I’d never used a video camera before,” said one. “Ben had to teach me, even to press record.”

Barron views the documentary as a recruiting sergeant for the Forces but one that would only work in reaching crucial Millennial audiences if the story they told was authentic. In greenlighting the approach, he admits to taking a risk.

“Unsurprisingly, most people in the army, air force or navy - and particularly those who’ve served 30 years - are fiercely proud of serving. They want to defend its reputation and that brings a conservatism toward media.

“But at the same time [the army] does do good things [as well as things that knock its reputation]. I have taken a little bit of personal career risk on this since not everybody [in senior military positions] has seen [Evacuation]. Hopefully you can see that our people have given quite a lot in a way we don’t usually give. [But] if you don’t know much about [the British Army] then the thing that will shine through is that the military is composed of people just like you.

“Ultimately, [the general population] is where most people are recruited from. We are trying to bring down barriers [to recruitment and to enhance the MOD image] and this is a powerful way of doing it.

He added, “I would say to military people [concerned about Evacuation] that you are in the military to do good things and to uphold the international rules-based system and democracy. Well democracy can’t function without a media that has access and that can criticise. You can’t have it both ways.”