Thursday 30 November 2023

Q&A: Sohonet on Connecting Storytellers, Integrating 5th Kind & the Road Ahead..

my interview / words copy written for Sohonet


article here

Sohonet says its reason for being is connecting storytellers. And in fact, they've connected thousands of companies and over 100,000 creators working across the globe.

"We work alongside studios and post teams to build a range of tools that remove technical obstacles in their workflows, so nothing gets in the way of creating great content," explains Sohonet's Chuck Parker, who joined the Sohonet board in 2013 and has been full time as chairman and CEO since 2014. "We are committed to our vision of revolutionizing the way storytellers create content by making collaboration more seamless and secure."

We reached out to Parker to find out how Sohonet is evolving with the industry, where their drive to empower creatives came from, and more…

How did Sohonet start, and how has it changed over the years?
The heritage of Sohonet is contained in our name. In 1995, a few friends in post met at their local watering hole in Soho [London]. They represented five computer graphics companies, and they needed to find a way to send large digital files to each other quickly and securely. They also knew that if they didn't make it, nobody else would. And so Sohonet began.

At that time, the state-of-the-art was ATM. Following an MBO in 2001, Sohonet took on its first significant investment in 2012 and grew its service offering on the West Coast of the US. Since our early, experimental days, we've grown our media network into a multi-Emmy Award-winning global powerhouse and the largest private network for the M&E industry.

In 2014, Sohonet launched its first software products with our accelerated file transfer tool FileRunner. We then released ClearView Flex in 2017 and ClearView Pivot in 2020 — for sub-100ms latency remote collaborative over-the-shoulder experiences. The pandemic accelerated the way the industry viewed tools like these, taking them from a "nice to have" for occasional use to an essential and embedded part of everyone's working life.

Sohonet acquired 5th Kind earlier this year. Why was this an important move for you, and what does it bring to your users?

‍There is a convergence happening in the industry. As both production and post workflows move to the cloud, the ability to manage production assets at scale with people in many different places will require a more sophisticated toolset. We believe the correct approach is production asset management (PAM) and that there are several improvements we can make for storytellers in how they collaborate.

Even five years ago, everything from set to post would happen in the four walls of a building on a hard drive and tape. Asset management was a physical job that constrained the flexibility for creative labor and meant that clients had to come onsite in person.

The pandemic turned that on its head. It's normal now to have teams distributed across different cities and multiple time zones working on a shot together, grading or mixing it remotely as if all the creatives were in the same room. There are tools that solve that as a point solution — ClearView Flex being a great example — but the minute that session is over that workflow goes back to its disconnected state.

The right PAM approach means those assets can move securely to review and approve and then onward, so that every workflow is in sync. For instance, when editorial is complete, the asset can flow to a dubbing specialist for final mix, VFX shots can flow back into the process, and assets are available to streamline integrated marketing for trailers and press kits or merchandise development and integration with retail.

With the converged trends of post moving to cloud, virtual production and remote collaboration, it is critical for the industry to embrace a secure, synchronized, data-first and superfast workflow.

‍You previewed several 5th Kind product integrations at IBC. How is that going, and are there plans for further integrations into your existing portfolio?

‍Yes, we have created two derivative products from that portfolio. Storylink is production asset management for studios, large-scale productions, franchises and multi-season episodics, and ClearView Rush is a review tool for dailies.

And there are indeed plans for further integration. For instance, when a director or DP can't attend a live ClearView session, we plan to enable the session to be recorded so that the key creative can make their notes offline and share back. We anticipate having that ready early next year.

Our file transfer tool FileRunner is also being integrated into Storylink to enable a smoother flow of data in and out of the system securely and at speed. We actually showed this on our stand at IBC in September.

Can you tell us a little more about who's using Sohonet, and what they're using your tools for?

‍There are three elements to Sohonet's business. One is first-class managed production services. These are essential on-set communications, such as phones, internet, Wi-Fi and connectivity, for which we are partnered with 300 premium stages around the world, including Pinewood, Trilith and Shadowbox. We also service another 400 stages with scalable connectivity, essentially connecting those premium stages to every large distribution studio from Disney brands to Warner Bros. Discovery brands, Apple, Netflix, Amazon, NBCU, Paramount and more.

‍The vast majority of VFX and post houses in the industry are connected to our network. That means industry participants can push large scale data, meaning multi-Terabyte plates for VFX, with confidence and with speed. And half of our business comes from a very wide range of production companies, VFX and post houses that use our collaboration solutions for remote flexible work. The primary relationships that 5th Kind has built for its products are with Marvel, Universal and Warner Bros. Discovery, as well as forward-leaning gaming companies like Riot Games, which are creating episodic and feature content from their existing IP.

Any sneak peeks for our readers? Can you share a bit about your product roadmap for the next six months to a year?

‍Our product roadmap for the next six months to a year is centered on redefining and streamlining collaboration workflows for media pros. We've already unveiled several important updates at IBC. And looking ahead, we are highly focused on integrating the user experience and security across our product portfolio, making it easier for storytellers to quickly find, access and manipulate assets. This approach aligns with our mission and will add substantial value to our offerings in the coming months.

Our industry is readying to emerge from a challenging period. How are teams looking to bounce back from the strikes and ensure a swift recovery?

‍It feels like there's a similar energy to the post-COVID surge, but instead of staggered lockdown endings worldwide, we're now starting to see everyone planning in parallel for the return of production. While the return to work after the strikes is going to have some similarities to the pandemic, it's a fundamentally different situation. The pandemic emphasized remote work and the development of new production protocols, whereas the strikes brought productions and post production to a standstill. There's no fear about returning to work; everyone is eager to rush back to sets, editing rooms and facilities. Producers and post managers must be prepared for this, and all stakeholders in the industry are going to be striving for a smooth and rapid re-engagement to regain full speed, a commitment shared by everyone at Sohonet. The big remaining uncertainty for all of us is how will the returning production volume compare to the pre-strike levels? Adding to this uncertainty is the state of post production supply, with many VFX and post companies forced to significantly reduce their workforce.

Looking further down the road, what trends are you following? And what does the broader future look like for Sohonet?

‍As you can imagine, the team here at Sohonet has been dreaming of and planning for our industry's cloud journey for quite some time. We believe our unique capabilities will allow us to combine high-speed connectivity from 700+ stages with our products that will enable the transfer of original camera files from the set to the cloud, allowing productions to work seamlessly and securely while saving time and money.

We are excited about the future and the role we're going to play in it. We look forward to not only connecting talent to tools and to each other, but to also helping the industry unlock the power of the massive amounts of data and metadata that are produced in today's content workflows. As our industry reinvents itself in an era of powerful new tools and the collaborative creative possibilities of the cloud, we are ready to continue to literally connect the dots.

 


Wednesday 29 November 2023

Jaron Lanier: We Need AI Regulation and Data Provenance (ASAP)

NAB

Tech guru Jaron Lanier has added his voice to those calling for regulation in AI, arguing that it is in the best interest of society — and that of Big Tech.

article here

As part of that regulation, Lanier, who now works at Microsoft, also argues for all data used by AI models to have its origin and ownership declared, to counter the threat from misinformation and deepfakes.

“All of us, Microsoft, Open AI, everybody in AI of any scale is and saying, we do want to be regulated. [AI] is a place where regulation makes sense,” Lanier told Bloomberg’s AI IRL videocast. “We want to be regulated because everybody can see [that AI] could be like the troubles of social media, times a thousand. We want to be regulated. We don’t want to mess up society. We depend on society for our business. You know, markets are fast and creative. And you don’t get that without a stable layer created by regulation.”

Speaking to the idea of “data dignity,” Lanier explained that this is the notion that creators should be compensated, especially if their data is being used to train algorithms.

Provenance

“In order to do it, we have to calculate and present the provenance of which human sources were the most important to give an AI output. We don’t currently do that. We can though. We can do it efficiently and effectively,” Lanier says. “It’s just that we’re not yet. And it has to be a societal decision to shift to doing that.”

He admits to being “scared” of the potential for misinformation caused by unregulated AI use interfering with politics but feels the answer to deep fakes is provenance. “If you know where data came from, you no longer worry about deep fakes. The provenance system has to be robust.”

Lanier’s bizarre title at Microsoft is “Prime Unifying Scientist,” something he admitted was a humorous attempt to encompass everything he does, like an octopus.

“I have come to resemble one, or so my students tell me, and I’m also very interested in their neurology. They have amazing nervous systems. So we thought it would be an appropriate title.”

However, this gives him something of a free-roaming role both inside and outside the company. He was at pains to point out that he was not speaking here in an official Microsoft capacity.

In fact, Lanier has become a fierce critic of the industry he helped build, but he wants to challenge it to do better from within.

“To be an optimist, you have to have the courage to be a fearsome critic. It’s the critic who believes things can be better. The critic is the true optimist, even if they don’t like to admit it. The critic is the one who says this can be better.”

Open Source Concerns

For example, he doesn’t think the open source model for AI or Web3 makes any sense. He poured scorn on the idea that open source would democratize and decentralize the internet and its reward system.  

“I think the open source idea comes from a really good place and that people who believe in it, believe that it makes things more open and democratic, and honest and safe. The problem with it is this idea that opening things leads to decentralization is just mathematically false. Instead of decentralization, you end up with hyper-centralization and monopoly. And then that hub is incentivized to keep certain things very secret and proprietary, like its algorithms.”

Instead, he advocates for a market economy, in which people and businesses pay to use technology, like AI. He hints that doing so would fund data provenance and retain data integrity.  

Lanier says he doesn’t agree with the founder of OpenAI, Sam Altman, on everything, including his notion of a universal cryptocurrency: “I think that some criminal organization will take that over, no matter how robust he tries to make it.”

The Benefits of Speaking Up

He says being able to criticize from within Big Tech is actually beneficial for Microsoft’s own business.

“I’ve tried to create a proof of that, where I can say things that are not official Microsoft. Look, I spend all day working on making Microsoft stuff better. And I really am proud that people want to buy our stuff and want to buy our stock. I like our customers. I like working with them. I like the idea of making something that somebody likes enough to pay you money for it. That to me is the market economy.”

Lanier wants to persuade colleagues at Meta and Google to speak their minds more, too. 

“If the other tech companies had a little bit of [free] speech in it might actually be healthy for them. I think it would actually improve the business performance of companies like Google and Meta. You know, they’re notoriously closed off. They don’t have people who speak, and I think they suffer for that, [even if] you might not think so because they are they’re big successful companies. I really think they could do more.”

He says there are four or five other execs at Microsoft with public careers outside the company who speak their mind. 

“I think it’s been a successful model. Do I agree with absolutely everything that happens in Microsoft? Of course not. I mean, listen, it’s as big as a country, you know.”

 


The Precision Editing Required for David Fincher’s Assassin in “The Killer”

NAB

Critics are hailing David Fincher’s The Killer as his most experimental film since Fight Club: “a subjective, cinematic tour de force,” says Bill Desowitz at IndieWire, in which we get inside the mind of Michael Fassbender’s titular assassin after he experiences his first misfire in Paris.

article here

The movie, now streaming on Netflix, divided into six “chapters,” each with its own look, rhythm, and pace tied to Fassbender’s level of control and uncertainty. According to the film’s editor, Fincher regular Kirk Baxter, ACE, the editorial process necessitated the creation of a visual and aural language to convey subjective and objective points of view for tracking Fassbender.

Baxter (Zodiak, The Social Network) goes into detail about working on each chapter with IndieWire. We learn that the opening sequence set in Paris took the most time for Baxter to assemble because it was stitched together from different locations including interiors shot on a New Orleans stage.

“I love the whole power attack, the stretching of time, the patience of what it takes to do something properly,” Baxter said. “And I love that it’s grounded in the rule of physics and how practical it is that each detail in order to do something correctly deserves the same amount of attention.”

Later, in a chapter set in New Orleans, the Killer exacts revenge on a lawyer. The setup prep is slow as he cunningly enters the lawyer’s office dressed as a maintenance worker.

“It was one of the hardest things to put together,” Baxter tells IndieWire. “It’s a little like a Swiss watch in terms of how exacting it can be in his control. David had like 25 angles in the corridor, but when you put it all together, I love how that scene unfolds by playing both sides of the glass [between the office and corridor]. Typically, he’s gonna say as little as possible and his stillness controls the pace, and when he gets fed up, these little, tiny subtle looks from him are letting you know that’s enough and where this conversation stops.”

The nighttime fight between the assassin and a character called The Brute in the latter’s Florida home is depicted as a contest between two warriors in the dark. Speaking to Dom Lenoir, host of The Editing Podcast, Baxter explains how he and Fincher choreographed this fight as well as talking more broadly about the director’s shooting style.

“David does always provide a lot of coverage [and] that gets misinterpreted as a lot of takes [but] what he’s extremely good at is making sure that I’ve got the pieces to be able to move around as needed, or to keep something exciting. It means I can edit pretty aggressively and use just the best pieces of everything. David knows these rhythms he shoots for an editor. So, if it’s a really long scene, you will find in the wide shot that they’ll often be blocking, for example, somebody coming into the room. You sort of work your way [into the scene].”

Baxter says all that matters to him once in production is the material Fincher has captured. “I will read the scene again so that I understand the blueprint of it. You know what its intention is, but then it can be thrown away because David can evolve beyond what the script was based on, whether a location or how our actors are performing. He’ll recalibrate and readjust.”

Although The Killer proceeds on a fairly linear trajectory (hey, like a bullet…) Baxter says appearances can be deceptive when it comes to cutting.

“I found it to be one of the more challenging movies to make because it’s not juggling a bunch of different character lines or going back and forth from past to present and that sort of thing,” Baxter told IndieWire. “It’s just a straight line, but the exposure of that [means there’s] nowhere to hide. It’s like everything is just under the spotlight and you’re not having dialogue and interaction to kind of dictate your pace. It’s a series of shots and everything has to be manipulated in order to give it propulsion, or how you slow it down.”

He continues this train of thought with Lenoir, “It was a challenging movie to make from my perspective because you are showing an expert on the fringes of society but he’s still a person that operates with precision. You’re trying to illustrate that by showing precision. And it is just a lot of fiddling to make things seem easy.”

He also discusses perhaps something that you may not notice in a first watch which is that The Killer doesn’t seem to blink. It doesn’t just happen in this film either but in other Fincher movies where Baxter says he has consciously selected shots of actor’s not blinking.

“I don’t think that it was an effort to remove them through the film,” he says. “It’s just the nature of how his performance was. But there’s been an effort to remove them in previous films when they’re all kind of landing off rhythm. It’s mostly about when you get into the meat of a scene and you’re in close ups and you want something delivered with intention and purpose.”

Audio was crucial to The Killer as well. Rather than be smoothed out in the background with the edge taken off all transitions, Fincher and sound designer Ren Klyce wanted the audio to be driven by point of view. The rules of the film’s soundscape are established in the opening sequence. Given that the protagonist is not predisposed to be chatty, we learn as much from his internal monologue as from his methodological movements.

“We crawl into his ears and sit in the back of his eye sockets instead of how it’s being presented,” Baxter describes to IndieWire. “From the moment when the target turns up, it was David’s idea to try a track that was what he plays in his headphones. And when you have his POV, we turn the track up to four, and when you’re back on him, the track drops down, and you get the perspective of it playing in his ear.”

They devised rules for how to apply his voiceover but realized they couldn’t have voiceover and music at the same time because there would be too much “sonic noise” for the audience.

“So one’s got to occupy one space and one take the other. The logic said to us what’s blaring in his ears and when he’s in a monologue is when we’re looking at him. That was the rule of what was subjective and what was objective,” says Baxter.

“We tried the notion of ‘vertical’ sound cuts,” Fincher explains. “By which I mean, you’re coming out of a very quiet shot and cutting into a street scene and – boom! — you pick up this incredibly loud siren going by. You’re continually aware of the sound.”

This makes for an unusual but effective experience. For instance, there’s a scene in a Parisian park where the sound of a fountain constantly moves around depending on the featured character’s POV.

Was matching that vertical sound cutting hard?

“I guess even when you’re creating chaos, you’re trying to affect it in your own way,” Baxter tells RedShark News. “You’re always seeking your own version of the perfect way to do this.”

Jennifer Chung, ACE was one of the assistant editors on the film — part of a 14-strong editing department. She also spoke with RedShark News about the tools they used.

“Obviously we use Premiere, and we heavily use Pix also,” she says. “We do a lot of our communication in post through Pix, especially during production during the dailies grind, where we’re uploading not only the dailies but selects that are coming out so that we can get that to David.”

Adobe After Effects is also used extensively, with the team using Dynamic Links to round trip the content out of Adobe Premiere and back in. Some of the assistants also script, so Python or even Excel, in some cases, were also deployed to help automate some of the critical processes.

The Killer was shot in 8K using RED V-Raptor and according to Chung proved a little tricky initially to grade in HDR.

“We definitely had some kinks we had to figure out early on,” Chung says. “We all needed HDR monitors, but we didn’t have HDR monitors at home, though we had HDR monitors at the office. We also use a lot of Dynamic Links in Premiere, and we were having some color space issues going from Premiere to After Effects back to Premiere, but because we have such a close relationship with Adobe, we were able to figure that out.”

  

Tuesday 28 November 2023

Vamos Vegas!

interview and copy written for RED Digital Cinema

article here

Las Vegas provides the stunning backdrop to a new Formula 1 Grand Prix and a golden opportunity to showcase Red Bull and RED Camera’s power and pedigree.

 This November, Formula One races in Las Vegas, Nevada, for the first time in over 40 years presenting a rich backdrop to tell the latest chapter in the story of Red Bull Racing. Vamos, Vegas! is the fourth in the series of F1 Road Trip marquee films produced by Red Bull Media House and follows a winning formula.

“Shoot some of the best action from the highest performing car in the world and mix it with light-hearted storytelling while juxtaposing performance action against iconic landmarks and enviroments,” explains Nick Schrunk who has directed a number of high-octane short films for Red Bull including Race to Miami ahead of the Miami Grand Prix.

“Ultimately, we are making an action film, but our take was how can we make it cinematic and unlike any other sports film?”

Creative development began with the aim of embracing the diversity of Las Vegas. This included and executing a full, 15-person pit-stop on the Las Vegas strip, a race with a 1,000-horsepower trophy truck in the surrounding desert and driving inside an actual active casino. A storyline featuring Mexican driver Sergio Perez (who goes by the name Checo) and Red Bull team principal Christian Horner set in a casino elevator unites the elements into one grand day-night journey through Sin City.

Schrunk explains, “These iconic locations have been shot many times before, so our task was to tell our story using the language of cinema. We wanted to do it without artificially handicapping ourselves to an anamorphic frame, which might be challenging to compose for. I didn’t want vintage glass with any idiosyncrasies and surrealism that might make this tricky or, even worse, make it seem that we didn’t shoot everything for real.”

It was a four-camera shoot featuring a RED V-RAPTOR VistaVision mounted on a camera car, two V-RAPTOR VV ground cameras and a lightweight RED KOMODO for the FPV drone.

“We wanted to keep it all in the RED family because the post workflow is very streamlined and it’s a format, we are very familiar with,” Schrunk explains.

“I was keen to use the aesthetic of the V-RAPTOR’s large format sensor to capture a pronounced depth of field and to cover what we knew would be lots of wide shots. We didn’t want the distortion that’s more apparent with Super 35mm but instead to keep everything sharp corner to corner – and that was helpful in storytelling.”

Schrunk also appreciated the sensitivity of the camera’s sensor in being able to cover night and daylight and interior locations.

“You couldn’t have two more contrasting environments than a neon-lit dark city street one day and the next we’re out in the blazing heat of the desert with sun beating down.

“Our Director of Photography Will Roegge and our first ACs could really commit to one format that would work across everything we planned. The flexibility of the V-RAPTOR allowed us to do this and is ultimately why we chose it.”

They shot 8K full frame with the additional resolution allowing them freedom to manipulate picture in post. “We wanted every bit of resolution for the down-sampling advantage and so we can move the frame around as needed to find the best composition.”

The V-RAPTOR was paired with a set of MasterBuilt lenses including a 25-125mm zoom on the camera car. A set of Laowa lenses were chosen for KOMODO on the FPV drone modified by taking off some coating to match the rest of the show.

“We are proud of the fact that there are no additive elements in post. There’s no CG car, there’s no fake casino. It’s all done for real. That is a real F1 car perched precariously on the roof of Caesars Palace. Even the smoke as the magician disappears was essentially comped from the same shot.”

Running a F1 car inside a casino had never been attempted before, not least because of the sonic roar of the car’s engines and difficulty in gaining access. The entire lower floor hotel rooms of the Wynn were booked out by the casino so as not to disturb guests.

In a lakebed to the south of the city they set up a racecourse where Trophy Truck racer and Red Bull athlete Bryce Menzies challenges Checo to a duel.

“It’s a beautiful place but the dust turned to dirt explosions as the cars ripped through it,” Schrunk recalls. “Preplanning was essential so that we could shoot an authentic street-style race that was advantageous to camera whether that was time of day or capturing the best angles.”

The race was so carefully planned that they were able to shoot the entire sequence in just 45 minutes during golden hour.

“We did lots of testing and had GPS coordinates for the whole course so we knew we could put the camera car in a safe place where it wouldn’t be crossing the other vehicle’s racing line.”

Running an F1 car in a race configuration is very challenging especially in this extreme environment where the more time you take the more chance there is of technical malfunction.

“Preproduction mitigated the risk of dirt and safety issues and allowed us to put all our focus towards perfect light at the end of the day and on getting the shots we wanted.”

The other big set piece shoot was on the Vegas strip. Naturally, for a car with an incredible 0-60mph acceleration of just 2.5 seconds and a straight-line speed in excess of 220mph, the strip was shut down for the night. The team liaised with the Bellagio resort to programme its famous fountain for a ready-set-go race with the F1 car, repeated several times in two-minute chunks over a two-hour period.

“Safety was the paramount concern. We had a 60-person team stationed along the strip to make sure there were no issues and they each had to give the OK on the radio before we greenlit the car.”

As the only production team globally to use a real Formula 1 car for filming and to insist on practical production, the Red Bull team have established a gold standard in a realm often dominated by CGI.

This commitment involves shooting on public streets with real drivers and a real car it has presented unique production challenges but also earned critical acclaim, including a recent 2023 Sports Emmy win for "outstanding camera work - short form".

Schrunk adds, “We've curated a camera plan that reflects the realities of filming in public, high-stakes environments, often limited to a single, defining take. As we prepare for the world's largest sporting event of 2023, the Las Vegas F1 GP, we are riding the wave of America's escalating interest in Formula 1.”

 


Monday 27 November 2023

Ari Wegner ASC ACS / Eileen

British Cinematographer

Ari Wegner ASC ACS takes on the psycho drama of Eileen and the wintry conditions of a New Jersey shoot.

article here

Reuniting for the first time since Lady Macbeth in 2016, director William Oldroyd and cinematographer Ari Wegner ASC ACS take on another psychological period drama and literary adaptation centering on a strong female protagonist. This time the character arc is reversed; while Lady Macbeth is ultimately trapped despite murder, Eileen arguably finds an escape.

“When William sent me the script I also read the book (by US author Ottessa Moshfegh) and I really loved her voice and the style of writing which was hypnotic, very dark and seductive,” says Wegner. “There’s a lot of overlap with Lady Macbeth. I love characters that you fall love with and then they challenge you by doing things that you kind of wish they wouldn’t.”

You can trace that theme as well as Wegner’s willingness to challenge to audience expectations in her other work including The Power of the Dog and The Wonder.

Eileen is taken from Moshfegh’s 2015 Man Booker Prize-shortlisted novel of the same name. The story follows an unhappy 24-year-old woman working at a prison in Boston in the 1960s whose friendship with a child psychologist takes a sinister turn. The script is by Moshfegh and her partner Luke Goebel.

US production companies Likely Story and Fifth Season were joined by Film4 in funding the film, which stars Thomasin McKenzie (Last Night in Soho) and Anne Hathaway (Les Miserables) and premiered at Sundance. It moves through the gears from Douglas Sirk melodrama with hints of film noir to full on Hitchcockian psycho-drama, not least with Hathaway’s femme fatale dressed as a smart blonde.

“From a visual perspective the challenge was to closely track where the audience’s mind is in in relation to our main character. You’ve got to help them visually, manipulate them, to fall in love with Eileen knowing you are going to take them to a very challenging place.”

Much of the discussion she had with Oldroyd in prep was around lighting for Eileen and how the house where she lives with her abusive father (played by Shea Whigham) should look. “Looking great is priority but the overriding one was helping the audience form a very strong attachment with Eileen, for them to be on her team and want what she wants knowing that later the audience will be conflicted in their response.”

Yet even Hitchcock might not have gone so far as the perversity of this story’s final act. “There’s definitely a Hitchcock vibe going on,” Wegner says, “but we didn’t talk about specific films we wanted to emulate. It’s quite a unique film in structure. It does have fairly strong genre favours but then about three quarters of the way through it turns into something quite different.”

She and Oldroyd also spent a deal of time discussing how to make the tonal transitions and in particular how the film’s final section would work. “The basement scene takes up a huge amount of screen time and unfolds almost in realtime. It’s also dramatically unexpected, our visuals change and relationships get shaken up. There are not many films that do that and even if there were they would probably not have helped us learn anything.”

As any filmgoer knows, bad things happen in basements. Wegner and her director were keen to avoid those clichés. “We decided to do the opposite which is to light so that you can see everything and there’s nowhere to hide. In contrast to the amber colours we use throughout the rest of the picture, here we select a green-white tone which also has a kitchen-like industrial brightness. I’m so glad we did it because as I think the result is a lot scarier than the classic swinging light bulb.”

She adds, “I love what it did to the skin tones and the dress that Eileen is wearing in that scene. Costume designer Olga Mill referred to her dress as a Christmas present, that Eileen looks like a Christmas ornament, which is then contrasted with bloodiness against the white. I’m glad we went that way and didn’t go with classic Hitchcock.”

The scene was shot with two cameras (A camera by Blake Johnson) to capture reactions to the character’s monologue and unlike other scenes in the film was not photoboarded until rehearsals with the actors on location.

Setting the mood

For the period setting Wegner looked at a huge amount of photography from the time, not unlike her prep for The Power of the Dog. “It was all about getting your mind as deep into that world as possible with a view that by the time you shoot if there’s anything out of place you will notice it,” she says.

They shot in New Jersey (to access tax credits) with locations dressed for a story set in Massachusetts. “Given that I’m not familiar with either of those states I had to get my head around cheating the locations so that it would make sense to an American,” Wegner says.

Beach locations were tricky to find and the New Jersey state prohibits shooting in any correctional facility (or even setting foot even in car park of a prison) so they had to get creative. “We found an elementary school partially closed for the holidays and did a bit of construction within that and we also shot in a courthouse holding area.”

They shot over Christmas 2021 into 2022 when it was “bitterly cold but there was no snow”. Since the story demanded snow they had to use SFX, but juggling this along with high-cost items like period cars was a challenge for the film’s strict budget. “The way William and I thought about it was if you don’t see off the edges of the frame then the illusion is maintained. The moment you see off the edges the whole thing falls apart. So defining where those edges were meant photoboarding every shot and sharing those with SFX and art department. We had very little room for manoeuvre.”

Shooting a period film is a very different proposition to a contemporary drama because of the relative lack of freedom to improvise decisions like camera positions on the day (unless you have a very large budget). Although this puts a lot of pressure on the DP to nail the visuals up front, Wegner says she enjoys the creative challenge. “I say I enjoy it now,” she laughs, “but probably at the time, a little less so. I do enjoy parameters and boxes to work in. I find infinite choice more overwhelming than projects with boundaries.”

Shooting on film would have eclipsed the budget so the next best was to shoot on the ARRI Alexa Mini paired with the same set of Bausch + Lomb Super Baltar primes she used on Lady Macbeth. She also selected a Angénieux 25-250 HR, a lens she says she was already in love with. “It’s the kind of zoom that is mostly in the kit to be a back-up lens but there’s something about it that really suited the period. When I put the lens on it felt like time travel looking through the eyepiece.”

As part of the visual style she and Oldroyd deploy a number of slow zooms. “I enjoy a slow zoom,” she says. “It suits the psychological thriller aspect of the story. For some reason a slow zoom conjures up thoughts of going into someone’s mind for an audience.”

Genre flavours

Deliberately shot in the dark and dreary winter months to mirror the claustrophobia of Eileen’s smalltown existence, the script called for numerous night shots and dark interiors. “For the night interiors and to get into the genre flavours, I tried to bring in a lot of FX shadows, leaves and branches and to break up the light in a way that adds to the wintry feel and extra creepiness,” Wegner explains. “We were working with various shades of amber from strong sodium vapor to tungsten with patches of strong reds and I had either hard lights with branches outside [the windows] or bouncing off reflective surfaces.”

She says her general approach to lighting focuses on setting an atmosphere that takes into account the drama and tension of the moment and also of story time and place. “I’ve often thought that you want the lighting to tell you what the temperature of the air is or what it smells like,” she says. “This was a cold time of year. Things never things seem to be completely dried out and the sun goes down early. For someone like Eileen, Christmas was not always cheerful. The cosiness of a warm home to return to doesn’t exist for her. Instead, the whole place reeks of darkness and wetness. We deliberately wetted the exteriors and used as much snow as we could muster.”

The final sequence set in the difficult terrain of a national park was the most physically taxing part of the shoot. The sequence required them to shoot at night through to dawn into day and without the budget for huge fixtures, Wegner worked wonders with a fog machine and headlights to silhouette characters. “It was a very tightly planned dusk for dawn schedule over two-to-three days using two cameras, two cars, sometimes a stunt diver and we used every minute of those three dawns after getting the shot list down to the bare minimum we would need to tell this story.”

The forest and frozen swamp “were not terribly conducive to moving fast – especially when everyone is half frozen – but the result was incredibly beautiful. It was worth the discomfort.”

She graded remotely with Nat Jencks at Postworks in New York and did something she had not done before, which was to add grain and halation as a first step. “That process is often left to the end but doing so up front meant our eyes were accustomed to it versus sitting with the footage in high resolution for two weeks and then making that major change. For me, it was all about texture and getting the level of grain right. Nat was fantastic in helping us push the distortion further in the direction we were already going.”

The final shot ends enigmatically on a static shot from a bridge overlooking traffic on a highway with the camera still rolling. The credits appear over the shot rather than fading to black, inviting the audience to ruminate on Eileen’s fate.

“We wanted a shot that basically said she was merging into the world but is anonymous and has successfully disguised herself “as a normal person”, as Rebecca (Anne Hathaway) might say. It wasn’t in the original script and the novel is perhaps more definitive about what happens next, but we always like the idea of her hitchhiking into the city while allowing the audience to project what they want onto her future. She has been to a dark place and has been transformed.”

Sunday 26 November 2023

How Influencer-Generated Content Has Become Core to Brand Strategies

NAB

Influencer-generated content is now core to brand strategies, with marketers increasingly savvy about the differences between creators and influencers and how to measure their performance.

article here

A recent study conducted by creator marketing platform LTK underlines the profound impact of creator marketing, an industry now estimated at $21 billion globally.

Next year, worldwide, marketers are expected to spend more than $32 billion on influencer marketing. Influencer spend is now outpacing traditional ad investment, with 80% of brands saying they increased creator budgets in 2023, per the report.

Some 92% of brands plan to increase their spending on creators in 2024, and 36% plan to spend at least half of their entire digital marketing budget on creators.

Because of what LTK calls the “significant trust” creators have built with their communities, the majority of brands it surveyed said consumers are turning to creators the most compared to social media ads and celebrities.

An overwhelming majority of brands (98%) are using creator content for channels beyond just social media, highlighting its versatility and reach.

Indeed, when asked where their marketing dollars are shifting, creator marketing and connected TV shared the top position overall for investment growth, beating out channels like paid search and paid social.

The study also found that dollars are being moved from digital ads to creator marketing because the scale of creator marketing has proven to be more efficient when compared to side-by-side, all-cost measurement.

Marketers, however, are becoming more discerning about the difference between influencers and creators.

“As marketers have got more comfortable with the creator economy, influencers have become the go-to for performance marketing, while creators are considered more for branding purposes,” says Krystal Scanlon, writing at Digiday.

Marketers are feeling the pressure to be super transparent and efficient about their purchases and the reasons behind them. This means they’re getting specific about when it’s better to collaborate with an influencer versus a creator.

Lindsey Bott, senior content manager at Ruckus Marketing, told Scanlon, “Previously, influencer involvement might have organically emerged in ongoing discussions. Now, we’re seeing brands come to us more frequently with well-defined briefs or specific suggestions right from the outset.”

The days of pay-for-reach deals are long gone, it seems. In fact, influencers increasingly have specific metrics, such as engagement rate, CPM, CPE, clicks, click-through rate and conversions, tied to them.

For example, Bott’s team has observed clients gravitating toward influencers due to their established reach and engagement metrics, emphasizing performance-driven results.

Conversely, there’s a growing interest in creators who prioritize crafting genuine, narrative-based content that closely aligns with a brand’s values and campaign themes.

“They’re unbelievable storytellers who can really shape perception,” Keith Bendes, VP of strategy at Linqia, reports at Digiday. Unlike influencers, creators usually don’t have the same set of metrics tied to them.

 “Over time, as marketers understand how a specific creator’s content performs when repurposed on their social channels or paid media, they may start to benchmark specific benchmarks for that creator’s assets,” said Lindsey Gamble, associate director at influencer marketing platform Mavrck.

According to Scanlon, this shift underscores how brands are distinguishing between utilizing audience influence and cultivating content that profoundly connects with their intended audience.

“Creators have evolved into valuable assets for brands, capable of driving substantial business impact,” says Rodney Mason, VP and head of marketing at LTK, writes at Adweek. “As we move into 2024, creator marketing is fundamental shifting how brands engage with consumers. Those marketers who embrace the rise of creators will find themselves at the forefront of this transformative wave. The time to invest in creators and their unique ability to influence, engage and build trust with consumers is now.”

In a recent webinar, “The Next Wave of Creator Marketing: 2024 Forecast,” LTK’s director of strategy insights brand partnerships, Ally Anderson, shares more detail about how “creator guided shopping” is becoming the foundation for marketing efforts and now influencing consumers through all aspects of their discovery journey.  


What Comes Next for the Creator Economy? (Um, Apart from $480 Billion Dollars)

NAB

While there is near universal agreement about the growing size and importance of the creator economy, estimates vary widely. For example: Citi estimates there are more than 120 million content creators generating $60 billion of revenue, a figure which it estimates is growing at about 10% per year. 

article here

Goldman Sachs Research has a very different estimate, saying the total addressable market of the creator economy could roughly double in size over the next five years to $480 billion by 2027 from $250 billion today. Meanwhile, it estimates there are presently 50 million global creators, growing at 10-20% per year — far less than Citi.

In a national poll of 5,854 Americans market researcher Keller identified 27 million people, or 14% of 16 to 54 year olds, working as “influencers” in the US economy.

However, there is consensus that growth has not stopped and will be driven by investment in influencer marketing and the rise of ad-revenue-share models, particularly in short-form video on platforms like Instagram, TikTok, and YouTube.

As Goldman Sachs puts it, Creators earn income primarily through direct branding deals to pitch products as an influencer; via a share of ad revenues with the host platform; and through subscriptions, donations and other forms of direct payment from followers. Brand deals are the main source of revenue at about 70%, according to its data.

eMarketer’s Insider Intelligence forecasts that in 2024, US influencer-marketing spend will hit $5.89 billion, and that its growth will “remain in the double digits through 2025.“

“The funds are not drying up anytime soon and we are seeing more and more people becoming creators,” Shannae Ingleton Smith, president and CEO of Kensington Grey Agency tells Amanda Perelli at Business Insider. “It’s a viable career space and in many cases pays more than the top tech jobs. Where the advertising dollars are, to me, is a great indication of sustainability.”

Since its inception in the mid-2000s the creator economy has also grown to encompass a range of professionals who work for creators. These range from managers to video editors, as well as tech execs who have built platforms and companies to help creators make money and build audiences.

“Social media was a tool for interacting with friends, but now it includes a vast ecology of people making money from posts or advertising,” Cristina Criddle explains at the Financial Times.

She interviews Kate Lingua-Marina, a creator known by her handle @SiliconValleyGirl, who explains that she made my first video in 2014 while applying to universities in the United States. She decided to document her journey — and her views exploded to the point that she now has three YouTube channels and a vlogging channel.

“I used to film everything myself,” Lingua-Marina says. “These days I have videographers who helped me from time to time depending on the type of content that I’m creating. I have several editors to help me with editing. If someone helps me post on platforms. I have a manager who’s responsible for working with brands.”

Top earners have built large teams, like the roughly 250-person operation assembled by MrBeast, who Forbes estimated made $82 million between June 2022 and June 2023.

But not everyone can be a MrBeast. In fact, no one should be mistaken that becoming an influencer is an easy way to make money.

Only about 4% of global creators are deemed professionals, meaning they pull in more than $100,000 a year, finds Goldman Sachs.

A recent survey of 689 creators by the influencer-marketing platform Mavrck found about 51% made less than $500 a month. In the survey, nearly a quarter of creators said they earned more than $2,000, and about 4% said they earned more than $10,000 per month.

Keller’s research found that 6% of Americans full-time creators and earn an average of $179,000 per year but that the average income is $93,000 per year. More than half of creators make less than $10,000 annually and a third only make $2,000.

“While the livelihood of the 11.6 million full-time creators (in the States) is a robust $179K/year, the total number of creators is larger than most estimates, likely based on the one third of them who earn less than 2K a year,” the researcher notes.

The creative economy is also facing mixed financial signals. After a flat 2022 YouTube ad revenues were up around 5% by the third quarter of 2023. Creators received just over half of the ad revenue generated on their channels. On the other hand, investment in the creative economy has dropped sharply with total funding for us startups fell 50% last year compared to 2021.

Revenue and funding going into platforms has decreased quite dramatically.

Criddle says, “One key problem for the Creator economy is that creator traffic and wealth tends to be concentrated among the very few, such as MrBeast. Only 4% of creators are defined as professionals earning at least $100,000 a year.”

While the creative economy might be moving away from past explosive growth, there is evidence consumers remain willing to pay for quality content.

“The days of wild growth might be over or at least on hold but that’s not going to stop the millions of creators out there,” Criddle says. “There is enough demand, enough supply and now is the time when the focus should shift from quantity to quality.”

AI Comes To the Creator Economy

The latest innovation driving the creator economy forward is artificial intelligence.

This year, YouTube unveiled new AI tools and features aimed at simplifying content creation. According to Business Insider, the industry is betting on AI not to replace creators, but to increase productivity and bring more opportunities for people to make content.

Rising AI startups in the creator economy like Crate, an AI platform helping creators streamline the creative process, and Midjourney, an AI model that can generate images, are winning over investors.

Keller’s survey found half of Creators saying they want to start working with AI and that Virtual reality/augmented reality is #2 on their list of tech they’d like to engage with in the future.

In a recent survey of 2,000 influencers by membership platform Creator Now, 90% said they were using ChatGPT during the content creation process, and 31% said they were using Midjourney. The top reason cited for using AI was to increase the speed of content creation. AI tools can edit TikTok or YouTube videos in a fraction of the time it takes today.

 “AI is a game changer,” says a creator speaking to the Financial Times. “The first time we used it was to create a script. I had to change some things, but it was right there in front of me in 60 seconds. If I create an AI version of myself, if AI create scripts, then my job is to decide which content goes out there and which topic my AI prototype is talking about. Good creators are becoming producers.”

 


Tuesday 21 November 2023

Francis Lawrence and Jo Willems Rewrite the Rules for “The Hunger Games: The Ballad of Songbirds and Snakes”

NAB

Let the games begin — again. The Hunger Games are back, this time as a prequel telling the story of young Coriolanus Snow, who will grow up to be the tyrannical dictator ruling the sci-fi dystopia of Panem in the four previous hit films.

article here

Also returning is director Francis Lawrence, of whom Jacob Hall at Slashfilm says, “Through his lens, what could’ve been a boilerplate YA series has leaned into the aggressive, the political, and the deeply moving.”

While Gary Ross directed the first film in the series — adapted from Suzanne Collins’ dystopian novel of the same name — Lawrence came aboard for the sequel, 2013’s Catching Fire, and stayed to helm the climactic two-part finale, 2014’s Mockingjay – Part 1 and 2015’s Mockingjay – Part 2.

“I thought I was done,” Lawrence tells A.frame‘s John Boone, “but not because I didn’t want to do more. I thought I was done because Suzanne, the author, was like, ‘I’ve been on this thing for 10 years. I’m going to write plays. I’m going to do other stuff. I’m done.’ Which I could totally understand. I’d been on the movies for three, four years, so I certainly wanted to do something else for a minute, too.”

In 2019, Lawrence and franchise producer Nina Jacobson received a call from Collins. “She said, ‘Hey, I know this is a bit of a surprise, but I’m almost done with a new book.'” 

That book is The Ballad of Songbirds and Snakes, adapted by Lawrence into a script that took him two years to write. 

“I think the only thing that intimidated me is that I feel like people are conditioned to believe that a Hunger Games movie is over when the games are over,” he tells Boone.

“So there’s just this feeling that people have, ‘Oh, you build up to the games. You get to the games. The games are over. Movie done.’ And the truth is, all the questions that are set up at the beginning of the movie are not answered by the end of these games, and there’s still a fair amount of story to tell. 

“I found that very exciting. I liked that there was a different structure, that it wasn’t just ending with the games, that the games are just part of a much larger story — especially for Snow. But I knew that we were going to have and will always probably have a bit of a hump, just because people are conditioned to feel that.”

There’s a vogue for lengthy cinema experiences, and at 157 minutes this movie is no exception. The Hunger Games has form, though, in splitting the final book of the trilogy into two films. Lawrence was adamant he didn’t want to do that again.

“We got so much s**t for splitting Mockingjay into two movies — from fans, from critics,” he tells Boone. “And weirdly, I understand it now. It’s episodic television or something.

“You can either binge it or you wait a week and a new episode comes out, but to say, ‘You have an hour-and-a-half-long episode of TV and now you have to wait a year for the second half,’ that’s annoying, and I get it. So, that was not going to happen on my watch this time around.”

Creating Coriolanus Snow and Volumnia Gaul

He says the biggest challenge in nailing the narrative was to have the audience root for Snow (Tom Blyth), in the beginning, “empathizing with him, while making sure that the elements of the need for ambition, some of the greed, some of maybe the genetic darkness that’s in him from his father, that all those seeds are planted. So eventually, in his descent into darkness, you find it sort of truthful.”

That arc is reminiscent of characters like Annakin Skywalker’s transition to the dark side over the course of Star Wars episodes one to three, but Hall tries to draw Lawrence into making explicit parallels with Donald Trump.

“It’s a real 2023 mood for a movie to be about how this person you really like is actually a fascist. It feels very timely right now,” he says.

Lawrence replies, tactically, “Yes, yes, for sure. But we get to see him formed into one. He doesn’t start as one.”

For Viola Davies’ character of chief gamemaker Dr. Volumnia Gaul, the director’s reference was Gene Wilder’s Willy Wonka.

“One could consider her a villain in this movie, but she does think she’s doing the right thing and what she’s doing is important,” he said to Boone. “She certainly is a very specific voice, philosophically, in the movie. But the Willy Wonka reference was more that her joy is actually in the creativity of the work that she’s doing, which informs the hair, the makeup, the wardrobe. That joy and the odd, creepy creations reminded me a little of the sinister underpinnings of Gene Wilder’s Wonka. That was my reference for her, which she totally got!”

He adds, “I have to admit, I was a little nervous bringing that up to her, but she totally got it and completely went for it.”

Also returning is Belgium cinematographer Jo Willems, with whom Lawrence has worked since starting out shooting music videos (graduating to work for the biggest names in the business including Justin Timberlake, Pink and Lady Gaga). Willems shot the second through fourth movies in The Hunger Games franchise and Red Sparrow, also directed by Lawrence — all starring Jennifer Lawrence.

While Catching Fire was shot on 35mm with anamorphic lenses, “over the years we progressed our style, we went into digital and then ended up shooting large format,” he tells Gordon Burkell at Filmmaker U. “We always try to get more and more intimate with the characters and we have just ended up shooting wider and wider lenses. Even though they are sci-fi movies, I try and work in a naturalistic way.”

He continues, “I also like shooting in very natural light, so a large part of the movie, where you end up in all these natural light landscapes, I think they look stunning.”

The director says he enjoys post-production more than shooting which he finds really stressful. “I wake up every morning with a knot in my stomach because you really only have that day to try to get the scenes that are assigned to that day,” he says in a first-person essay written for MovieMaker Magazine. “So many things could go wrong — could be somebody’s personality, could be somebody’s sick, could be something’s broken, or something’s not working, or we didn’t plan something correctly, or it’s raining and you need sun. I find it so constantly stressful.

“But post, once you have all the material, you come home and the lifestyle is much more civilized again, and you sit with your editor and you go through it all, and then you see the movie come together in a whole new way. And there’s something really gratifying about that.”

Presumably, if this film is a hit, there will be another story set in the franchise to come. 

“I would totally do another one, but it’s all up to Suzanne,” he says to Boone. “It’s the same as after the Mockingjays. I said, ‘I would come back 100 percent if asked.’ But it’s got to come from the mind of Suzanne, because she truly is the author of these things. But also, she writes from theme and writes from a real idea, and I think that’s what gives these stories their substance and their relevance.”