Thursday, 18 January 2024

Hollywood and GenAI: I Think (Hope) This is the Beginning of a Beautiful Friendship

NAB

The deepest fear that Hollywood creatives have about AI is that it will suck jobs from the industry and all the life from storytelling. The reality is that Hollywood creatives mostly believe that generative AI is nowhere near good enough to produce final product without huge human involvement.

article here

That AI will impact profoundly on content creation is a given. Film historian David Thomson compares GenAI to the advent of cinematic sound.

But opinions differ as to the extent and value of GenAI’s impact.

AI Content Possibilities

Katie Notopoulos at Fast Company outlines the extremes. She quotes Edward Saatchi, founder of production company Fable Studio, predicting a future where there’s a “Netflix of AI” that allows viewers to pick from an array of customized episodes of their favorite shows.

“You could also speak to the television to say, ‘I’d like to have a new episode of the show, and maybe put me in it and have this happen in the episode,’” said Saatchi, whose company is developing an AI-generated animated series.

On the flip side, Adam Conover, writer and board member at WGA West, told Fast Company, “Maybe there will be some AI-generated chum that shows up on Twitch and people have it on in the background while they do their homework — but that’s not going to compete with movies.

“Movies are: ‘I want to go sit in the dark. I want to watch the hottest person in the world say the funniest things in the world and ride a real f***ing motorcycle off a cliff.’ That’s what people want.”

Expected AI VFX Efficiencies

Without doubt AI will increasingly come into play as a time (and cost) saving tool by automating and simplifying complex tasks, most notably in VFX.

Lon Molnar, chief creative officer of VFX company Monsters Aliens Robots Zombies (MARZ) tells Fast Company that smaller-budget movies and shows will have easy access to Marvel-quality effects — in five to 10 years.

That’s still a way off and in any case fits into the wider and ongoing trend that tech advances from digital cameras to YouTube have in ‟democratizing” filmmaking.

Around half of U.S. entertainment industry workers polled by YouGov and Variety Intelligence think that GenAI will be used for processes like sound effects, autocompleting code to assist in game programming and developing 3D assets and artwork for storyboards — within three years.

At a basic level, generative AI could be used to save money on expensive reshoots, even on the tightest of budgets.

With AI, you could “generate a video model based on all the footage from your scene, and then generate new shots based on the photography that you captured,” filmmaker Paul Trillo tells Fast Company. “That’s going to rewrite the rules of postproduction.”

AI Scripting? Wait and See

However, the same YouGov/Variety poll found just 18% of U.S. entertainment workers believing that GenAI will be able to effectively write film and TV scripts anytime soon, ranking the lowest of any creative task.

Certainly, chatbots such as ChatGPT are capable of producing output in the manner of a screenplay. “It’s less plausible that AI can yet, soon or ever succeed at producing a complete, coherent and production-ready script without at least some human assistance,” said Variety’s Audrey Schomer.

It is more likely that existing Large Language Model-based AI tools can assist writers to speed script development, for example, exploring alternative storylines or generating ideas.

In this sense, “an LLM might better operate as a muse, brainstorming aid or sounding board,” suggests Schomer.

LLM-based tools could help writers rapidly ideate and iterate story concepts, including providing possible settings and scene locations; character names, identities and backstories; and plot points and narrative arcs.

At the same time, studios might experiment with using LLMs for ideation, generating basic concepts for pilots and movies that could be expanded into treatments and scripts, Schomer suggests.

The Impact of “AIGC”

Studio execs who might want to create more content for less money might be inconvenienced (in the short term at least) by the deal struck with the WGA and by reluctance to be hit by copyright law suits.

The fear remains, however, that the overriding economic impact of AI will inevitably lead to a future of mass-generated ersatz content, or as The Economist’s Alexandra Suich Bass puts it, “we’ll all be watching synthetic entertainment generated by robots and acted out by CG versions of beloved stars, a hollow version of the films we loved.”

Just as the internet led to an explosion of “user-generated content” being posted to social media and YouTube, generative AI will contribute to reams of videos proliferating online. Some predict that as much as 90% of online content will be AI-generated by 2025.

GenAI may result in more derivative blockbusters and imitation pop songs, but the technology could also generate original ways of storytelling. For instance, AI could be the catalyst for new types of personalized and interactive stories.

With de-aging tech, screenwriters can craft more ambitious time-skipping narratives — something we might see in Robert Zemeckis forthcoming feature Here, starring Tom Hanks and Robin Wright.

“With the ability to render convincing cityscapes, historical dramas can roam far beyond the handful of carefully scouted locations that usually serve as their sets,” Notopoulos suggests.

Cristóbal Valenzuela, who runs AI software development company Runway, call AI a “new kind of camera,” offering a fresh “opportunity to reimagine what stories are like.”

As Trillo explained, “I’m less interested in using AI to make things I can shoot with a camera than creating imagery I couldn’t create before.”

Monday, 15 January 2024

For Seth McFarlane and Tom Costantino, ACE - ClearView Flex Is Pure Logic for TED

interview and copy written for Sohonet

Ted, the foul-mouthed teddy bear voiced by creator Seth MacFarlane, has returned to screens in a new 7-part event series for Peacock, thanks to a collaborative workflow enabled by Sohonet ClearView Flex.

article here

The live-action TV series is produced by NBCU’s Universal Content Productions (UCP), MRC Television and Seth McFarlane’s Fuzzy Door. MacFarlane co-writes, directs and reprises his vocal performance for Ted. He works closely with a tight-knit editorial crew headed by Tom Costantino, ACE, lead editor and co-producer on the project.

Costantino began working with the system in March 2020 in the middle of Season 3 of The Orville, MacFarlane’s sci-fi comedy drama for Hulu.

‍“Many of the studios were caught out by Covid and no one really knew how it was going to affect work. It became very apparent to us that we had no way to continue working and keep everyone employed unless we found a solution.”

The Orville’s VFX producer and co-VFX supervisor Brooke Noska had previously worked with FuseFX, one of the show’s vendors and a ClearView Flex user. FuseFX assisted the show to build a remote networked facility based on multiple ClearView Flex Pro boxes for editing and VFX.

“Brooke and I were trying to figure out to how to keep it all going in our respective departments and how to keep it going for Seth. Then, we had a Sohonet demo.

“We’d tried similar systems with Evercast but Seth found it too difficult. There was a delay, and you had to have a headset on. But with ClearView Flex you could be on Facetime audio and see this beautiful picture in real time. It was like being in the room.

So efficient in fact, that the team have retained similar flexible workflows for Season 3 of The Orville and now on Ted. During post, the show’s media was held on Avid Nexis at Fuzzy Door while Costantino was typically in another location and MacFarlane in another.

“We can all jump onto ClearView Flex and it’s so smooth it’s like nothing ever happened,” he says. “Given how busy we all are and being able to do work in the margins is just transformative. During a quick lunch break or if Seth has a half hour somewhere to do a VFX review, as long as he has the app, or an Apple TV or a laptop we can be working. I never have to be in the same room as my showrunner.”

The team used ClearView Flex for all VFX reviews, including final reviews, and for all editing and spotting sessions.

“We have also used it for remote sound mixing for those people who can’t be here in person,” he explains.

“For example, we did a spotting session with 15 people and only five of us were in the building. We work at medium to super high resolution. CVF can definitely handle high rez, but Seth and I are just so superstitious that I edit on medium resolution. There’s no scientific reason to do so - it’s more of a hangover from the days of working from my house!”

He continues, “Even though we all got back together we still have three of the boxes (two for VFX) because it has made us efficient in ways we cannot even calculate. That’s to say nothing of the features which we use internally, such as the sketch-ups for on-screen annotation.”

TED, which debuted on Peacock on January 11, 2024, also stars Alanna Ubach and Scott Grimes.

“I treat comedy the same way I’d treat drama,” shares Costantino who has also cut multiple episodes of CSI Miami and 48 Hours. “Sometimes you look for the best performance. Sometimes, if it’s a good old-fashioned joke, you look for the best read. Mostly, you are cutting a scene for its emotional truth and the jokes will come through but you gotta believe in the characters to do that. In the case of improv as with drama too you always have your magic bin of clips, throwaways, and lines which is your toolbox to punch things up. My advice with comedy is to cut tight to keep the energy up. That’s the golden rule.”

Costantino finishes by noting, “I don’t know how we could ever go back to working without Sohonet. It is an important part of our natural workflow.”

 


Front Street Pictures: ClearView Flex

interview and copy written for Sohonet 

article here

Front Street Pictures in Vancouver has produced and provided production services on a broad catalogue of live-action, scripted content across multiple genres for more than two decades. The company houses a full suite of services including development, legal, budgeting, and scheduling as well as post-production facilities and the management of all aspects of delivery for clients who include Hallmark and Lifetime movies of the week and features for Paramount and Sony Pictures.

Sepideh Merchant, director of Post-Production, joined the company in 2019 and immediately saw a need to improve upon the already established workflow. Coincidentally, this was a few months prior to the global pandemic that would then force companies to take on a more remote-focused process, which lined up perfectly with the direction Front Street was going. 

‍Trial and Error: Finding a Reliable Streaming Solution

“When I arrived, a priority was to upgrade the post-production system from hard drives to more of a networked storage solution. It was imperative, especially when the pandemic hit, that we shift to a more remote-focused solution and find a way for us to all connect to the office from different locations. Through research I found a remote streaming software that we tried out in order to keep collaborating with our production and distribution teams.

However, Front Street found it challenging to find a solution that was simple and reliable enough for the teams to collaborate with each other effectively. 

“The system we were using proved slightly unreliable,” she reports. “Set up time was very lengthy. Producers and creative clients were having more trouble than they needed to set up sessions on their laptop or iPad. The vendor suggested that we purchase additional hardware to help with the workflow, which we did but it didn’t make much of a difference and we often had to call tech support.

“Luckily through word of mouth we learned that Sohonet had a demo of its solution in Vancouver which I attended. I was immediately hooked.”

Securing a box to trial proved to be the answer Front Street was looking for.

“The best thing about ClearView Flex is that it is true plug and play. Clients just need to sign in and away you go.”

The real test occurred during production of a Lifetime movie. They began the project using the existing streaming software and were experiencing familiar trouble with set up and frame delay.

“The best thing about ClearView Flex is that it is true plug and play. Clients just need to sign in and away you go.”

“When we had the opportunity to work with ClearView Flex we did so side by side with the streaming software and immediately saw the benefits. ClearView Flex was ahead by miles. It was so seamless and perfect and a superb experience for our producers.”

Front Street uses ClearView Flex extensively for off-line editorial (editing and mixing) for much of their in-house productions. More recently, Front Street has become among the first Canadian facilities to employ the NDI-enabled version of the technology.

“We jumped on that when it was released by Sohonet because we run so many productions through here that it made sense for us upgrade and thus enable multiple productions to schedule sessions.

The NDI version means we can run multiple different productions day to day all connected to one box.

“It works beautifully, and we’ve had no issues at all,” she adds.

 @Sohonet https://www.sohonet.com/article/front-street-pictures-the-quest-for-a-true-plug-and-play-streaming-solution-leads-to-clearview-flex

BTS: Poor Things

IBC

A story about re-activating a dead woman with her unborn baby's brain was always going to make for a strange film but if weird is what you want Poor Things will not disappoint.

article here

From its phantasmagorical costume and steampunk production design to the gothic visuals of its cinematography the latest film from Greek director Yorgos Lanthimos is a sensory treat.

Based on Alasdair Gray’s book, the script is by Tony McNamara who created the fizzing period romp The Great. It inverts the classic Frankenstein tale by making the ‘monster’ a very perceptive and beautiful woman, and her love interests, potential monsters.

“It a pretty unusual scenario to start from, and knowing his work with Tony, and their kind of humour, I knew it would be an ambitious take about sexual, emotional and spiritual awakening, with a lot of layers,” says Robbie Ryan, ISC BSC, who was Oscar-nominated for Lanthimos' Queen Anne drama, The Favourite in 2018.

The coming-of-age story follows Bella (Emma Stone), a young woman who is brought back to life by her protective guardian, the unorthodox scientist Dr. Godwin Baxter (Willem Dafoe) in an alternate Victorian era that flits between Lisbon, Alexandria and Paris.

The heroine lives in a “dystopian version of a Merchant Ivory film, with the idea of a grand tour,” according to McNamara in the film’s production notes.

Lanthimos set his fellow heads of department a task: which was to make it all as hand-crafted and free of digital trickery as possible. Among other things this entailed shooting it all in a studio - the old fashioned way with gigantic sets, shooting on 35mm film using resuscitated filmmaking techniques.

“Yorgos really wanted to create from whole cloth,” Ryan says. “Things weren’t meant to feel real or verité. It's got its own angle, its own quirk.”

Ryan is best known for his work with Andrea Arnold on Fish Tank (2009), American Honey (2016) and forthcoming release Bird as well as for shooting Ken Loach’s Palme d’Or winning I, Daniel Blake (2016) and the director’s last movie The Old Oak.

Unlike all those movies as well as The Favourite, Poor Things was strictly studio bound. With Lanthimos, Ryan discussed how to film scenes as if they were on location – with no lights, flags, and equipment on set other than the camera. Among other things it meant having to pre-light everything from outside the windows or on studio ceiling rails.

“From my perspective, [the challenge] was to try to light all those worlds as if it was a normal location,” Ryan said.

Shooting on film

First and foremost, Lanthimos wanted to shoot on 35mm film and moreover to use an older film stock only ever made as 16mm. The filmmakers approached Kodak who agreed to create a unique 35mm version of Ektachrome 100D 5294 colour neg. A number of the film’s opening scenes are filmed in black and white joining Oppenheimer (DP Hoyte Van Hoytema FSF NSC ASC), Maestro (DP Matthew Libatique ASC) and Asteroid City (DP Robert Yeoman ASC) in the list of 2023 titles made using a storytelling hybrid of 35mm colour and B&W stocks.

“It’s a beautiful celluloid to work with,” says Ryan. “It was quite a selective process in terms of what was shot on Ektachrome, depending partly on the set partly on lighting.”

He went through the schedule with Lanthimos and marked-up which scenes to shoot on Ektachrome.  “When Bella goes on her journey, the kaleidoscope of colour comes out,” he adds.  “We used the various textures, contrast and colour of the different film stocks to enhance the look and atmosphere of multiple sets and different scenes.”

Processing of the Kodak Double-X 5222B&W footage was done at Magyar Film Lab in Budapest. The exposed Ektachrome was developed at Cinegrell Berlin.

4K film scanning of the different film stocks was completed by Cinelab, along with the creation film print deliverables. The DI grade was performed by Greg Fisher at Company 3 in London.

Lenses tell their own story

Ryan also created the film’s language with use of various vintage lenses. Tests with Lanthimos were extensive, including going through 50 sets in a day to find the right one.

These included using antique 58mm and 85mm Petzval lenses first made around the turn of the twentieth century for stills portraiture and a number of wide-angle optics including a 8mm Nikkor lens used for Baxter’s laboratory, a wide-angle 10mm Zeiss lens and an extreme 4mm Optex Prime.

Of the Petzvals he explained, “You've got this really beautiful bokeh where the fall off from the focus is very shallow. The focus is all over the place and the centre usually is the only thing that's in focus. It creates really beautiful swirly optics.”

If the moment needed something “a bit more mad”, he’d pop the Optex 4mm on the camera. As it was designed for 16mm cinematography, it's the wrong lens for the 4-perf film 35mm format they were shooting.

“But it gave a lovely vignette with dark edges, a kind of porthole into another world, that Yorgos really liked,” Ryan told Kodak. "It does not bulge and bend the image so much as the 6mm lens we used on The Favourite, but has a huge depth-of-field, where pretty much everything in the frame was in focus from a face or object just a few inches in front of the lens right out to infinity.”

He also shot using zooms, which was a new technique for Ryan. He operated A camera and had to perfect his skills at focus pulling.

“Yorgos hates the idea of conventional film coverage. So, we blocked scenes with all sorts of dolly and crane moves that, for example, started on a close-up, zoomed back and then tracked over to a different character, which might perhaps then intercut with an extreme wide.”

The decision to frame for a 1.66:1 VistaVision aspect ratio, a departure from the standard widescreen format, allowed him to shoot intimate closeups and use the additional height to achieve more abstract shots.

Some scenes were shot on the compact Beaumonte VistaVision camera designed to  work on  Steadicams which has its own quirks. During the re-animation scene when Bella wakes up the camera started running out of power, which mean the physical film reel going through it was running slower too, producing a “weird animation” they ended up using in the final picture. “It looks like it’s sped up, and it was only purely by a mistake.”

Studio bound

Poor Things was filmed in August 2021 in Hungary, mainly at Origo Studios in Budapest, where elaborate interior sets, created by production designers Shona Heath and James Price, were built on stage, along with exteriors of Paris and Lisbon on different backlots.

The filmmakers started to look at cities like Budapest and Prague to use as locations, but inspired by the films of the 1930s, Lanthimos began exploring the idea of constructing their own world from scratch.

Heath found a lot of her inspiration from the satirical drawings of Albert Guillaume during the Belle Epoque era in Paris, which were futuristic for their time.

“We always tried to imagine that this story was set in a past time, but with the vision of the future,” Heath explains in the film’s notes.

Production took over numerous soundstages, where they built the complete worlds of London and Baxter’s House, the ocean liner ship, the Paris square and brothel and the Alexandria hotel and slums. For the city of Lisbon, they used the largest sound stage in continental Europe at Korda Studios in Budapest. Painted backdrops and back projection rounded out the world shot in camera.

Augmenting these old-school techniques was a virtual production screen which was the deliberately fantastical backdrop to scenes from the cruise ship. The heightened almost surreal colour palette and unabashed artificiality of films like Rainer Werner Fassbinder’s Querelle (1982) an influence.

Alasdair Gray was also a painter who illustrated the novel’s text. That set Lanthimos off on his visual exploration of the book’s themes which he describes as fundamentally about a woman’s freedom in society.

It’s a political film, McNamara contends, “The idea of patriarchy and of young women liberating themselves from being objectified has become so important in society. I hope that comes through.”

Produced by Film4, Element Pictures, TSG Entertainment and Searchlight Pictures, the film is on UK release in January.

 

Past and Present Intersect in Steve McQueen’s “Occupied City”

NAB

Occupied City is the second recent feature film following The Zone of Interest to address the holocaust without resorting to over used imagery. This four hour feature documentary by British director Steve McQueen concerns the Nazi Occupation of Amsterdam during World War II but doesn’t use archive footage, talking heads, or dramatize any scenes.

article here

It is based on Atlas of an Occupied City: Amsterdam 1940-1945, a historical encyclopedia written by McQueen's wife, the historian and filmmaker Bianca Stigter.

“Bianca had written this extraordinary book, and it's all her research over the last 20 years or more," explained the director to Aframe. "It's not the first book you'd ever think we'd translate into a movie. It's not an obvious choice."

Using the text of Atlas as narration, McQueen (who won Best Picture with 2013's 12 Years a Slave) juxtaposes the history of the city and explanatory narration by Melanie Hyams with footage of life in Amsterdam today, which he shot over the course of over several beginning in 2019 and through the pandemic lockdowns. 

“What I wanted was, as you would do in a city, you get lost,” McQueen told IndieWire’s Filmmaker Toolkit podcast, adding that the film was a bit like an English garden. “Unlike a French garden, which is all about the avenues; it’s very symmetrical, very formal. An English garden [has] more to do with wandering and the contemplating and lots of ideas come from those places of wandering and pondering.” 

Stigter describes the film is more of a free wandering through the city, and the book is more practically set up like a guide book.

One scene in which the elderly owner of an apartment in which Occupied City filmed showed the crew country line-dancing. Under Hyams’ narration of what happened there during the war, the joyful dancing of the owner adds the fact that she, also, might have her own story of the Nazi occupation. 

“There’s something excessive about the movie because — besides from what you see, you also think, ‘What do these people [we’re seeing] have in their heads [from that time]?’” Stigter told IndieWire.  

McQueen, who lives in Amsterdam with his Dutch wife, found the experience of living in a city that had once been Nazi occupied an unsettling one.

“My daughter's school was once an interrogation center. Where my son went to school was a Jewish school, so these things were in my every day,” he told Aframe. “When it's sinking into your pores, you start thinking about it. Coming from London, not having grown up in an occupied city but being here now, it felt like I was living with ghosts. It's almost like an archaeological dig. This is recent history within the last 85 or 90 years, and I thought this could be fascinating. It is two existences: My presence and another presence.”

Initially, McQueen thought he’d find some archive footage from Amsterdam in WWII to project on top of the present day footage, but then decided to use narration based on Stigter’s text and to merge the two things together.

“There's optimism in [Hyams’] voice, even though there was a dispassionate sort of description of what was going on,” he told NPR's Asma Khalid. “And that was because I didn't want to manipulate the audience. It was about the audience bringing the information, receiving the information for the first time.”

He described the process of shooting on 35mm – his favoured medium – as a ritual. “It's so precious this footage and it actually adds to the tension of being careful about how you how you approach the moment,” he told the New York Film Festival.

“It was shooting without a tightrope, in a way,” he added to Aframe. “Young people today shoot digitally; they spray the whole area, shooting for 60 hours and cutting it down to half an hour. You can't do that with film. The process of making a film and working with Lennert Hillege, the DP, the sound people, and others, it was a beautiful ritual every time we took the camera. I think that was extremely helpful in capturing things, because everyone was very focused.”

Addressing the length of the film, McQueen said it couldn't be told in an hour and a half. “It needed that contemplation, needed meditations to sort of get into the psyche of the cinema experience, and that time was very important for us,” he told NPR.

Stigter said, "It's essential to have ways to bring history to the fore. We have documentaries, books, and feature films, and this is trying to tell you things about the past in a different way. That's also why the length is important. It turns it more into a meditation or an experience than a history lesson."

McQueen, who began his career making video installation art, is also preparing a “36-hour sculptural version” as an art piece. “There are 36 hours of edited footage,” he informed Aframe. “From that 36 hours of edited footage, we took out these four hours, because making a feature film is a very different experience than making the sculptural element of it. Certain things are repeated in that, but you don't want to do that in a feature film. In some ways, after a particular moment, it condenses itself, and then you decide what you want to keep in and what you want to take out to make it a certain kind of journey.”

Occupied City ends with a bar mitzvah ceremony because it was important to McQueen and Stigter to show the persistence of Jewish life in Amsterdam.

In a presentation at the New York Film Festival Stigter said, “For me the last scene is also very important to show something of contemporary Jewish life in the city, and that was a very beautiful and hopeful conclusion for the for the movie.

“I often think watching a movie is like a religious experience,” McQueen added to Aframe. “You're trying to create meaning in what you see. In this case, the more you know, the less you know.”

He continued this theme with NPR, saying, “When you go to the movies, people try to connect the dots and try to make sense of things. But the lessons learned from this situation is that nothing makes sense. How can you even fathom or sort of get to an understanding of how, for example during this war, 6 million people died. Try and make sense of that.


Friday, 12 January 2024

Entertainment Industry Enters Age of Austerity

Streaming Media

article here

Almost every major media tech company you can think of is shedding jobs so, rather than wondering which companies are laying off staff, it’s probably easier to ask, "Who’s hiring?" 

More than 260,000 global technology-sector employees were laid off in 2023 from 1186 companies, some 100,000 more than in 2022, according to data compiled by the website Layoffs.fyi. That compares to 154,336 employees made redundant in 2022. 

The most recent is Amazon which has announced job cuts in Prime Video and MGM Studios divisions, together with 500 staff, a third of the total, from gaming platform Twitch. 

“We still have work to do to rightsize our company,” Twitch CEO Dan Clancy said in a blog post. “For some time now the organization has been sized based upon where we optimistically expect our business to be in three or more years, not where we’re at today.” 

In 2023 across it wider retail business, Amazon axed another 27,000 jobs. That’s while Amazon founder Jeff Bezos made over $7.9 million an hour for every hour of the day in 2023, according to Fortune.  

Amazon is far from the only company having to issue PRs that euphemistically talk of streamlining operations or downsizing or making the most of resources.  Google made $76.3 billion in revenue in just the third quarter of 2023, according to its most recent figures, with a net income of $19.7bn yet it still felt impelled to axe a reported 600 people at the start of the new year. That’s on top of making 10,000 redundant last January. 

Google's AR team is believed to have been affected. In a post on X, the Alphabet Workers Union described the job cuts as "another round of needless layoffs." 

We’re not even at the end of January and already a quarter (1,800 people) of the workforce have been axed from game engine Unity despite three rounds of layoffs in 2023. Meanwhile media measurement firm VideoAmp is losing nearly 20% of its staff

Chip maker Qualcomm Inc, networking giant Cisco and streaming company Roku also announced job cuts recently. Microsoft slashed more labor in July 2023, adding to the 10,000 cuts it made a year ago. Niantic, the company behind "Pokemon Go" made 230 layoffs in June. Meta and Twitter/X’s billionaire owners have also eliminated human resources in the last two years. 

Epic Games, makers of Fortnite and Unreal Engine, reduced its headcount by 16% (or 830) in September, despite making billions of dollars in revenue

“For a while now, we’ve been spending way more money than we earn, investing in the next evolution of Epic and growing Fortnite as a metaverse-inspired ecosystem for creators,” wrote CEO Tim Sweeney. “I had long been optimistic that we could power through this transition without layoffs, but in retrospect I see that this was unrealistic.” 

Aside from hubris the common denominator impacting media tech is the global downturn in the economy and a consumer much more reluctant to part with cash for entertainment. 

Nowhere has this hit home harder than in the studios and streamers which have had to come to terms with the failure of a business model that inflated users over revenue. 

Streamers 

All the streamers are on a profit drive, and that means slashing people and entire shows from the bottom line. Disney took out 7,000 jobs with CEO Bob Iger touting a “significant transformation” for the company. WBD cut hundreds of jobs  including at CNN; United Talent Agency, NBCUniversal, and Paramount Global laid off employees too. Netflix has not been immune. It trimmed its drama executives at the end of last year having made deeper cuts to its animation division

Ed Barton, Research Director at Caretta Research, says the entire industry is shifting from chasing raw subscriber growth to proving that premium content streaming is profitable and sustainable. “The spending on content and subscriber acquisition has to be controlled and right-sized for how much revenue they generate,” he said. “Previously streamers seemed to adopt a bit of a ‘spray and pray’ attitude to spending. As long as subs kept growing, everything was fine. Those rules no longer apply.”    

Barton thinks some companies are running out of patience. “Amazon spent nigh on a billion dollars for Twitch nine years ago and effectively bought a machine which loses money. They've had plenty of time and resources to work out how to monetise games’ streaming profitably and it's not happening. At some point management looks at the return they're getting from these platforms and decides the capital is best allocated elsewhere, such as in the highly profitable cloud computing business.”  

More Cuts Coming

The worst decline in traditional TV advertising in 15 years has resulted in fears of more job cuts at UK broadcasters on top of cuts to commissioning budgets.  

The rot has set in. Annual declines in traditional TV ad spend are predicted until at least 2028 and while broadcasters like ITV are attempting to engineer audiences and advertisers over to its streaming platform, streaming ad revenues are forecast to be worth less than a third of the £3.5bn traditional linear TV ad market by the end of this year. 

To compound the problem facing legacy broadcasters, they now face competition for ad spend from all the international streamers including Netflix and Amazon Prime. 

Don’t expect any bounce back either. The age of austerity seems here to stay. 

“The trend is long term, said Barton. “The companies which own these platforms need to demonstrate that streaming makes money without unrestrained spending on content, marketing and subscriber acquisition. The entire TV industry has bet the farm on streaming, deliberately relegating broadcast and pay TV in their content and investment priorities despite living large off the success of these markets for decades.  

“They have to prove that streaming can step up and fill this massive hole that they helped dig and if they don't, they will be smaller and less relevant businesses going forward with less power to attract the best talent and make the biggest impacts on audiences.”  

With Paramount on the block, media consolidation has not stopped. Any wave of merger and acquisition fuels job losses.  

While most redundancies are not a direct result of AI the use of intelligent automation is on the rise. Tech optimists hold that AI will create new roles for the ones it replaces but others remain fearful of being machined out of a livelihood. 

Online language app Duolingo for example cut 10% of its contractor workforce at the end of 2023 saying it would use AI to streamline content production and translations previously handled by humans. 

Barton said, “Success in streaming will be more concentrated than in TV and if you aren't one of the winning platforms, you won't be in a position to maintain a resource intensive cost base. So less people and more automation (if they can afford the transformation process) is possible though it's more likely that they will be taken out by consolidation.” 

Tuesday, 9 January 2024

BTS: Ferrari

IBC

Michael Mann’s new film contrasts the frenetic action of racing sports cars with the more formal staging of Enzo Ferrari’s interpersonal rivalries and driving ambition.

article here

Just as the template for Days of Thunder was Top Gun so Ferrari borrows camerawork straight out of Top Gun: Maverick though these two films are worlds apart in all other respects.

Michael Mann’s new film is a period drama about motorsport mastermind Enzo Ferrari’s determination to win the automotive business race while pushing his test drivers to the limit.

“We were going to be driving these cars extremely fast over kilometres of country so we needed cameras that would be lightweight and robust enough,” explains director of photography Erik Messerschmidt ASC of the film’s signature racing scenes which included restaging the 1,500km motorsport endurance race Mille Miglia.

 “DoP Carmen Miranda is a great friend of mine and he had used the Sony Venice very successfully on Top Gun: Maverick. We weren’t going to use green screen either. Micheal wanted the cars to approximate teh speeds that the drivers used to drive them.”

As on Top Gun: Maverick, Messerschmidt used the Venice in Rialto mode where the lens block is separated from the camera. The cars had mounts built-in to the tubular chassis so that they could quickly fit 6-9 cameras on board variously on the hood, wheel rims, bumpers and passenger seat.

“These cameras weren’t suctioned to the car they were bolted rigid to the frame. Even with that in mind a 25lb camera outside a body panel would significantly change the handling of the car for the stunt team, or we wanted to get the cars really close to each while cameras hanging off the side, so weight and space was a huge issue.”

That wasn’t the only reason to choose the Venice. He also wanted a camera with internal ND filters. “That with something I desperately felt I needed because we were going to have situations shooting multiple cameras simultaneously day or night as these cars raced and I didn’t want to slow Michael down with filtration changes. I could keep the iris where I wanted it without disturbing the actors. That was the initial reason why Venice was chosen.”

As the cars sped around the countryside in Northern Italy, Messerschmidt was able to monitor the feeds live via long range transmission from antennas to a video village arranged by engineers from US firm RF Films.

For the racing sequences he deployed a number of zooms “compressing the space” to accentuate the feeling of speed and the tight geography of the cockpit and corners.

“What was important to Michael in the race scene was to get across what it felt like to be in these machines. He wanted the smell of gasoline, the grease from the engine and dust from the road, the extreme rattling of the metal. He wasn’t interested in capturing the smooth running of the cars from aerials or camera-cars running alongside. It was intended to capture the experience of being that driver.”

Seeing red

Ferrari the brand is synonymous with the colour red but Mann stipulated that the only time the audience see red in his film is when they the race cars are on screen.

“There is a little bit of ox-blood wallpaper in one of the apartments but there is no red at all other than the cars, a deliberate choice that Michael wanted,” says Messerschmidt.

Away from the track action and the film’s colour palette recalls summertime in Tuscany and Emilia-Romagna and especially Modena, famous for being the base of sports car makers like De Tomaso, Lamborghini, Maserati and Ferrari.

“To me it has this hay, honey yellow to it,” observes Messerschmidt who spent time in the region on recce. “The buildings are painted with yellow plaster and oranges and brighter earth tones so the sun will hit the buildings and reflect off all this colourful natural light.”

Inspiration also came from study of Italian Renaissance painters like Titian, Caravaggio and Tintoretto, though as Messerschmidt acknowledges a 20th Century master was influential too.

[Director of photography] Gordon Willis ASC is a hero of mine and it would be hard for me as an American to make a movie set in the countryside of Italy without thinking about The Godfather. 

“There is a kind of simplicity in terms of how Willis lit The Godfather that was attractive to me,” he adds. “I like the idea of distilling the environment down to just a couple of light fixtures, asking, what is the least we can do in this room? And that also supports Michael’s shooting style. He wants the freedom to move the camera a lot and he doesn’t want to let lighting rigs get in the way of that. So, there’s a practical consideration too to the support the director.”

Mann, creator of Miami Vice and director of films such as Heat and Collateral, had spent three decades working on this movie, during which time he made a number of research trips to Modena.

“Michael is a photographer and he has a photographic brain,” says Messerschmidt. “He is an image maker. He took dozens of photos, he has files and files of historical records and newsreel footage and stills of Enzo Ferrari. It was an incredible assortment of media. I would go into his office and he’d show me what he wanted the movie to be. He had a very clear idea.”

One of those ideas was to use a probe lens called the Skater Scope in order to get extreme close-ups of his actors. Unlike most probe lens systems which have an integrated lens optic the Skater is essentially a periscope-like extension onto which the DP can mount their own lens.

Messerschmidt explains, “This pulls the lens away from the camera body about 25cm and it changes the optics to give you quite a bit of macro close focus. Michael likes to put the lens very close to the actor. We put it on Steadicam occasionally and it meant you could put the lens to someone’s eyeball but the operator is at arm’s length away. We could fly around Adam [Driver] or get the lens right behind someone’s ear or into someone’s face. It’s a very unique, specific look.”

It complicated work for the DP because the lens is slow (losing two stops) and required lighting the sets with additional light to compensate. “To get any resolution out of it you have to shoot around a 6.5K and you need a relatively high speed camera,” he adds.

Most of the film is lensed using Panavision Panaspeeds, the same set that the DP had used to shoot the period war film Devotion in 2022.

“I love modern lenses because I like them to be consistent. I am not someone who is necessarily attracted to the idea of vintage lenses. It is hard when you change lenses and it requires a different f-stop or one lens exhibits a pink hue and the next is green. Even though you can fix it, it drives me nuts. I am definitely in the camp which appreciates modern lenses.  Sometimes sharpness is an issue, but it’s nice to start at that point.

On Devotion, Dan Sasaki [Panavision’s lens scientist] had detuned the Panaspeeds so they exhibited “aggressive spherical aberration with halation in the highlights,” he recalls. “I really loved it and so we did the same on Ferrari.”

Red cameras, the Komodo and V-Raptor are also used for sequences shot in actual vintage cars, such as the open wheel car shots in the film’s beginning, where even the Rialto was too big to fit.

The DP’s favourite shot is not though from the track but a quieter moment in which Ferrari’s wife Laura (Penelope Cruz) holds him accountable for his actions.

“She is stately and centred and starts the scene sitting. Then Enzo comes in and orbits her and walks away. She is very strong in the scene, static. We lit her with a very simple top light. The way that Michael staged that scene with Penelope’s performance and Adam moving in and out of the light is my favourite shot.”