Monday, 11 December 2023

AI for M&E: Things Are Going to Get Complicated

NAB

Few industries will be more directly impacted by generative AI than Media & Entertainment and as it evolves in its second year, the battle lines are being drawn.

article here

Broadly, the battle lines will be fought in three areas: the legal right to use AI; between open source and proprietary AI tool developers in IT; and Hollywood Studios versus the legion of employees from A-list talent to production crew.

The resolution of the strikes both for actors and writers has only punted the issue a couple of years down the road. Ultimately, it would seem, M&E is going to be shaken up for better and for worse.

“The last decade in film and TV was defined by the disruption of content distribution and the next decade will be defined by the disruption of content creation,” summed up industry analyst Doug Shapiro in an extensive post on Medium explaining how all aspects of production would impacted.

In its special report, “Generative AI in Film & TV,” Variety found the tech already beginning to disrupt traditional methods, with generative AI tools currently used to automate some creative tasks. Its impact stands to be positive, Variety concluded, “as it eliminates rote work, speeds project timelines and allows productions to pursue previously impossible creative paths prohibited by constraints on cost, time and even physical reality.”

At the same time, Variety notes that its use promises to reduce the need for certain processes and workers to achieve the same level of output. Spelled out: That’s job losses.

Shaprio breaks down the costs of production costs for movies including the 50% of “below-the-line” crew and production costs of which 25-30% is post-production (and of this percentage, mostly VFX). All in all, roughly two-thirds of these costs are labor, he says. “It is a sensitive topic for good reason, but over time GenAI-enabled tools promise (and threaten) to replace large proportions of this labor.”

Practical use cases are already cropping up across all stages of the TV and film production process. These include story development, storyboarding/animatics, pre-visualization, B-roll, editing, VFX and localization services.

How far will this all go?  Even making the relatively conservative assumption that TV and film projects will always require both human creative teams and human actors, Shapiro says future potential use cases include: the elimination of soundstages and locations, the elimination of costumes and makeup and even “first pass editing.”

“In the future, it is likely that editing software will make a first pass at an edit, which can then be reviewed by a human editor,” he suggests. “Similarly, it’s easy to envision an editing co-pilot or a VFX co-pilot that could create and adjust VFX in response to natural language prompts.”

You can argue, as Shapiro does, that we have a “visceral negative reaction” to anything that’s supposed to look human but doesn’t, the so-called ‘uncanny valley.

“In which case we will still need human actors, possibly for a long time — but it would also mean that every other part of the physical production process would be subject to being replaced synthetically.”

All of this will likely have a profound effect on production costs. “Over time, the cost curve for all non-Above The Line costs may converge with the cost curve of compute,” is Shapiro’s possibly true if disheartening conclusion.

The potential for lower production costs would seem a silver lining for Studios but it also presents a daunting change management challenge.

“Studios should start either by experimenting with non-core processes or developing skunkworks studios to develop ‘AI-first’ content from scratch,” Shapiro says.

Peter Csathy in TheWrap thinks the major studios, faced with mounting Wall Street pressure to transform their business models, will begin to focus on generative AI “to increase output and cut costs.” Early experiments he suggests will include hyper-automation in visualization and initial uses of “Synthetic Performers.” 

Streamers like Netflix, “with Big Tech DNA coursing through their veins,” will lead the way, he says.

Legislation to Tackle and Protect

The EU and the US Congress as well as individual states at the federal level will pass significant AI legislation that directly impacts the M&E industry in the next 12 to 18 months. President Joe Biden’s recent Executive Order points the way.

“Congress will demand that the Big Tech companies behind GenAI give some basic level of transparency about the material on which their large language models are trained,” says Csathy. “Regulators will also try to get ahead of the game — a stark contrast to when they were largely absent when social media rose in popularity and importance (and caused significant harm).”

Csathy expects the creative community to do its best to keep AI companies honest by implementing so-called “forensic AI tech” like watermarking to identify whether relevant creative works were “scraped” or not. That in turn will promote “opt in” solutions for AI training.

Startups to Rival Big Tech

The battle between proprietary AI and open source AI is at its fiercest. Broadly speaking this is the battle between Big Tech and smaller start-ups and the battle is being fought in the market. Perhaps OpenAI/ChatGPT’s lasting legacy will be in opening up the first bona fide market for AI. In fact, AI — as the moving chairs at OpenAI have shown — is no longer controlled by scientists in the lab but by Wall Street.

CambrianAI analyst Alberto Romero, in his blog The Algorithmic Bridge on Substack, characterizes the debate like this: “The open-source scene is vibrant, full of enthusiasts who firmly believe AI shouldn’t be in the hands of the few and are working relentlessly to make their vision of a better, democratized world through AI a reality. They have detractors who think AI, as a (potentially) very powerful (and thus dangerous) technology, shouldn’t be available for anyone to use.”

He adds, “If the open-source community wasn’t pushing as hard as it is, closed businesses would capture all the value.”

Open source-based startups are also growing in number and in quality of output.

“They’re catching up with the best models, such as GPT-4,” says Romero. “While closed-source LLMs [like ChatGPT] generally outperform their open-source counterparts, the progress on the latter has been rapid with claims of achieving parity or even better on certain tasks. This has crucial implications not only on research but also on business.”

He thinks that the era of extremely large models dominating AI was just a phase and it’s coming to an end.

“Small and cheap is the future,” he says. “Open-source AI is becoming a powerful counterforce to Big AI as more people realize that this tech shouldn’t be in the hands of a few — it’s catching up.”

Csathy thinks Big Tech companies like Alphabet will try to have it both ways. “Desperate to keep up with OpenAI (and Microsoft) Alphabet will relentlessly march on with its AI development while trotting out its new SynthID watermarking solution to quell the creative masses,” he predicts.

“Alphabet throws these bones to the creative community, while its stock price rockets upward and the entertainment industry struggles to monetize amidst its continuing transfer of wealth to the Big Tech players that disrupt it.”

 


Friday, 8 December 2023

Bring Your Own AI if You Plan to Transform the Workplace

NAB

The media industry will regain confidence in 2024, fueled by the rise of generative AI and stabilizing advertising revenue, according to industry analyst Forrester. Google, Meta, and TikTok in particular are poised for a strong 2024.

article here

In its “Predictions 2024: Media And Advertising” report, the analyst foresees that generative AI “will transform Google into the next Google.” It explains that as Google harnesses GenAI, it will help the company sustain dominance as the number one search engine. Forrester conducted a survey with its ConsumerVoices Market Research Online Community and found that 73% of online adults would rely on Google to verify suspect responses from ChatGPT.

“In 2024, Google will leverage its credibility and commercialize its C4 data set to deepen its moat as the crawler and repository of reliable information,” Forrester states.

It predicts that TikTok will gain the lion’s share of linear TV budgets for Gen Z-minded marketers. Citing research that 86% of B2C marketing executives in the US are prioritizing better ways to reach Gen Zs and Millennials, it adds that these audiences are spending their entertainment time on “nonpremium video and gaming environments,” with around 40% of young adults in the US and the UK saying that they’re on TikTok constantly.

In an effort to connect with Gen Z, Chips Ahoy already moved most of its linear TV budget to social and digital channels. TikTok, not TV and CTV, will dominate media budgets for marketers trying to reach this influential audience.

Enter the Era of Intentional AI

AI dominates Forrester’s list of trends and predictions. The real question is will GenAI and AI in general live up to the massive amount of hype we’ve seen to date?

Unsurprisingly, Forrester’s answer is yes. 2024 will be another banner year for AI overall, it states, ushering in a new era of “intentional AI,” where gimmicks and technical experimentation give way to more focused and strategic initiatives.

This trend is already underway. Forrester says 67% of enterprises are embedding GenAI into their overall AI strategy.

“Every organization right now is asking themselves, how they can use AI to improve their business,” said senior analyst Andrew Hewitt in an accompanying podcast, “Predictions 2024: Where Will AI Go Next?”

“And I think the overall consensus is, that they want to be able to use AI in a way that’s very personalized to their specific business and allows them to drive outcomes for their business.”

While that is certainly the end goal for many organizations, Forrester also found that many are struggling to put that together. As a result of that, what it is starting to see that organizations are having to contend with a new concept, Bring your own AI (BYOAI). In other words, employees bring their own consumer versions of AI tools.

“Of course, the most popular is ChatGPT and using that in different parts of their work,” Hewitt said. “What’s ultimately happening is that while organizations are striving to provide that kind of corporate sanctioned AI capability or develop that strategy, they’re not able to do it fast enough. That brings in the consumer oriented services that we believe many employees are going to be using over the course of the next year.”

Forrester’s formal prediction is that in 2024 60% of workers will use their own AI to perform their job and tasks. That’s more than half of the workforce using some form of AI to do a substantive part of their job.

Added analyst Kim Herrington, “That could be a generative AI system or could also mean AI that’s embedded in an application that maybe isn’t sanctioned by the business or that that employee owns themselves. We predict that 60% of workers are going to actually bring their own AI similar to the moving around, bring your own device, and use that for their work over the next 12 months.”

Herrington went on to say that employees will use those tools to automate big portions of their job, whether that’s content generation or summarization of articles or decision support or using it in a sales scenario.

“We foresee that organizations or employees specifically, are going to be successful and improving their productivity over the next 12 months. At the same time, it also introduces a lot of risks from a legal perspective, from a security perspective and from a privacy perspective.

“We’re probably going to see some big blunders from organizations in terms of unsanctioned use of AI buildings by the workforce, leading to some negative business impact, whether that’s a privacy violation, a security infringement, or legal jeopardy.”

Ultimately, that will end up driving organizations towards corporate sanctioned AI capabilities, Forrester argues. “While they’ll definitely build their own policies to manage ‘bring your own AI’ in the workplace, ultimately, it’s going to push them towards developing and delivering a corporate sanctioned version of AI that the workforce can use without jeopardizing security management or legality of the overall AI system itself,” Herrington added.

The rapidly growing and widespread use of AI in the workplace also require new training programs for professionals. Forrester predicts that 60% of data and analytics professionals 60% will get prompt engineering training in 2024. Prompt engineering is the practice of creating and refining instructions given to an AI model to get the desired responses.

Yet only 33% of US and UK Data and Analytics employees say that their organizations currently provide training on how best to communicate with chatbots or intelligent agents via prompts.

“In order to capitalize on AI, not only are businesses going to have to fund AI developments, but they’re also going to have to budget for AI search, training and creation of those different prompts,” said Harrington, as well as budget for data communicators to “evangelize the AI tooling” and act as analytics translators to help people adopt those new technologies.

Deepfakes Dominate Misinformation

With national elections coming up in the US and around the world in 2024, there’s increased concern about generative AI’s role in influencing elections. Deepfake ads will become the primary accelerant for election misinformation, said senior analyst Mo Allibhai, although he noted there was a high bar for bad actors to extend their reach.

“Setting up a publisher website, and then being able to actually be admitted to an Ad Exchange and then drawing audiences to that publisher website.. It’s pretty expensive as an endeavor.”

The good news is that Forrester think that generative AI-created disinformation will fail to alter the course of any national elections because the real challenge lies in the distribution of disinformation and not the creation of it.

 


Power to the People: A Social History of the Internet

NAB

The idea that the history of the internet is as significant, maybe more significant, told from the lens of the users and creators rather than the CEOs in Silicon Valley is the crux of a new book by Washington Post columnist Taylor Lorenz.

article here

In Extremely Online: The Untold Story of Fame, Influence, and Power on the Internet, Lorenz says it is users and creators who hold the power when it comes to social media.

In a conversation Brock Johnson, host of the WBUR podcast Endless Thread, she explains why she wanted to tell the other side of social media history.

“I think it’s so underwritten and for the majority of the rise of social media, there weren’t reporters covering it. It’s kind of crazy to describe how small this beat remains. At least in 2020, there were more reporters covering Facebook alone as a company than all of internet culture.”

Traditional media have been notoriously blind to shifts in social media, she argues, “and refuse to adapt to them.”

Most people think of the rise of social media as dominated by “Silicon Valley men that really saw the future before anyone else and that’s not true,” she says. “Actually, many times they had absolutely no idea what they were doing or they were sort of saved by specific communities that adopted their products.”

Lorenz argues that “social products” aren’t like other tech products in the sense that the user base is the product. The users have a massive amount of influence over the success of a product because at the end of the day the product is the social network platform that users themselves cultivate.

Put another way, the true value of Facebook or Instagram or — dare we say — Twitter/X is the people who use it.

She maintains that users constantly exert their power on the platform’s erstwhile overseers.

“Look at things like the @ sign or the hashtag or the retweet,” she explains. “These were user-driven behaviors that the product then integrated. YouTube itself started as a dating site, but it was the way that users uploaded videos that the company actually leaned into and sort of adapted to and became this widely successful video sharing platform.”

In the book, Lorenz divides social media into two camps: Entertainment model and Facebook model.

“In the beginning, there was this entertainment driven model of social media, which was like people using it for fame and attention and to build audiences,” she elaborates. “This was very much the MySpace model. The Facebook model of social media was all about a walled garden. It capped your

friends list at 5,000 people, because they didn’t want people using it for fame. It was more about manifesting your IRL connections on the internet through this highly curated experience.”

The Facebook model acted as a bridge to attract people online but, ultimately, the entertainment model of social media has won.

“This is where we have these private spaces for group chats and, direct messaging and things like Snapchat. And then you have the public facing side of things, which is, TikTok, basically. If you go back and read MySpace’s marketing materials and compare it to how TikTok markets itself today, they’re shockingly similar.”

Asked how a more equitable and powerful creator economy could be built, Lorenz prescribes first taking the content creator industry seriously.

“[We] need to recognize it as labor and cover it as a labor story. People still think influencing is mostly women taking selfies online. It’s this trivialization of women’s work and of a very female dominated industry. I mean, women built the creator economy. They’re never credited with it. They never get the respect they deserve,” she says.

“If you look at the most highly paged content creators, it’s almost all men. And not only is it all men, it’s mostly white men, it’s almost no people of color. LGBTQ people also pioneered this industry and have largely been pushed out of certain areas of it.”

Quizzed on how TikTok treats LGBTQ creators, Lorenz says all major social media platforms behave the same.

“It’s not like TikTok is uniquely censoring LGBTQ people. Look at YouTube. Notoriously de-platformed LGBTQ creators, restricts their reach, says that their content isn’t family-friendly enough. Same thing with Twitch,” she says.

“Same thing for women. Same thing for people of color. All of these marginalized groups struggle on these social platforms because their content is deemed not brand safe. They get mass reported. Nobody cares about their struggles on YouTube or Instagram seemingly. They care about making TikTok the villain because it’s easier to make TikTok the villain than deal with the systemic issues inherent in our landscape.”

The platforms themselves need far greater accountability to stop that happening. “It’s ridiculous, the amount of power that they have,” she says.

Unfortunately, the social tech landscape right now is dominated by Meta, Google, and TikTok (ByteDance) with “no way for smaller apps that are more responsible to compete and to grow audiences at the scale that Meta and Google have.”

Lorenz also calls out the “intense lobbying” power that US social media giants have that “squash the competition so effectively.”

She pins the blame on lack of oversight on the US government. “[Members of] Congress quite literally have stock in these companies. They want these companies to succeed and they’ve refused oversight. It’s very anti-competitive. Now of course, look at them freak out about TikTok. Not because there’s any inherent problem with TikTok, really. I mean, they pretend that it’s about Chinese ownership. Really, it’s about questioning Facebook and Google’s supremacy in this country.”

 


Wednesday, 6 December 2023

Gavin Struthers ASC BSC / Invasion

British Cinematographer

Gavin Struthers ASC BSC on shooting docs, shooting drama and adapting the look for Apple TV+ show Invasion. 

After a decade making blue chip British drama like Shameless, Secret Diary of a Call Girl, Holby City, Downton Abbey, Doctor Who and Last Tango in Halifax director of photography Gavin Struthers ASC BSC switched to photographing U.S productions like Da Vinci’s Demons, Marco Polo and The Witcher all while leaning on an earlier grounding in documentaries. 

article here 

“What is great about docs is that the drama unfolds in front of you, so you need to make decisions on the fly,” says Struthers. “You are storytelling in real time. When I went into drama, I found it tricky at the beginning. It’s almost too much time, too many options. You have to stand back and see the whole story. You know how a drama is going to end whereas in docs you don’t know what is important to an audience until it’s happening.” 

Struthers has a unique perspective not just on the nature of docs versus TV drama but on the different approaches of productions destined for UK broadcasters or US networks and streamers. 

“Americans tend to understand the importance of good visuals more than producers of a UK terrestrial show. You are allotted a bigger budget and because of the influence of Hollywood your opinion as a DP is listened to more. If it’s not working for the cinematography, they will change the script or a location first. They take a positive approach to getting it done. There’s also more of a team effort to keep that production quality high.” 

Working on shows like Black Sails (Starz) Struthers learned that preparation by DGA directors and production designers was as serious as if they were working on studio features.  

“My job begins with page turns with directors where we go through story and pick the dramatic beats in each scene. We discuss at length what a potential location can offer. When we go on reccies – they call it ‘casting a location’ – and you don’t feel it would work for the script then the production will move on. Then comes the problem solving – prelighting a stage, pointing out where I want practicals, deciding where generators should go. We try to deal with any issues that arise so that principal photography, in theory, goes smoothly.” 

Most directors, he says, want a location to “offer a sense of drama” which makes one wonder what he makes of the vogue for shooting against virtual backgrounds. 

“A volume definitely has its uses but is no replacement for real locations,” he says. “For example, on Invasion, a volume was very useful for in-car travelling work because it allows the director to do multiple takes with multiple cameras without having to run cars on a low loader or chase the daylight.  

“Also, on the day of shooting, the director will get the best out of the performers in that environment, but it doesn’t mean it’s cheaper. You still have to shoot plates and test in the space to see how the car reflects light. You end up employing a lot more people and having a much bigger footprint on the schedule based on the fact you will get better performances. I wouldn’t say it was something I would push for unless I know it’s a tricky scene to shoot live.” 

Feeling the cinematic force 

Struthers was five when he saw Star Wars at the cinema, an experience which cemented an already burgeoning interest in photography encouraged by his late father. By the age of nine, Struthers was developing and printing his own photographs in the family bathroom.  

“I always had a camera on me, but I was more interested in practical special effects,” he says. 

He built Airfix models and put lights and batteries in them to create alpha mattes of the type that ILM had pioneered for George Lucas. “I was aware of exposure and film stock quite early on, but I was driven by a love of ILM and practical matte paintings. It wasn’t until film school that I decided to study cinematography.” 

The summer before going to NFTS found Struthers working in the call centre of a medical insurance company to pay off university debts. It was “pretty stressful” but inspired an idea for a doc that he pitched to Channel 4. 

“It was the period of drama-docs like 999 and Airline and I thought that what I was doing combined holidays with emergencies, so I wrote up a pitch and sent it to four production companies. Touch Productions bought the idea and hired me as their consultant.” 

Four Days in August was made as part of C4’s Cutting Edge strand and proved formative for Struthers’ career shooting documentaries and drama. Touch owner and director Malcolm Brinkworth guided Struthers on the shoot by writing little symbols on the back of his shoulder while operating camera. “‘Turnover’ would be a circle or ‘cut’ would be a line so I learned quite quickly from him what he found important and where his cut points were. I took that through to drama where you still need to know how it is going to be edited.”  

Invasion  

He shot a lot with Arri Alexa including the Alienist and three seasons of Black Sails but  

has used RED cameras “pretty consistently” since 2016 when he shot the first season of The Witcher on Monstro 8K VV. He shot Superman & Lois on Monstro and switched from Sony Venice to RED when invited to photograph S2 of Apple TV+ sci-fi Invasion.  

“I love the sensor. For me it’s still the most cinematic and filmic around. The other reason I like it is I can have a studio camera (MONSTRO VV 8K), a crash cam and film with drones (using the smaller KOMODO body) all within the same camera system so I don’t have to swap working spaces and workflow or codecs or lens mounts.  

The impact of a large-scale alien invasion in Invasion is told from the point of view of characters living in different parts of the world. Season 1 used captions like ‘Brazil’ or ‘Japan’ to signpost changes in location but showrunner Simon Kinberg spoke to Struthers about changing this up for season 2. 

“I was hired before the director and production designer, so Simon was my main collaborator when discussing how to break the series down. His chief instruction was that he wanted the pace to be a lot quicker. He wanted our characters to go on action road trips where they were proactive rather than reactive. That meant faster cutting and ideally no text captions.” 

Using lens choices and filters Struthers devised different looks for each setting to give the audience an instant understanding of where they are. A set of Ultra Panatars created images with high contrast; using Auto Panatars gave a softer look. Both are anamorphic so he used spherical Zeiss Ultra Speeds for scenes in Europe. 

“It worked. You could crosscut between them, speeding up the edit, and not be confused about where you were in the world.” 

 


Monday, 4 December 2023

Behind the Scenes: Society of the Snow

IBC

To faithfully recreate a 50-year-old real life plane crash and remarkable tale of survival, the filmmakers behind Society of the Snow combined LED and green screens with multiple practical sets of the plane’s fuselage and put them all 2000+ metres up a mountain.

article here

That the mountain they used was in Spain’s Sierra Nevada rather than the Andes didn’t detract from the effort in terms of organisation, getting the crew and filming equipment there, and adapting to constant changes in the weather.

“We wanted the audience to feel they were really at the Valle de las Lágrimas (Valley of Tears),” explained cinematographer Pedro Luque of the crash location to IBC365. “When the survivors saw the movie they were amazed at how accurate the production was.”

Society of the Snow depicts the crash of Flight 571 and its aftermath — from the day Uruguay’s Old Christians Club rugby team left for a match in Santiago, Chile, to 72 days later when only 16 of them finally came home. Their story has been called ‘the miracle of the Andes’ and was previously filmed as Alive in 1993 by Disney.

Reclaiming the story back from Hollywood was one reason director Juan Antonio [J.A.] Bayona (The Impossible) wanted to retell it.

“I had to film Jurassic World: Fallen Kingdom and The Rings of Power to earn the right to direct this story as it was meant to be — in its original language, in the places where it happened, and with the ambition with which we approached the project,” he told Netflix, which eventually funded the project.

“It’s a story that runs really deep in Uruguayan mythology,” said Luque, himself Uruguayan and working for the first time with Bayona. “Everybody knows about it. There is some pride about surviving against the odds but let’s not forget this was a big tragedy for more than half of those who were on the plane. Even those who returned were scarred and while they went on to have families it was tough for them. Unlike Alive this movie is not about heroism. It’s about what really happened.”

Script Development

Bayona adapted the 2008 book ‘La Sociedad de la Nieve’ by author and journalist Pablo Vierci, who was a college classmate of the plane crash survivors. It’s a compilation of interviews with all the survivors forming an intensely personal recollection of what happened rather than a blow by blow account.

“I knew from the start that this was a mutating story, not written in stone, and that we were going to discover how to tell it in the journey of making it,” Luque added.

Developing the script, the production shot tens of hours of interviews with the survivors, opening the possibility of inserting some footage into the film and making it a more conventional docu-drama.

Instead, the filmmakers felt they could better convey the truth of the story by fusing docu-style camerawork with cinema aesthetics. Luque himself is responsible for the photography of films such as Don’t Breathe (2016) and The Girl in the Spider’s Web (2018).

“J.A. and I love classic movies—from Hitchcock to Spielberg to Lawrence of Arabia – but our instinct was also to have a camera ready to pick up and shoot in an instant either because of the changing weather on top of the mountain, or to be super ready for the actors.”

They shot mostly chronologically during which time the actors’ stopped eating, just as the passengers were forced into starvation. [The actors had a nutritionist but were deliberately losing weight].

“The most important thing was to be ready for them and at the same time to imbue this story with a cinematic eye,” said Luque. “It’s a very fine line between realism and the poetry of cinema.

“When we’d finished the film, a friend of Bayona’s said ‘You guys have invented a new genre of expressionistic documentary.’”

Luque described the mountains as “super overpowering, trapping you” and treats them as a character in this storytelling. “These are giants that don’t allow any life to exist and at any moment can act irrationally. That’s something you can feel, but it’s very difficult to convey on camera.”

All of this thought led to them adopting a language that alternates between close-ups and wide angles. “Even inside the plane we use wide shots where everything is in focus because we wanted to capture the feeling that everybody was caught in the same situation.”

Luque took a small crew to the Italian Alps with some actors and wardrobe department and shot with Sony Venice, Alexa (Mini LF and 65) and Reds variously with Leica lenses, Cookes, Arri DNAs and Panavision. Then, in a blind test projected for Bayona, the director chose the Mini LF with Panavision T-Series anamorphic glass expanded for full frame.

A metallic, sterile look

Luque spent six days in the Valley of Tears noting the dramatic and subtle light changes.

“You have a very hard, harsh light and a white that blinds you and suddenly there’s this beautiful pink sheen over everything and the nights are super bright when there’s a moon. There’s a beautiful range of emotions to pick from.”

With costume and art department he decided to take the colour green out of everything that happens on the mountain, because it represents life and nature. The mountain is represented by a world of blues and whites. “When the snow gets shiny it has this silvery metallic quality to it,” he said.

“Our characters eat and sleep and talk and they’re all in same location so I wanted to have a journey with the colour. One of the things we wanted was to have a very organic look. It couldn’t look digital. That’s a matter of taste but also because it’s a period piece it had to look filmic.”

Luque set the look first with Camera Test Colorist Kike Cañadas at Deluxe Spain and then with his colleague Chema Alba through the DI and grade using a Blackmagic Design Resolve workflow throughout.

“We didn’t want it to appear like ‘found footage’ but as if the film could have been shot then,” Alba told IBC365. He supervised the transfer of the film to 35mm Kodak negative at Cinelab London and then rescanned it back to digital in Resolve where it was regraded. The photochemical step added in some texture and they used tools in the software to dial more in.

“We used to do this 20 years ago but in 2023 this workflow was a present for me. The whole Resolve workflow was amazing. Even though this film was not the most difficult to colour grade we didn’t want to lose that doc feel. I tried to keep the grade basic and balanced to connect to the reality of the film and introduce nothing artificial.”

Production on a mountain

In the bid for authenticity and realism they shot most of the film in the mountains of the Sierra Nevada ski resort in Grenada, Spain. There they built a 300ft x 300ft reproduction of the mountain made with scaffold and foam in a hangar on a parking lot 1,000 metres up, a larger set constructed at 2,000 metres and an upper set at 3,000 metres, high enough to risk altitude sickness if not acclimatised.

A specialist mountain crew led by Eivind Holmboe spent two weeks in the Andes shooting background plates and photogrammetry of the Valley of Tears across a 10km panorama. This second unit footage was displayed on a 90 x 21 ft LED Volume in the hangar fitted with 140 ARRI Sky Panels on the roof. The DP and his lighting team could simulate the angle and colour temperature and exposure of the sun shot in the Andes with the environment on set.

“We were trying to match plates shot in the summer from a Valley in the southern hemisphere with photography being shot live in the northern hemisphere in the spring,” explained Luque.

Of the complex production arrangement he explained, “Because we were shooting chronology we had to jump from interior to exterior and backlot to interior. Even shooting interiors, we want to look out of a window or when a character comes out of the fuselage, we needed that immediate environment to be real.”

This entailed combining digital set extensions with practical sets of the plane fuselage in the context of the real mountainous environment. “We were able to shoot up to 4km of actual rock and snow in front of us combined with rear screen footage of the Andes for set extensions together with interiors shot on the backlot,” he explained.

Since the Sierras were largely devoid of snow when they filmed in early 2022, artificial snow composed of polymer, crushed plastic or cellulose was piped in.

The stage built at 2000m housed an iron and concrete bunker inside which was a fuselage on a hydraulic crane that could be moved up or down by 6 metres to mimic the snow levels before, during and after a huge avalanche and some of the storm scenes.

Another fuselage was transported to the top of the valley in a difficult to access location for heavy machinery like cranes or dollies, or a large crew, which made it more like shooting a documentary. Here, Luque mostly used natural light augmented by a couple of HMI and mirrors.

The crash restaged

When it came to recreating the plane crash, the SFX team talked about how a chain reaction concertinaed the seats pushing from the back all the way to the front.

They called it ‘the accordion’ and it was the last scene they shot.

“There was a lot of debate about how to depict the crash because to this day nobody knows exactly what happened,” Luque said. “If the plane had flown 10ft lower everyone would have been killed on impact, but 10ft higher and they would have been killed crashing into the next mountain. And because the plane was pulling up, all the power of engines was extracting air from the inside so it was not as pressurised as would have been if cruising so when it did impact it didn’t explode.

They decided to show the crash from the passenger’s point of view. In other words, with little in the way of explanation or understanding. The plane, having lost its wings slides down the mountainside, slams to a final halt.

Production built another half of a plane fuselage on a gimble to reproduce movement, speed, and levitation and to give the actors physical experience of the accident. The whole fuselage was placed on top of “lungs” so they could mimic the vibrations of a plane on take-off, hitting pockets of weather.

They also built pieces of the plane set - a wall, the carpet, the bathroom with another wall - “little sets that would give us small shots,” said Luque, around which they put green screen and LED screens with a projection of the mountain.

“We did a lot of previz and made a very thorough shot breakdown. For instance, a shot list might require us to take out seats 20-26 and piece 2a from the wall and to bring in a dolly with a 32mm lens. It was very precise in that sense. We shot more than ended up being in the final cut.”

Closer to home

Luque was born eight years after the event but recalls, as a teenager, reading the 1974 book by British author Piers Paul Read on which Alive was based.

“Uruguay is a small country and this kind of event forms the character of our people,” he said.

He revealed that a friend of his wife is a daughter of one of the survivors and that his mother in law had a friend who died on the mountain.

“It is close really close to home.”

In Alive he added, the dead were given different names. “In this film we use their real names. This was very important for the survivors and the families of the dead. When we screened the movie for them, they told us it was very healing to watch this with the relatives of those who didn’t make it because finally they could feel what happened there.”

 


Sunday, 3 December 2023

Taryn Southern: AI Is Your Full Stack Creative Team and You Are the Director

NAB

“In the time it takes for me to even finish the sentence, AI can create a 4K photograph, craft a pitch deck, produce pop songs,” says content creator and AI artist Taryn Southern. “We knew the robots would eventually come for our jobs but how do we feel about it encroaching on the last bastion of humanity, our creativity?”

article here

Southern has released the world’s first solo pop album composed with AI and directed an award-winning film on the future of human and artificial intelligence. In a video released by Vimeo she shares how you can use AI to craft powerful stories that inspire action and impact.

Like many other creators, her message is that AI is a tool that for creation for imagination and for saving time and money.

“The only thing that I think we need to fear at this moment is complacency,” she says. “If we don’t learn how to work with the tools, if we don’t learn how to build and synthesize our own ideas with them, then AI could be a serious threat. But if we want to push creative boundaries, if we’re willing to learn and adapt quickly and iterate, I do believe we will thrive.”

She breaks down the steps to creativity. These include “exposures” to what we’ve been taught or educate; an “operational” component, which is “how we actually get from point A to point B,” and also for a lot of artists appointed great friction; and our ability to “synthesize,” which means taking an insight from one domain and applying it to another domain.

The fourth component of creativity, for Southern, “is that elusive, magical moment that we all just crave as creatives — illumination. It’s the unexpected ‘A-ha’ moment that happens in the shower when you’re least expecting it.”

Every creator will have their own relationship to these four ideas. She then proceeds to detail how AI can be used to augment or “fill in the holes of our own creative expertise,” which is currently typically done in collaboration “with real life people.”

That’s fine if you have a budget or are part of a business, but what if you’re a DIY creator? That’s where AI really scores, she says.

“Now with AI, you have access to all of these skill sets and perspectives and tools… and you get to collaborate with them on your own time, no budget required.”

She asks us to think of AI as our “full stack creative team” in which you are the director.

“To be a great director, you need to have clarity and specificity in your direction. You also need to have meta-awareness. You’ll be able to filter through the good ideas from the bad ideas, allowing for novel ideas to seep in, and also to separate and improve on each component part of your project before assembling it all back together. That’s really what an incredible director does.”

In order to work effectively with AI, you need to train it on the various components. That means giving some thought to the project goals (am I trying to sell a product? Am I trying to build a brand? Am I writing a musical?); the constraints (resources, time and budget) and the audience you are targeting.

All standard issue checklist for any serious content creator. Then it’s about selecting your “AI team” from the hundreds of off the shelf algorithms out there capable of generating text to images, video to transcription.

“You are going to find your own just by experimenting. And once you have your AI team members, you can actually select the tools that you feel best represent their skill sets,” she says.

“Once you’ve gone back and forth with ChatGPT, and you have a finished script, you can ‘gut check’ your work to ensure it meets your goals and that of your audience. You can also ask GPT to identify if there are any biases or critical missing pieces of information that you should be aware of,” she continues.

“Finally, it’s time to move on to production. Now I’m not a cinematographer, I have very little skills here. So of course, I started by asking GPT about specific lenses that I can use to inform the starting images in my work, I can take that information over to Midjourney and insert into my prompts there.”

Southern thinks that, by 2030, AI will enable us to be able to listen to music tailored to our cognitive and emotional needs, “storylines will shift for our intended physiological states, art will change in real time to optimize our brainwaves,” she predicts.

“So much will be happening, and it will be combined in real time with our physiological signals. But AI is not a silver bullet for creativity. The impact of this tech means we will see a lot of content, a lot of noise and information, and a lot of misinformation and copyright issues,” she says.

“On the other hand AI also will allow us to accomplish incredible feats of the human imagination and empower new ways of being and thinking across the human experience and across our storytelling. It’s really just up to us as storytellers to determine what we want to use it for.”

Friday, 1 December 2023

Lifeline for UK VFX Facilities in Promised Tax Break

IBC

article here

Can facilities survive the current lull until new proposed enhanced tax relief kicks in?

The promise of a more internationally competitive tax scheme targeted at the UK’s VFX industry can’t come a moment too soon for facilities, but lobby group UK Screen Alliance will be holding this and future governments feet to the flames to ensure the bill is passed by 2025.

“A number of companies are hanging on with white knuckles onto this timeline,”

Neil Hatton, CEO of UK Screen Alliance told IBC365. “We can see the light at the end of the tunnel with the end of the strikes and the light is brighter now we have a promise of greater tax relief. But we’ve got to get there first.”

The affirmation of the government’s commitment to supporting the VFX sector in the Autumn Statement is a welcome boon to the beleaguered sector.

“What was announced was a significant step forward in that the previous promise from the government delivered on budget day and repeated in the Creative Industries Sector Vision in July was that it would look at the case for increased support for VFX,” Hatton said.

Looking at something isn’t the same as doing anything about it of course but Hatton points to the more definitive statement in the foreword to the Call for Evidence on the UK Visual Effects sector.

“I can confirm that we will provide more additional tax relief for expenditure on VFX to boost the international competitiveness of the UK’s offer,” the Chancellor stated.

“That’s a firm promise that action will be taken,” said Hatton.

Although the implementation date is April 2025 “it will pass quickly enough with the pace of consultation and legislation”. Indeed, the Call for Evidence is now proceeding on a “absolute breakneck schedule”.

Normally such consultancies are given three months, but this has been allotted six weeks with two of those interrupted by Christmas.

“We’ve got a lot of work to do,” said Hatton who aims to get it submitted before the holiday break.

To compile the submission, UK Screen Alliance is engaging with its members, production companies and large content groups, notably the eight major studios and streamers.

The latter group are specifically being asked questions in the Call for Evidence paper aimed at gauging their appetite for change and where the best places for that change would be.

These questions include:

‘At what point in the production cycle do you choose your VFX studio?’ ‘How does the 80% cap on qualifying expenditure impact your decisions on where to spend your money on VFX work?’ Please provide information about how many productions meet or exceed the 80% cap on qualifying spend?’ and ‘How would removing the cap only in relation to VFX spend impact your decisions about where to place VFX?’

As is clear from this framing, the government seems to have grasped the import of addressing the territorial cap.

This has been in place since the introduction of film tax relief in 2006 where productions can claim a 25% rebate on up to 80% of its global budget if it is spent in the UK. Once that 80% of budget is spent in the UK you cap out of tax relief.

As Hatton pointed out, “If you’re coming to the UK to use our excellent crews, locations and studio infrastructure you are already spending that kind of money so the remaining 20% - which is the most likely to be spent on VFX- is also the most portable part of your budget and will go to VFX studios in other parts of the world that offer incentives for that.”

France, Canada (particularly Montreal) and certain states in Australia all have attractive incentives that VFX shows can and do take advantage of.

An analysis of UK production between 2017 and 2019, reveals that £1bn of VFX expenditure on projects qualifying for UK Tax relief was carried out overseas – this is approximately half of all VFX work carried out on UK-qualified productions in that time period.

“The cap is getting in the way,” Hatton said.

The second issue is the rate itself. “Even if a production could do VFX within the 80% cap, our rate of 25% does not compare to the 30%-50% you could get if you shop around.

“We need to do something with the rate.”

The UK does of course have some core non-financial benefits to international filmmakers including a large number of world class artists and technicians working in the English language (“an advantage not to be sniffed at,” said Hatton).

“So, in a sense we need to get close to the financial benefits to make the money argument go away and then we can compete on our creativity and innovation.”

While addressing the cap is “fundamental” UK VFX facilities want the government to go further.

“We feel that the 80% cap needs to be coupled with an increase in the rate in order to make this really fly. There will be some impact by just removing the cap but we’re not going to maximise the impact that we could have without increasing the rate.

UK Screen calculate that the net cost to the Treasury of introducing both changes is “nothing”.

“It creates economic activity which creates tax receipts, so they cover off the cost of providing the incentive - so really this is a no brainer. This is growth in jobs and economic value for no net cost to the Treasury.”

He added, “I think the government are minded to remove the 80% cap. Everything else is what we are proposing and we have no guarantee that they will.”

Studios will begin soon, if they are not already, to plan locations for productions that will shoot in 2025.

“The fact something is in the offing will pique studio interest but they’re not going to jump on a possibility,” said Hatton. “They need something definite.”

Timeline and Labour

Once the consultation closes there will then be a ‘design’ period where the Treasury comes forward with a firm proposition.

The likelihood is this would be published in time for the Spring budget after which there will be another consultation with a view to getting it passed through Parliament in the Autumn 2024.

That’s not withstanding General Election in the interim. While the Treasury officials handling the technicalities of this don’t change, the political will might which is why UK Screen is also lobbying the Labour party.

Hatton said he will engage with the Shadow DCMS team, itself subject of a recent reshuffle.

“This is all just mitigation,” he said. “The legislation might go through even though the implementation is beyond the next general election.”

Impact on VFX shops

None of this can come a moment too soon for VFX facilities. Many have had a torrid summer because of the actor’s strike the impact of which will extend along way into 2024.

“The impact on VFX is delayed because of the shut down in photography which is only now restarting. While relieved that the strike is over, the problem is that it will take several months before that work rolls into postproduction.

“We’ve got a long journey to go and some significant bumps in the road. A number of companies are holding their breath on whether they are going to get through this period. For those able to stay in good shape on other side I think there are reasons to be cheerful.”

No race to the bottom

The Studios seem to hold all the cards. Presuming the UK ups its incentives, won’t then Montreal or other hubs go one better?

“We’re not intending to beat the rates available,” Hatton said. “We have other advantages we can compete on. What we don’t want to do is go to the bottom of market. What we do want is to put ourselves back in the window of fair competition.”

UK Screen Alliance has a track record of success. It helped lobby for animated features not just animated TV to be eligible to claim a credit rate of 39% (29.25% net after tax and itself a 4.25% rise on the previous rebate), which comes into effect on January 1st. Hatton reports anecdotal evidence of an “uptick in interest in using the UK as a destination for animated feature production.

Closing the skills gap in the regions

UK Screen is further proposing that the government introduce a 5% differential for productions posting VFX outside London.

“We know there is significant latent talent outside London that could be developed,” Hatton said. “The costs of operating outside the capital are lower and this would be a boost to levelling up.”

Its research suggests that while 95% of VFX jobs are in London only 20% of people employed doing them are actually from the capital.

“Many are from overseas but in terms of VFX work as a magnet to pull talent from across the UK we think it shows that if the jobs had been available locally, I am sure they would work more locally. If we can create the fiscal environment to change that that’s what we’d like to do.”

Hatton accepts this is quite a radical proposal but insists that VFX companies headquartered in London would be receptive to the move, possibly establishing outposts in places like Liverpool or Leeds or the nations.

“VFX mostly not a client attend business. We showed at the outset of the pandemic that 10,000 people in the UK could be moved to remote spaces in just a month so in terms of technological deployment this is quite easy to do. The issue is do we have the local skills bases?”

UK Screen have prepared a skills plan that would do that. If it gets the full package its plan would see the creation of 3000 jobs, including 1000 apprentices supported by the main UK VFX vendors.

“It is an ambitious plan for growth,” insisted Hatton. “We are not just going to the Treasury and asking for more money. This is a plan about investment for growth. We want to make sure the UK is the first-choice destination in the world for productions looking to place VFX.”