Tuesday, 11 April 2023

Bringing back the undead

copy written for RED Digital Cinema 

Everyone knows zombies never truly die but it was still an unexpected treat for fans of The Walking Dead to see characters killed off in previous episodes brought back to ‘life’ in a series of commercial spots aired during the show’s finale.

article here

To celebrate TWD’s 177th and final episode, four infamous characters who died over the series' iconic 12-year run were resuscitated for five ads aired on AMC in a first-of-its-kind marketing moment.

The creative campaign was the brainchild of Ryan Reynolds’ Maximum Effort which together with AMC Networks’ Content Room executed the concept in just two shoot days by capturing all five on RED V-RAPTOR XL.

Each of the distinctive ads, for Autodesk, Deloitte, DoorDash, MNTN and Ring, were directed by Maximum Effort’s Bryan Rowland and photographed by LA-based commercial cinematographer Nick Mahar.

“Maximum Effort’s modus operandi is like SNL’s in the sense that they can move very quickly in coming up with inventive ideas and shooting them within days,” says Mahar. “They tend to write up to the last minute and ask for a very fast turnaround which gives the creative an energy that I love to match with camera.”

Mahar has been a passionate RED user since the start of his career and regularly uses different models on projects for clients Marvel, Meta and Google. You could say the camera’s selection for The Walking Dead campaign was a no-brainer.

“Both Bryan and I have been shooting RED for years and on many Maximum Effort projects because we love the color science and versatility of RED RAW,” he says. “That was the case here too but we had other factors in play. From the get-go, Bryan wanted to shoot anamorphic for some characteristics so we needed a large format sensor to capture all the lenses’ glory. We had some heavier VFX in the spot for DoorDash and Bryan was doing the post work, so I wanted to make sure we used his favorite format. The biggest reason to choose V-RAPTOR XL is because of its internal ND exposure management that was going to be key to moving very quickly and adjust on the fly.”

All five commercials were shot across two locations, in order to turn around finished spots in time to air with the finale.

“We had less than a week to prep which we did meticulously at my favorite rental house, Rare Breeds, so we would be ready to go,” says Mahar. “This included location scouting for the DoorDash spot near Northridge, Calif. It was pouring rain and foggy the first time we visited. The second time it was bright and sunny but on shoot day there were 60mph winds so couldn’t put up huge silks. We planned to have the sun backlighting us to maintain consistency, and the ¼ stop control of ND in the camera was pivotal for this.”

The four other spots were shot on three different sets on a soundstage pre-lit by Mahar, Gaffer Team Bashet and Key Grip Brandy Tannahill.

“The entire team is always on their ‘A’ game and ready for whatever gets thrown at us. The sets were basically ready to go besides some minor lighting tweaks when we arrived that day. Bryan is an incredible director who knows exactly when we have the shot, so we flew through our days. Generally, we shot two cameras including V-RAPTOR VV but if we only did one then the V-RAPTOR XL VV was our go-to.”

The spot for Deloitte involved an office setting with lots of heads in jars on the wall. It starred Laurie Holden who reprised her character of Andrea in The Walking Dead.

“We wanted this to look upscale but apocalyptic with bright colours through windows and all the heads backlit,” Mahar explains. “It was important for us to match The Walking Dead vibe but make it more polished.”

Shooting on RED V-RAPTOR XL 8K VV allowed them to shoot in 7K 6:5 at 2x anamorphic. “We wanted the anamorphic look but not something too dirty where everything outside the centre frame is distorted. That’s a cool look but harder for VFX. We were after a cleaner anamorphic look.

“We needed to pack a lot into a day so speed was the essence. That’s why adjusting exposure with the built-in ND meant we had no lag when adjusting angles. For me, having built-in ND with the V-RAPTOR XL body was a dream that I’ve been wanting with my RED for years. Instead of having to go find a filter, load it to the matte box, make sure exposure is correct - I can simply dial it in. It meant we moved so fast to pick off shots.”

Mahar adds, “Ryan Reynold’s Maximum Effort team is one of my favorite creative agencies to work for. Bryan Rowland, has been shooting on RED since the company was founded and is the reason they are able to move ridiculously fast compared to other companies. I love the adventure and challenges it brings; this shoot was no different. There’s never a dull moment!”

 


Why It’s Time for Hollywood to Think Seriously About AI

NAB

AI is being introduced to the creative industries at pace and at what some see at the risk of loss of control. On the one hand, ChatGPT, Midjourney, DALL-E and others, are being marketed as tools to aid the creative process by speeding up time-sapping processes and providing a spark for ideation.

article here 

Not everyone has bought into this narrative, however, and now writers are following artists in speaking out against the wholescale introduction of AI without due consideration for its impact.

“After a cautious approach to ChatGPT-type products, guilds and creators are becoming more vocal about limiting AI’s influence in entertainment,” reports J. Clara Chan in The Hollywood Reporter.

Creators like Cassey Ho, who’s behind the popular fitness brands Blogilates and Popflex, say they’re wary of supporting AI tools that can easily exploit the work of artists.

“I like the idea of it being a co-pilot, but when it’s riding off the backs of creatives, I don’t feel good about it,” Ho said at SXSW, as reported by THR.

The same anxieties around credit and compensation extend into the inner workings of Hollywood, where unanswered questions about AI’s ability to transform the future of entertainment have already informed discussions at unions like the Writers Guild of America and SAG-AFTRA as writers and actors, among others, seek to protect their work and right to compensation.

“Human creators are the foundation of the creative industries and we must ensure that they are respected and paid for their work,” SAG-AFTRA said in a statement on March 17. “Governments should not create new copyright or other IP exemptions that allow AI developers to exploit creative works, or professional voices and likenesses, without permission or compensation. Trustworthiness and transparency are essential to the success of AI. SAG-AFTRA will continue to prioritize the protection of our member performers against the unauthorized use of their voices, likenesses and performances.”

The Writers Guild is also in the midst of negotiations with studios around the use of AI in the writing process, likening tools like ChatGPT to research material like, for instance, Wikipedia. “The WGA’s proposal to regulate use of material produced using artificial intelligence or similar technologies ensures the Companies can’t use AI to undermine writers’ working standards including compensation, residuals, separated rights and credits,” the guild wrote.

Earlier this month, the US Copyright Office declared that AI-”assisted” works could be eligible for copyright protection. It stated: “Based on the Office’s understanding of the generative AI technologies currently available, users do not exercise ultimate creative control over how such systems interpret prompts and generate material.”

Yet this hasn’t assuaged many creatives.

“There’s a fine line between when is something inspiring someone versus when is someone just ripping off or absolutely treading protected intellectual property,” insisted Candle Media’s chief development officer, Brent Weinstein, at SXSW. “AI is going to force us to examine that fine line and rules will be written, and we will all adapt to a new world order.”

Writers won’t be the only ones affected by this new trend. Directors should also be concerned, writes Jason Hellerman at No Film School.

On the positive side, AI could be used to create virtual sets, which could help directors visualize scenes and make decisions about camera angles and lighting before filming begins. AI could also be used to analyze and edit footage, making the post-production process more efficient and cost-effective.

However, AI could potentially replace human directors altogether. “We would instead have computers trying to tell us about the human experience or estimating emotions they are not complex enough to feel. This could lead toward an overreliance on tropes or the points of view of the people who created the AI, which may not be reflexive as a whole.”

When it comes to producing, AI could be used to help producers with tasks such as predicting audience response, optimizing marketing strategies, and even identifying potential investment opportunities.

AI algorithms could analyze audience data to predict which types of films or TV shows are likely to be successful, helping producers make more informed decisions about what projects to pursue. AI could also be used to analyze marketing data and make recommendations about how to reach and engage audiences more effectively.

“In reality, this kind of intelligence might completely eliminate producers,” says Hellerman. “Who needs someone to make calls to package when a computer can send form emails to agents or use its metrics to decide which projects it should be greenlighting?”

To underline the point, Hellerman reveals that the article under his name was largely written by AI, albeit tuned and polished by the author. ChatGPT even mimicked the No Film School website format.

From writing and directing to producing and marketing, AI is being used in various ways to make Hollywood more efficient and effective. “However, with these advancements come potential risks and challenges, such as the loss of creative control and the homogenization of output,” Chan suggests.

The fact is, contends Hellerman, “when giant corporations buy a bunch of Hollywood companies, they are looking for ways to strip the movie and TV process down. How can we employ fewer people and maximize profits? Well, I think they will do it with computer-generated stories and positions.

“That spells less creativity and originality and work for us all.”

 


Monday, 10 April 2023

Citibank: The Creator Economy Will Be Worth $75 Billion by 2024

NAB

The burgeoning creator economy — of content creators monetizing their content directly with fans — is estimated to be worth $75 billion by next year and has caught the attention of big finance.

article here

The findings of banking group Citi may not be revelatory but the fact that conventional finance has charted the market lends weight to arguments made by Web3 exponents that change in the distribution of labor and reward is afoot.

In its report, “The Creator Economy — Getting Creative and Growing (Citi GPS),” Citi calculates there are over 120 million content creators worldwide and expects 9% growth per year through 2024.

Estimating the figure is challenging due to platform overlap. For example, a gamer that creates a live stream on Twitch may also have a YouTube channel and a Patreon account. And some YouTube channels are associated with traditional media firms (like a film or a blockbuster video game).

However, even with these complexities, there are stark differences between platforms. Citi estimates YouTube has nearly 100 million channels (excluding traditional media). Roblox has about 10 million developers. Etsy has about four million sellers. There are currently more than four million podcasts. At the other end of the spectrum, Substack only has a few thousand writers.

Around half of the revenue stems from ad-based video platforms, like YouTube. The other half is spread across a wide array of industries: publishing, education and podcasting, among others. Platforms charge fees that can vary from less than 10% of creator’s revenues (Patreon, Spotify and Unity) to as high as 85% (Roblox). Citi thinks the variance often depends on the value the platform provides across five functions, which it identifies as creation, hosting, distribution/promotion, and monetization. The more functions these platforms perform, the higher the fees.

The lion’s share of the revenue is captured by a very small portion of the content creators. Far more than 80% of the revenue is created by far less than 20% of the creators.

How much can a creator make in net revenue after paying fees to the various platforms varies considerably. The report suggests that a writer for Substack generates, on average, $25,000 per year. The average creator that uses Patreon generates $6,000 per year. At the other end of the spectrum, a developer that creates items for Roblox generates, on average, just $50 per year.

Citi concludes that a creator’s average net revenue is inversely proportional to the number of creators that use the platform. That is, while Substack writers generate the most net revenue, the platform has very few writers. YouTube is at the other extreme. While there are many creators on YouTube, the average net revenue per creator is quite modest. Revenue per creator on the other platforms — Etsy, Twitch or Roblox — falls in between.

Going forward, the bank highlights three areas worth watching.

First, traditional social media firms may begin to share some of the spoils with content creators. That is, platforms like Twitter may begin to emulate YouTube’s business model. Or, as Instagram and Facebook push further into e-commerce, they will create opportunities for content creators to share in the economic spoils.

Second, Web3 tools — like blockchain and crypto — will allow consumers to finance, own, and trade content rather than simply paying creators to consume it.

Digital wallets are one way that new Web3 tools can help creators with monetization. Using a subscription model linked to a digital wallet, readers can access creator content stored on a decentralized storage layer. The subscription paves the way for creators to build a wallet-based community that enables far richer engagement than passive consumption more common with Web2 tools.

“Whether these new tools will be embraced by legacy creator economy platforms or new platforms remains to be seen. But, Web 3.0 tools will certainly result in new innovations that are apt to benefit creators and consumers alike.”

Third, artificial intelligence may alter the creator economy in several ways including by helping with content creation, helping brands find the right influencer or helping consumers find the right content.

“We expect AI to be used to help bring some order to the highly fragmented creator economy ecosystem. For example, AI will likely be used to help consumers find the right content and can also be used to help brands find the right influencer.”


AI for the Entertainment Industry: What Could Happen vs. What’s Actually Happening

NAB

Previous revolutions that mechanized the labor force tended to disproportionately hit the working class.

Here’s the news: History doesn’t always repeat itself.

article here

“This time the robots aren’t invading textile factories or threatening blue collar workers on assembly lines,” says Jeremy Fuster at The Wrap. “In 2023, they’re storming executive and creative suites, going after educated, white collar workers, what used to be considered the most protected class.”

Ironically, in the near future, the employees with the most job security in Hollywood may very well be the ones who work with their hands, like key grips, electricians and craft service caterers.

“The rest of us will have to learn to either somehow adapt our current jobs to the new AI matrix or — more likely — start looking for other occupations,” Fuster says.

That’s a doomsday scenario but other views are available.

“I think some people in Hollywood are panicking unnecessarily,” Ben Grossman, founder and CEO of VR/AI studio Magnopus, tells Fuster. “But there are a lot of others who look at AI and say, ‘Well, great, this is an opportunity for me to actually make it home in time for dinner for a change.’ Because the demand for content is so high right now, a lot of people are working seven days a week.”

“There’s a lot of fear of the unknown,” agrees Scott Mann, co-founder of AI startup Flawless and director of Fall. “People are frightened of new technologies. But AI has the potential to actually strengthen Hollywood. The industry has been suffering for a long time, but AI could be the solution that saves it. It could be the tool that empowers and enables us all.”

 

In a three-part look at “AI and the Rise of the Machines” The Wrap delves a little deeper into the implications for Hollywood.

Practical Magic

The first part of the series examines the implications of ChatGPT on flesh-and-blood screenwriters. Most don’t seem alarmed.

“I mean, is it possible that we would one day see a film that was entirely written that way?” ponders Sera Gamble, showrunner for Netflix’s You. “But the technology is not there yet.”

Gamble and other writers see AI as a potential helper or a tool to help game out plot points in ways that would otherwise require countless hours of human toil.

“It won’t come up with amazingly original story leaps, but it is helpful to just lay out the most obvious story path so you can tweak from there,” former Amazon and Disney executive Roy Price posted on Twitter.

Elsewhere, the University of Southern California’s Entertainment Technology Center, or ETC@USC, s working on an AI tool that allows content creators to extract features of the content to speed editing.

“Right now, when you shoot a bunch of content, someone has to sit in front of the rushes and tag certain moments,” explains Yves Bergquist, director of the Center’s AI and Neuroscience in Media Project. “We’re making videos searchable by shot types, emotional arcs of the characters, scenes, objects, talent and colors. That should really help producers go through the content a lot faster.”

It’s a time- and money-saving win for everybody involved in the film production process — except, of course, for the someone who’s currently getting paid to sit in front of rushes and tag moments the old-fashioned way.

“I mean, there’s not going to be no impact,” Bergquist admitted. “There will be impact in a lot of jobs that are very menial, that don’t involve super-high technical knowledge or super-high creative ability. Probably these jobs are going to take a hit. But I don’t think there’s going to be much job displacement. People are just going to need to educate themselves and ramp up on how AI can help them.”

What about actors — should they be worried about an entirely new synthetic star taking over Tinseltown with no on-set tantrums? Or perhaps they could benefit from sending in their digital twin to preserve their looks on screen when the real thing wrinkles with age.

“People are starting to have those conversations,” says Grossman. “It’s conceivable that in the relatively near future… you could have a famous actor like Michelle Yeoh, and you train an AI on how she looks, how she acts, what she sounds like, and then give guardrails around what her performance should be. That’s what everyone is working towards.

“Right now, it makes more sense in the metaverse and the gaming world because the bar for quality is so high in film and television. But soon we’ll have a level of quality that could be applied in a TV commercial or a movie. Without doubt, that’s going to happen.”

Screenwriters and Actors Guilds Respond

Part two of The Wrap’s report takes a closer look at how the industry’s labor guilds are responding to automation.

Duncan Crabtree-Ireland, national executive of actors union SAG-AFTRA, believes that if proper guardrails are put in place, AI can be a benefit rather than a threat to its members.

“We definitely recognize that there are real risks to jobs, but past history has shown that resisting technology or pretending it doesn’t exist or hoping things don’t change doesn’t work,” he said. “We need to be ahead of the curve and have a say in how this technology will be used.”

The Writers Guild of America, meanwhile, has made AI part of its recently-started contract negotiations, though such talks are mostly to protect members from a point in the future when AI programs like ChatGPT become powerful enough to generate a full script.

The Screen Actors Guild plans to secure those protections by enforcing already existing federal and state laws as well as rules within its own contracts with studios regarding fair use of media and artists’ consent.

In a statement, SAG-AFTRA declared that AI performances based on an actor’s voice and/or likeness fall under the guild’s jurisdiction, and per the National Labor Relations Act, studios wishing to acquire the rights to recreate an actor in AI must negotiation with the guild.

“In addition, any use or reuse of recorded performances is limited by our collectively bargained contract provisions, including those requiring consent and negotiation of compensation,” SAG-AFTRA added.

“If a company decides to start licensing or using AI content based on a performer’s work as part of training datasets for AI engines, then there’s a whole broader social question going on about what that means. Even copyright owners have deep questions about that,” Crabtree-Ireland said.

“One of the reasons why I have a good deal of confidence that we will arrive at the same conclusion with the studios on this is that principle applies just as much to them as it does to us,” he continued. “They don’t want other companies scraping the internet for content created by the major studios and using that as part of training datasets for AI to create other content outside of their systems. So really, the copyright rights that are sort of a key part of this and our contractual rights are very much aligned.”

He continued, “As long as our members are armed with knowledge of how they can take advantage of this new tech and how it can exploit them, AI can be a net positive for actors. This is just the next step of what we’ve always done in this guild, and that’s keep up with the times.”

The WGA wants to ensure that studios “can’t use AI to undermine writers’ working standards including compensation, residuals, separated rights and credits.”

As part of the proposal, the WGA would permit studios to suggest to writers that they refer to AI-generated writing when writing or rewriting a script, but that AI writing cannot be used as the core source material for an adaptation to “create MBA-covered writing or rewrite MBA-covered work, and AI-generated text cannot be considered in determining writing credits.”

David Goodman, current negotiating committee co-chair at WGA, told The Wrap that he believes copyright concerns surrounding AI are a major concern for studios and believes that is a major reason why there hasn’t been an attempt yet to try using AI in a screenwriting capacity.

 “AI has to read human-made work to understand how to write in a specific style or like a specific author, and most of that is copyrighted. Outside of our own response as a union protecting writers, trying to greenlight a project with an AI-generated screenplay would be an easy target for several lawsuits,” he said.

“But for us, our members have told us that they want this addressed immediately, and it’s already in our MBA that scripts have to be written by a WGA member. That still stands, even with artificial intelligence.”

The third part of the series looks at AI’s implications for journalists.

Will AI threaten the core of our news-gathering culture, which already has been challenged in the past two decades by the Internet?

“Journalists and especially publishers of journalism need to know what AI models are good at doing and what they’re bad at doing,” says Jeremy Gilbert, the Knight Chair in Digital Media Strategy for Northwestern University’s Medill School of Journalism. “They are bad at facts. They are bad at math.”

Assuming the technology is better trained on journalistic output and accuracy, generative AI could be used to create different versions of an article depending on the reader’s individual needs and preferences.

“There are clearly ways that journalism can’t be using AI because it cannot be depended on yet to act like a real journalist,” said Sisi Wei, editor-in-chief at The Markup. “But that doesn’t mean there aren’t many exciting and extremely helpful ways that journalists can use AI as a part of the journalistic process.”

Based on the quotes a reporter has and the kind of story it wants to tell they could, for example, direct a large language model like GPT-4, the latest version of OpenAI’s artificial intelligence model, to churn out a longer or shorter version, a more linguistically complicated or simpler version tailored to individual users’ needs. This would enable often complex topics like technology and politics to become more accessible to people.

While Wei said she doesn’t trust AI to generate articles for The Markup — or even for low-stakes writing like automated Little League game articles — she noted that it can be used as a good brainstorming partner.

Even though journalists need to keep in mind that not all the information churned out might be correct, the reporter may find one or two of the suggestions interesting and then ask for some well known experts on those topics.

“I think there’s a lot of researching that it can do in a very conversational kind of way and then it’s up to you to go validate that information and then actually go do your real reporting after that,” Wei said.


The Creator Economy Is… Going Through Changes

NAB

Creators were widely touted as being best able to bear the brunt of economic downturn but they may have to take a longer-term view of their prospects, according to VC firm Antler.

article here

Its latest annual report into the nascent creator economy finds the future uncertain, as platforms struggle to secure large investments and many of the long-tail creators fail to earn significant incomes. Despite these challenges, many people are still optimistic about becoming creators in the future. But Antler believes the industry is moving toward consolidation rather than rapid growth — for now.

“For the first time since its inception, the creator economy is facing a difficult phase,” says Antler’s Ollie Forsyth in a preface to the report. “Brands are reducing their marketing budgets, creator platforms are making a number of layoffs and shutting down, venture capital is becoming more cautious about investing in this area, and the number of platforms achieving unicorn status has plateaued.”

He adds, “Despite the headwinds, the number of people becoming creators is increasing — motivated by their quest for independence, flexibility, creative freedom, and uncapped earning potential.

“We believe the new creator economy will weather this storm in the coming year through two priorities: community building enabling creators to build closer relationships with their fans and creators diversifying their income streams.”

After ramping upwards at a rate during the COVID-19 pandemic, the creator economy has stuttered this past year.

Per the report, investment in creator economy startups has dropped significantly, with only $180 million invested in Q4 2022 in the US, compared to the $500 million invested in the space in each quarter since Q1 2021.

Antler attributes the decline to factors including the threat of a recession, widespread layoffs, and startups raising at tempered valuations.

Platforms have also seen a substantial decrease in funding in Q4 2022, according to The Information’s Creator Economy Database.

The creator economy is not failing, Antler insists. “It is evolving and changing direction.

Indeed, creator economy market size remains undiminished at $100 billion, the VC firm calculates.

“Creators are still likely to play a significant role in shaping future economies and may even go on to create successful and well-known brands. It is possible that some creators will become unicorn founders — it’s just a matter of time.”

For that to happen it prepares creators to expect a reduction in income, “particularly those who depend on brand partnerships.”

Forsyth warns, “Burnout is a persistent problem for many creators, but many are becoming more aware of the economic conditions and are working to diversify their income sources and audience base.”

As a result, it expects to see platforms make more transparent and binding commitments to creators.

“This year we must see a rise in funding for platforms, which will be tough given the current fundraising environment; [we must see] platforms being more flexible on how they cater to creator needs such as upfront payments; more transparency around how platforms charge creators; and [we must see] the option for creators to own and have direct access to their fans.”

It also anticipates creators will take advantage of generative AI, for example to turn their content (text, video, or voice) into any language “while still being in the creator’s voice.”

While only 33% of the 30+ creators Antler surveyed for this report confirmed they have started using a generative AI product — and some said they were still unsure what it is — they expect its use to skyrocket this year.

“I think many of the creator platforms launched in the past few years with sky-high valuations may come to learn that the addressable market is much smaller than they originally believed,” Megan Lightcap, an investor at Slow Ventures, says in the report. “There are only so many full-time ‘professional’ creators who drive meaningful economic value for these platforms.”

“There is still very little transparency — creators don’t know how much they could be making, what platforms they should be using, how they should be defining their brand,” states Faraz Fatemi, investor at Lightspeed Ventures. “In addition, brand budgets have pulled back across the board, lowering the viability of a key creator monetization lever.”

Short-form video content platforms will most likely add e-commerce and shopping opportunities as an avenue for creators to earn additional incomes.

For example, YouTube Shorts now has 1.5 billion monthly active users contributing to 30 billion daily views.

“The opportunity is huge,” says Forsythe. “How these platforms monetize outside of advertising and e-commerce is yet to be seen; however, those are two huge revenue opportunities for platforms to be paying attention to.”

At the same time, creator Sandy Lin warns of the downsides of short-form content that creators need to be mindful of: “Today creators are realizing TikTok and short-form content is not the best way to create an engaging community, monetization isn’t consistent, and constantly creating short-form content is leading to more creator burnout than ever. Creators are reverting back to YouTube to create more engaging audiences. We are heading toward TikTok generation creators treating YouTube as their main platform.”


Friday, 7 April 2023

NAB 2023 Preview: If Cloud Is The Answer, What Is The Question?

IBC

Cloud economics, AI disruption, environmental action and Xtended Reality are the big themes to watch at NAB 2023.

One of the benefits of the move to cloud is that workflows and costs can be streamlined. However, the very term ‘cloud’ masks huge complexity for CTOs to grapple with, whether in a post house or live content producer setting.

article here

That’s why the prevailing approach is to keep a foot in both camps.

An NAB conference session titled ‘If Cloud Is The Answer, What Is The Question?’ tackles the practicalities of cloud transition. It argues that on-prem “is not necessarily the relic it might have been even a few years ago”. While acknowledging the many advantages of cloud, speakers including from Ross Video will identify general misunderstanding about the problems cloud actually solves, and equal uncertainty about the new issues it creates.

For example, many postproduction owners are wary of cloud economics. David Klafkowski, Founder & CEO, Racoon, said unexpected monthly bills from public cloud “can be pretty fierce” and that “controlling cloud cost is a science in itself” especially for small to mid-sized facilities.

Envy, one of the biggest UK edit houses, said its clients want to work flexibly and that includes going into a suite. Jai Cave, Technical Operations Director, said, “If clients want to offline remotely they can, but three days out of five they want the ability to come into a suite. That’s why, due to our size and the way we work day to day, we are the client’s public cloud.”

At the other end of the equation are massive projects able to take advantage of cloud’s economies of scale. None has been bigger recently than Lord of the Rings: The Rings of Power which used cloud to crunch production times and unite hundreds of remote VFX crews as will be detailed by Jesse Kobayashi, the project’s VFX Producer, at another NAB2023 session. For all the groundbreaking efforts to perform crafts like colour grading in the cloud, it’s worth bearing in mind that the cost for it all was underwritten by AWS – a luxury that other productions do not share.

Also showcased will be the work of first-time production company NoneMore which used cloud workflows to create the Oscar winning animated short The Boy, the Mole, the Fox, and the Horse.

There’s even a presentation about the cloud workflows used by BBC Studios to deliver live feeds of the Queen’s funeral to international rights holders with limited or non-existent satellite links.

Vendors to watch include LucidLink network attached storage (used on The Boy, The Mole); Blackbird which is turning its attention to the millions of digital content creators and Avid which might well have a cloud evolution of its familiar edit interface.

NAB 2023 Preview: Virtual production matures at pace

LED screens for Virtual Production and will be conspicuous on the showfloor but it is the dozens of tools designed to enhance the new filmmaking methodology which will of more interest. These range from image-based lighting systems which playback content in tune with skin tones (Quasar Science to set to post camera file management software from Adobe (Frame.io) and Sohonet which recently acquired Fifth Kind for this purpose.

GhostFrame will demonstrate its patented ability to derive four independent images from a single camera frame on the stands of partners Vizrt, ROE, Kinoflo and disguise. This means for example that a presenter can see their autocue or an AR object marker while viewers see only the intended environment.

Also look for MRMC to showcase interactive user experience Unreal Ride which combines virtual and physical using motion control to create a unique video for guests to take away.

Innovation in the area is happening so fast it is causing a headache for professionals trying to keep up. While more than three quarters of US filmmakers in a recent survey expect to do at least some work using VP technology this year (57% said they anticipated doing “a lot more”) feedback also suggests that the industry needs to do more to educate users about the technology.

“Training in Virtual Production is much needed from a technical perspective and how to apply it in filmmaking,” said Jonny Persey, director at Met Film School which works with VP stages at Garden Studios. “The technology and the methodology is changing so fast which is both incredibly exciting in terms of its potential for storytelling but almost means everyone is learning on the job.”

NAB 2023 Preview: Sustainability - M&E under scrutiny

After efforts to benchmark the carbon footprint of high-end TV productions, including Bafta albert; PEAR in the US, and Carbon’ Clap in France, attention is now turning to the entirety of the programme’s lifecycle.

With more defined start-end boundaries calculating Scope 1 greenhouse gas emissions is easier for production. Accounting for indirect waste that occurs in the upstream and downstream activities of an organisation (Scope 3 GHG) is harder to address.

Whether you agree with a recent Sustainability in Video Entertainment report that the video streaming industry’s annual carbon footprint now exceeds that of aviation, it underlines the findings of other studies that the bulk of Co2 in M&E stems from data centres upstream and streaming to devices downstream.

In addition, the pace of AI/ML development and use of applications like ChatGPT is burning more energy than other forms of computing. A report by Bloomberg blames a lack of transparency about the true carbon cost of AI on major data centre owners Google, Microsoft and Amazon and GPU developer Nvidia.

NAB is airing the topic in the panel sessions ‘M&E sustainability in the cloud’ and ‘Why sustainability in media matters’ during which it will also present a series of Sustainability Awards. It is to be hoped that tough questions will be asked since surely every company from camera battery maker to content delivery network needs to do more.

NAB 2023 Preview: AI - the latest disruptor

The Genii is not going back in the box. Attention in Hollywood has turned from whether to use Generative AI towards how to use it in earnest. Opportunities abound, from automating editing and VFX workflows to letting ML models loose on creating entire scenes from text or image prompts.

Sohonet said it trained an AI on 25 years of production data to help it forecast production volumes three months ahead. The results were useful – to an extent. What the AI didn’t factor – because it hasn’t been trained on the relevant data - is the looming writer’s strike. Generic AI models can’t yet adjust in real-time for the impact of news events.

A more pressing concern, highlighted by Josh Glick, associate professor of film and electronic arts at Bard College, is that studios using algorithm-driven predictive analytics may end up ironing out diversity of form, story, and talent.

“It damages the possibility of getting more films out there made by women and by people of colour when studios are just trying to make a film with the least amount of risk,” Glick warned.

AI is not new if viewed with the longer lens of disruptive technology.

“The startup that will solve AI-based rotoscoping won’t be the guys with a workforce of cheap labour in India,” said Sohonet chairman & CEO Chuck Parker. “It will be the startup in India with no labour which solves the problem. They can afford to innovate without destroying their own balance sheet, in contrast to the incumbent who continues to think that cheap labour is a competitive advantage.”

This pattern will repeat at an individual level. He said: “You could be an editor, colourist, audio mixer, production designer - everybody has to figure out how AI is going to co-opted into your job. The only inevitability is that right now someone else with an AI tool is figuring out how to take yours.”

At NAB, using AI/ML to automate workflows and jumpstart the creative process will be part of the conversation.

NAB 2023 Preview: XR all around you

A NAB session titled Immersive Storytelling ponders what’s next for virtual reality (VR), augmented reality (AR) and extended reality (XR). It’s a question that has perplexed some of the biggest tech giants which continue to pour millions of dollars into research.

As Matthew Ball, a tech investor and metaverse evangelist recently said, “In 2023, it’s difficult to say that a critical mass of consumers or businesses believe there’s a ‘killer’ AR/VR/MR experience in market today; just familiar promises of the killer use cases that might be a few years away.”

Ball admits that the technology has proved harder than many of the best-informed and most financially endowed companies expected.

He and others like Meta’s Mark Zuckerberg and Apple still believe that a smart glasses style device will provide the best gateway to the metaverse but no-one has been able to design one that packs sufficient battery and compute power, high resolution screen and the dozens of sensors required into package as comfortable as a smartphone.

Perhaps that’s because, as Epic Games’ CEO Tim Sweeney pointed out, we may need not just new technology but actual new science to build an AR platform that’s a substitute for the mobile.

Consumer electronics VR and XR devices might be some way off although that is unlikely to stop Paramount Futurist Ted Schilowitz and Sony Pictures’ SVP, Virtual Reality, Jake Zim from star gazing in the NAB session.

Until the killer AR wearable is invented, immersion will continue to amplify at scale around live events and physical venues. For an example, attendees need look no further than along the strip to the Venetian where the MSG Sphere, featuring two of the world’s largest spherical screens, are being readied to open later this year.

Thursday, 6 April 2023

Who Should Own the Data (That You Generate)?

NAB

Each of us is producing exponentially more data than ever before but do we have any power of it at all? If data (outside health) is the most valuable asset we hold, shouldn’t we be more concerned about taking back control?

article here 

Big questions which Brittany Kaiser, co-founder of the Own Your Data Foundation, believes can be answered. Delivering a keynote address at SXSW, she urged a concerted effort to reset our relationship with the big tech ad machines governing our private information by embedding ownership of our data in Web3 technologies.

She’s not the first to chart our recent troubled history with data being siphoned off by Silicon Valley giants like Google and Facebook.

“Technology has been designed to be inherently extractive,” she said, “to extract as much value from individuals and pull that value up to the top of the supply chain, [where] multibillion dollar or even trillion dollar technology companies are mostly made up of our digital assets, our personal data, our behavioral data, everything about us.

“But somehow we, as the producers of that, don’t have any access to its value. How in this multitrillion dollar value chain do the producers of most of the value not really have access to that monetary value, let alone the process of the supply chain?”

It’s not just data on what we watch or what we shop for either. Even when we give permission for apps to work on our devices we’ve probably given them access to our calendar, GPS, your photos and videos – even have access to your camera and your microphone, even when you’re not using the app and when you have no trust basis with those organizations.

“This is why data rights is one of the most important topics in legislation, in regulation and human rights, in education, and of course, in the design of new technologies.”

Kaiser said there is a movement, of which she is part, to make technology more ethical with more individual empowerment, though admits the conversation at government levels has only just started.

She walked through the steps she thought needed to happen for us all to take back control. This begins with the ability to opt out which is now possible in Europe and being introduced in the US

The next step is consent and permission so that we understand and agree to the purpose to which our data is being used.

“We should be able to revoke that consent as well,” Kaiser said. “So the next step is accountability. A lot of the data architecture that is used in current technology has a lack of accountability, because if data is transferred, if data is shared or if data is deleted, often it’s not possible to tell that that has happened. Using current technologies, it’s very difficult to have that actual accountability.”

The next concept she talked about was ownership. Under most laws around the world, we do not own our personal information, she said. “Our personal information is either owned by the government or it is owned by the company that has collected it from us.”

All of this can be built using Web3 technologies to create a different data architecture.

“I really believe that blockchain technology has the ability to scale trust in a way that nothing else has been able to up to this point. In order to know that I can interact with anyone around the world without having to have a human trust between two people, we can build technologies that protect us so that I don’t need to trust the person I’m interacting with.”

Encryption, she said, can be used to make sure that our personally identifiable information doesn’t need to be shared unless we want it to be. It will ensure that every action we take online can be anonymized while our collective behavioral data can be used by companies and governments.

“The concept of digital identity means that we are able to use an identity that is not linked to our personally identifiable information in blockchain technology,” she said. “Hopefully we building as an industry enough tools where this is going to be very simple in the future.”

It’s not that data is by itself evil. “Big data can solve a lot of the world’s greatest problems,” she said. “That’s why most of the big NGOs, United Nations’ departments, governments, militaries, humanitarian aid organizations, all relying on large scale data sets and data science and data driven research. The more data we have, the more that we can see patterns, the more that we can predict what is going to happen before it does and intervene.

“So it is of the utmost importance that we as individuals, that our governments and technology companies, start to take these issues incredibly seriously so that we can make sure that the architecture of our digital lives starts to become more congruent with the ability for us to protect our rights.”