Monday, 13 February 2023

How Synthetic Media is Completely Changing… All Media

NAB

Synthetic media, sometimes referred to as “deepfake” technology, is already revolutionizing the creative process as both artists and artisans conscript AI for production assistance. From videos to books and customizable stock images and even cloned voice recordings, one can now create an infinite amount of content in seconds with the latest generative AI technology.

article here

Israeli firm Bria helps users to make their own customizable images, in which everything from a person’s ethnicity to their expression can be easily modified. It recently partnered with stock photo image giant Getty Images.

“The technology can help anyone to find the right image and then modify it: to replace the object, presenter, background, and even elements like branding and copy. It can generate [images] from scratch or modify existing visuals,” Yair Adato, co-founder and CEO at Bria, told The Media Line. 

Another Isreali startup, D-ID Creative Reality Studio enables users to make videos from still images. It is working with clients, including Warner Bros.

“With generative AI we’re on the brink of a revolution,” Matthew Kershaw, vp of commercial strategy at D-ID, told The Media Line. “It’s going to turn us all into creators. Suddenly instead of needing craft skills, needing to know how to do video editing or illustration, you’ll be able to access those things and actually it’s going to democratize that creativity.”

Kershaw believes that soon, people will even be able to produce feature films at home with the help of generative AI.

In fact, analyst Gartner has put a timeline of just seven years on that. It expects that by 2030 a major blockbuster film will be released with 90% of the film generated by AI (from text to video).

AI in marketing

The rise of synthetic media has made it easier than ever to produce deepfake audio and video. Microsoft researchers, for instance, recently announced that their new AI-based application can clone a person’s voice with just seconds of training. Called VALL-E, the app simulates a person’s speech and acoustic environment after listening to a three-second recording.

Such generative models are potentially valuable across a number of business functions, but marketing applications are perhaps the most common.

DALL-E 2 and other image generation tools are already being used for advertising. Heinz, Nestle and Mattel are among brands to be using the technology to generate images for marketing.

In fact, the global generative AI market is expected to reach $109.37 billion by 2030, according to a report by Grand View Research.

Jasper, for example, a marketing-focused version of GPT-3, can produce blogs, social media posts, web copy, sales emails, ads, and other types of customer-facing content.

At the cloud computing company VMWare, for example, writers use Jasper as they generate original content for marketing, from email to product campaigns to social media copy. Rosa Lear, director of product-led growth, tells Thomas Davenport at HBR, how Jasper helped the company ramp up our content strategy, and the writers now have time to do better research, ideation, and strategy.

Kris Ruby, owner of a PR agency, also tells Davenport that her company is now using both text and image generation from generative models. When she uses the tools, she says, “The AI is 10%, I am 90%” because there is so much prompting, editing, and iteration involved.

She says feels that these tools make one’s writing better and more complete for search engine discovery, and that image generation tools may replace the market for stock photos and lead to a renaissance of creative work.

By 2025, Gartner predicts that 30% of outbound marketing messages from large organizations will be synthetically generated, up from less than 2% in 2022. 

 

Coming soon: Generative Synthetic Media

The next evolution is being dubbed Generative Synthetic Media (GSM). This is defined by Shelly Palmer (Professor of Advanced Media at the Newhouse School of Public Communications) as data-driven synthetic media - created in near real time and surfaced in place of traditional media.

This could happen very soon following “an explosion” of applications built over large language models (LLM), such as GPT-3, BLOOM, GLaM, Gopher, Megatron-Turing NLG, Chinchilla, and LaMDA.

Again, it is in marketing and advertising where the biggest impact will be felt first. This would range from AI-driven data collection to target specific audiences to the creation of tailored ads all in near real time.

Palmer supposes that a natural language generation (NLG) platform would generate a script, that content will be hyper personalized (rather than targeting larger audience segments), would automatically optimize content for social media, email, or display ads and then continuously monitor the performance, ensuring that the content remains relevant and effective.

“It will not be long until the ideas described in this article are simply table stakes,” Palmer declares.  This will start with generative AI creating ad copy and still images (GPT-3 and Midjourney APIs), and then we’ll start to hear voice-overs and music. Next we’ll start to see on-the-fly deepfake videos, and ultimately, all production elements will be fully automated, created in near real time, and delivered.”

He thinks this will take “more than a year, less than three years.”

As it stands today, to use generative AI effectively, you still need human involvement at both the beginning and the end of the process. As a rule, creative prompts yield creative outputs. The field has already led to a prompt marketplace in which for a small fee one can buy other users’ prompts.

Davenport thinks that ‘Prompt engineer’ likely to become an established profession, at least until the next generation of even smarter AI emerges.

 

Deepfake concerns

Synthetic media is also sometimes referred to as “deepfake” technology, which brings with it more worrying connotations. These concerns raise from identification of authorship and copyright infringement to the ethical muddy waters of fake news and training AIs on datasets biased in terms of race or gender.

LLMs for example are increasingly being used at the core of conversational AI or chatbots. Even Facebook’s BlenderBot and Google’s BERT “are trained on past human content and have a tendency to replicate any racist, sexist, or biased language to which they were exposed in training,” reports Davenport. “Although the companies that created these systems are working on filtering out hate speech, they have not yet been fully successful.”

 

AI in law

An Australian court ruled in 2021 in favour of AI inventorship (i.e. the AI system could be named as the inventor on a patent application). However, this has been overturned by the Australian Federal Court. Law firm Dentons expects to see lots of developments and change globally on this issue.

In the US, a blueprint for an AI Bill of Rights has been introduced and proposes national standards regarding personal data collected by companies and AI decision-making. According to Dentons, further regulation of AI decision-making is likely to see continued focus from the federal government following the publication of a blueprint for an AI Bill of Rights by the White House Office of Science & Technology Policy.

The EU is already going further and is seeking to benchmark restrictions on use of AI just as it did successfully with GDPR in its jurisdiction. Expected to be finalised in 2023, the EU AI Act will categorise AI as either being an unacceptable, high or low/minimal risk.

As explained by Dentons, Unacceptable-risk AI systems include, for example (i) subliminal, manipulative or exploitative systems that cause harm.

The law firm says, “We look forward to 2023 being a fruitful year in terms of the increase in scope for AI deployment, and also inevitable regulation, with the possible exponential increase in legal disputes relating to AI.”

 

 


Friday, 10 February 2023

Beacon of Light: Roger Pratt BSC

British Cinematographer

article here

The BSC is bestowing its Lifetime Achievement Award in celebration of Roger Pratt BSC. Spanning over 45 films the DP’s work is a towering list of the epic and the intimate. A founding father of British indie film in the 1980s and equally at home on the biggest studio movies, his memorable achievements include Terry Gilliam’s Brazil (1985), Neil Jordan’s Mona Lisa (1986), Mike Leigh’s High Hopes (1988), and Tim Burton’s Batman (1989).

He made four films with Lord Richard Attenborough including Shadowlands (1993), In Love and War (1996) and Grey Owl (1999), and two Harry Potters, The Chamber of Secrets (2002) and The Goblet of Fire (2005). Pratt also shot The Avengers (1998) and Wolfgang Petersen’s Troy (2004). He earned two BAFTA nominations, for Lasse Hallström’s Chocolat (2000) and Jordan’s The End of the Affair (1999), for which he was also Academy Award nominated.

“I love working and talking with him,” Gilliam tells British Cinematographer. “It has taken too long for directors of photography to recognise Roger in this way.”

Born in 1947 in Leicester, the son of a parish vicar, it seems that movies became a religion for the teenage Pratt after viewing 16mm ‘fact and faith’ films in his father’s church.

“Dad always said that from that moment he understood something of the potential of film,” says May Phillips, Pratt’s daughter. “The mechanics of the projector as much as the films themselves sparked his interest in cinema. He joked that he owes his career to divine intervention.”

At Loughborough Grammar School he made the 8mm B&W short Green and Dying about the school’s students and teachers, single-handedly writing, shooting, and cutting the film which was favourably reviewed in a local magazine.

After a gap year with the VSO in Mali, he went to Durham University then headed back south to study at the London Film School on a grant from Leicester City Council. Working days at Humphries Film Laboratories (and nightshifts in a garage) to earn his union card, Pratt eventually found himself at Chippenham Films, an outfit producing corporate videos for members of the Monty Python troupe.

Python partnership

It was as clapper loader on Monty Python and the Holy Grail in 1975 where he first met Gilliam, who co-directed with Terry Jones.

“We were filming the Bridge of Death sequence (in Glen Coe, Perthshire) and needed a dramatic shot looking up at the bridge with the mountains in the distance,” Gilliam recalls. “I stuck the camera on the edge of the cliff, but the lens wasn’t wide enough. We were a long way from the road, the light was going. It was terrible. This guy said, ‘Just give me a moment’ and in a few minutes, while we were still faffing around, he had run all the way down the mountain, forded the river, run up the other side, into the camera truck, grabbed the right lens and here it was. We stuck it on the camera and got the shot. That was the moment I fell in love with Roger.

“I was convinced that I wanted people like Roger around because I was still learning how to make films and I needed people with some experience. He was also a magnet for good people.”

Having cameoed as the clapper loader toward the end of Grail, Pratt was also cast by Gilliam as ‘Man Living in a Barrel’ while focus pulling for Terry Bedford on comic fantasy Jabberwocky (1977).

It was another step along the traditional route to cinematographer before Pratt debuted as DP on Roger Christian’s The Sender (1982).

“I was pushing him to be a full-time cinematographer, but he was hesitant to move forward too quickly,” Gilliam says. “He was not concerned with advancing his career so much as about learning the craft and feeling very comfortable in himself. I totally respected that. Too many people jump before they can fly.”

Gilliam invited Pratt to photograph the short The Crimson Permanent Assurance, intended to form part of Python’s sketch feature The Meaning of Life (1983).

“When we edited the whole thing, it was clear that Crimson had a very different rhythm to the rest of the film,” he explains. “The others wanted it to be cut shorter – it was like dealing with a studio! I said ‘No’ so we pulled it out and ran it as a short preceding the main feature.”

Pratt’s career now took off. Having assisted on Mike Leigh’s debut feature Bleak Moments while at film school, the director called on him to lens Meantime (1983), a drama produced for Channel 4, and then High Hopes, a drama of a working-class family living in Kings Cross.

“Roger is dry, funny, intelligent and not your average cinematographer,” Leigh says in a My Life Films production of Pratt’s life. “Meantime was a tough shoot on a tiny budget but there was a great rapport between us and really where my memory and relationship with Roger began.”

Problem solving

Bigger gigs followed, notably with Gilliam on cult classic BrazilThe Fisher King (1991) and 12 Monkeys (1995).

“The thing about Roger is that we didn’t spend a lot of time talking on set about the process. We’d meet at our houses, as we did before 12 Monkeys, and on location to talk our way through mood and time or shadow often so I had more flexibility about where to place actors in the shot.

“Filmmaking is about problem solving and he was always ahead of the problems before they raised their ugly head. It meant I could concentrate on whatever else I needed to concentrate on. Roger’s efficiency made that possible.”

Gilliam adds, “For Brazil, I very much wanted to do a German expressionist film and Roger helped me decide on angles, including Dutch angles, and colour. He introduced me to warm and cold light and mixing them in the same shot which is not something a lot of people did.”

This was used to great effect in 12 Monkeys as a palette to distinguish characters imprisoned underground with the twilight zone of scenes above ground. For the haunting dreams of main character Cole, Pratt worked with the interpositive and internegatives to almost destroy the image for the parts of the sequence.

“We printed very light and made an interpositive and internegative which we then overexposed even more,” Pratt told American Cinematographer at the time. “When Cole finally gets to the airport at the end of the film, the look goes back to being high key, with all of the detail put back in." 

To light the vast expanse of Grand Central Station for a set piece in The Fisher King, Pratt came up with the idea of hanging a mirrorball above the central concourse.

“Suddenly the scene is lit like a disco, and I’ve got a thousand people waltzing and the lights are spinning everywhere,” Gilliam marvels. “It was utterly magical.”

Gilliam further credits the DP for introducing him to ultra-wide lenses, which are the signature look of a Gilliam film. “That is what happened on Grail,” he says. “Not only had Roger run through a river but he delivered this 14mm wide angle which I thought was a great approach to portraying characters in an environment and in a space, not just in a closeup.”

He elaborates, “What I like about ultra-wides is that they are claustrophobic and agoraphobic at the same time. I like the way a wide lens bends architecture and forces perspectives so the scale of things is altered. Also, the whole scenario is around you when you look through the camera.

“Of course, this makes lighting much more complicated. There’s no place to hide a light. With a long lens you can put a light everywhere and focus on the area you are dealing with. With other DPs it was always a trickier process but Roger made it seem so easy.”

All the while, Gilliam and Pratt developed a friendship, playing racquetball among other spare time pursuits.

“I don’t tend to have cherished moments - I usually only remember the disasters,” Gilliam says, “but with Roger I remember a lot of good things. He is very much a close friend.”

Industry icon

Oliver Stapleton BSC recalls a problem lighting beer. He had just graduated from film school in 1980 and was a trainee in Neal’s Yard, Covent Garden, opposite the Python offices. He recalls being offered a beer commercial to shoot but had no clue how to light it.

“None. I was there in the office one morning and thought, I bet Roger Pratt knows. I didn’t know him, but he had a reputation as somebody who knew what he was doing because he’d gone the traditional route of learning on the job. I walked over the road, introduced myself and told him my problem. He then spent a good hour showing me lots of tricks such as how to put a soft light to project into the beer to get the colour just right. The next day I went off to shoot the commercial and did exactly what he told me.

“It was so amazing and generous of him to help me - a nobody - to survive another day in the film industry.”

Four years later and Stapleton is shooting Absolute Beginners at Shepperton while on the next stage, Pratt was filming Mona Lisa. “I never forgot how generous he was to me. We were competitors, I guess, but I was never aware of anything other than respect.”

Sir Roger Deakins CBE BSC ASC reflects on Pratt’s career. “In 1984, Roger Pratt began shooting a film with the initial title of 1984 1/2 as I was finishing up working with Michael Radford on his version of 1984,” he says. “I have been a great admirer of Roger's work since seeing the amazing results of his collaboration with Terry Gilliam on this film - a combination of Orwell and Fellini’s  - which was released in 1985 as Brazil.”

“I’m delighted that Roger Pratt is to be honoured with the BSC lifetime achievement award this year,” adds Billy Williams OBE BSC. “In focusing on the drama and the storyline [he] has created the visuals for a variety of so many memorable films. I send him my congratulations and warmest good wishes.” 

Praise for Pratt also comes from outside of the cinematography sphere. Kenneth Branagh, with whom Pratt collaborated on 1994’s Mary Shelley’s Frankenstein, says: “One of the few authenticated 100% top-to-toe geniuses in his field I have met. I couldn’t have done it without you. Love and admiration.”

“Roger is one of the creative forces behind not only ‘80s cinema but going on through the ‘90s and up until today,” notes producer Stephen Woolley, talking to BAFTA in 2019. “Roger personified the often-clichéd term ‘painting with light’, and as a producer I feel privileged to work with him.”

In an inscription in a book given to Pratt while making The Fisher King, the late Robin Williams writes: ‘Roger, if you have the smoke, I’ve got the lines. Thank you again, you are incredible.’

Wonderful union

Chuck Finch gaffered for Pratt on 17 movies, including Shadowlands, Inkheart, and The End of the Affair. They met on Brazil, where Finch was Best Boy.

“He is a brilliant cinematographer with a wonderful eye who made lighting look easy,” Finch says. “He gave me some great tips. He’d say, ‘If you can’t paint with light, light with paint.’ In other words, if you want to put a shadow on a wall and you can’t get a flag, then paint the background dark.”

It’s the type of trick that evolved out of German expressionism and was picked up by cinematographers working on American thrillers in the 1940s where it became part of classic noir. In fact, Pratt’s affinity with film noir lighting is picked up time and time again in reviews for his work in films from Mona Lisa to Harry Potters.

The first time Finch heard Pratt use the expression was prepping the bell tower sequences in Batman and lighting Gotham City on the vast Pinewood backlot. Pratt had hired Finch for this, his first gaffer job.

“He never struggled to convey what he wanted because lighting for him was all about tonal separation,” Finch says. “Once you understand that, lighting is easy. But Roger was meticulous.”

Prepping 12 Monkeys in Philadelphia, Finch spent a day with the DP breaking down the light required for each set.

“It was a great learning curve that I continue to carry with me 50 years later. He was my mentor.” On location in Malta for Troy, the pair dined out every night, even staying in the same hotel, so close as friends had they become. “We used to arrange to meet of an evening and we’d chat about film and life, sport, family. About everything and nothing. We were like a married couple really.”

For Pratt, the art of cinematography is as much about the practical as the artistic. “People do have the most flowery language to talk about cinematography, the art of it,” he told The New Yorker.

“For me, there are many problems on a film that have nothing to do with theories of colour or highfalutin aesthetics. Because my job is concerned with big lumps of lights and metal cameras and laboratories, it’s something that makes half of me very pragmatic. I look at myself as a technician… Photography relies on science… Photographs, they’re really just chemicals in labs, aren’t they? Lights on paper. Images in silver halide. But they turn into live things.”

 

 


Wednesday, 8 February 2023

Why Live Production Has to Up Its Sustainability Game

 NAB

article here 

Amid tightening carbon reduction policies to achieve corporate sustainability goals, the live production sector is facing more challenges than most. Excuses and exemptions for the unique aspects of an outside broadcast are no longer acceptable when there are tech solutions available that can drive down emissions while keeping the same quality on-air.

In a white paper exploring the topic, Sony asks “Can live production be sustainable?” and finds that the company is not yet living up to its sustainability potential.

“In a corporate environment increasingly prioritizing ESG commitments, there are questions surrounding live production’s seemingly low drive towards sustainability in comparison to the wider broadcast industry,” Sony asserts.

“The persisting view is that live production cannot lend itself to this without compromising its output, or at the very least putting it at risk.”

Sony acknowledges that a complete overhaul is risky and expensive, but says there are solutions.

It first identifies the issues then supplies some answers.

The starting point has to be measurement. Companies need to know — “tangibly and with certainty” — the emissions they are producing and through which practices. “Without this knowledge and transparency, the impact of sustainable practices themselves are difficult to quantify, and for those looking to make them, harder to justify.”

There are programs, such as BAFTA’s albert initiative, that can benchmark productions, but there’s an important gap in the data. This gap concerns transport and logistics, the biggest and most polluting cost for any live show. Sony says this is perhaps the biggest challenge to the industry’s sustainability efforts.

“Yet, there is no reliable quantification of their impact. With transportation representing almost a quarter of Europe’s greenhouse gas emissions, 70% of which is road transport, it’s undeniably a significant area of concern,” says Sony.

The white paper notes the uncertainty around local power supplies at a location introduces backups to avoid worst case scenarios.

“This energy is often underutilized,” it says, adding, “power supplies differ — some filming locations might rely on a fossil-fuel powered generator, whilst others may be set up with more sustainable power sources.”

Similarly, SDI cabling continues to be relied on because of its proven “fail-safe” performance. The last thing a live producer wants is signal blackout. SDI, as opposed to 5G and IP delivery, is an on-site solution which continues to be favored for the sake of consistency and reliability. “Consequently, live production infrastructure continues to output associated carbon emissions.”

Forty-ton OB trucks themselves are deemed the culprit for the bulk of a production’s emissions.

So what can live producers do today that will seriously cut back on carbon? Streamline by going modular, into the cloud and centralizing is Sony’s solution.

“Until now, live production has been defined by preparation for the most complex set ups, while in reality utilizing and needing a fraction of those resources,” the white paper argues. “By nature of this thought process, elasticity has not been a key design philosophy, and systems are built with only the most difficult scenario in mind.

“The result is overprovision, where OB trucks built to broadcast the Champions League final are also used for filming five-a-sides. When live production processes are broken down into modules it will provide operational benefit in the long term.”

Furthermore, by leveraging cloud-enabled IP technologies for the core processing capabilities of formerly monolithic OB trucks, operations can be more agile as the truck is split into functional modules that interconnect through IP, combined and separated based on the individual needs of the production.

In other words, teams can use resource more efficiently, using what they need when they need it, and no more.

Centralizing processes is a solution that doesn’t necessarily rely on strong connectivity or modularity, and is therefore “within close reach for most live production operators,” according to the vendor.

Sony explains that sending production content to a single location for processing and broadcasting that houses the existing team can both speed up processes and cut down on the logistical impact of OB trucks.

“Production format also comes into play — from recurring studio environments to on-location reporting. Regardless of content, the more productions can leverage a single location, the more efficient utilization of resources.”

Remote work is another path towards sustainability targets. Offering staff the versatility to work on multiple productions in the same time period can help cut down the emissions associated with on-site presence.

Some of these concepts are probably being embraced by every large live event broadcaster, but perhaps not all together or at the speed the planet needs.

As Sony says, “focusing on financial success is no longer enough in order to achieve success.”

It is only with the collective effort of all the individual businesses within the industry that we will move the dial on sustainability.

 

“Kendrick Lamar Live in Paris” Brings Cinematic Production to the Streamed Concert

NAB

Camera technology that started out in the upper echelons of cinema have now become so accessible that the use of digital cine cameras and lenses is being use to photograph sports and music concerts too.

article here

Normally, such cameras are used sparingly for cinematic depth of field cut-aways in live sports or in glossily post produced video concert footage.

The video production of the recent Kendrick Larmar tour took this to another level by using multiple digital cinema cameras in a livestreamed outside broadcast.

“Concerts shouldn’t look like a sporting event game,” said the production’s director, Mark Ritchie. “They shouldn’t be filmed like one either.”

Perhaps that isn’t surprising given an artist of Lamar’s caliber. The Big Steppers: Live From Paris, part of Lamar’s “Big Steppers Tour,” was streamed live exclusively on Amazon Music and Prime Video from the Accor Arena in Paris this past October.

“We didn’t want to just use a prefab camera plot,” Ritchie explains. “We really wanted to understand what would be dynamic, what would be a great storytelling device, what lenses would feel more immersive versus objective.”

The amount of technology used for the shoot was astonishing, as detailed in the Sony case study. An ARRI Trinity went from the stage to the floor for specifically choreographed moments. Two additional Steadicams, one on stage for fluid live moments, and one in the audience, captured moments with fans. They had a robotic rail-cam system “that acted like a sniper,” able to boom up and boom down precisely while maintaining a beautiful frame above stage height.

They also had a spidercam for very specific cinematic moments, a 25-foot tower camera and Technocrane gliding slowly over the audience that captured waves of hands as it made its way to the stage.

Principal photography was from 16 Sony Venice cameras and Sony’s new cinematic pan-tilt-zoom camera, the FR7.

Ritchie used the Venice at 6K in full-frame, along with lenses like Signature Zooms or Fujinon Premistas and primes.

“The beauty of full-frame is you can see a nice wide shot of a stadium or an arena, but stay focused on the person right there in front of you,” he said. “To be able to control someone’s attention with more shallow depth of field in certain moments is critical to the narrative. I can show you 80,000 people and a massive stage, and by using a shallow depth of field I can ensure the audience stays laser focused on the artist while still offering an epic sense of depth and grandeur.”

He also used Venice in Super 35 mode, allowing him to employ longer cinema zooms and converted broadcast lenses that can offer both tight and wide coverage from all angles.

“One of the biggest challenges in live spaces is distance to the subject,” says Ritchie. “Feature films happen between eight and 20 feet. However, it’s often challenging to maintain the inner ring of close coverage in a live space, especially when you have massive stages and catwalks in excess 120 feet, while trying not to impede on the audience experience. Having that second ring of coverage is crucial to maintain coverage throughout the film.”

Live Grade LUTs were applied, adjusting exposure and black levels and accounting for any variances between lenses and the environment, which as you can imagine means battling with constantly changing extreme contrasts, bright LED screens, and highly saturated lighting.

“We’re doing that with 16 to 20 cameras in the live space where every one of these needs to be as close to perfect as possible,” adds Ritchie.

“When you’re shooting for a film, you have the luxury of time and an edit. You can just shoot Log and tweak the exposure and color later. But in the live space it’s real-time. In line LUT boxes apply our base look and our truck RCPs control Iris as well as subtle variances between cameras. The cinematographer, DITs, LD and video engineers are all working in perfect sync, safeguarding the image through every crucial step.”

 


Tuesday, 7 February 2023

What We Can Learn From 2022’s Most-Streamed Content

NAB

Americans streamed more than 19 million years’ worth of content last year, a total that was at least partly driven by original film and drama, according to Nielsen. Among the most popular hits were Netflix’s Stranger Things and Disney+ animated feature Encanto.

article here

In the ratings agency’s end-of-year streaming rankings, Netflix shows lock-out the top ten. Stranger Things came out on top of both original and acquired content as the most streamed TV show in 2022, amassing 52 billion minutes viewed for a total of 34 episodes (spanning all four seasons).

The teen scarer was also the streamer’s second series to cross the billion-hour viewing mark, after Squid Game, notes Fierce Video.

The dominance of original content is underscored even more by the fact that there are only 34 episodes of Stranger Things, while there are 192 episodes of The Office, finds Nielsen.

The overall streaming figure of 19 million years is up 27% over 2021 (15 million years’ worth) but not quite achieving the earlier pandemic record-highs of 2020.

Another notable Netflix title on the originals ranking was Wednesday, taking third place at 18.6 billion minutes streamed despite debuting in late November with just 36 days of availability on Netflix to make the cut for this chart. Ozark came in second in the original-only list (31.3 billion minutes) but fourth place in the overall ranking.

Netflix locks out the top 10 streaming episodic shows with Amazon Prime’s The Boys coming in at 11 and The Power of the Ring at number 15 (9.4 billion minutes).

When it comes to original movies on streamers, Disney is the winner.

Encanto was the most streamed movie in 2022 with 27.4 billion minutes viewed and taking fifth in Nielsen’s original and acquired streaming ranking. Turning Red (11.4 billion minutes), Moana (8.6 billion minutes) and Hocus Pocus 2 (5.7 billion minutes) were other big hits for Disney+.

As you can see from the chart, it’s far from all about originals. The long-running procedural drama NCIS was the second most-watched show in 2022, gaining 38.1 billion minutes viewed across 356 episodes.

Nielsen said, “This highlights the immense attraction that library content holds for viewers who spent billions of minutes throughout the year watching popular titles like Grey’s Anatomy, Bluey, Seinfeld, Criminal Minds and the Simpsons.

Is There a Dark Cloud Over Cloud Storage?

NAB

article here 


The majority of organizations will spend more on expanding cloud storage capacity in 2023 despite a huge number of them blowing their budgets in 2022.

The “ugly truth” is that enterprises are spending almost as much on storage fees as they are on storage capacity, finds storage vendor Wasabi, which compiled the “2023 Cloud Storage Index Executive Summary Report,” pointing to significant improvements to be gained in billing/fee structures and multicloud deployments.

Wasabi analyzed survey results from 1,000 IT decision-makers worldwide to provide insight into how corporations across sectors including energy, finance, and media are strategizing cloud storage.

Its data confirms the relentless pace of data growth in the cloud, with 84% of respondents expecting the amount of data they store in the public cloud to increase this year.

Today, organizations allocate 14% of their total IT budgets to public cloud storage services, on average. Wasabi expect this proportion to expand, as overall IT budgets grow slowly or remain relatively stagnant in 2023, and more dollars are allocated to high-growth IT segments like cloud infrastructure.

However, more than half (52%) of organizations exceeded their budgeted spend on cloud storage in 2022, illustrating a significant pain point which many users may look to address this year.

The worst offenders were new adopters. 72% of respondents who adopted public cloud storage services in the past 12-24 months exceeded their budget.

The reasons why organizations exceeded their budget expectations include incurring higher data operations fees (e.g., cross-region replication, object tagging, transfer acceleration) than expected

Also, migration of “additional applications/data” to the cloud was higher than originally anticipated. Others reported higher API call fees (e.g., reads/writes) and higher data retrieval fees than expected.

“Fees can be notoriously difficult to predict,” said Andrew Smith, senior manager of strategy and market intelligence at Wasabi. “As a result, they are a major reason why more than half of organizations we surveyed said that they exceeded their budgeted spending on cloud storage services in 2022. Understanding the cloud storage bill was the number one challenge associated with cloud storage migration. The survey data also sheds light on one of the industry’s unfortunate truths: A large proportion of storage bills are allocated to various fees. Specifically, respondents said storage fees account for 48% of their total cloud storage bill on average.”

Wasabi’s survey also confirms that many enterprises are using more than one public cloud infrastructure provider: 57% of organizations use more than one public cloud storage provider.

“Nothing groundbreaking here, but what is interesting are the reasons why many organizations have adopted multiple cloud providers for storage, and what they believe the key benefits and challenges of this type of strategy are,” noted Smith.

Almost 90% of those surveyed indicated that they had migrated storage from on premises to the public cloud within the last year. Interestingly, the top reasons driving migration were not cost related. Instead, users were spurred by the need for better infrastructure resilience, durability, and scalability.

 


Monday, 6 February 2023

The America’s Cup plots course to 1 billion viewers

SVG Europe

article here

When the 37th edition of the America’s Cup (AC37) sails off in October 2024 it will do so packed to the gunnels with sensors, mics and cameras in an effort to deliver a billion viewer worldwide audience. 

The world’s most prestigious yacht racing competition moves to Barcelona in 2024 for a challenger series beginning August to decide which team will compete for the title against defending champions Emirates Team New Zealand. 

Announced challengers include Alinghi Red Bull Racing from Switzerland, Luna Rossa Prada Pirelli from Italy, American Magic New York Yacht Club from the USA and K-Challenge from France. 

The 36th edition from Auckland was viewed by 942 million people worldwide. “We will look to beat that by a considerable margin at Barcelona 2024,” declares Stephen Nuttall, the event’s Head of Television. “We intend this to be the most watched America’s Cup of all time.” 

Nuttall – a former Senior Director for YouTube in EMEA, and Group Commercial Director of Sky - explained that the race means to achieve this with technological innovation on the audio visual side to match the high-tech design and engineering of the boat’s carbon fibre hulls. 

“Certainly, we’re intending to have cameras than we’ve had in the past. We intend to produce UHD HDR coverage – something not been done before in sailing let alone the America’s Cup. It will be surround sound. It will be a whole new level of coverage.” 

As you might imagine, the technical challenges of covering the Cup are considerable.  Starting with the fact that they can have no cables on the water; every camera is wireless. 

We want to open up the experience of being on an America’s Cup yacht to the widest possible audience,” Nuttall said. “That means more than 10 cameras, a plethora of sensors and 12 water-proofed microphones on every boat and on the sailors so that viewers can feel what it’s like to be part of the crew. 

The onboards are complemented by two helicopters, each with a gyro-stabilised camera and two similarly equipped chase boat catamarans.  At least one of the camera boats and all chase boats will use a hydrogen fuel cell for power. 

F1 on water 

“We’re looking at putting cameras and mics on the team chase boats to create a F1 pit lane type experience but on water,” Nuttall reveals. “That hasn’t happened before.” 

There’s a strong crossover between sailing and F1. Some teams, such as Alinghi and INEOS have established connections with Formula One teams, with Red Bull Racing and Mercedes respectively.  Dan Bernasconi, the lead designer for Team New Zealand, formerly worked at Maclaren.  

Nuttall said, “The America’s Cup puts the best engineers, designers and innovators each team can find to create the fastest possible boat within the designated design rules. It is the spearhead for cutting edge innovation across industries. The same principles apply for the ACTV team, it is never a case of doing what has been done before with the TV coverage, but innovating and creating new frontiers of broadcast technology. 

The media team will deploy cameras and sensors not just to televise the event but to officiate the competition accurately. If a yacht strays outside the boundary or gets too close to or in the way of its competitor, a penalty is applied that may determine the outcome of the race or indeed the entire contest.   

We have to do this on a grand scale and with great complexity. An America’s Cup race takes place within an area 4 km long and 1 km wide, with multiple laps in each head to head encounter.  We have a fixed time for each race to satisfy spectators and broadcasters so we need to adjust the course to create the spectacle.” 

With some campaigns costing over €100m and with a history stretching back to 1851, these are high stakes: “We can’t afford to make a mistake,” he said. “The America’s Cup is a 172 year-old start-up.  Every time there’s a new winner of the trophy, this creates an opportunity to reset the rules of the game.” 

Data in six dimensions 

For AC37, the fundamentals of the yachts are the same but the crew is reduced from 11 to 8.  New AC75 style yachts are an extreme, high performance boat class that is 25m long and ‘flies’ on foils above the water at up to 100km/h – four times the speed of the wind. 

Nothing is fixed, yet we have to measure the boat in six dimensions: its x, y and z planes and the dynamics of roll, pitch and yaw to a 100th of a degree and calculate how far in millimetres the boats are above that ever moving fluid surface. 

“We also track the position of each boat and each of the objects on the racecourse in realtime.  You can imagine, with the size that the boats are, that a very small pitch forwards magnifies up to a massive movement at the top of the mast which is 8 storeys high. So, it’s really important when you think about umpiring purposes and for the TV graphics that we measure everything very accurately. It is a phenomenal challenge.” 

This generates 125 million data points a day. “If you’re a hardcore fan of sailing you can visit our website and build your own dashboard to follow the race in the way you want to follow it. A more casual viewer of sailing will be delighted by the world feed. 

“The America’s Cup, uniquely, is a sailing event that transcends sailing,” he says. “That is our dual challenge.  It means we have to think about getting coverage on as many screens as possible, including on free to air networks, on our YouTube channel and website. We had 55 broadcast partners last time that covered 190 countries. and record numbers of TV viewers but it was the online audience that grew the TV audience by 19%. Online is very material growth and a key to the event being open and accessible.” 

Content is already being generated in the run up to the event. Media ‘Recon’ teams are embedded with each challenger and the Defender as they prepare for the Cup in their home locations before arriving in Barcelona. 

“There’s an almost daily update that goes out on social media telling fans of sailing what is happening, what the teams are experimenting with in the design of their boat, what training the teams are doing. It’s enabled us to have content all through the cycle as opposed to just the peak.” 

Mission Impossible documentary 

A Netflix-style behind the scenes documentary series will further aim to energise audience interest. Currently in the early stages of production, the new series has some heavyweight names attached. It is being produced by Skydance, the film studio behind Top Gun: Maverick and Mission: Impossible and produced and directed by an Oscar and Emmy Award winning team led by Jimmy Chin and Elizabeth Chai Varsaheyli whose previous credits include Free Solo and The Rescue. The show’s co-producers also produced The Last Dance, ESPN 30 for 30 and The Redeem Team. 

Free Solo is a good example of what we want to do with the doc series for the Cup because it’s a climbing film that’s not about climbing. It’s about one man’s struggle to fufill a life-long ambition. That is very relatable to the America’s Cup. The series will tell how the 100 people in each team devote four years to try to win probably the hardest to win trophy in all sport.” 

Nuttall is in the process of working through tenders for provision of all onboard media equipment. In weight terms this equates to 135kg of media kit for each of the six competitor boats. 

For the onshore outside broadcast operation they are again talking to various parties including “local businesses, international businesses, world class operations and we’ll make some announcements about that soon.” 

He was speaking at ISE, a trade event for audio visual media, which is hosted at the Fira in Barcelona. 

The city will become the first venue in the world to host both an Olympic Games and an America’s Cup event.