Friday 30 June 2023

Max Wang on the making of Ping Pong: The Triumph

for RED

article here

Table tennis, also known as ping pong, is considered the national sport of the People’s Republic of China. The Chinese table tennis team has not only dominated the Olympic Games, but also reigned supreme at the world championships and world cups for decades. Indeed, there have only been a few rare occasions when the nation did not manage to land a clean sweep of gold in major championships. But there was one time, back in 1989, when Sweden usurped China, in a shock which ran to the core of the national psyche if the new film Ping Pong: The Triumph is to be believed.

This sporting drama, based on the seismic loss at the World Table Tennis Championships and the team’s redemption five years later, is cast by co-directors Deng Chao and Yu Baimei as a tale of the underdog rising to meet a challenge.
Deng himself plays the team's chief coach, Cai Zhenhua, striving to lead his team to win the Swaythling Cup against arch rival Sweden at the World Team Table Tennis Championships in 1995.

“We knew a lot of the drama was going to center on the tables so I had to find interesting ways of conveying the fast moving action and of varying the coverage,” explains cinematographer Max Wang, who previously worked for the directors’ on Looking Up and Devil Angel. “That led me to think of using FPV drones and overhead cable cams.”

Wang devised a one minute 40 secs long single shot that begins with the Trinity operator standing on a GF16 crane before stepping down to walk with the lead actor using the Trinity Stabilizer to follow him down a street.

“In addition, there was a sequence scripted where we needed to suggest how desolate and shocked the players felt after losing a tournament,” Wang says. “For this we thought of deploying bodycams so that the camera is close-up on the player faces as they exit the stadium. For these reasons alone we needed a small and lightweight very high quality digital cine camera. For me, there was only one choice.”

Wang used multiple RED V-RAPTOR and RED MONSTRO VV along with RED’s KOMODO camera. "In most scenes we operated single cam but for the ping pong scenes I used every camera at my disposal. I assigned my operators to cover different angles and, if we had a spare camera, I’d prepare for the next scene either on a crane or a dolly.”

KOMODO was mounted on the drone for FPV shots. “For me, it is the holy grail of lightweight cameras,” he says.

Born in Taipei, Taiwan and moving to the US to study at AFI, Wang hadn’t grown up in mainland China during the time of the movie’s setting in the early nineties. He says this provided him with the advantage of being more objective and able to adapt to the director’s vision when determining the period look.

“Normally I am the one with lots of ideas for the director about visual look and tone but here directors Yu and Deng had very strong ideas. It meant that I was able to design a look with objectivity rather than it being based on my memories of the early 1990s. We had a long conversation about how it should look. Out of this we decided to use softer lenses. I interpreted this to select a range of vintage Lomo primes for RAPTOR plus a set of super lightweight Kowa anamorphics.”

The V-RAPTOR’s capability to shoot high speed was another important consideration for the filmmakers. “You can shoot 250fps comfortably with the V-RAPTOR and it was just so flexible since it was always available as an option should Deng or Yu require,” Wang says.

Extreme high motion shots were captured on Phantom at 1,000fps with footage “blending in superbly” with the RED camera reports Wang. Several scenes, such as one in a park, were set at night and lit to keep the real location atmosphere of the street lamp sources. Others such as those in the gymnasium where the Chinese team practice are filmed to mimic the fluorescent lighting typical of such places and illuminated by powerful daylight sources through windows.

"The RED holds up so well in any kind of light. In the park we pumped ISO 1,600 and I knew we could get the detail in the picture. In some scenes, such as when our new team members get off the bus to the practice hall and again in the gymnasium mixes extreme highlights of strong sun with deep shadow and both areas of the picture hold up extremely well.”

Photography took place in 2022 in China when the world was experiencing the challenges of shooting under strict coronavirus prevention measures. Main location work was in Beijing and Xiamen, which is a three-hour flight from Beijing.

“It was a nerve-wracking time since we couldn’t gather many people in one place at the same time,” says Wang. “That’s when communication with my directors in pre-production was so important. When we got to the shoot we knew exactly what we wanted to achieve.”

Tuesday 27 June 2023

Charlie Brooker Speaks Out on Generative AI

IBC

article here

The doyen of dystopia explains how he played with AI to create the latest series of Black Mirror.

Although shooting wrapped on the sixth season of Black Mirror before ChatGPT was launched last November, the show’s ever prescient creator had already toyed with Generative AI and found it wanting.

“Obviously the first thing I did was ask it to come up with a Black Mirror episode to see what it would do,” he told GQ in a pre-season interview. “What it came out with was simultaneously too generic and dull for any serious consideration. There’s a generic quality to the art that it pumps out. It’ll be undeniably perfect in five years, but at what point it’ll replace the human experience? I don’t know if that’ll ever come.”

He related the same story to The Hollywood Reporter and was even more blunt: “The first thing I did was type ‘generate Black Mirror episode’ and it comes up with something that, at first glance, reads plausibly, but on second glance, is shit.”

With this latest series, the first since the pandemic, Brooker wanted to throw out some of the core assumptions of what a Black Mirror episode is.

“When we started doing the show, there weren’t many dystopian sci-fi shows around,” he relates in the programme’s production notes. “These days, you can’t hurl a smartphone across a room without hitting three dystopian sci-fi shows.”

There’s been a conscious effort to rip up the rule book, keep it unpredictable for viewers, and maybe expand its remit.

“It allows us to disorientate the audience. Also, it meant I didn’t have to think, ‘What’s the episode of Black Mirror about NFTs,’ which is an idea that depressed me greatly.”

When Black Mirror isn’t a technology satire, it’s a media satire and one episode in particular merges the two. ‘Joan Is Awful’ explores the idea of a TV network that pumps out targeted AI-generated content targeted at individual people. In the episode, Annie Murphy plays a digital media CEO who turns on her TV one night to find that her favourite app, Streamberry (with deliberate echoes of Netflix itself), has used her personal data to dramatise everything she did that day.

“This is stuff that’s absolutely on their minds at the moment,” he explained to Esquire. “As writers, the thought that we could end up soon with automatically generated entertainment that is endlessly targeted directly at individuals is horrifying.”

The episode, like to much of Black Mirror, is eerily prescient and inspired by his own experiments with Generative AI.

“I toyed around with Chat GPT to come up with a Black Mirror storyline and it will spit out something that at first glance gives you a kind of sudden spike of dread,” he said in conversation with BBC Culture Editor Katie Razzall. “You think ‘That’s it. I’m out of a job… this thing has replaced me because it looks convincing.’ But really what it’s doing is sort of emulating. It’s an impersonator. For now, all it can do it be like Rory Bremner.

“I know that sounds like I’m dissing Rory Bremner,” he continued. “But it’s mashing up stuff that other people have already done. It’s just hoovering up content and repackaging it. And actually, once you sort of sit there and look at it, it’s very derivative. You still need a human to come in and make it usable.”

Brooker thinks the danger for creatives is that commissioning executives using Gen-AI to generate “a bit of IP”, that doesn’t actually work “until you get a human writer in to “make it something that’s saleable or usable. And of course, that person doesn’t get paid or wouldn’t get paid as much.”

Black Mirror might present as a dystopian set of warnings from a switched-on Nostradamus (remember the pig and politician episode in Series 1 on Channel 4 which coincided with Prime Minister Cameron’s own porcine past) but Brooker insists he is no luddite.

Because of the inadequacies of current Generation AI as he finds it, Brooker is optimistic of the value of writers and longevity of human creativity.

“I think the best, most surprising ideas emerge when you’re relaxed and ostensibly supposed to be thinking about something else, and suddenly your mind wanders into a room it hadn’t planned on entering,” he observes in the production notes. “Something like ChatGPT can’t do that. It has no genuinely original ideas of its own: it hoovers up material other people have already written – without paying them for the privilege and attempts to pass itself off as human.

“I can see its eventual potential value as part of a human writer’s toolkit – a sort of author’s equivalent of the ‘autofill’ tool in Photoshop – but certainly for now the stuff it creates only looks imaginative to people with no imagination of their own.”

Sir Paul McCartney has apparently used AI to polish and augment John Lennon’s vocals from a demo recorded in 1979. Brooker suggests that using AI to generate more Beatles music, whether by McCartney or anybody, is to be welcomed but that the value and interest we derive from it stems from The Beatles’ history and our curiosity about them

“What’s of interest is that it’s about these human beings who made all this stuff that we love, and their personalities and their stories is actually what we’re actually interested in there. I don’t know if creatively it will take the place of human creators because I think we’ll always be interested in people in that narcissistic kind of way.”

Referring back to the Joan is Awful episode, he talks about the streaming platform featured in the series, Streamberry.

“Put it this way, if we weren’t showing the series on Netflix, we would be being sued by Netflix for how it looks in terms of corporate identity,” he said in the programme notes. “We asked if we could ape the front end of Netflix, and they said yes. It’s not quite biting the hand that feeds because it’s funny. It meant we could throw in lots of easter eggs in the show. You could argue that Black Mirror is a fictional universe within the Streamberry platform.

“If anyone asks me, “Is it a shared universe?” I can say yes, and that can also be my get-out clause for any inconsistencies across the series. I wrote (another series 6 episode) ‘Loch Henry’ before Joan is Awful and had to go back and retrofit the idea of the Streamberry into the show.

 


Monday 26 June 2023

Neal Stephenson: Gaming Has (Almost) All the Tech We Need to Make the Metaverse

NAB

With the arrival of Apple’s new Vision Pro XR headgear it’s worth asking if we are any nearer to building a next-gen version of the internet.

article here

It’s too soon to know, sci-fi author and technologist Neal Stephenson told the XR community in a presentation at AWE Live, although the building blocks of the metaverse are taking shape. View the full talk in the video below.

These building blocks include the sophistication of game engines and “the fact that those game engines can be downloaded and used for free,” advances in power, and the lower state of the hardware needed to render three-dimensional imagery in real time, and — just as importantly — the user base.

“The people who’ve learned how to navigate 3D environments by playing video games must be past the one billion mark by this point,” he said.

However, if the next generation of the internet is to built on a model that decentralizes power and reward (which is a good thing in Stephenson’s mind), then support for those creators needs to be addressed.

“Right now, the skillset that’s needed in order to create metaverse experiences is basically what you see in the game industry. People who know how to use game engines and who know how to create the assets that feed into those game engines. We need to create the economic basis for them to get rewarded if they succeed in creating metaverse experiences that lots of people are enjoying.”

Lamina 1 is Stephenson’s own attempt to do this. It’s a blockchain on which to build the infrastructure of an open and decentralized metaverse that puts technology in the service of humans, not the other way around.

There’s no escaping the obligatory question these days on AI and its potential impact on the creative process. Are we getting close to being able to ingest an adventure book into AI, The Lord of the Rings for example, and generate a 3D world based on that story with characters and then be able to play it in VR?

Stephenson (of course) knows the folk at Weta VFX in New Zealand, which helped make The Lord of the Rings movies for Peter Jackson.

“If you watch those movies, one of the things that makes them great is the personal attention and care that’s lavished on every single detail of every costume and every prop,” he says. “So I don’t think we’re going to see work of that quality coming out of AI. Just because it requires that you do original thinking and come up with something different.”

Of Snow Crash and its Nostradamus-like foresight, he says, “I’d say I was the first to use that word [metaverse], not the first to imagine that. I mean, as I’m sure you know, with your background in the field, there were people thinking about similar systems before I wrote the book. Habitat being one example.

“The metaverse as described in Snow Crash was my best kind of guess as to what a mass medium based on 3D computer graphics might look like, but the metaverse per se in the book is neither a dystopian or utopian, or at least that’s how I meant to portray it.

“In the opening pages of the book our initial exposure to the metaverse is, is kind of very mass market, lowest common denominator, sort of crude, obvious, like the kind of the worst of television. But later on, as we get farther into the book, we see that people have used it to make beautiful works of art. And we see that there are some people who have lavished a lot of time and attention on making homes in the metaverse that are exquisite works of art.”

 


Behind the Scenes: Glastonbury 2023

IBC

article here

With more than 40 hours of TV plus 85 hours of live radio in addition to digital streams from the five biggest festival stages, Glastonbury 2023 delivered record coverage live from Worthy Farm. For the BBC teams assigned to produce it, however, Glasto is far more than one giant promo.

“It is a massive commitment from the BBC in terms of cash and time on air and a reflection of just how important Glastonbury is as an arts festival to the culture of the nation,” said Peter Taylor, BBC Studios, Head of Operations. “We don’t think it’s possible to tell the story of Glastonbury from just one point of view. An arts festival this big is experienced by people in so many different ways so it’s our job to try and give a flavour of that for those who can’t be there.”

Taylor has been involved in the BBC’s Glastonbury productions for the last twenty years. He and Alison Howe, Executive Producer for BBC Studios, work on the event on and off all year round.

Build of the operation including track ways, fencing and production offices, is installed two weeks out, but the technical outside broadcast rig only rocks up on the Tuesday before beginning broadcast on Thursday evening. Even then it can take 5-6 hours for OB trucks to travel from the entrance gate to their correct position.

This year’s presentation operation was delivered by festival stalwarts Timeline TV. Its largest scanner on site handled presentation for BBC One and BBC Two. A second Timeline truck managed prez for BBC Three and Four and a smaller one was dedicated to iPlayer. It supplied two radio cameras for coverage across the site and a roving cable camera for filming in the Pyramid stage pit. Timeline also supplied cameras for the presentation positions including at The Park, a near Worthy Farm looking over the Pyramid stage a teepee in the main broadcast hub used primarily for iPlayer presentation.

“It is an enormous machine,” said Gareth Wildman, Timeline’s Head of OB. “If you break it all down no individual part is particularly complicated but the scale of it is considerable when you consider how much we are broadcasting from a farm in a field.”

New to 2023, and following a tender for the operation, are Cloudbass which provided OB facilities for three stages including the Pyramid, and Vivid Broadcast which had responsibility for Woodsies (formerly John Peel) and The Park stages.

Feeds from all five broadcast stages were fed to Timeline’s hub on-site for production. The editing operation on site from four Avid suites included production of all non-live content clipped into various programmes and dotted throughout the weekend’s TV schedule. For this, Origin Broadcast supplied EVS systems and trucks.

The entire site is ringed with a fibre network first installed in 2013. “One thing I’ve learned is that farmers like to move dirt around and dig holes and fill them in,” said Taylor. “Our fibre infrastructure has been hit on many occasions by a tractor ripping it out.”

That was reinstalled last winter along with a whole back up circuit both of which are managed on site by Timeline. Twenty-two circuits comprising various stage feeds are transferred offside over two diverse IP networks to Bristol and to London for transmission (backed up by a satellite uplink managed by Timeline).

The connectivity also handled talkback between stages and carried BBC radio audio back to its on-site hub. Bandwidth is also shared with festival (for internet access and electronic payment) as data transfer across the whole site has become much more important.

“The upgrade to the fibre network features more connections to various places and some relocation of fibre termination points,” said Wildman. “The festival is quite organic. Things change all the time.”

Handling UHD HDR

All coverage of the Pyramid stage, which hosted headliners Arctic Monkeys, Guns N’ Roses and Elton John, is UHD HDR to feed the dedicated UHD channel on iPlayer which first launched in 2022.

“Since the majority of viewers are still HD we have to make sure the picture quality is maintained for that audience,” said Taylor. “This will likely stay the same for many years. We’ve only just moved off SD for some areas.”

 “The important thing is that the pictures are racked and exposed for those viewers watching in HD SDR but the benefits of extra colour and extra exposure is available for those watching in UHD.”

A BBC developed LUT is applied to the UHD signal which is then down converted for HD viewers. The same workflow was used for the Coronation and major sports events.

“It’s becoming normal but still requires some extra work and quite a lot of additional kit to make it work,” Taylor said. “If you get the core right then the peripheral will sort itself out.”

Each stage also had its own giant screen display operated by teams separate to the BBC. This featured shots from additional cameras but resources are shared in order to keep the amount of infrastructure down.

“Rather than filling the venue with extra cameras there’s a lot of interchange,” Taylor said. “There’s an awful lot of conversion and monitoring between what we are able to take from them and what we give to them to cut into their mix.”

The quality of pictures from the Pyramid stage has improved since the introduction of HDR. The stage faces away from the sun and has traditionally proved tricky to film.

“It’s a black box into which the camera points,” said Taylor. “When you come wide it is completely silhouetted by a sunset - which can look absolutely glorious but honing those pictures so that they convey what’s going on onstage while exposing for the sun is not without its challenges. The teams we work with are used to that and HDR makes it zing even more.”

Others stages present the reverse problem with a huge amount of sunlight shining directly into it so the audience are backlit. “If you think of those festival shots of hands in the air, waving flags on a lovely summer’s afternoon, that is what people are tuning in for,” Taylor said. “A chunk of British summer.”

‘Vanilla’ camera plan

Glastonbury’s reputation has grown as coverage has expanded. Last year Sir Paul McCartney’s performance on BBC One reached a peak audience of 3.9 million and Diana Ross’ performance peaked at 3.8 million. Although the ‘authored’ programming on the main channels remains solid, it is streaming which shows the event’s televisual future. Last year’s event saw record-breaking digital audiences. Streams increased by 116% on iPlayer and 205% on BBC Sounds, up from 2019.

There are around 64 official BBC cameras on site which is not dissimilar from previous years. Acts on the Pyramid stage are recorded with 12 cameras which a few of the headline acts may augment with a few cameras of their own.

“Because there are so many different acts and stages we have to have kit that works for any band. So, each stage has a ‘vanilla camera set up’ if you like.”

Howe hints at some “additional toys” this year but the Glasto team are generally more restricted in terms of camera gear than at a major sports venue. There’s no wirecam, for example, due to health and safety.

“I quite like not having a zillion cameras,” Howe said. “Glastonbury coverage operates on a level appropriate to the environment we are in. Its position in the landscape with the artists and the crowd lends Glastonbury a unique visual festival. We want the viewer at home to feel part of it and when you have the right operators and right directors you don’t need lots of tricks to do that.”

The camera and director teams cover ten bands per day on each stage. “When each new band comes on stage it should feel like a new chapter in the story of Glastonbury 2023,” said Taylor. “For us, just as important as capturing the artists themselves is capturing the interaction that happens between a band and the audience. We want it to feel very much a live event feel.

“We also have teams filming everything away from the stages such as the Kidzfield (for under 12s) or the Greenpeace Field or any of the myriad other elements on the site.”

Sound mix with artist involvement

Because the scale of the live broadcast across radio and TV the Glastonbury production uses one stereo sound mix which is produced by BBC Radio. Even the UHD channel can only deliver in stereo but as Taylor points out, “what would you gain by having 5.1 or Atmos audio through rear speakers? That’s the bit you don’t want to hear, the audience coughing and cracking open a beer.”

Representatives from the acts regularly join the BBC sound mobile truck to advise on what their artist expect.

“More artists want to spend more time on the prep with us each year because it’s important to everybody concerned that Glastonbury is a success,” said Howe. “Some artists want to talk through every element and join rehearsals and have a lot of meeting and specify things. Those acts in the middle of major touring schedule tend to be ‘in the zone’ while others performing live as a one off will approach it differently.”

Summer solstice – what could go wrong?

After a hiatus during the 1990s when Channel 4 took over (recorded) coverage the BBC has had a quarter of a century of uninterrupted broadcast of the festival. It’s a valuable relationship said Howe.

“The BBC team is part of the whole festival now so if one of our team find’s themselves in a tricky situation – say, the weather renders transport from one area to another difficult - we know who to call,” said Taylor. “We’re not seen as outsiders being a nuisance.”

The Summer Solstice was on Wednesday 21 June to give the whole festival an added vibe that the Gods are shining down.

“I’ve done enough Glastonbury’s to realise you can’t [rely on] the weather but so much effort goes into the show from so many people you want that last little element to tie it all together,” Howe said.

“It would just be nice if the weather joined in.” Fortunately for the team – and the festival-goers - the glorious weather held for the weekend – wellies not required this year!

 


Wednesday 21 June 2023

“Black Mirror:” Charlie Brooker Finally Sees the Reflection

NAB

After years of exploring society’s dark absurdities, the sixth season of Netflix’s dystopian anthology series “Black Mirror” gazes at its own reflection.

article here 

Like all good sci-fi, Black Mirror reflected our present into the future, but in the four years since the last run of episodes on Netflix the world seems to have become so dystopian that you couldn’t make it up.

The pandemic forcing everyone indoors, the riots on Capitol Hill fed on social media conspiracy, the rise and rise of generative AI, entrepreneurs commercializing space, and, of course, the metaverse.

The Emmy-winning anthology series is back and writer-creator-showrunner Charlie Brooker has been talking about how he took the opportunity to mix things up.

“It feels like the dystopia is lapping onto our shores at the present moment,” he told GQ’s Brit Dawson of the five-episode instalment.

“I definitely approached this season thinking, ‘Whatever my assumptions are about Black Mirror, I’m going to throw them out and do something different,’” Brooker said.

This included more comedy, particularly in the episodes “Joan Is Awful” and “Demon 79,” a horror story subtitled “Red Mirror” that draws on staples like Hammer and the work of Dario Argento.

“I sort of circled back to some classically Black Mirror stories as well,” Brooker said. “So it’s not like it’s a bed of roses this season. They’re certainly some of the bleakest stories we’ve ever done.”

He’s also perhaps not as wary of the future or of technology as his Black Mirror persona might suggest. He recalls how frightening it was in the 1980s during the height of the nuclear cold war.

“That didn’t quite happen! The other thing I would say, I do have faith in the fact that the younger generation seem to have their heads screwed on and seem to be pissed off. So that’s going to be a tsunami of people, it’s just that they’re not at the levers of power yet,” he says.

“We have eradicated lots of diseases and generally lots of things are going well that we lose sight of but it’s just a bit terrifying if you think democracy is going to collapse. That and the climate breaking down.”

In another pre-season interview with Amit Katwala at Wired, Booker continues, “I am generally pro-technology. Probably we’re going to have to rely on it if we’re going to survive, so I wouldn’t say [Black Mirror episodes] necessarily warns, so much as worries, if you know what I mean. They’re maybe worst-case scenarios.”

Three of the five episodes are set in the past, with seemingly no connection to the evils of the internet from past seasons.

“I think there was a danger that Black Mirror was becoming the show about consciousness being uploaded into a little disc,” Brooker explains to Emma Stefansky at Esquire. “Who says I have to set this in a near-future setting, and make it all chrome and glass and holograms and, you know, a bit Minority Report?” he asks. “What happens if I just set it in the past? That opens up all sorts of other things.”

However, one episode does openly “worry” about a near-future in which AI takes control of our lives in ways we hadn’t imagined. “Joan Is Awful” is about a streaming service called Streamberry — cheekily mirroring Netflix and clearly with the streamer’s consent — that makes a photoreal, AI-generated show out of a woman’s life.

It was specifically inspired by The Dropout, the ABC mini-series about Theranos founder and convicted fraudster Elizabeth Holmes, along with the possibility of all of us having the ability to generate personalized media using AI. Except in Black Mirror’s take this is another example of Big Tech using our private data for the entertainment of others.

People prefer viewing content “in a state of mesmerized horror,” the CEO of Streamberry says in the episode.

“Obviously the first thing I did was ask it to come up with a Black Mirror episode to see what it would do,” Brooker told GQ. “What it came out with was simultaneously too generic and dull for any serious consideration. There’s a generic quality to the art that it pumps out.

“That was the first wave [of generative AI], when people were going, ‘Hey, look at this, I can type “Denis Nilsen the serial killer in the Bake Off tent” into Midjourney’ and it’ll spit out some eerily, quasi-realistic images of that, or ‘Here’s Mr. Blobby on a water slide,’ or ‘Paul McCartney eating an olive.’

“It’ll be undeniably perfect in five years, but at what point it’ll replace the human experience? It does feel now like we’re at the foothills of new, disruptive technology kicking in again.”

The episode “Loch Henry” is set in the present day, following a pair of documentary filmmakers who plan to give a shocking hometown murder the lurid true crime treatment. “Loch Henry” relies on VHS tapes to build its narrative, instead of a smartphone app or webcam.

“It’s a weird one, because it is about the archive of the past that people are digging into,” Brooker tells Stefansky. “But it is also about the way all that stuff is now hoovered up and presented to you on prestige TV platforms — that we’re mining all these horrible things that happened and turning it into a sumptuous form of entertainment.”

He continues, “There’s nothing more frustrating than when you’re watching a true crime documentary, and it starts to dawn on you somewhere around Episode 3: They’re not going to tell me who did this. Not what I want. I want to see an interview with the killer. Go and generate one on ChatGPT.”

At Wired, Katwala speculates that maybe the next step is personalized content about personalized content. Society and social media has been moving in this direction for years, he says.

“One of the supposed benefits of generative AI is that it will enable personalized content, tailored to our individual tastes: your own algorithmically designed hell, so horribly well-targeted that you can’t tear your eyes away.”

 But, he wonders, what happens to cultural commentary when everyone is consuming different stuff?

The irony is that while hyper-personalized content might be great for engagement on streaming platforms, it would be absolutely terrible for landmark shows like Black Mirror and Succession, which support a whole ecosystem including websites like Wired and NAB Amplify.

“We siphon off a portion of the search interest in these topics, capitalizing on people who have just watched something and want to know what to think about it. This helps explain the media feeding frenzy around the Succession finale and why I’m writing this story about Black Mirror even though we ran an interview with the creator yesterday,” Katwala argues.

“In a way, you could see that as the media’s slightly clumsy attempt to replicate the success of the algorithm.”

 


Vying for Eyes: Investments in the Attention Economy

NAB

Grabbing people’s attention in the first few seconds has become de facto metric for brands and advertisers that applies across mediums from TV to TikTok. When the average person is exposed to as many as ten thousand ads a day, the aim is to hook audiences quickly to generate brand awareness, but how do we cut through the noise and effectively grab a consumer’s attention?

article here

A panel of experts at the 2023 NAB Show took to the stage to share their views in a session entitled “How to Stand Out in a 3 Second World.” 

“The three seconds should be inviting someone in, opening the door, and then, like, keeping them on long enough so that they actually stay and meaningfully engage with the content,” said Clare Stein, executive creative director of ATTN.

ATTN has a creative checklist to gauge whether a piece of content is gaining attention. “It’s not a science, it’s not something that we’re like bureaucratically crossing things off, but it can help.”

Stein suggests creating a “curiosity gap” right upfront in the sense of explaining to the audience what they’re going to get in the video, but not giving it all away. “Otherwise, they’ve kind of gotten what they’ve needed, and they can move on. But you also want to show you’re providing some real value to the audience.”

Another best practice is to elicit human emotion as a compelling way to open a video.

“I see a lot of clients who want to open a video with beautiful aerial drone shots to set the tone, and maybe that’s great in a longer-form documentary, [but] that is the worst possible way to start a video [on social]. You need to give the viewer someone to connect with. I think it’s just like human psychology.”

Her third tip is to provoke and surprise. “It’s really hard to break through if you’re not doing something different,” Stein said.

She cites a viral video for Adidas promoting a line of shoes the company had made from recycled ocean plastic. “Our opening clip was a squid trapped in plastic that was really close up, you couldn’t quite tell what was going on. And we had a lot of internal debates of, like, is this a good opening? Like, it’s not a person, we can’t really tell its nature, we can’t really tell what’s going on. But because of that I think it caused people to continue watching the video and ultimately led to its success.”

 

Chris Di Cesare, head of creative programming at Dice Creates, said that judging the success of a campaign ultimately means “becoming a part of the cultural zeitgeist.”

Added Ian Grody, chief creative officer at Giant Spoon, “When the work that we do is getting picked up in The New Yorker and it’s on the news and people are talking about it on social, that to me is true, meaningful success. It’s success that has resonated in a truly authentic way. And it’s not something that we fool ourselves into believing is success. It’s something that we’ve earned.

“We are building these Trojan horses that contain brand messages,” Brody continued, “but [they] need to arrive in the form of culture that you seek out, that you would pay to watch, pay to attend, instead of the thing that you pay to skip.”

The panel also touched on the impact of AI, viewing it generally as a force to be harnessed. When anyone can generate content in the style of, say, Wes Anderson, or Marvel or Star Wars, then “increasingly the emerging coin of the realm is going to be originality,” says Grody.

“Originality is going to become the most precious commodity in the advertising space, full stop. It’s going to be those people who can really operate at a high level, generating original work that are going to continue to break through as the playing field is levelled.”

Stein said she was excited and interested to see how creative agencies and brands can harness the “decentralization of creativity that exists when anyone can have a voice.”

She added, “One of the trends I really like seeing on TikTok is people making commercials for products and brands that sometimes are funny, and they’re parodies. Sometimes they’re really good.”

Agreeing with this, Grody said, “the fact that so many people who were creatively disenfranchised before now have the opportunity to make the stage and to have their work seen is a wonderful place for brands. [Brands] can continue to elevate co-creation opportunities because now you have all of these potential partners out there who are willing to participate, who are hungry for even more exposure, and brands can be collaborators, brands can be amplifiers.”

In order to do all of this effectively, agencies are advised to populate their team with “people who live and breathe the culture that you end up wrapping that story in,” said Grody.

“I have made it a huge priority, as have other people within the organization, to defy the advertising industrial complex, and reach out to different patches and pluck this incredible Ocean’s 11 of experts with unique passions that are meaningful to our clients. So, when we’re working on Activision, we have gamers on our team. The answer to me is that it all comes down to your people, who you staff, and making sure that you have the right people to meet all of these challenges.”

 


Tuesday 20 June 2023

The multiplexity of moving playout to cloud

InBroadcast

p30-31 June Issue here 

While the shift to cloud is inexorable broadcasters face a multiplicity of challenges from cost and integration to security and training. One constant is the need for a technical partner to guide the way. Leading vendors tackle the issues head on. 

Transitioning to cloud playout involves several common challenges for broadcasters. Mark Strachan, Head of Media Practice for Telstra Broadcast Services lists these as: 

Bandwidth and Connectivity: Sufficient and cost-effective bandwidth, both to and from the cloud, is crucial for file-based and live broadcasts. Availability and affordability of high-speed connections can be a hurdle. 

Cost Considerations: While cloud playout offers scalability and flexibility, compute storage costs and data transfer expenses (egress) can be significant. These costs may impact the overall total cost of ownership, potentially favouring on-premise broadcast models. 

Vendor Management: Successful migration to the cloud requires seamless integration of multiple vendors. Managing these relationships and ensuring smooth collaboration can be complex. 

Monitoring and Quality Control: Cloud-based broadcast necessitates effective monitoring and quality control systems across diverse cloud resources to maintain broadcast standards. Real-time monitoring of multiple cloud resources, performance metrics, and content delivery requires specialised systems and skill tom implement effectively.  

Skill Requirements: Cloud-based broadcast operations demand a new skill set from traditional broadcasters. This requires upskilling or acquiring additional talent with expertise in cloud technologies. 

Strachan points to TBS’ cloud native Media Production Platform as a solution. This platform caters to various broadcast workflows, including playout, remote production, asset management, and media/standards conversion. 

“By leveraging TBS's extensive connectivity solutions, the Media Production Platform enables seamless transmission of live and file-based content from the source into the cloud and onward to its final destination. As a fully managed service, TBS acts as a single point of contact, handling integration with underlying vendors and mitigating complexities. 

“Moreover, TBS leverages its operational scale to negotiate reduced costs on various cloud components, helping broadcasters optimize their expenditure. With the Media Production Platform, customers can rely on TBS's expertise, minimising the need to upskill their workforce for managing cloud-based broadcast operations.” 

One of the key challenges of moving broadcast playout to the cloud is cost. Many broadcasters still invest in on-premises technology because they think the TCO is lower than cloud-based solutions.  

“Cloud usage is a lot lighter than when these workflows first migrated to the cloud,” says Rob Gambino, Head of Advertising and Personalisation Strategy at Harmonic. “Moreover, the cost of compute and storage has come down appreciably. Using a cloud-based playout solution is now cost comparable to an on-premises solution, or in some cases can be more cost efficient when using the latest playout technologies such as channel assembly. 

“Another issue is workflow support. Many broadcasters have complex workflows that have historically only been supported by on-premises solutions. Workflow support has been increasing year over year as cloud-based playout solutions have improved. Solutions like Harmonic’s VOS360 Media SaaS now support a wide variety of workflows, enabling video service providers to distribute to affiliates, over the air, or direct to consumers from a single channel origination workflow.” 

Scheduling is also a challenge. As broadcasters launch more D2C channels, scheduling can be time consuming and require substantial manpower. Modern, cloud-based scheduling solutions support many workflow automation features that make scheduling significantly less manpower intensive. Furthermore, new technology like AI/ML library categorization and automatic schedule generation can almost completely remove the need for human intervention in the channel origination process. 

Finally, monetisation is always a concern. Gambino says broadcasters are increasingly feeling the pressure to maximise the monetisation of their channels. Cloud-based playout solutions can integrate directly with ad insertion systems where they live and scale best — in the cloud — to deliver personalised advertising experiences directly to viewers.  

“Harmonic leads the charge in this area with its VOS360 Media SaaS and new VOS360 Ad stand-alone, server-side ad insertion SaaS that enables targeted addressable advertising at scale for video streaming.” 

Broadcasters contemplating the transition to cloud need to address seven key questions says Pavlin Rahnev, PlayBox Neo CEO. 

1. Does your proposed choice of cloud-playout system integrate easily with your existing ingest, production, post-production, scheduling, presentation and playout workflow? 

2. Can the proposed system be operated within a unified graphic user interface rather than forcing your staff to work with multiple GUIs? 

3. Do you gain the freedom to operate on-premises and in-cloud playout as an integrated hybrid? 

4. Does your proposed cloud-based system support 24/7 automation? 

5. Can the entire system be accessed remotely by a securely connected administrator? 

6. Will the same system integrate easily with your existing back-up infrastructure or any desired reserve layer? 

7. Is the proposed solution cost-efficient? 

He then talks about how PlayBox Neo Cloud2TV delivers on these issues. “Cloud2TV can be configured to match any style and scale of TV media application from a single terrestrial, satellite or internet-streamed channel up to fully international networks transmitting in multiple languages. It is designed for fully automatic round-the-clock playout while retaining the ability for live content to be transmitted at any time. 

“Cloud2TV offers PlayBox Neo’s huge global customer base an easy and efficient way to supplement their Channel-in-a-Box-based (CiaB) and AirBox Neo-20 server-based systems. It also interfaces easily with third-party playout products. Extra TV channels can be added at very short notice. It has proved highly successful for channel managers seeking to accommodate start-up program channels and event-specific red-spot services.” 

Finding the fine balance between attaining the flexibility of the cloud together with the reliability of on-premises infrastructure is the challenge identified by Daniel Robinson, Head of R&D, Pebble. There is a misconception that the cloud is more cost-effective (you only pay for what you use) he says, but it’s not that straightforward. 

“Broadcasters may have content that is not time-critical, which works well at SD and even HD, and is available to OTT customers only. Channels like this can be made up of pre-prepared videos and pre-rendered graphics, with simple automation. It is feasible to operate these entirely in the cloud, where content is uploaded once and stays there.  

At the other end of the scale are channels delivering quality live content, both over the air and OTT – for example, sports or 24/7 news channels which are time-critical. Add in HD and UHD resolutions, a wider colour gamut for HDR content, or complex real-time graphics, and committing fully to the cloud may make less economic sense. Nonetheless, these channels can benefit from a hybrid approach, improving redundancy, analytics, or archiving via the cloud. For now, the flexibility of hybrid cloud workflows is likely to be the best route to take.  

Pebble offers its our own cloud-first “self-healing, service-oriented” broadcast technology platform, Oceans. Robinson says it offers dynamic scalability for workflow needs and is deployable on-premises as well as in the cloud.  

“We recognise our solutions need to serve customers who want to continue to operate with the confidence that on-premises workflows provide and Oceans offers that. There aren’t many broadcasters who are ready to jump into a full cloud solution. But they still want to leverage the benefits of working in the cloud in a way that suits them and their operations. That is the sweet spot of effective, reliable hybrid solutions, and it will remain so for some time.” 

According to Peter Wharton, TAG’s Chief Strategy Officer, the main obstacles to moving playout (or any traditional on-premises workflow) to the cloud is education and cost effectiveness.   

“Migration to the cloud creates operational agility and avoids CAPEX infrastructure investment in a media industry beset by change and uncertainty,” he says. “Linear channels, once thought to be dying, are experiencing a rebirth. But moving or launching these channels in the cloud only makes sense if the cloud is economically comparable with on-premises operations. So how do you create a cloud playout system that meets these objectives?” 

Wharton explains that, initially, vendors shifted from purpose-built master control and server hardware to CiaB architecture. This was software-defined playout running on COTS server hardware and a video card for I/O.   

“With the emergence of IP, these solutions adopted IP I/O and often ran in virtual machines. As cloud playout grew, the natural progression was ‘lift-and-shift’ this software to the cloud, which unfortunately is the exact opposite of a cloud optimized playout solution. 

“Cloud playout demands a highly deconstructed playout service with each component running optimally; running the core playout service in the smallest instance size possible, or even prerendering the content and not running any instance. It means using on-demand and serverless cloud compute for media management, live signal ingest, graphics redundancy and other components not essential in the core playout service. Deconstructed playout reduces cloud costs by 80-95%, making cloud playout more economical that its on-premises equivalent.” 

 TAG, a monitoring solution provider, has taken a similar approach to cloud monitoring.  The efficient and powerful solution supports all signal formats while matching instance sizes to workloads for optimal cost-effectiveness.  

TAG uses adaptive monitoring that it claims reduces cloud costs by 80%. Its bridge technology ‘deconstructs’ traditional monitoring by separating inputs and mosaic outputs to further reduce costs while simplifying orchestration.  

Wharton explains, “Bridge enables multi-region, multi-cloud and hybrid ground-cloud monitoring optimised to customers’ workloads. In addition - our new Content Matching enables workflows to be monitored end-to-end by exception, improving accuracy and performance while simultaneously allowing far more channels to be monitored. This new technology dramatically reduces workflow complexity and eyes-on-glass and enables media companies to deliver quality content with fewer resources and more confidence.” 

Data Security, latency and infrastructure transition are the principal headaches challenging broadcasters according to Aveco

There are concerns about potential data breaches, loss of sensitive information, and compliance with data protection laws,” notes product manager Martin Mach. “There’s concern that any significant delay in the live broadcast will affect the viewer's experience and of course, moving from a traditional broadcasting model to a cloud-based one can be expensive, time-consuming and requires careful planning.” 

Other industry issues Mach highlights include the skills gap, vendor lock-in and customer support from cloud providers “not attuned to the immediate needs of on-air operations.” 

That’s where Aveco’s more than three decades of experience in broadcast automation come into play. “Our technology has robust security measures, encryption, and audits to protect your data,” he says. “Our cloud management tools effectively manage your costs while ensuring resources are available as needed.” 

When it comes to skills, Aveco offers “comprehensive training resources” to prepare broadcast teams for the cloud era. “Having a DevOps engineer available means never having to lose sleep,” he says.  

Since Aveco is a device-agnostic company using open standards “we provide customers with complete flexibility. We also provide a hybrid approach, moving some services to the cloud while others remain on-premise, managing risk and ensuring business continuity during the transition period. With Aveco, your move to the cloud is more than a transition - it's an upgrade.” 

The same trio of Security’, ‘Latency’ and ‘Flexibility’ are top of mind for Grass Valley when it comes to helping client shift to cloud playout. 

Systems must be secure and protect against unauthorised access offering the same security and compliance standards as traditional on-premises systems,” says Steve Hassan, senior director of the firm’s playout operations. “Solutions must be capable of delivering content with extremely low latency, to meet the requirements of live broadcast. True cloud playout should offer flexibility for deployment across public and / or private cloud providing the customer with the ability to build resilience as they require.” 

Grass Valley address these through ‘Playout X’ on the AMPP (Agile Media Processing Platform) SaaS solution. “AMPP is built from the ground up to be secure, reliable, cost-effective and easy to manage and use, proving itself with Tier 1 customers around the world,” says Hassan. 

AMPP implements industry standard security measures including metadata encryption, identity authentication and authorisation, with independent testing of the platform. Customers retain management of their own media and channels.  

The system provides patented low latency streaming and timing, supporting live dynamic events in Playout X and, when it comes to flexibility, AMPP leverages cloud technologies and a micro-service-based architecture. Hassan explains, “Broadcasters can easily self-deploy channels and upgrades without downtime, choosing placement of processing, enabling them to adapt to changes and only pay for what and when they use. 

“Playout X enables true site, region and platform diversity, with remote operations via HTML5 based user interfaces. Elastic scalability supports any size operations, across a mix of channel complexity and formats – including UHD, HDR, compressed and uncompressed I/O. Grass Valley enable the biggest live broadcast events in the world and Playout X customers are not investing into a limited ‘point product’, so can easily add other AMPP or Alliance Partner solutions to their system.” 

Qvest is concerned for the human resources involved in cloud integration. Frank Mistol, MD Qvest Stream says unfamiliarity with the system can leave people feeling overwhelmed by navigating and operating a new user interface. Additionally, there is concern about limited flexibility when integrating with existing systems and workflows.  

 

“At Qvest, we empathise with these concerns and are dedicated to addressing them head-on. Our approach involves providing independent and service-oriented cloud solutions that ensure a seamless integration process. Moreover, we go the extra mile to maintain and enhance flexibility for broadcasters, ensuring they can adapt and optimise their operations according to their unique needs.” 

With makalu, Qvest have developed a cloud playout product that is specifically designed to overcome these concerns. “makalu offers broadcasters a user-friendly interface, making it easier for them to transition to the cloud environment and operate the system effectively, even if they have no prior experience with cloud-based systems,” says Mistol. Furthermore, makalu ensures a seamless integration into existing systems, empowering broadcasters to effortlessly adapt and customise the solution according to their specific requirements. Our focus is on scalability and cost-effectiveness, providing a solution that seamlessly aligns with the financial capabilities of any individual company.” 

Mistol adds that the company is has been delighted by the outstanding feedback from experts following the release of the latest comprehensive functional upgrade of makalu in April. 

“We strive to meet the needs of broadcasters while ensuring a smooth transition to cloud playout, ultimately maximising productivity and minimising concerns and costs. 

  

For service providers like Planetcast International, that have customers needing 99.99% uptime SLAs, there is a requirement to provide extensive diagnostics and correction as well as comprehensive redundancy planning. Over time, network infrastructure has continued to improve, offering higher bandwidth and lower latency connections. Additionally, advancements in technologies such as 5G networks and fibre-optic connections have the potential to enhance the reliability and performance of cloud-based playout.  

 

“While cloud-based playout can offer scalability and flexibility, it also introduces new cost structures. Broadcasters need to consider factors such as cloud service fees, data transfer costs, storage expenses, and ongoing operational costs,” says Venu Iyengar, COO Digital at Planetcast. “Evaluating the financial viability of cloud playout, comparing it with traditional on-premises infrastructure, and determining the optimal business model becomes crucial. Media brands can benefit from tech partners that provide predictability and financial control through insightful cost modelling.” 

Broadcasters are subject to various regulatory requirements specific to their regions, such as closed captioning, watermarking, content localisation, and advertising regulations. Migrating to the cloud requires ensuring compliance with these regulations, which may vary from country to country.  

“Adapting cloud-based playout systems to meet specific compliance standards is essential,” says Iyengar. “Planetcast understands these complexities through running post-production services for majors including Amazon Prime and Viacom.” 

He says cloud service providers have invested significantly in improving security measures to address data security and privacy concerns including compliance certifications to ensure the protection of sensitive content and compliance with regulations. 

Cloud providers also offer scalable and cost-effective storage solutions, allowing broadcasters to efficiently manage and store their media assets. With features like content delivery networks (CDNs) and object storage, broadcasters can easily distribute and retrieve content globally.Cloud playout systems themselves are designed to integrate with existing broadcast workflows and infrastructure. Open standards and APIs facilitate seamless integration with automation systems, CDNs, and other components of the broadcast ecosystem, enabling a smooth transition to the cloud.  

“Planetcast’s content supply-chain management system, Contido, has been designed to ensure seamless workflow management and adaptation through a single dashboard, simplifying operational management. We continue to develop systems and services that reduce the complexity of workflow management, removing one of the greatest deterrents to cloud migration.”