Tuesday 28 December 2021

2021 Review – Production And Workflows

Broadcast Bridge

An obvious place to look is how COVID-19 has reshaped the workflows of productions across the industry. Remote working and geographically dispersed teams linked online and via cloud are now embedded.

https://www.thebroadcastbridge.com/content/entry/17878/2021-review-production-and-workflows

Microsoft thinks this has accelerated to the point that the M&E industry is probably five years ahead of where it thought it would be.

“What you see is companies looking to reimagine their workflows because the lines between media types are disappearing,” Simon Crownshaw, CTO for Media & Entertainment at Microsoft, said.  “These transformations are ramping up at an amazing rate.”

With traditional working practices upended, rising costs and technological advances, startup post production firms are eschewing physical facilities in favour of a cloud-first approach. If you are going to launch a postproduction business today you would at the very least offer the flexibility for clients and staff to work from home.

Start-up post houses in 2021 included Assembly, with facilities in New York City (and a Los Angeles facility opening in 2022) with a cloud backbone to offer clients a choice between in-person or remote creative sessions.

Others include, Racoon, launched in February by the former management team of one of the largest and most successful bricks and mortar facilities, London-headquartered The Farm. Racoon is an online platform providing all the offline, finishing and mastering services you would expect from a traditional facility, but in a potentially more efficient, agile and sustainable manner.

“It is very difficult to change your business model if you are having to fight legacy real estate and technology,” said founder and CEO David Klafkowski “If I had 50 rooms and I was looking to re-engineer my editorial backbone I would actually engage the services of a company like Racoon because it would be more efficient than buying in equipment.”

He is careful to avoid calling Racoon a facility, even though it does have premises on Frith Street. However, the only hardware on premise is monitoring. Everything else is in a data centre connected to by dark fibre.

“You can’t start a business and be entirely virtualised. I knew we would have to have some real estate to bridge the gap. The industry can be slow and reluctant to change but also there is a need for people to get together and discuss issues face to face. It just doesn’t need to happen anywhere near as much as it used to.”

Also new is BeloFX a cloud-first facility founded in Canada and the UK by a group of former senior executives at VFX powerhouse DNEG. “We are remote and cloud first—everything we design is with that in mind,” explains Graham Jack, CTO. “That doesn’t mean we won’t have our own infrastructure or premises but the starting point is to make it work in a decentralised way and open up opportunities for remote collaboration. It would have been challenging to do this five years ago but the technology has now reached a tipping point.”

BeloFX’s set-up mirrors the functions of a traditional facility with virtual workstations running in the cloud connected to artists machines via Teracidi PCoIP. This is the standard decentralised tech workflow adopted by dozens of posthouses to remain in business during the last 18 months.

Launching as a decentralised facility is also an opportunity to recast the conventions around employment where talent doesn’t need to be living around or commuting into central London.

“Artists have already restructured their lives and reorganised how they live their day around child care and how they interact with families,” says BeloFX COO Ellen Walder. “People have embraced flexible working. We definitely want to embrace new technology that makes it possible to build a diverse and inclusive company, one that’s merit based. We believe this breaks down the barriers to employment and progression that there has been in some companies in the past. We are passionate that this is the model of the future.”

Many editors working in drama and entertainment have worked from home and preferred it. A number of those have now relocated from cities. That’s creating a dynamic where in a new hybrid world editors will want to meet and work on a production initially to get the editorial shape together but the bulk of editorial itself can be remote using tools like Sohonet ClearView Flex for an over the shoulder session.

Remote Live Production

Remote live production has become all the rage in the past 18 months for reasons that should be obvious. This doesn’t necessarily mean that every workflow has shifted to the cloud, but that is the logical next step. Even BT Sport, one of the world’s premier sports production teams, is doing R&D on lifting its current remote collaborative workflows into the cloud.

The cost and efficiency benefits of distributed workflows were long touted as reasons to get out of the facility. The last period has proven that the normal operation of live production – for something as previously wedded to the OB as the vision mixing – can be produced with negligible impact on quality by staff working in a central office or from home.

Rather than having to dedicate that team to one event, a cloud infrastructure with a distributed workflow permits multiple people working on multiple events happening one after another.

Another benefit to distributed workflows and cloud production is the ability to copy-and-paste complete workflows once you’ve created a template that works for you.

You can take the workflow you’ve used for one show and re-create it for another event using another data centre. It allows you to spin up multiple productions and multiple events without having to worry about starting over every single time.

Final distribution is another benefit of cloud production. If you’re going to streaming destinations, whether overseas or domestic, final distribution is definitely easier in the cloud. You’re already on the internet. You can’t get any closer to it.

There are drawbacks too.  Getting familiar with a production workflow based on GUI access, as well as figuring out what your inputs are and how they get there, along with outputs and how to get them out, can be challenging.

Skills Gap

This speaks to the issue of cultural upheaval as staff are retrained in new technologies. It almost goes without saying that the pandemic accelerated the introduction of automated and remote production workflows and web-based technologies, but a lingering side effect has been to exacerbate an industry-wide skills gap.

Education is needed to bring previously different cultures together as one team. Broadcast engineers need to get their heads around programming and OB crew need to feel comfortable with virtual control panels that aren’t physically attached to the hardware driving it.

IT specialists on the network side need to understand core broadcast skills such as why PTP is so vital or why there is such a concern about accurate orchestration, the impact of latency and the implication of lost packets.

Level Playing Field

To an extent the acceleration of adoption and the advance in sophistication of remote web-based production and distribution tools have levelled the playing field between what was nominally a tier 1 and a tier 4 sports production value.

Fans of course don’t know or care about how their sport gets to them – they expect top quality whether that’s broadcast or viewed OTT.

That said, the challenge will be how to manage new remote production workflows alongside the traditional OB. Major live events like next year’s FA Cup Final and the World Cup Finals will use remote production but still maintain significant OB facilities on site.

Perhaps the main leveller for change in this regard will come from the clamour to get serious on sustainability.

Adoption of cloud brings the prospect of flexible working, scalability, collaboration, efficient upgrading, deeper analytics and yes, lower carbon emissions and lower cost.

Getting Serious On Sustainability

But as we emerge from the pandemic it is on all of us to substantiate these claims. Media organizations must turn their attention to adjusting their business models further and quicker to tackling the threat of climate change.

It’s going to be next to impossible to claim to be carbon negative without transforming workflows on a greater scale than ever before. “Some of the technologies required won’t even have been invented yet, and it’s not something that M&E in general has begun engaging with on a large scale,” says Microsoft’s Crownshaw. “Having spent time with leaders across the industry, I think it’s starting to become a critical issue.”

The UN’s COP26 summit capped a year which drove climate change to the top of news agendas and awareness of its urgency means companies can no longer pay lip service to sustainability and get away with it.

The leading broadcasters are becoming very concerned about doing all they can to demonstrate their carbon-reducing credentials. This includes the technology and operational capability of their managed service providers. Every organization must seek to eliminate or reduce the environmental costs of doing business. Decarbonizing the supply chain is a sensible place to start.

Predicting 2022 - Supply Chains Move To Cloud

The Broadcast Bridge

As the pandemic pushes remote technologies to the fore, cloud production is undergoing a baptism of fire. The move towards cloud-based production environments has been rapidly accelerated by the pandemic forcing the video ecosystem to quickly shift to a new way of working, using cloud-based production techniques to solve the physical separation of their workforce.

“This trial by fire has shown that new techniques can be deployed, enabling new use cases and cost efficiencies inside a typically conservative industry,” according to analyst Rethink Research.  

https://www.thebroadcastbridge.com/content/entry/17879/predicting-2022-supply-chains-move-to-cloud

Rethink calculates that Cloud Production revenues will soar from $601 million in 2020 to $2.48 billion by 2026 as broadcasters shift investment.

Vendors have to take note too. Rather than an all-you-can-eat fixed price, the business of the future needs the flexibility of a pay-as-you-go model to respond to the way consumption habits – both consumer and business to business – are changing. Businesses need to better match variable costs to variable revenues.

“With few exceptions, any suppliers must transition to a SaaS business model as fast as possible [especially those with external investor shareholders],” says analyst Josh Steinhour at Devoncroft. “Any technology professional in the media technology sector (customer or supplier) dithering about moving to the cloud is likely harming their organisation’s shareholder value.”

He adds, “A successful SaaS business model requires more than a reference in sales collateral and an uncompelling purchase option in the price list. It is a lifestyle.”

SaaS - in which the entire application stack, including the application itself, is delivered as a service remains a mainstay of the overall cloud computing market.

Other related categories include Infrastructure as a service (IaaS), which delivers infrastructure components such as compute, network, storage, and virtualisation as services; and Platform as a service (PaaS), which in addition to the IaaS layers, also provides an application development environment via an application programming interface (API) for developers to write to before they deploy their apps onto the platform.

Consequently, analyst Omdia valued the PaaS market at $32 billion in 2021 (for all industry sectors) meaning PaaS grew 39% year on year, compared to IaaS’s 32% and SaaS’s 28%.

“PaaS is clearly proving to be the most popular delivery mode for cloud computing of late, enjoying the fastest growth rate overall,” states its report Ahead in the Cloud. 

Omdia’s separate ICT Enterprise Insights survey for 2020/21 highlights that M&E-specific applications — such as MAM and playout — will be hosted on both on-premise and multicloud service providers going forward. Rather than reliant on one cloud service provider (a risky business putting all eggs in one basket) media companies want to take advantage of the strengths of each.

Per Omdia: “AWS excels in the breadth and depth of its services, while Microsoft’s dominance in office productivity makes it a favoured destination for certain types of workload, and Google’s strength in AI gives it an edge for any analytical application leveraging AI. Such multicloud environments are also quite often hybrid, with at least some functionality still residing on the customer’s premises.”

Since most M&E enterprises have invested heavily in on-premises infrastructure, writing off such investments will not be feasible in the short term. The most likely scenario is the development of hybrid multicloud infrastructures, with some functionality remaining on the provider’s premises while other parts move into the cloud in some combination of public cloud/IaaS or PaaS, on-premises, and SaaS.

MovieLabs, the US studio funded thinktank, has also identified cross-cloud interoperability as important for hosting large scale film and TV productions in the cloud. MovieLabs wants to find a way for assets such as raw footage and editorial, can be ingest into one CSP in one part of the world, worked on by editorial in another location and streamed to creatives working elsewhere.

Opening Up To New Operating Models

The technology is moving at pace and taking business models with it. In part this is an accelerated response to the changing nature of work - particularly the recent proliferation of freelance and remote workers. More organisations are shifting from rigidly hierarchical structures to ones that permit greater organisational agility in order to quickly reorganise teams and respond to change.

Broadcast investment is no longer about monolithic hardware stacks and permanent facilities but software and services running on commodity machines that can be scaled and customised at will.

Similarly, tech vendors and service suppliers need to offer their products as a Service to cater to the way that media organisations want to prioritise operational over capital expenditure.

With systems running on hybrid cloud deployments, utilising the full potential of private and public multi-cloud solutions combined with software-defined workflows, the industry stands ready to deliver the quality and flexibility required.

MAM Moves To Cloud

Only a year ago, one of the biggest decisions for anyone looking to implement, replace or upgrade part or all of their media supply chain was where to put it – on-premise, in a private cloud, in the ‘public’ cloud, or some hybrid of the three. Today, for the majority of media organisations, it is almost taken for granted that at least some of the infrastructure and services will be cloud-based.

What or how much is on-premise depends largely on legacy storage or systems that need to remain operational. Cloud-based platforms are already helping to reduce those start-up costs and diminish risk. With minimal initial investment, systems can be spun up and, especially in integrating other systems, sandbox and staging systems can be managed cost-effectively.

There are already excellent examples of existing cloud-native supply chains, such as A+E Networks, Discovery, ViacomCBS and WarnerMedia, embracing this journey to the cloud. They recognise that moving to the cloud not only helps them become more efficient, but it gives them newfound agility to pursue new opportunities much faster.

In 2022 look to more broadcast supply chains to migrate to the cloud, from ingest through delivery. For many organisations the goal is to push content directly into the cloud from the point of ingest or acquisition, and never have it come back to the ground until it reaches the viewer’s device. This kind of efficiency and agility will allow broadcasters to scale the number of channels and services they can offer dramatically.

A large piece of the market will however continue with on-prem storage for some time. MAM workflows have to straddle them. With remote working now being so fundamental, it is important for users to be able to execute their broadcast workflows regardless of where they are working from, and regardless of where the media is located. 


Wednesday 22 December 2021

The New Prime Time Is… Always

NAB

VOD, AVOD, BVOD, SVOD, good old linear. It’s all converging. Let’s just call it TV — here are five trends that will make it happen, courtesy of measurement platform TVSquared.

https://amplify.nabshow.com/articles/the-new-prime-time-is-always/

Let’s be clear from the outset: TVSquared has a vested interest in promoting cross-platform TV measurement.

‍However, its observation that advertising is no longer about investing during a certain time/daypart or in a certain show just because of ratings or its perceived popularity can’t be faulted. Instead, it’s about reaching and engaging with the total TV audience — regardless of when, where or how they watch TV.

“A ‘set it and forget it’ TV buying strategy has already proven unsustainable in such a dynamic, cross-platform TV universe,” says Marlessa Stivala, senior manager of content marketing at TVSquared. “In 2022 it needs to be left behind entirely. Advertisers continually measuring and optimizing, including mid-campaign, is now a must.”

The ideal “media mix” for brands in terms of identifying ideal platforms and audiences remains an experiment, and that’s a good thing. “Striking this balance will allow them to target broader audiences as well as more hyper-targeted segments,” Stivala says.

Legacy measurement and currencies have held back the industry for too long, TVSquared argues. The connected TV ecosystem has evolved to the point “where it requires new, future-proofed approaches.” So, in 2022, multiple currencies will be the reality, powered by cross-industry collaboration to find consistent ways to count and ascribe value for all forms of TV.

While there have been initial steps toward this, the new year will bring about a greater, industry-wide effort to clearly define measurable units (such as households, devices, various types of data sources, etc.), the tech provider believes. Meanwhile, advertisers have increasingly been armed with real-time, cross-platform insights (including reach and frequency, incremental reach and outcomes).

“While these changes are taking shape now, they will become table stakes in 2022. It’s simple; these always-on measurement and attribution capabilities are critical to providing advertisers with advanced consumer insights, which allow them to sell more products and, in turn, invest more in converged TV.”

Yeah, “The Matrix” Isn’t a Metaphor Anymore

NAB

https://amplify.nabshow.com/articles/how-the-matrix-isnt-a-metaphor/

“You take the blue pill, the story ends, you wake up in your bed and believe whatever you want to believe,” says Laurence Fishburne’s Morpheus in The Matrix. “You take the red pill, you stay in Wonderland and I show you how deep the rabbit hole goes.”

That scene and those lines have seeped into our collective consciousness over the last couple of decades, more so even than the bullet time sequence, which in and of itself proved highly influential in advancing visual effects. Online, to be “red-pilled” is now a verb, meaning to awaken to a vast conspiracy that only a select few can see.

The trailer for The Matrix Resurrections also uses Jefferson Airplane’s “White Rabbit” to ram the hallucinogenic point home.

Today, the film’s influence is everywhere: “from fashion and philosophy to the shape of our technological anxieties, the proliferation of conspiracy theories, and the political tumult of the past five years,” writes Samuel Earle in The New Stateman, sensibly skating over the two sequels from 2003, which joylessly pulverized audiences and created their own kind of cinematic dystopia.

“The directors, Lilly and Lana Wachowski, foresaw contemporary tensions online: between the internet’s tendency towards freedom and conformity, anarchy and authoritarianism. More remarkably, through the sheer force of the movie’s prescience and popularity, they shaped those tensions.”

It the transgender agenda of the film that has become more apparent over time. “That was the original intention but the world wasn’t quite ready,” said Lilly Wachowski, who came out as trans along with her sister after the films were released.

Earle notes that the film’s title referred to an early word for the internet, the rebels use phone lines to move between real and virtual worlds, and inside the simulation, you can manifest a truer version of yourself The movie’s diverse and androgynous cast suggests a world beyond gender and race. When Neo meets fellow super-hacker, and future lover, Trinity (Carrie-Anne Moss) in person, he is surprised: “I always just thought you were a guy.” “Most guys do,” she replies.

Speaking to the BBC last year, Lilly Wachowski said, “The Matrix stuff was all about the desire for transformation but it was all coming from a closeted point of view.

“We had the character of Switch — who was a character who would be a man in the real world and then a woman in the Matrix.”

Lilly said she doesn’t know “how present my transness was in the background of my brain as we were writing” The Matrix. “But it all came from the same sort of fire that I’m talking about.”

The original film also captured the mood of excitement about the possibilities inherent in the nascent internet and paranoia about the rise of the machines.

Earle notes that the fear that reality is a hoax pre-dates the internet — Plato’s allegory of the cave and RenĂ© Descartes’ ‘evil demon’ are famous examples — but, popularized by The Matrix, it is now a cultural mainstay.

Yet the reality or unreality of our world was not the central concern of The Matrix, he says. While it’s filled with philosophical references, the most overt is to the French theorist Jean Baudrillard.

The directors made Baudrillard’s “Simulacra and Simulation” (1981) required reading for the cast: Neo holds a copy, and Morpheus quotes the theorist. Baudrillard was even asked to assist on the sequels but refused. The simulacrum hypothesis deserved better than to become a reality, he said.

Writing when the internet was in its infancy, Baudrillard’s principal idea was that under a deluge of what we now call “content” — news articles, photos, movies, adverts, television — anything as singular and concrete as “reality” ceases to exist. Representations of the world saturate society.

Earle extrapolates this idea to our current politics where conspiracy theories flourish: “Donald Trump’s rise remains one of the starkest symptoms of our collective descent down the rabbit hole. Trump was a conspiracist who called every truth into question: from the size of his inauguration crowd to his predecessor’s country of birth, to the weather on any given day.”

The red pill was even appropriated as a symbol of the alt-right, and an entire industry now surrounds the idea that reality is a hoax imposed by a politically correct, feminist cabal determined to subjugate men.

“Trump was cast as Neo. In edited clips, he dodges bullets marked ‘fake news,’ ‘Hillary Clinton’ or ‘CNN.’ TheRedPill, a notoriously misogynistic forum on the social media site Reddit, became a hotbed for support. The forum’s creator, later revealed to be a Republican lawmaker, used the alias ‘Morpheus Manfred.’ ”

The newest installment, The Matrix Resurrections, arrives into a world riddled with paranoia and sapped of whatever techno-optimism once existed. It’s also a world, writes Earle, where the system hardly permits original films, let alone novel futures.

An article in The Guardian also bemoans Hollywood’s abandonment of original film-making for box-office certainties. There is trepidation that Resurrections will be any good — although surely not worse than Revolutions and Reloaded.

Fans will take solace in the belief that Lana Wachowski, who directed the new movie without her sister, would not return unless she thought it was creatively worthwhile.

As reported by Looper, a (conspiracy) theory has recently taken off online that suggests that James Cameron’s Terminator movies are in fact prequels to The Matrix, and that the Wachowskis intentionally wrote The Matrix to take place in a future after Skynet has taken over the planet.

The reason why the victorious machine race then establishes the human simulation is explained in the film by Agent Smith (Hugo Weaving). He explains how the machines first simulated “a perfect human world” without suffering, but humans rejected it. “Which is why the Matrix was redesigned to this,” Smith says, dryly. “The peak of your civilization.”

To that we say, welcome to the metaverse. Red pill or blue?

 


Behind the Look: The Matrix Resurrections

for RED

In the two decades since we first entered The Matrix, there’s been a quantum leap in computing and filmmaking technology. Visionary filmmaker Lana Wachowski made sure she considered both those changes before going back down the rabbit hole.

https://www.red.com/the-matrix

“There have been huge advances in computing that called for an updated visual representation of the virtual world of The Matrix,” explains Daniele Massaccesi, co-cinematographer with John Toll, ASC on The Matrix Resurrections. “Lana wanted to create a synthetic world that would be believable to humans in 2021. It is therefore photoreal and full of color.”

The Matrix Resurrections is the long-awaited next chapter in the sci-fi franchise that began in 1999 with one red pill and has populated cultural consciousness ever since. Wachowski directs from a screenplay written by Wachowski, David Mitchell & Aleksander Hemon, based on characters created by The Wachowskis. Keanu Reeves and Carrie-Anne Moss reprise their roles as Neo and Trinity.

The original film was shot with a green filter, a tonal key borrowed from the green code typical of industrial computer monitors of the time. It was also shot, like its two sequels, on film. For The Matrix Resurrections, Wachowski wanted to take advantage of the versatility of digital.

“We knew we had to cater for significant VFX, but Lana also wanted to shoot as much in camera as she could,” says Massaccesi. “In addition, she has evolved the way she shoots movies and prefers to use Steadicam a lot of the time to feel more involved in the creation of the story on set. Pausing to reload film reels would just delay the creative efficiency.”

Massaccesi has operated camera for Toll on Wachowski projects including Cloud Atlas, Jupiter Ascending and Sense8. They collaborated with the director to reimagine the groundbreaking ‘bullet time’ visual effect of The Matrix for Resurrections.

“In prep for Resurrections, we discussed using an array of 100 cameras shooting at 120 frames per second,” Massaccesi says. “In tests we found RED’s RANGER HELIUM was the best at capturing high frame rates. Camera reliability was another important consideration, plus we were going to need dozens of camera bodies making RED the perfect camera choice.”

As production developed, the initial plans for volumetric capture were altered in favor of different ways to tell the story. Arrays of 15 RED RANGER bodies were deployed for certain shots on set. This included a shot of Trinity screaming, with multiple shadowy perspectives of her face captured by the camera array showing in the same frame.

In the film’s most spectacular homage to the bullet time sequence, two actors [Reeves and Neil Patrick Harris] appear to move at different speeds within the same frame.

Explains Massaccesi, “Lana wanted one actor to be moving super fast and one moving super slow, and yet be in the frame together interacting. It was our job to try to find a solution for that, and importantly, one that would still enable the scene to be shot as normal without disruption.”

The answer was to use a stereoscopic rig. Instead of having each RANGER in parallax as if to shoot 3D, the cameras were aligned to shoot an identical view, with one recording 6K 120fps and the other 6K 8fps. The footage was then blended in post to create an 11-minute-long scene played back at normal film speed 24fps.

“This is not something we could have practically achieved using a camera array which would demand a couple days work just to set up one shot. The setup also gave the actors greater freedom to move. They weren’t locked onto a green screen unable to perform opposite each other. For Lana, this was all about stepping away from having to postprocess everything and keep as much in camera as possible.”

Similarly, Wachowski tasked her creative team to build interactive lighting fixtures into the film’s production design. One of these illuminated the portals through which characters in the film travel when inside the Matrix.

“On set we placed LEDs around the rim of a portal door just so it glowed each time it was activated in the film and a character passed through it. This real-world lighting helps VFX to keep the whole scene feeling real.”

Having finished the major chase scenes and exteriors in San Francisco, the production moved to Babelsberg Studios in Berlin before COVID halted activity in March 2020. By the time resumption was greenlit in August 2020, Toll felt unable to travel to Europe because of the pandemic and recommended to Wachowski that Massaccesi take over the reins.

“I was slightly nervous to be honest,” Massaccesi admits. “When we’d finished in San Francisco, we felt we’d accomplished quite a lot, but we still had three quarters of the movie to shoot.”

Wachowski knew the film was in good hands and he already had an intimate working relationship with the Italian crew over the course of their previous films together. On Sense8, they evolved a technique for almost physically attaching the director to the back of Massaccesi’s Steadicam so that they move together as they film. They filmed over 98 percent of The Matrix Resurrections using this method.

“It’s a process that is not only very efficient, but I think gives Lana a direct and immediate sense of the actor’s performance,” Massaccesi explains. “Being with me on set means she sees and feels what’s going on without any need to pause and look at a monitor. We created a bubble together where all the creative decisions were made. She can immediately tell the actors whether they should change this or that. Although we’ve discussed the scene and I know what to do, she might change her mind and whisper in my ear as we film ‘go right’ or ‘go left’ or ‘look at this’ or ‘point the camera’ there. It’s kind of remote control!”

He adds, “Because Lana wanted to be on Steadicam all the time, that meant we didn’t want to spend much time changing lenses. We relied on Zeiss zooms for most of the film which, combined with shooting at 6K, allowed us to quickly resize the frame if we wanted to without cutting the camera. That’s helpful for the actors too, because it meant we could make longer takes.”

Massaccesi used all his experience to judge the scene’s lighting by eye during the shot and was able to monitor the LUT on a small monitor attached to the Steadicam. In keeping with the original, Toll had set two principal looks for the film to differentiate the human and the synthetic world.

“The Matrix in our film is modern and colorful like a postcard,” says Massaccesi who supervised the DI in Berlin, “while reality is cooler, darker with more contrast. The RED RANGER gives us more information than we need and the freedom to create different looks. For a scene in which Neo confronts Morpheus for karate training, the tone is set according to the warm colors of sunset.”

Supplementing the RED RANGERs, Toll and Massaccesi used RED KOMODOs for stunt work and additional coverage.

“We loved the KOMODO,” says Massaccesi. “It’s such a small body you can place it anywhere. We used it a lot in vehicles. On a movie like this with lots of fast motion, gunfire and explosions, you want the action to be as hard and as high resolution as possible. The global shutter eliminates that rubber motion you get from cameras with conventional shutter. It’s a terrific camera.

He continues, “For one scene, set in an office with all the fire alarms activated, we filmed for three days with a full blast of water coming off the buildings and we had no issue with the camera. We never had a problem with cold or heat. In another stunt scene in San Francisco, the camera body of a KOMODO melted in a vehicle explosion but we saved the footage. The footage was still saved. It just shows how RED cameras are very strong and reliable.”

The toughest shot for Massaccesi was capturing Reeves and Moss leaping from a building 43 floors up. The actors trained for a month and were safety wired for the jump, which they made in San Francisco.

“The actors are running then jumping and I had the camera leading them in front and then following them down. It’s always a tricky kind of stunt shot. You need to be incredibly careful with the wires and the speed, plus you don’t want to miss it! The first few jumps they were a bit nervous which is quite natural, but they did it, we got it, and it’s a tremendous finale.”

 

P Video Delivery in a 2021-22 Snapshot

InBroadcast

With 2021 in the rearview, we asked a cross section of executives to give us their perspective on the headline trends in IP Video Delivery and to cast ahead to how this is likely to evolve in technology and business. 

https://europe.nxtbook.com/nxteu/lesommet/inbroadcast_202112/index.php#/p/48

TAG Video Systems, Kevin Joyce, Zer0 Friction Officer  

“Not long ago, IP was just an emerging protocol, now it is the main form of delivery and widely embraced throughout the globe. In 2021, IP trends included migration to the cloud even as vendors and users alike explored its capabilities and limitations. This significant shift in operation merited a review of pricing methods and has resulted in a fundamental, and appealing, change in fee structures as well as a substantial increase in asset utilization in the media industry.  

If the past two years taught us nothing else. they taught us that we must be flexible and prepared for the unpredictable.  Agility and ability to pivot on a moment’s notice will be a top priority in 2022 as consumer demands continue to evolve and challenge vendors to develop technologies at an accelerated pace. Open, flexible solutions that enable rapid change, support growth and drive vendors to reach new levels of workflow agility and business enablement will continue to lead the market.” 

Cerberus Tech, Chris Clarke, CEO 

Naturally, the impact of the pandemic has resulted in widespread interest in IP technology to futureproof content delivery infrastructure. Overall, there has been a significant increase in awareness of the benefits of broadcast IP, both in terms of the downstream cost-profile and the scalability of the cloud. There have been many more RFIs and RFPs with IP delivery sections, particularly for live sports events which have suffered from unpredictable scheduling in recent months. We’ve has also noted cautious approaches to 2110 deployments with major broadcasters – a trend set to become more widespread in 2022.   

More customers state that they are looking for a multi-cloud solution for IP video delivery. Currently only one of the major cloud providers deploys media services and customers are in need of uniformity across the deployment. Moving forward, it seems likely that a significant number of customers will explore building their own services or actively look for partners that offer multi-cloud compatible infrastructure. 

Ateliere Creative Technologies, Arjun Ramamurthy, Global CTO 

“In 2021, consumer viewing habits continued to change, with more content and viewing options available than ever. More people began watching content online after a show has aired to see the highlights only without having to sit through an entire program. And even if people did watch the first-run broadcast, they still want something that’s easily shareable with their social networks. It’s all about packaging and boiling down content into bite-sized pieces for people who want their water-cooler talk, or their Tik Tok shares. Content owners are increasingly turning to cloud-based platforms which support the assembly and delivery of these components to consumers while still allowing them to retain full creative control. 

The traditional linear IP video delivery model will continue to evolve towards more componentized, just-in-time encoding and delivery. With a componentized model, we can deliver parts and pieces of content quickly and easily without requiring full programs in 30 different versions to be pre-formatted and then stored at a CDN site.  

Everyone doesn’t always have the time or the interest to sit and watch a full programme. Instead, they want it componentised for consumption during their Uber ride or in an autonomous car: ‘I only have 10 minutes during this drive, this is the content I want to watch.’ This componentised content can be pushed to a user’s device if they’ve subscribed to a provider’s highlights or made available for them to retrieve it whenever they want.” 

 

Cobalt Digital, Chris Shaw, COO 

“There has been an increase in programming delivered over IP in both 2020 and 2021.  News coverage, TV shows and films have accounted for the most significant growth, but the use of streaming services for video games, computers, podcasts, listening applications like audible books, and consumer-produced media has escalated substantially as well. And the pandemic made digital meeting applications a necessity to maintain business and social communications. Supporting all of these is an explosion of streaming technology that incorporates high speed and low latency capabilities to reward those spending time on social media, messaging services, and network TV with a superior viewing experience.  

VR and AI are now adding to the possibilities with multiple levels of control providing the end user with the ability to select their own preferred view, audio reproduction and more.  Moving forward this need will only grow. Data catching capabilities will enable providers to identify individual end-user preferences. We’re already seeing viewing habits tracked by streaming companies with online merchants identifying shopping and browsing behaviour and filling users’ screens with preferred products and entertainment choices. While we are still exploring this brand-new space, there appears to be no limits regarding what can be achieved over the next few years.” 

Exterity, Colin Farquhar, SVP Sales 

“In 2021, we’ve observed increased demand for secure distribution, which has encouraged us to focus on the expansion of Secure Reliable Transport (SRT) support across products. There’s also been greater interest surrounding enhanced codec capabilities, with partners looking to benefit from improved quality, reduced bitrates and robust content security as they integrate into new applications. 

Once again, we saw a continued ramping up of video consumption this year, which has boosted demand for transcoding and CDN interfacing to deliver content to more devices.  

VITEC’s acquisition of Exterity made big news in the industry. We firmly believe this is a real positive for the market and our customers, and it has allowed us to expand our range of integrated products available. It’s an exciting time and we look forward to 2022 with optimism. 

An issue making global headlines during these past 12 months has been that of climate change, emissions, and sustainability. VITEC has committed to changing the trajectory of its carbon footprint, and ‘net-zero’ has been a key focus. VITEC products now deliver more for less, with dedicated hardware optimising both performance and power consumption. This is a huge topic, and we expect to hear more about it from across the industry in 2022.” 

 

MediaKind, Stuart Boorn, VP Product Management  

“IP video adoption is increasing at pace. Whenever a customer is fitting out a production studio or building a contribution infrastructure, IP technology is always present. Broadcasters are enabling more of their services for live streaming, or even streaming-only for some services using technologies like HbbTV.  

At MediaKind our deployments around SMPTE 2022 -5 and SMPTE ST 2110 with NMOS are becoming increasingly commonplace in the market. It’s not essential for every deployment but what was a technology discussion three years ago, is becoming standard infrastructure.  Interoperability is much more reliable after the initial days of intensive interoperability sessions to ensure we could work with other vendors.  

As the world of production and playout evolves to the cloud, the IP headend is following. Although this has extra dependencies on cloud costs, SaaS adoption and the rate of technology refresh, it is inevitable. This will lead to an acceleration in the adoption of IP across the entire video chain, from the headend to production and playout. Our challenge is to ensure our customers get the commercial and flexibility benefits of IP and cloud for live video as this technology adoption accelerates.” 

 

Net Insight, Per Lindgren, CTO 

“In 2021, we saw the media industry building on the innovation in remote and distributed production that was delivered in 2020. Over the last year, media companies continued to transform both technologically and culturally. Following the challenges of 2020, organizations embraced flexible production workflows that give them the agility and scalability to produce a bigger volume of content more efficiently.  

Our customers tell us they want cutting-edge technology that transforms their business today and gets them ready for the future. They want to ensure business continuity under challenging circumstances and a remote and distributed production model can give them peace of mind. 

In 2022, IP and cloud acceleration will continue.  We’ll see more IP studios and the combination of high-end ST 2110 and NDI production environments. Remote and distributed production will continue to rise, enabling media companies to transition to OPEX, work with the best talent across the world, save on costs, and increase efficiencies. Viewers can also expect richer content experiences powered by 4K and AI-controlled autonomous cameras that bring consumers to the centre of action.  

2020 was the year of challenges and transition, 2021 the year of recovery, and 2022 will be the year when media companies will embrace innovation and strategize about their future.” 

 

Telestream, Luann linnebur, Product Marketing and SMPTE Fellow 

During 2021 live OTT events have become possible through lower latency, increased reliability of delivery, and enhanced monitoring.  In addition to OTT live delivery, content creation technologies have evolved.  De-centralized production of content has become more common, allowing creatives broader choices, and lowering costs through shared use of production tools. The ability to validate and monitor content as it moves through production and LAN/WAN domains has helped to drive this adoption and will become even more important going forward.   

Next year and beyond, we’ll see an increase in interoperable, bookable, cloud-based solutions.  Content creators and service provers will continue to utilize hardware, virtualized or cloud native tools, and be able to manage and monitor them through common interfaces.  As content is delivered, the need for a global view, in light of increased content viewing options, will mean an increase not only in the need to monitor and validate that the correct content ran when intended, but that the quality intended reaches the audience that has selected it, and that ad fill rates improve. 

The OTT advertising market is in a period of sharp growth. US AVOD revenues will reportedly triple between 2020 and 2026 to $31bn. Through correlated data that looks at viewer experience along with content at every step in the delivery process, issues can be resolved before viewers seek alternate offerings and revenues can be improved through more targeted ad delivery and better proofs of performance.  Visibility and quick issue resolution will be key to these complex new offerings.”   

 

PlayBox Neo, Pavlin Rahnev, CEO 

“Automated server-based playout continues to be the preferred mode of operation throughout the industry. A key driver is the need for maximum flexibility in where, when and how broadcasters choose to work.  

Secure IP-based operation has long been a core feature of our technology, allowing easy control of the entire playout process regardless of physical distance. Our research shows that ingesting large amounts of incoming content streams is a significant challenge for many channel operators. The recently introduced PlayBox Neo Capture Suite speeds up the ingest workflow of television networks, post-production facilities and playout service-providers.   We have also expanded the feature set of our server-based AirBox Neo-20 modular playout system. Suitable for on-premises or IP-linked remote control, AirBox Neo-20 allows playlist scheduling to be performed weeks ahead of actual transmission.“We see a strong trend towards hybrid combinations of server-based and cloud-based playout. Our Cloud2TV offering forms a robust platform for secure MAM and archiving, accessible by accredited personnel anywhere in the world. Cloud2TV is fully compatible with existing PlayBox Neo solutions and can be used to extend their capabilities and functionality. The only hardware an operator requires is an internet-linked desktop or mobile computer. Control is via an intuitive web-based interface with administrator-adjustable rights assignment, TV channel management, action-logging and notifications.” 

LiveU, Ronen Artman, VP Marketing 

“LiveU has seen tremendous growth across the IP bonding space in 2021, a space that’s increasingly broad as adoption of our technology spreads to all corners of the industry. It’s across the sports sector that we’ve seen particularly spectacular increase in use. This is not only for dynamic, value-add additional material - now we’re seeing rising use for main feeds across top tier sports.  

It’s clear that real-time, or close to real-time, cloud services will continue to grow in use and complexity, enabling end-to-end, cloud-based workflows. LiveU is very active in the development of these services, either via tightly integrated solutions with key industry partners like Avid, Blackbird, Grabyo, Grass Valley and Vizrt, or through ongoing internal solution development, with the recent launch of orchestration platform Air Control one example.  

It’s now not a question of whether it can be done but how best it can be done. This also applies across remote production. Our technology and the industry are ready. Now it’s up to our imaginations.” 

 

Zixi, John Wastcoast, SVP Strategic Alliances and Marketing 

“Live IP video adoption is rapidly accelerating with all major broadcasters pivoting away from satellite with stated dates for entire transition plans. The drivers have been virtualization and the desire and necessity to operate remotely, the regionalisation and localisation of content, rapidly evolving monetization opportunities, C-Band capacity migrating to 5G as well as hybrid network workflows over Internet in the cloud and on fibre. The associated low risk with Opex is allowing for the test of services without the worry about long-term losses. 

Looking ahead: Real-world deployments of 5G in B2B television use cases will increase. The high bandwidth, ultra-low latency wireless network providing universal edge will enable mobile consumers to create and receive 4K live content, as well as mobile-to-CDN workflows.  

Like other verticals, M&E is adopting AI and ML to overcome data overload and realize autonomous broadcast operations. There have been too many false alarms in a network where it is hard to tell what is and is not important but with new tools broadcasters can visualize all the data so that a human can understand and be provided a better root cause analysis to understand the true causes of instability and failure. These analytics deliver predictive maintenance that is orders of magnitude cheaper and easier, as well as improving confidence in new configurations.” 

 


Behind the Scenes of Lana Wachowski’s The Matrix Resurrections

IBC

Cinematographer Daniele Massaccesi explains how the visual representation of The Matrix has been updated, including the iconic ‘bullet time’ effect

https://www.ibc.org/features/behind-the-scenes-of-lana-wachowskis-the-matrix-resurrections/8241.article

The Matrix was as prophetic as it was popular on release in 1999. It won four Academy Awards including for VFX, reset the bar on high-concept sci-fi cinema and spawned a meme (red pill or blue) that resonates ever stronger as the facsimiled digital dystopia of the metaverse edges closer.

A tall order then for creators Lana and Lilly Wachowski to repeat; an attempt that many fans felt they failed in the two back-to-back sequels, Reloaded and Revolutions, of 2003.

Almost 20 years on, and the computing landscape has changed, advancing filmmaking technology with it. These developments called for an updated visual representation in Lana Wachowski’s return to The Matrix Resurrections.

Resurrections is designed to reflect the leap forward in computer interface and to give us a different look and a more natural colour palette,” says Daniele Massaccesi, co-cinematographer on the project.

The original, which was released six years before Apple launched the iPhone, illustrated the ability of characters to travel within the Matrix through phone handsets and dial-up ISDN lines. In Resurrections they travel through portals.

The tonal look of The Matrix was green to mimic corporate computer screens of the time while the real world was given an inhospitable, drab blue tinge.

“This time our look for the Matrix world is more colourful, like a postcard, while the depiction of the real world remains cooler and darker with more contrast,” explains Massaccesi. “In Resurrections, the Matrix has been designed so humans in the real world find the simulacra to be believable. It is therefore photoreal and full of colour.”

While The Matrix and its two sequels were shot on film, Wachowski wanted to take advantage of the versatility of digital and worked with Red Ranger Helium cameras shooting 6K.

Homage to bullet-time

“Principal photography was only 88 days – for a movie of this size that’s very quick,” says Massaccesi, who began the project as A camera operator for John Toll, ASC. When Covid halted production after location shoots in San Francisco, with all interiors left to shoot at Studio Babelsberg, the American felt he couldn’t resume due to the health risks of travel. Massaccessi, an Italian with vast experience including work with Toll on Wachowski projects such as Cloud Atlas, Jupiter Ascending and Sense8, took over.

This included reimagining the iconic ‘bullet time’ visual effect of the original which appeared to show action in slow-motion while the camera moves through the scene at normal speed. It was achieved using arrays of cameras and postprocessing, and that was the starting point this time for director and camera team.

“Working with Lana and John in prep we discussed using arrays of 100 cameras shooting at 120 frames per second,” Massaccesi says. “We decided not to go this route, partly because we felt we wanted to tell the story in a different way and also because Lana’s filmmaking has changed over the course of movies since. She prefers much more fluidity in her process and to work with Steadicam a lot. She didn’t want to pause production to do complex VFX. She wanted to shoot as much in-camera and on Steadicam as she could.”

Arrays of 15 Ranger bodies were however deployed on certain shots. This included a scene in a café where Trinity (Carrie-Anne Moss) is screaming and shown with multiple shadowy perspectives of her face in the same frame.

In the film’s most spectacular homage to the bullet time sequence, actors Keanu Reeves and Neil Patrick Harris appear to move at different speeds within the same frame.

“Lana wanted one actor to be moving superfast and one moving super slow and yet be in the frame together, interacting,” explains Massaccesi. “It was our job to try to find a solution for that and importantly one that would still enable the scene to be shot as normal without disruption.”

The answer was to use a stereoscopic rig. Instead of having each camera in parallax as if to shoot 3D, the cameras were aligned to shoot an identical view with one recording 6K 120fps and the other 6K 8fps. The footage was then blended in post to create an 11-minute-long scene played back at normal film speed 24fps.

“This is not something we could have practically achieved using a camera array which would demand a couple of days work just to set up one shot,” she adds.

“I think in the original she found the VFX a slow process with a lot of passes to get it right. She is stepping away from that and wants her film to be more realistic in the sense of finding more of the solutions on set. Clearly there are a lot of VFX in this film [led by Framestore and DNEG] but the trick was always to find something practical when shooting a scene for the VFX to reference.”

One of these ideas was putting a rim of light over the portals that catch the actor’s hair and face as they move between the worlds.

Intimate Steadicam operation

On Sense8, Wachowski and Massaccesi had evolved a technique where the director would shadow the Steadicam operator during filming. It’s unusual, since most directors would be back from set monitoring the shots. However, so enamoured of the process was Wachowski that they filmed close to the entire film using this method.

Wachowski has described the technique as forming “this weird, four-legged creature with one eye”.

“It’s a process which is not only very efficient but I think gives Lana a direct and immediate sense of the actor’s performance,” Massaccesi reveals. “Being with me on set means she sees and feels what’s going on without any need to look at a monitor. She can immediately tell the actors whether they should change this or that. Although we’ve discussed the scene and I know what to do, she might change her mind and whisper in my ear as we film ‘go right’ or ‘go left’ or ‘look at this’ or ‘point the camera’ there. It’s kind of remote control!”

Supplementing the Rangers, the DPs used Red Komodos for stunt work and additional coverage. “On a movie like this with lots of fast motion, gunfire and explosions you want the action to be as hard and as high resolution as possible,” Massaccesi says. “The global shutter eliminates that ‘rubber motion’ you get from cameras with a conventional shutter.”

It also proved nigh on indestructible. Massaccesi says this was not an easy movie to shoot. “For one scene set in an office with all the fire alarms activated we filmed for three days with a full blast of water coming off the buildings and we had no issue with the camera. We never had a problem with cold or heat. In another stunt scene in San Francisco the camera body of a Komodo melted in a vehicle explosion but we saved the footage.”