Wednesday, 24 February 2021

SMPTE 2110: Is it fit for the future of broadcast?

IBC

As live production moves to the cloud, can SMPTE Standard 2110 evolve to retain broadcast quality standards or should the industry wholeheartedly embrace web technologies? 

https://www.ibc.org/trends/smpte-2110-is-it-fit-for-the-future-of-broadcast/7301.article

Broadcast production is at a crossroads and CTOs have a decision to make: should their new studio or mobile facility be built using SMPTE standard 2110 or something that might be more suitable for the cloud computing age?

It’s a problem that has surfaced in recent months as large-scale live production — the area of premium broadcast programming for which 2110 was principally designed — has shut down or reverted to using less conventional technologies to keep on air.

For many, ST 2110 still represents the bedrock of professional production and a relatively risk-free way to segue the industry’s legacy SDI base into IP. Others see an existential crisis in which broadcast engineering-based standards are a cul-de-sac and that if traditional players are ever to innovate on par with internet-first streamers they need to change the narrative.

“Broadcast TV must adapt to online entertainment formats faster than the online entertainment formats make broadcast irrelevant,” says Johan Bolin, Edgeware’s chief product and technology officer.

“A growing number of broadcasters are asking themselves do they really need to continue building the broadcast stack along conventional lines or is now the time to embrace web and other technologies more generally in production.”

For many the issue boils down to the engineering mindset. If your starting point is to build a perfect pipeline where all the important performance indicators like frame sync are under full control and can be guaranteed, then this will inevitably fail when working in the cloud.

A STOP2110 website is blunt. It lambasts the standard as a “train wreck”, “worse than SDI” and “old school hardware engineering combined with design by committee.” The site doesn’t suggest any genuine alternative and its author lacks the courage to go public, but it begs the question – why has a dry international specification drawn such ire?

The value of ST 2110
From the analogue era through SDI the industry has used baseband video simply because it’s the highest quality. Without processing, uncompressed media also offers the lowest latency because in live you want to interact with your studio and your performers.

When it came to devising a means to migrate the industry into IP, these fundamentals were sensibly maintained. Standard 2110, for which SMPTE and co-developers VSF, EBU and AMWA have been awarded a Technical Emmy, reinvents SDI by providing for uncompressed video and precision timing.

It differs significantly from SDI in splitting audio, video and metadata into separate streams (or essences). Instead of having to worry about running the correct type of cable and signal to various locations, broadcasters have far greater versatility to be a responsive studio business.

“The idea that you could separate the flexibility that you needed from the cabling you laid down was considered a goal worth achieving,” says Bruce Devlin SMPTE’s Standards Vice President.

Since standardisation in 2017, ST 2110 interfaces have been added to core equipment from cameras to multiviewers enabling broadcasters to build new facilities entirely in IP. BBC Wales’ new facility in Cardiff, is one example.

It achieved its aim of unifying the industry around a common suite of IP specs and allows broadcasters to migrate as fast as their investment allows by keeping one foot in the SDI camp.

However, the rocketing rise of OTT streaming and the advance of cloud computing exacerbated by Covid-19, has put the future of 2110 under scrutiny —even at SMPTE itself.

“It is not really that 2110 is the wrong standard, it’s that the means of content consumption has started to change rapidly,” Devlin says. “The global pandemic accelerated this when live sports and stage events, all the stuff that 2110 is dedicated to, almost vanished overnight.”

Cloud-based workstations using PCoIP, and low-cost low bandwidth video transmission has become the norm. Business teleconferencing tools, smartphone cameras and webcams are in routine use in at home scenarios for both production crew and on-air talent. ST 2110 was not designed for this.

What’s more, the audience has begun to accept what the IABM calls ‘Covid Quality.’

“The use of off-the-shelf collaboration tools may not be ideal, but it keeps the media factories running,” it finds. “Audiences started to accept glitches, streaming issues and for that matter more often than not poor video and audio quality; our expectations for more 4K UHD in 2020 turned into Covid Quality.”

PTP meets floppy timing
It’s not as if things will go back to normal when the pandemic passes. Remote production links contributed over the internet were advancing anyway. Now they are entrenched. Cloud computing and cloud services are becoming ubiquitous.

“We’re having to find ways to use the 2110 ecosystem to connect nano-second accurate studio environments with remote operations over the internet or in the cloud where floppy timing exists,” Devlin says.

The Joint Taskforce on Network Media (JT-NM) which coordinates SMPTE 2110 and the wider development of a packet-based network infrastructure, is investigating ways to connect the Wide Area Network of a production plant with tools, applications and facilities outside of the studio.

However, current cloud connections are not up to the quality standards required for low latency live streaming media. Therefore, SMPTE says research into quality-enhancing technologies, such as retransmission or Automatic Repeat reQuest (ARQ), is crucial to improving the network infrastructure required to deliver broadcast-quality transmissions. 

“The JTNM say we still need 2110 accuracy within a facility but we don’t necessarily need 2110 perfection between two facilities or between an OB truck and a facility,” says Devlin.

“It’s finding a way to take the gold-plated excellence of 2110 together with parts of the ecosystem which are less gold plated and using them both to produce better in a Covid world.”

One option is to compress media to get it to and from the main production plant or into and out of cloud. The leading scheme is ISO standard JPEG XS, a mezzanine compression that squeezes the bits sufficiently to save on bandwidth but not hard enough to destroy the quality needed for manipulation, like chroma keying, in production. Crucially for live production, JPEG XS exhibits extremely low latency. It is already mapped into the 2110-22 ecosystem and products are launching with JPEG XS capability.

The BBC also expects ‘hybrid’ architectures to evolve, and its R&D team is looking at how to ensure interoperability. In a blogpost, it says: “ST 2110 isn’t naturally suited to deployment in a cloud, so we expect ‘hybrid’ architectures to evolve, and will be looking at how to ensure interoperability in these. This is likely to include work on ensuring that media identity and timing information is preserved, including where we need to go through compressed channels, such as for contribution from remote studio.”

ProAV interoperability
Also in the works is a proposal to standardise the interoperability of products within the Pro AV sphere. The Internet Protocol Media Experience (IPMX) would encompass many technologies being used by at-home productions such as robotic PTZ cams and web-cams as well video conference codecs.

IPMX is based on 2110 and promoted by the Alliance for IP Media Solutions (Aims), which is chief cheerleader for ST 2110 in broadcast. This makes sense since, according to Aims, a quarter of its members sell into both broadcast and AV markets.

The move also recognises both that AV and broadcast are undergoing a transition to IP. The benefits are similar for both industries such as bi-directional cables and reduced space. The gear used to produce and distribute content for giant screens at music venues, for digital signage or esports events is also sold into broadcast. And in many cases the quality of AV content exceeds that of broadcast.

The elephant in the room when talking to SMPTE, Aims and ST2110 supporting vendors like Imagine and Sony is the widely used video over IP transport scheme NDI. Developed by NewTek and owned by Vizrt, NDI is a live production protocol considered a non-starter by backers of 2110 because its heavy compression is considered unsuitable for broadcast and its proprietary nature incompatible with open standards.

Innovating production
However, these arguments are precursors to the wider challenge of evolving production to deliver truly personalised, interactive media.

This is generally considered the future of ‘TV’ and is tantalisingly in reach thanks to high-speed high bandwidth technologies like 5G. In comparison, the production of content itself remains in the dark ages and ST 2110 is considered by some to be part of the problem.

“Fundamentally, if TV is to transform it must overcome the brick wall between production and distribution,” says Bolin. “These two domains are separated and 2110 is not the solution.”

He argues the while the upstream process in TV all about creating content the downstream process attempts to reach as wide an audience as possible, whether through satellite, cable or DTT and now the internet.

“Upstream has worked with the same production processes and tech stacks for five decades but the growth of the internet has forced broadcasters to increasingly work with internet-based technologies downstream.

“Yet it is incredibly difficult today for viewers to contribute video upstream. This is by design. It is not a consequence of the technology. It is how we have designed the technology.”

Bolin says he wants to see technology that “not only allows” but “encourages the industry to mix and blend downstream and upstream processes” to enable TV formats more tailored to the viewer or concepts that allow viewers to contribute to the live programme.

BBC in the cloud
These are not the thoughts of one maverick vendor. The BBC is thinking along identical lines.

Having started to use IP in production centres like BBC Wales fitted with 2110 it now says, “the content-making capacity and equipment in these facilities is still mostly fixed during the design and fitout stages, meaning large changes can only occur during a re-fit. The business operating model of a current generation IP facility is also fairly inflexible, with large capital expenditure required upfront.”

Those are alarming statements given that they could equally apply to SDI, the prison from which ST 2110 promised escape. Content still has to be created using traditional broadcast equipment in a physical production facility.

BBC R&D is therefore investigating how it can apply the cloud computing technologies which run iPlayer to its production operations.

“R&D are working with colleagues from around the BBC to join up these two areas, enabling broadcast centre-style production operations to occur within a software-defined cloud environment,” it states.

“We think the benefits of this will be huge, making our physical IP facilities even more flexible, and enabling us to deploy fully virtualised production systems on demand. Ultimately, this will help the BBC make more efficient use of resources and deliver more content to audiences.”

Edgeware is making similar explorations. “The idea is to take web-based technologies and the tech stacks and concepts from games development and esports and incorporate those into the TV stack,” says Bolin.

“Broadcast has always been about guaranteed bitrates and guarantee framerates and guaranteed no drift in time. Video on the internet is about accepting its imperfections, accepting that you will have drift and you will have a problem guaranteeing perfect bit rates. The onus is on the industry to build solutions that mitigate these imperfections.”

No matter how perfect the upstream there will always be imperfection in the downstream. That’s true with SDI or 2110 since the source is always degraded in some form during distribution. Bolin says the industry should prioritise innovation in production and work with internet’s concept of best effort distribution.

Indeed, there are a number of protocols for smoothing loss, jitter and latency such as MPEG DASH, RIST and SRT which do mitigate the internet’s deficiencies.

“We should facilitate innovation rather than seek perfection,” he says. “Today’s best effort is pretty darn good.”

 

 


What the Heck Is (n)K Resolution?

Copywritten for AVID

Leaps in video resolution are unrelenting. Acquisition continues its inexorable march from 4K to 8K—and toward a time when pixel counts will be virtually unlimited. This trend is set to unleash new creative possibilities, spurred on by advances in technology and by consumers' desire for more immersive, photorealistic experiences. Avid calls this trend (n)K resolution, and it describes a future in which creatives are no longer constrained by the number or quality of pixels in a screen.

https://www.avid.com/resource-center/what-the-heck-is-nk-resolution

So, what is driving resolution independence, and what can you expect to see from it down the line? Let's break it down.

MOVING TOWARD RESOLUTION INDEPENDENCE

"Within Avid, we've been following a philosophy of resolution independence," says Shailendra Mathur, vice president of architecture at Avid Technology, in a Z by HP report titled Reshaping Creativity. "That's why we call it (n)K resolution. Any aspect ratio, any resolution. We've gone from SD to HD to UHD, and now we're at 8K. That trend is going to continue."

There are many reasons for acquiring video at the highest resolution, including banking a master copy for sale when the format's market (e.g., the install base of screens) catches up. From VFX to frame resizing, high-end content is routinely produced in post using high resolution and bit rates. Acquiring at the highest resolution produces a better-quality output—even if the end device plays back a lower resolution and bit rate.

The industry's adoption of UHD formats is following a similar trajectory to the transition from SD to HD, though at an accelerated pace thanks to digital-first platforms like YouTube—and the momentum is on track to continue through 8K to 16K, 32K, and beyond.

"The biggest driver is the demand by humans for even more immersive visual content," says Thomas Coughlin, digital storage analyst and author of the annual Digital Storage for Media and Entertainment Report. "Other drivers are computing, networking, and storage technologies that can support the creation and use of ever higher resolution content."

THE DESIRE FOR HIGHER FIDELITY CONTENT

Jeremy Krinitt, senior developer relations manager at NVIDIA, agrees. "There's a strong desire among people to experience content in higher fidelity. This has driven higher resolution requirements, but it's also driving technologies like HDR that can more accurately display colors," he says. "Ultimately, all of this is in the service of storytelling. Whether something is recorded on an old webcam or on the latest 8K camera, it needs to be able to serve the storytelling purposes of the person creating the content."

In Japan, 8K broadcasts have already made the air, a library of 8K resolution content is available on sites like YouTube, and the flagship screens/flat panels of major consumer electronics brands are now 8K. However, the creative demand for super resolutions is targeting emerging immersive applications.

"While flat image resolution may reach a limit, 360° content requires higher resolution, driving the resolution and image quality requirements even further," says Coughlin. "Volumetric computing capture and display technology will require the use of even more captured content."

Another factor impeding the breakout of consumer VR is the bottleneck in both resolution and the ability to deliver high fidelity to all parts of the viewing experience, including peripheral vision. VR requires wrapping the participant in a photorealistic experience with a minimum of 8K resolution content delivered to both eyes.

RESHAPING REALITIES WITH VOLUMETRIC VIDEO

"Viewing through two eyes is the natural thing to do, and stereoscopic VR takes us into the next level," says Mathur. "It's a wholly immersive experience." Mathur believes we won't be satisfied with entertainment "until we can offer an alternate reality that matches how our senses work."

This vision is in the early stages of being built by computing giants Apple, Microsoft, Google, and NVIDIA. Otherwise known as spatial computing, it conceptualizes a next-generation, 3D version of the internet that seamlessly blends the physical world with the digital in an extended reality.

"When you create things spatially, you can explore them as either a virtual reality or an augmented reality experience," says Nonny de la Peña, founder and CEO of Emblematic Group, in the Z by HP report. "I think that the idea of the separation between [AR and VR] technologies is going to go away."

As Reshaping Creativity observes, spatial computing offers gesture control—currently only practical in VR—in the 3D world, allowing users to interact with virtual interfaces and objects by reaching out and touching them. Resolution, along with key image attributes like HDR and high frame rate, are key to this future.

8K PRODUCTION HAS ARRIVED

The ecosystem to produce 8K has arrived. Feature films like 2020 Netflix release Mank are part of a growing number of productions being shot in 8K, in part for production and in part for archive. RED, ARRI, Sony, and Blackmagic all have cameras capable of 12K acquisition—the release of more cameras able to record at these higher resolutions is inevitable.

Higher resolutions are entering the mix beyond television and film, too. The first applications will be in digital out-of-home advertising and large entertainment venues, such as the MSG Sphere being built in Las Vegas.

Experiencing images at higher resolutions whets the appetite for pushing visual limits even further. Techniques that capture volumetric video of a 3D space may help create content for VR head-mounted displays, and eventually for free-standing holography, such as those being developed at Light Field Lab.

"I have heard talk that something like the holodeck from Star Trek could require more than 520K video," jokes Coughlin.

Yet NVIDIA CEO Jensen Huang says in this IBC article that the combination of cloud-native and photorealistic tools with path tracing and material simulation, powered by NVIDIA GPUs and AI algorithms, could bring that holodeck to life.

N(K) RESOLUTION IS ON ITS WAY

None of this will be easy. It is predicated on continued advances in compression technology with AI solutions, cloud storage, 5G edge computing processors, and networking bandwidth.

According to Krinitt, it's not just about processing: innovations will rest on higher efficiency and capabilities from networking technology. This has implications for CTOs and IT teams looking to future-proof their infrastructure.

"Since resolution and other important video requirements, such as bits per pixel, will drive ever higher storage capacities, anticipating this need and building for this level of scale will be an important element in future-proofing post-production and archiving architectures," says Coughlin.

It's quite a vision for the industry. Resolution independence opens new possibilities for creatives to tell stories at whatever combination of resolution, color gamut, dynamic range, frame rate, and even dimension they wish, automatically scalable up or down to the viewer's screen, environment, or pleasure. (n)K resolution is on its way—and it might be here sooner than you think.

 

Monday, 22 February 2021

Powering Extreme E’s Remote Live Production

copywritten for BASE Media Cloud 

A multi-cloud distribution platform from Base Media Cloud and Veritone helps off-road racing series Extreme E store, manage and share assets with multiple global partners.

https://www.broadcast-sport.com/2021/02/22/feature-powering-extreme-es-remote-live-production/

Imagine a Red Bull air race on the ground. There are certain gates that teams need to pass through but how they get through them is them is down to the skill of male and female drivers on terrain that varies from desert to deforested jungle to deserted glacier.

That’s the premise of all electric rally-style Extreme E, the progressive FIA-backed SUV racing series which launches next month.

With 30 percent of the planet’s CO2 emissions coming from transport, Extreme E exists to showcase the performance of electric vehicles, and to accelerate their adoption.

As such it needs to marry urgent environmental messaging with as lean a production footprint as possible.  That’s particularly challenging for a live broadcast given that the locations are remote and infrastructure-free.

“We want to shine the spotlight on the climate crisis that we’re facing all over the world through the lens of an adrenaline filled action sport,” explains Dave Adey, head of broadcast and technology for Extreme E. “We’re employing remote production with minimal production staff on site and no spectators at the track, so for us content and fast turnaround is imperative.”

There are four constituent elements to the Extreme E production designed by production partners Aurora Media Worldwide and North One. All race camera sources including drones and onboards are uplinked from a lightweight TV compound on site. Car telemetry is managed by Barcelona-based Al Kamel Systems with AR and VR overlays from NEP in The Netherlands. Everything is back hauled to the gallery in London for production of live coverage across each race weekend plus highlights shows, a 30-min race preview and 300 VOD films for digital.

Given the scale of production, Extreme E needed a system that would allow them to manage content, including the ability to upload from anywhere into a centralised secure storage location. They also needed to be able to manipulate, search, view and download content; and to give this functionality to its authorised media partners.

“We need to find any of the content instantly so the user interface needs to be intuitive and the metadata schema rich but precise,” Adey says. “Once you find the clip you want to be able to view it with a proxy version online. We then may want to manipulate that content or create clips or transcode to different file formats. The system we chose had to do all of this and more.”

Extreme E chose to use a sports multi-cloud Digital Media Hub (DMH) comprising a cloud-native storage and content distribution platform developed and managed by Base Media Cloud with Veritone’s AI-powered asset management system.

After transmission, all live programming and all the rest of the content including VT’s, highlights and digital is uploaded to the DMH for rights holder to search, view and use.

“The DMH provides a dual purpose: to make content easily available to rights holders; and provide a rich suite of assets that rights holders can use to enhance their own content,” explains Adey.

 “A key benefit of a cloud-native solution is that the distribution of content is much more cost effective. I don’t have to put up a satellite feed to do a highlights program. Instead, we can create those programs in London, upload them into our content management system and make them immediately accessible via accelerated download for any of our rights owners and media partners around the world.

“It’s also really important that we have very high and very clear, environmental credentials which the multi-cloud sports media solution from Base Media Cloud and Veritone gives us.”

More than 70 broadcasters have bought rights to Extreme E including Discovery, Sky Sports, Fox Sports, BBC, ProSieben Maxx, Disney ESPN and TV Globo. The series launches in April in the deserts of Saudi Arabia and will continue in Senegal, Greenland, Brazil and Tierra del Fuego. 

 

Thursday, 18 February 2021

Remote Collaboration is a Fact of Life as Post Production Finds a Home in the Cloud

copyritten for Sohonet 

Chuck Parker, CEO, Sohonet, on Storytellers gaining confidence in remote workflows — and how the technology will ultimately get better and cheaper over time, resulting in a “new normal”  – just as effective but with better work-life balance.

https://www.sohonet.com/our-resources/blogs/remote-collaboration-is-a-fact-of-life-as-post-production-finds-a-home-in-the-cloud/

The collective scramble that our industry colleagues from Avid editors to VFX artists were forced to undertake in the early stages of the pandemic has given way to a universal acceptance and relatively standardized mode of remote working. With health and safety protocols likely to remain in effect for many months to come (.and warnings that we’ll be social distancing and wearing masks well into 2022), it’s clear that we’ll be in a hybrid work scenario for some extended period of time.  The practice of putting the customer into the darkened room while the artist drives the session remotely to preserve pandemic protocols is likely to work into our industry’s “muscle memory” in 2021. 

By the end of 2021, the industry will have experienced 21 months of remote collaboration. What began as a necessity will most likely remain in place even as teams are allowed to travel or return to the office. We are forecasting the bulk of 2021 to be remote, with artists and creative execs traveling to special darkened rooms sporadically and often alone except for those joining remotely.  

Far from diminishing, this trend will continue as people realize that such tools solve practical problems for the content creation process and improve everyone’s quality of life.  Remote collaboration is beneficial to artists. Talented creatives no longer have to live in expensive cities like New York, LA or London to access work. Any location which meets your family’s needs and work-life balance is on the table. Remote collaboration enables the work to move to you, and, while we yearn for togetherness, begs the question of whether we will ever return to a single creative suite with the number of physical participants and frequency we once did.

Post production moves to the cloud

The VFX end of our industry began the move to cloud at scale in 2015, driving improved rendering costs and time efficiencies and introducing new workflows.  In 2020, the pandemic kicked remote use cases for creative tools into overdrive, resulting in more post production processes moving to the cloud, boosting remote distributed collaboration (i.e. lots of team members in lots of different places).

However, while there are many, many more artists using remote tools hosted with public cloud providers, there are still at least two major hurdles for our industry to solve before the physical trappings of our existence ebb away.  First, the simple economic hurdle has to be solved.  Thousands of industry participants have already invested in creative workstations and other tech gear which are already deployed in machine rooms and data centers all over the world.  That “sunk capital” problem will likely take 18-30 months to work itself out, arriving at a future where the majority of new purchases are in a SaaS model vs. the currently most common capital expenditure model (capex).

Second, we need to solve the technical challenges of “critical review output”.  Meaning, we need to provide the same video and audio fidelity and “over the shoulder” responsiveness that our industry demands of our in-suite experiences from the cloud delivered equivalent solutions.  This is not a simple problem because video and audio fidelity requirements demand large streaming payloads which in turn create more challenges for latency and contention on the network — after all, there is a public internet in between the artist and their cloud-hosted tool set, duplicated to every viewer of their live stream.  Pushing the output of those tools to dispersed artists and creative execs around the world scale so that it works right every time won’t be easy.

So while the timing may be harder to predict, our industry’s direction of travel is certainly accelerating towards a future where Storytellers will continue to gain confidence in the remote workflows they are forced to utilize today, which will get better and cheaper over time, resulting in our “new normal” which is likely to be every bit as effective as the old way of physically being together, but with the work-life balance benefits of being where life needs you to be at that point in time.

 


The complexities of moving to the cloud

IBC

Broadcasters and content owners are moving more of their operations into public cloud but this global trend masks a range of complexities. 

https://www.ibc.org/trends/the-complexities-of-moving-to-the-cloud/7287.article

Media organisations are moving at different paces and there is no one size fits all technical approach. While some have already moved lock stock and barrel to the cloud, others face investment, training and technology dilemmas about how best to proceed.

“Cloud is a one-way street and something broadcast CTOs have to embrace,” says Baskar Subramanian, Co-founder, Amagi. “It’s a question of what to move and how fast.”

The poster child for this is Discovery which began its wholesale move to AWS Cloud in 2016.

“Previously we’d have to buy a load of servers and install them, we’d need a file transfer system, and we wouldn’t know what the return on investment would be over time,” explains Simon Farnsworth, CTO Broadcast Technology & Operations at an SDVI hosted webinar. “Now, we can very accurately cost things like major new projects. It has become a lot less emotional and more binary since we can accurately predict cost.”

He estimates Discovery’s cloud-based supply chain has already saved the company $100m. Cloud is also claimed to have shaved $1bn in synergies from Discovery’s 2018 acquisition of Scripps Networks.

“Historically, [when Discovery entered] new territories we had siloed ops teams with siloed tech stacks and siloed workflows but we’re able to standardise that now,” Farnsworth says. “All the content for Discovery+ is in the cloud and it’s just a question of feeding it through to our own operated platforms or to affiliates. We need to be fast. What [cloud] has allowed us to do is generate the same amount of content while investing a truck load in new product.”

Comcast Technology Solutions, perhaps the largest service provider in the world, is about to make a major acceleration in moving its own supply chain (though not yet including Sky) to the cloud.

“We want to move into the cloud for flexibility and speed,” explains Bart Spriester, VP and GM of Content and Streaming. “To provide services to spin up and down and we need it to be usage based. We need to remove the integration lead time of on-prem solutions and remove capital approval cycles and slow software deployment.”

In 2020 the company syndicated 66 million minutes of content to partners like Cox and Rogers and more than 170 affiliates. “With this volume we need to take a lot of friction out of the system,” Spriester says. “We think there will be a huge benefit to moving this out to public cloud infrastructure.”

The benefits of moving the supply chain to the cloud are clear. This includes the ability to build up and down rapidly and only pay for resources when required. Enterprise scale operations can be run with greater efficiency and accuracy than before.

“Changing from a custom on-premises environment where different processes are done on different vendor’s kit to using common tools in an open source environment gives a much more consistent view of the operational state of the platform,” explains Tony Jones, principal technologist, MediaKind. “The way you build and configure systems is declarative meaning that you instruct system components what you state you want it to be and the system executes how to get there.”

It is the deterministic behaviour of systems in a cloud environment that means broadcasters can predict with far greater certainty exactly what operating a service should cost.

Add to that a microservices approach to development and deployment and broadcasters can upgrade equipment and introduce new features far faster, more economically and more flexibly than before.

“If broadcasters want to have a healthy future in competition with SVOD vendors they have got to think along those lines,” urges Jones.

One destination, many paths
These arguments may be well known but getting there is not straightforward for the majority of broadcasters. In the negative column are cost, complexity and cultural inertia.

“People are on different pathways,” says Peter Sykes, Strategic Technology Development Manager at Sony Europe. “At one end you have more traditional organisations making SDI to a IP as a first step while others are now moving to combine IP with cloud. Media companies know they have to reach new audiences but can’t increase resources and in some case are having to reduce capital outlays.”

This financial squeeze is one reason for a phased migration to cloud. Many broadcasters put their toe in first by moving disaster recovery operations. This has sped up since Covid-19 underlined the necessity for business continuity.

Another step might be to take less critical workflows like media processing and VOD to cloud. For others it makes sense to move complete sub-systems into a public cloud environment rather than a component-based approach. These systems are typically operated as-a-service by external providers.

“They could move a complete broadcast chain encompassing playout, compression and multiplexing or ABR packaging as a one functional unit,” Jones says. “There’s not really any value to the broadcaster to build that themselves but if they choose to take it prepacked it’s an operationally easier environment and there’s just one [vendor] to talk to if there’s a problem.”

The pace also differs depending on delivery technology. “The traditional DTH anchor of broadcast delivery is moving slower than OTT DTC service launches which are more likely to be cloud deployments,” says Richard Mansfield, MediaKind’s Steaming Director. “Broadcasters not ready to migrate their entire infrastructure are making this their first step.”

Arguably, the biggest issue hindering broadcaster moves to the cloud surround skills and mindset.

“The primary issue is cultural mindset more so than technology,” says Subramanian. “It’s a question of being comfortable with a particular way of doing things and a reluctance to doing things differently.”

Broadcasters used to plugging-in individual components using SDI or its IP version SMPTE 2110 face difficulties in working out how to apply that to the cloud.

“Imagine you picked a handful of vendors, one of whom deploys into AWS virtual machines, one deploys into a Kubernetes environment and another one into a Kubernetes service in a cloud provider,” posits Jones. “How you integrate that as a complete system is a nightmare and probably beyond most broadcast engineers.

“Not only do different vendor applications need to interface together but you also need to consider whether the deployment environment they work in are compatible with each other,” he adds. “A lot of legacy software apps that were built to run on premises have been adapted for the cloud but are not cloud native. There are no standards for this deployment. It’s a wild west.

“We have seen some big network operators that been able to grasp that change – but it does take quite a big investment.”

IT training needed
Related to this is the need for a whole new set of IT skills required of broadcast engineers.

“Broadcasters launching OTT services in the cloud are often doing so using an IT team,” says Mansfield. “In the long run, this separation is insane. They are essentially doing the same thing as the broadcast team but delivering to a different output medium. To be successful those teams need to be merged together as one operation.”

For Subramanian the answer lies in better education about the total ownership cost of cloud workloads. “The finance team, the operations and tech departments are all used to a capex model which, when suddenly taken to opex-driven model, catches them off guard. In some senses the cloud complicates life because there are so many different pieces of the puzzle.”

In one simple illustration, buying a server for on-premises versus putting a server on the cloud cannot be compared like for like. “With the cloud model you need to consider the networking gear, the data centre, the air con power and performance,” he says.

Aside from Kubernetes, which is an orchestration layer adopted by all major cloud platforms, Subramanian agrees that the internet is fracturing away from the broadcast safety net of unified standards. He doesn’t think this a problem. “There will be a plurality of standards that we all need to support including NDI, SRT, RIST and Zixi but this multiplicity breeds innovation.

“Fundamentally what is missing from the whole ecosystem is better education to create business models. We have seen customers cross that bridge once they understand the significant benefits.”

Somehow, Discovery seems to have done this. “We managed to flip the [internal] conversation from finance looking at the bottom line to looking at metrics,” explains Farnsworth. “How much volume is flowing through? What is the reliability like? What is the cost so we can start delivering KPIs? It’s a much more straightforward conversation and means we can concentrate on creating a better consumer experience rather than how we make it work technically.”

Thursday, 11 February 2021

Stitch Editing Finds ‘Bullet Proof’ Supervised Finishing Solution

copywritten for Sohonet

Stitch Editing and Bacon VFX on their hunt for high-quality remote review tool that didn’t compromise the clients viewing experience  —  and how they landed on ClearView Flex.

https://www.sohonet.com/our-resources/blogs/stitch-editing-finds-bullet-proof-supervised-finishing-solution/

Boutique post-production shops pride themselves on offering a unique creative environment in which clients can engage with craft artists. That is at risk if social distancing forces a halt to normal service and places a potentially damaging strain on relationships. 

Stitch Editing faced such a dilemma last March and it took a little while to overcome. The Santa Monica-based boutique, which shares a sister facility in Soho, London, works on a range of projects from commercials, music videos, online content to full length features often in concert with in-house finishing arm Bacon Visual Effects. 

“When Covid-19 hit town and we told everyone to work remotely, our initial feeling was that we could power through this since it would only be a few weeks or a couple months,” says Mila Davis, executive producer, Stitch Editing. “After about three weeks, it became clear we needed a longer-term solution.” 

Stitch had started out by using video conference apps to communicate and then streaming platforms to view materials as well as experimenting with dedicated remote video systems. 

“None of them were sufficient even in a best-case scenario to meet our needs,” Davis says. “Some clients and projects require that we work with outside vendors, like MPC, The Mill or Company 3, and when we began interacting with them using their remote video solutions. We were able to experience from the client’s perspective what worked best.”.

Mitch Gardiner, VFX supervisor and senior flame artist at Bacon Visual Effects admits to being highly sceptical of finding a solution that would provide the same collaborative experience his clients were accustomed to.  

“When lockdown happened, I was genuinely concerned about not having face-to-face interaction with clients. The only thing worse than not having that would be attempting to provide supervised sessions in an incomplete or compromised way that could result in frustration for our clients.” 

He elaborates, “By offering a finishing division within an editorial boutique we’ve always insisted that the client’s experience be the kind of dedicated personal and creative process they expect from a boutique while providing the same efficiency and technical capacity as the largest facilities. 

“If we were going to be supervised remotely, we had to find a solution that would be as seamless and professional as what the largest studios were using. The experience had to be no different from the client’s perspective. Therefore, the idea of any sort of ‘consumer grade’ solution where content was potentially visible by other people or had quality or latency issues was not acceptable.  

“I wanted something designed for broadcast-quality review with a guaranteed security infrastructure. Even then, one of the professional solutions we tried dropped the connection in the middle of the session when one of their servers went down. That was also a non-starter.” 

So concerned was Gardiner to insulate clients from a compromised experience that he was at first resistant to live-streaming sessions. 

“I really wanted to protect the client’s experience when I was at the wheel. I didn’t want any latency or quality issues or connectivity frustrations to reflect poorly on their experience in the session. 

“At the beginning of the lockdown, I’d work unsupervised and post for the client, they would then review and send to their client. I’d get feedback and make any changes and we’d repeat the process again and again. It got to the point where one of my clients, who I’ve worked with in a normal supervised workflow for 10 years said they needed a more efficient system, and they didn’t have time for a lengthy posting review process. That’s when our search for a better way forward really began in earnest.” 

Gardiner was looking for two principal things in a remote review service: high picture quality and ultra-low latency.  

“We’re frequently getting down to the level of individual pixels in the review process,” he explains. “The client needs to know if what they are evaluating is in the footage or is a result of compression on their end. To have a system where you can guarantee a certain level of quality and increase that for a more granular level of review is very important. 

“The second requirement is low latency. Other systems we tried had anywhere from a 2-8 second delay in picture between the Flame and the client’s monitor.  We are constantly moving between frames and the client is giving feedback about what they see on their end – for there to be a delay quickly becomes maddening. It’s like trying to play soccer with key players reacting several seconds behind the ball. It’s just not workable. 

“ClearView Flex was the only service able to achieve the same result with our supervised remote sessions as we had in our studio. The fact that there is no delay in the signal is critical to me. It’s been working fantastically well.” 

Since introducing Sohonet’s solution, Stitch and Bacon have worked on projects for Honda, Wells Fargo, Electronic Arts, LG and others. They have also produced the seven-minute film ‘No Strings Attached’ for Moschino’s spring/summer 2021 collection produced by Alex Winter and featuring puppets from Jim Henson’s Creature Shop. 

ClearView Flex will remain a permanent fixture not just to work around Covid-19 health protocols but for simple convenience and efficiency.  

“There are certainly times where there is no substitute for the one-on-one client relationship with an artist in the room, but there will always be some sessions that can’t be or don’t need to be supervised in person,” Gardiner reflects. “I think our clients have found remote workflows and supervision to be surprisingly sustainable. Going forward, I would expect a permanent split between the traditional in-person sessions and a supervised remote workflow for most projects.” 

It has also been beneficial for the Flame artist, not least in terms of giving him back hours in the day to be more productive. 

“After spending 18-years more or less daily in our suites, I haven’t set foot in Santa Monica in the last 10 months and it’s been eye-opening to me. After collaborating with the creative teams in this new workflow, while they take time to present to their client, the break can allow me to regroup on another project or even spend a few minutes with my family. For a job that can mean 16- or 18-hour days, that has been incredibly transformative for the way it feels to be busy.  

“Prior to this I’d have two to three hours a day commuting to the office. When there, if you had downtime with clients in the room you couldn’t easily switch gears to another project. Even on a busy day there could be downtime that I couldn’t make use of as much as I’d like – perhaps to work on a passion project or take care of facility matters. ClearView has taken that downtime out of the equation.” 

Like many Flame artists, Gardiner has worked hard his entire career to build a reputation and a trust with clients that risked being undermined by sessions frustrated by inadequate technology. 

“Supervised finishing has always been very integral to what I do and how I do it,” he says. “A good Flame artist develops a strong relationship with clients so producers, creatives and directors choose to work with you based on earned trust and respect. You are not a commodity that can be simply be replaced by another individual. So, for me, to embrace any sort of modification to how that relationship develops is significant. 

“It took finding ClearView Flex for me to be comfortable that we had a broadcast level solution for supervised effects, finishing and color that we could present to clients as a bullet proof way of working.” 

 

ProAV: A Resurgent India

AV Magazine

Growing smart cities and corporate hubs are driving proAV in India

P25-27 https://edition.pagesuite-professional.co.uk/html5/reader/production/default.aspx?pnum=25&edid=90122f3e-f671-411b-88dd-541542391e74&isshared=true%E2%80%A6

India is among the world’s fastest growing economies and is, in fact, the third largest proAV market but a late-summer wave of Covid-19 created a downward revision in the country’s overall finances.  The IMF now projects a 10.3% 2020 GDP decline, down significantly from the 4.5% decline that was predicted in June, which itself was down from the April projection of a small 1.9 percent increase.

The good news is that India’s long-term economic fundamentals are strong. AVIXA anticipates a strong (8.8%) rebound in 2021 and full recovery in 2022.  Long-term growth is expected to stay nearly as dynamic, settling in at 7.4% per year from 2023-2025.

“The brightest aspect of proAV in India is its long-term growth, which is expected to slightly exceed GDP and increase at 8% a year from 2023-2025,” says Jonathan Seller, senior director development, APAC, AVIXA.

Several AV firms predict that business will return to pre-Covid levels by the end of 2021. “We’re investing a lot of time and effort in joint marketing and promotions to support our local partners,” says MediaStar Systems regional sales director Mark Stanborough. “Unlike other markets, though, where we’ve seen a real push for expanding the use of digital signage, we’re not seeing that in India. It was already quite heavily used and judging from the orders we’ve started to receive in the last couple of months, demand remains at the same level.”

Hospitality is expected to pick up towards the second half of 2021, and tourism is expected to witness an immediate jump as people across India are keen to travel, reports Karan Kathuria, director (Asia, Oceania and SAARC), Renkus-Heinz. “Likewise, the house of worship market is asking for advanced audio and video streaming solutions due to the impact the pandemic has had on Indians who have a very strong religious connection.”

While the market has started to recover, challenges remain given the current uncertainty. “Spending has moderated in the corporate vertical for AV systems,” reports Sanket Sawant, sales director, SAARC and ANZ, Atlona. “As workers return to offices, businesses have put thought into how their AV infrastructure will need to change to keep people safe and healthy. We expect this will have a direct impact on increased corporate AV spend through 2021.”

Girish Narayanan of Resurgent - a PSNI Certified Solution Provider founded in Bangalore with offices nationwide - comments, “The pandemic had its effects, however, India’s economy continues to grow. It is just a matter of time before India’s private and public sector bounce back with its full power. Covid has brought a major chunk of the population closer to new technologies. AV has increasingly become a part of our daily lives now.”

He explains that the Indian government have become early adopters of digital technology which will open opportunities for companies like Resurgent. “As the government's focus on e-governance gathers pace, rapid distribution of key messages to the public will be a basic requirement for all institutions, creating numerous opportunities for the AV industry,” he says. “We can provide high-end AV design and build integration services by bringing global standards to the Indian market.”

One product of India’s strong pre-2020 growth is a burgeoning middle class—and all the AV demands that brings alongside.  Hospitality, retail, and venues and events verticals will be buoyed in the long term by this increase in middle-income consumers.

Increasingly networked

Connectivity is another AV driver here, suggests Seller; “As more of the country gains high-speed internet access, expect spending increases in solution areas like content distribution, media and conferencing and collaboration.”

“India is increasingly networked to itself and the rest of the world,” agrees Samuel A. Recine, vp of sales, AV/IT Group, Matrox. “The beauty about the constantly improving breadth of solutions to tackle communications and presentation challenges is that it is becoming easier to reach all sectors of the economy.”

He adds, “India has an above-average software development capability. As AV/IT environments increasingly blur, these skills are particularly strategic. Customers and integrators in India are also concerned with proprietary implementations by manufacturers and are pushing for progressive thinking about product and platforms being more useful and interoperable with each other through open standards.”

While many leading AV companies have established a presence in India, it is a little early to state that India is an ‘organised’ proAV market.  “The uniqueness of India is its diversity and scalability,” says Kathuria. “With 65% of the population based in rural areas – which is only now experiencing a boom in data and digitalisation – it is still too early to say the market is ‘mature’ in regard to AV.”

The government’s flagship projects, ‘Make In India’ launched in 2014 and ‘Digital India’ in 2015 are beginning to deliver on the country’s intent of becoming a global tech superpower. 

Kathuria says, “With internet access now reaching every corner of India, data has become a commodity. This is an initiative the Indian government has driven toward very seriously as it dovetails with its vision of smart cities.”

Currently there are 100 smart city projects with 12 on fast tracked for development.  “Keep an eye on smart cities,” says Sawant. “We’re also seeing projects in Tier 2 and Tier 3 cities which call for an upgraded AV infrastructure. This also opens up new projects for tourism, such as at the Statue of Unity project now taking shape (of Indian statesman Vallabhbahi Patel in the state of Gujarat and, at 182 metres, the world's tallest). All of this is important for the government, which is focused on bringing more tourists and visitors to India.”

A “significantly large deployment” of AV technologies is an integral part of these smart cities, agrees Kathuria; “All global players should be looking at this endeavour. India is further being perceived as one of the most viable service industry investment hubs, with many of the world’s largest companies now having some presence in the country related to manufacturing, assembling, software or digitalisation.”

The government plans to invest heavily in infrastructure, mainly in highways, renewable energy, and urban transport over the next two years. Its smart cities will be supplemented with Transit Oriented Development, public transport and last mile connectivity. Last October, the Airports Authority of India (AAI) announced plans to upgrade runways at seven domestic and international airports across the country by March 2022.

“Because public address systems form the backbone for all major transport hubs, Renkus-Heinz believe that providing pro audio solutions that offer agility, redundancy and scalability to the transport sector will be crucial in the year ahead,” says Kathuria.

Education home and away

AVIXA records fresh investment by central and state governments in digital classroom technologies. “The long-term goal is to find consistency and efficiency in the development and delivery of the school curriculum,” says Seller. “During lockdown, government K-6 students were taught via terrestrial television and radio.  The current academic year has been completely remote, with private schools K-12 and colleges and universities using online technologies for tasks and assessment. Pupils from kindergarten to university are now continuing distance education using online platforms.”

The trade body itself has been conducting a series of virtual training for certification while Infocomm India 2020 went virtual as well. The show saw a “large number” of participants engaged for three days of conferences, live product demos and also “AI driven business matchmaking.”

Altona has also won business enabling remote education. Its PTZ cameras, extenders and switchers have increasingly been specified to support hybrid learning environments and online lectures. The firm won a significant project with a technology institute that is on the verge of installation, which it will announce in the new year.

There may be considerable demand for remote comms but connectivity is a limiting factor outside of the major metros and business parks. Metro areas are not necessarily the go-to locations for populations which used to migrate from rural or tier 2 and tier 3 cities to look for employment. Altona says it has won business integrating with software and cloud-based conferencing services like Zoom and Teams, both in corporate and education environments.

“Most IT companies are adapting a work from home model but the work from home culture is new to India beyond a very small percentage of the population,” Sawant says.

Similarly, Renkus-Heinz recognises the “huge upsurge” in online education, but says parents are realizing the value that in-person coursework adds towards the complete development of students. “The trend of students and teachers embracing new technology means we are also hearing interest in new technology within classrooms as well,” says Kathuria. “This makes the education sector a hot vertical going forward.”

 

Growing centres of capital

The pandemic has negatively impacted on micro, small and medium enterprises while larger organisations have seemed to survive the storm. Evidence from MediaStar suggest activity is driven by corporate multinationals and global banks, opening new offices in the larger cities.  Mumbai, Chennai, Bangalore, Hyderabad and Delhi are hot spots while Bengaluru has become a tech hub where many distributors, AV partners and consultants are based. This corporate activity provides good opportunities for office/board room fit outs.

“On a global scale London and New York have always led the way when it came to demanding new AV tech, but the financial hubs in Mumbai and Pune give both of those cities a run for their money,” says Stanborough. “There is also an interest in British-designed solutions, which are regarded highly.”

Corporate projects are growing more complex. “We’re seeing more large enterprise deployments involving UC platforms,” says Sawant. “As a result, we’re seeing greater demand and reliance on training, certification, and best practices for AV standards.”

The influx of large offices has led to a higher standard of specifications and requirements for AV installations. Andy Lee, who covers the region for Datapath, says distributors and integrators have become more professional as a result.  “As the AV industry recovers from Covid we expect to see demand for higher standards in the region increase further.

 “The opportunities are vast and the rewards could be great but you have to take into account local cultural differences, the distance from your HQ, time difference and competition from Indian-made as well as cheap Asian imported products,” he advises. “Be aware that there are not one or two major cities to potentially visit and cover. There are potentially five or six and cross-country takes time.”

Covid-19 has meant less location shooting for the country’s industrious film and TV sector with demand “moving heavily” towards virtual production and virtual studios, according to Tom Rockhill, disguise CSO. “Demand for disguise’s xR technology has spiked massively in 2020, and we’re working closely with our local partners and Epic Games to provide the best possible backbone for LED stage environments possible.”

Peerless-AV’s business has grown more than 50% year on year in India by targeting “high quality, premium” products to a niche category of high-end customers, mainly in the corporate space. It just appointed Pro Radio Networks as its new local distributor and is already helping Peerless to win projects with multinationals moving overseas branches to major cities in India.

“The majority of tenders received are from multinationals in the region, and the technical expertise falls to integrators and consultants who are specifying solutions direct to end customers,” says Justin Joy, Peerless-AV’s senior sales manager. “There’s a lot more business in India to be had but it’s harder to penetrate this customer base owing to their price conscious culture and use of local fabricators resulting in a lower quality product. The corporate sector is highly advanced compared to retail and government, currently.”

India clearly has considerable potential and but also challenges associated with that potential. “We are looking at 2021 as a year of immense opportunity,” says Kathuria. “We see India as a very unique and strategic market, and we’re committed to delivering much needed engineering expertise and services to the Indian AV Industry, on par with global standards.”