Friday 30 December 2022

Sports streaming battle is joined

InBroadcast

 

This was the year in which sports moved to the centre of major streaming content provider strategies. We asked executives at leading vendors to identify the key trends in live sports in the last 12 months and to look to 2023 and share the business issues or innovation to watch in this fast moving space.

article here

 

 

Olly Parker, Chief Commercial Officer, BlackbirdIn the past year we’ve seen a steady increase in the use of data to partially automate or speed up video creation workflows. This helps augment workflows around the higher-quality content that humans are still needed to create as opposed to fully automated, rules-based systems. Better use of data can also permeate beyond the creation process and is helping organisations to address multiple delivery end-points in a more streamlined and consistent way. It’s commonplace to see multiple teams creating video for multiple platforms, especially around live events. What businesses have been driving through better metadata support are workflows with as much shared resource - technical and human - as possible. 

2023 is going to see a renewed focus on technical and infrastructural efficiencies, particularly in relation to use of the cloud and how to manage a multi-vendor space. We are all well past doubting the benefits that can come from both public and private cloud infrastructure and services, but there are many situations where an ‘all-in’ approach to any platform isn’t the most efficient, cost-effective or sustainable. Hybrid deployments that can take advantage of the best available technology and adding new cloud workflows on top of existing on-premise tech stacks can bring major benefits to media companies.

 

Ian Godfrey, President of TSL Inc. and Head of Control Systems

“We continue to see international monetisation of live sports content as a significant business driver for our customers in this segment. Like all businesses, live sports content providers are looking to grow their revenue. One method to do so, is to grow viewership outside of the traditional regions for any given sporting event.

 

Recent examples include the NFL expanding their international footprint by playing games in three countries outside the US, including Germany for the first time, and the NBA continuing their Global Games after a pause during the pandemic. With this large geographical expansion, we have seen increased complexity to the live production workflow. Not just because the event is occurring outside the traditional region, but because content providers want to make the event relevant to broadcast viewers in that emerging market, whilst still maintaining broadcast viewership in the traditional region.

 

Into 2023, centralised processing with distributed operations. Whether it be live media processing centralised in the cloud, in a VPC, or in a designated media processing centre. We expect to see more live sports productions happening with remote operations and centralised processing (away from the venue). Of course, there are events that justify the costs associated with producing live sports from the venue, but there are so many live sports events to produce and broadcast (all of which deserve the highest quality production money can buy) it just makes sense to centralise the media processing wherever possible. Enabling operators to work more efficiently. We know this is possible today for some live broadcasts, but how far can we push the scale and complexity?”  

 

 

Kerry Freeman, Head of Sales for IMES - UK & Ireland, Iron Mountain Entertainment Services

“Streaming video services have been at the forefront of everyone’s mind this past year. This focus has largely been a result of the Covid-19 pandemic forcing many to stay at home. Whilst this led to a boom in streaming services there was little new content being produced causing providers to look to their archives when creating content to attract and retain viewers. 

One of the biggest gaps in the market was the live sport sector, and so many broadcasters opted to utilise old tournaments, races and matches showcasing some of the industries greatest sporting highlights. These included ITV in the UK replaying the Euro 96 football championship and RTBF in Belgium offering cycling races from the 1990’s.

We predict that archiving and restoration will continue to be of massive importance for 2023 as we unlock the vast potential of the content currently residing in store rooms around the world. 

This is at the forefront of broadcasters such as Canal+ Group who are looking to their tape archives to add value for their viewers, with IMES in the process of digitising up to 110,000 hours of tape archives from sports content to digital assets for streaming services.”

 

Paddy Taylor, Head of Broadcast, Mark Roberts Motion Control

“The key requirement in sports is sustainability and in particular the desire to be carbon neutral. This trend started about three years ago and has been gaining momentum. As we recovered from the effects of the pandemic, we have seen broadcasters looking to protect themselves from the impacts of a new variant or a similar situation whilst balancing the desire to reduce the environmental impact of their productions.

We believe these two things can be interdependent, the use of remote acquisition tools with a smart automation system means that not all camera operators need to get on a plane or drive to a venue, they can control cameras or multiple cameras from any suitably fast IP connected location, and potentially cover multiple games in a day. This will bring significant advantages both in terms of cost savings (no more expensive taxi bills or hotels at peak times) as well as reducing the carbon footprint of the production and, crucially, opening the door to more creative innovation.

We regularly talk to producers who want more angles, more motion, and more cameras but the logistics and cost are usually prohibitive. By using automation tools such as Polymotion Player One, a single operator can manage multiple cameras from a simple pan bar, so as the operator follows the action, all the cameras are looking at the same point with different framing, angles etc. As a really simple example, the same operator could be running two cameras side by side, one in 16:9 and the other in 9:16 for OTT, but the possibilities are immense.

 

Thomas Lind, Director of Product Management, Appear TV

 

“The trend towards remote production of live contribution triggered by COVID have continued in 2022.  For primary events, it is still typically a hybrid combination of on-site and remote production, while secondary events are moving towards full remote production. Mostly, remote production is done from site into a central location. Remote production into a public cloud like AWS, Google etc. is being evaluated and tests are being conducted. It is still not clear what the balance between private cloud and public cloud production will be – this will be driven by which can provide the best business case. For live 24/7/365 we now see a trend that some operators are moving away from public cloud back to on-prem private cloud to reduce operational cost. 

 From a technology point of view, we have seen a large up-take of both JPEG XS and NMOS in remote production. We see JPEG XS becoming the standard for contribution when dedicated fibre connections with guaranteed bandwidth are available. When bandwidth capacity is limited or for contribution into public cloud, ultra-low latency AVC/HEVC are used. Since contribution is all about fitting the video into available bandwidth and not pushing the coder technology to its limit, we see little interest for newer codecs in the contribution space.  NMOS is becoming the standard used in orchestration of contribution networks.

 Into 2023, and work continues on evolving the NMOS standard, especially with the work being done to add transport stream support that will help standardising the orchestration of contribution networks. Going forward it will be interesting to follow how much remote production will remain in a private cloud/facility and how much will move into public cloud. This will largely come down to the individual operational requirements and corresponding business case.

So far, we’ve seen JPEG XS being more frequently used in US contribution than in the rest of the world. We believe that the deployment in remote production using JPEG XS will start to escalate in Europe in 2023.”

 

Igor Vitiorets, CTO, SLOMO.TV

“For quite a bit of 2022, many countries still had in place Covid-19 restrictions. That supported the growth of the remote production market. The expansion of DWDM/LAMBDA (λ) data transmission networks, their increased accessibility, both financially and physically, and the use of JPEG-XS hardware codecs have given broadcasters more opportunities to expand their Remote Production. The ability to receive in a remote Video Control Room the superior quality video signal with minimal delay, the savings on travel, and reduced staffing requirements have resulted in an increase in remote production.

Currently two global factors are having a very big impact on our industry. First, the global electronic component shortage which affects the production of equipment, is increasing. Some manufacturers claim delivery times of six months or more; some equipment becomes more expensive. All this leads to an increase of CapEx, the equipment is depreciated over a longer period of time.  The second important factor is the global economic crisis, which is cutting investment in long-term television projects.

For these reasons, the market behaviour of equipment purchasers changes: they change the range of purchased equipment, refuse to launch new expensive projects and to implement expensive technologies. On the other hand, they discover new brands, solutions and workflows. For many, it turns out to be a surprise that solutions they had not paid attention to before are efficient and well suited to their tasks.

As a result, the way the industry deals with equipment and finances are changing, and small and medium-sized manufacturers are getting a chance to enter the major market.

As an example: the use of IP video/mixer for IP broadcasts using mobile phones (with a specially installed application) as cameras. This solution is from an absolutely non-broadcast-grade market, but nevertheless it provides quite a high-quality video, which can be used by top TV companies as well. There is no secret that mobile phone manufacturers are making amazing progress in improving the quality of their cameras. 

Computational photography algorithms make it possible to achieve very high-quality video. The idea to use the mobile phones as a video source has a couple of disadvantages: the absence of SDI output which is solved by transmitting H.264 compressed video with up to Hi422P profile (10-bit 4:2:2) via WiFi and the absence of Zoom optics, which is also not a problem for some types of TV broadcasts. But the whole solution providing a decent quality video becomes very inexpensive and compact. 

Dave Gill, Chief Technology Officer, AE Live

It’s all about choice best describes the state of sports and live production heading into 2023. Broadcasters and rights-holders need creative ways to hyper-target and super-serve their changing audience demographics. Consumers with different viewing habits have more content options than ever, making their valuable eyeballs harder to attract and retain.

Audiences expect immersive, interactive, and graphics-driven experiences, with Extended Reality (XR) or Mixed Reality (MR) as the physical and digital worlds converge.

Virtual Studio environments are at the heart of this convergence, expanding what's possible for delivering a brilliant, photo-realistic home viewing experience. While the technology can be expensive, many costs are up-front as broadcasters don’t have to invest in building multiple studios to cover different topics. Going virtual provides nearly unlimited creative and editorial freedom.

The increasing use of gaming engines is enhancing virtual production technologies. Unreal Engine, for example, works with existing graphics workflows to create impressive effects from a real-time render engine. As these engines’ toolsets continue to improve and deliver real-time services and applications, we’ll see them move beyond virtual studios, AR, and VR to become a key element of all graphics production.

Audiences prefer to drive their own personalised graphic journey. For service providers that means re-evaluating our offerings, from 9:16 video content delivery to accommodate users’ scrolling vertically on their phones or tighter graphics integrations with their social media platforms. Our industry’s focus has to be on finding new ways to keep audiences engaged, while also giving them plenty of choice.

Jon Raidel, Global Lead, Live Production in the Cloud, Vizrt

“It’s clear that customers are rethinking workflows and looking for ways to improve their efficiency. They want to produce more content with less complexity – for instance by aiming for a smaller carbon footprint or working without the need to roll out a production truck.

Considering how important quality sports content has become, and not just for big broadcasters covering top tier sports, it’s crucial that the right technology considers the need to cover all levels of sports. To deliver this demand for simple and efficient production workflows, solution providers are looking at the possibilities of cloud production.

Cloud makes workflows simpler, remote production possible, and the carbon footprints lower. Take BT Sports, for example, that joined with UEFA to produce the UEFA Youth League tournament fully in the cloud, using Viz Vectar Plus. Since mobile trucks were not needed, cameras were sent to the cloud, and the game was covered smoothly – with a 25% reduction in the carbon footprint.

The efficiency provided by cloud live production points to this solution being a major choice of broadcasters and content creators in the future. Making workflows simple and easily adaptable, not to mention lower in overall cost, is a win-win for any type of coverage.

This past year, when speaking to broadcasters and integrators alike, I noticed recurring concerns – like the expanding control room footprints and the lead time it takes for the infrastructure to be put in place. A key trend we’re going to see is a search for a production set-up and workflow that addresses all these concerns but is also cost-effective and scalable – and the solution is live production in the cloud.”

Phil Myers, CTO, Lawo

A welcome trend for Lawo is that SMPTE ST2110-based IP infrastructures are now going mainstream. In Germany, Belgium and across many other countries, stadiums are connected to permanent ST2110-based networks that link them to the production hub, or even hubs (plural) as well as to other stakeholders, using bi-directional essence transportation. Temporary or long-term bandwidth issues for video essences are increasingly addressed via lossless and efficient compression strategies.

The ultimate goal is to make more efficient use of both processing resources and crews as well as to drastically reduce carbon emissions and travel times, whether for planetary events or routine production scenarios that increasingly involve a variety of locations.

What is certainly needed are a relentless focus on the user experience and a clear vision of where the broadcast industry is headed. Operators rightfully expect plug-and-play connectivity for their IP systems; they furthermore need almost instant scalability; they appreciate the ability to set any and all devices from a single location with just the right amount of parameter complexity; and they wish to protect their network and content from intruders. This is being addressed with Lawo’s HOME management platform for IP infrastructures.

Due to shrinking budgets, and with a view to serving broadcasters even more efficiently, the software-defined approach for video and audio hardware is bound to evolve beyond the current horizon. Finally, staying on top of looming paradigm shifts and shaping them as they materialise will be more important than ever before.

 

James Ransome, Business Development Manager - Sports & Live Events EMEARoss Video

“2022 was an exciting year in Sports Broadcast. With real-time information increasingly available to sports fans through digital and social media platforms, fans now expect to have a fully engaging experience from their own homes. This includes real-time data on screens. That means showing people at home and in the stadium, with compelling visuals, how far players have run, how many passes they’ve made, and the speed at which the ball hits the net. We are seeing this with the ongoing World Cup in Qatar with various in-venue and app-based technologies.

We expect this to continue into the future as fans demand further expert analysis of key moments in real-time across sports. With technology like Ross Video’s XPression & Piero, presenters can explain these complex topics quickly and in an engaging way.

In 2023 it’s all about maximising investment, and essentially doing more with less. There’s increased competition for sports rights, and with this in mind, I expect to see broadcasters making decisions based on flexible technology, and the ability to scale up and down. Cloud technologies like Graphite CPC and Hyperconverged Platforms like the Ultrix Acuity and Ultrix Carbonite are going to be at the forefront of live video production in Sports.

It will also be exciting to see the continued adoption of cable camera systems that provide stunning aerial visuals in stadiums across Europe like Spidercam. Events such as the World Cup and the NFL London Series have used Spidercam immensely to give viewers a different perspective.”

Robert Szabo-Rowe, SVP Engineering and Product Management, The Switch

“The media industry is continuing to understand the capabilities of IP-based transmission, and that it can be reliable if the service design is optimised. This has challenged the transmission market in 2022 in areas where broadcasters historically used satellite-based transmission. Although IP-based transmission has a higher latency than fibre, there are certain types of use cases where this is appropriate.”

All of the major streaming platforms now have live content offerings or investments as a way to make customers stick. The streaming giants – from Apple to Amazon to Netflix – are all eyeing and buying up live event rights and this will certainly continue into 2023. This, in turn, has an impact on the production services market as these platforms don’t have their own broadcast capabilities, despite pitching for top tier sports rights. So, they’re securing their production capabilities from third parties.

This also has a consequence for broadcasters and TV networks, which now need content to replace their lost sports assets. Live programming that perhaps was not previously regarded as right for core content on premium-level channels is now being shown – with the upshot being that new sports and other areas of content are being opened up to and being introduced to mainstream audiences.”

Julien Signes, SVP and GM of Video Network & Simon Brydon, Snr Director of Security Business Development at Synamedia

  

“2022 was all about how the future of live sports streaming would play out. We’ve been keeping an eye on the likes of Apple and Amazon Prime Video as they experiment with strategic rights buys in certain markets, while Netflix has also gone on record warming to the idea of offering live sports. But this learning phase has to be quick, so in 2023 we’ll see whether these streaming giants decide to go big or go home.

Sports streaming at scale is not for the faint hearted, and achieving profitability is no mean feat given the cost of sports rights. Although on paper it sounds like a challenge suited for big streamers with deep pockets, they really don’t like its territory-by-territory model.

Whether you are watching a live event on a big screen outside the stadium or in a bar with others, you expect a premium experience and pin sharp quality. Fan zones are not just limited to popular sports - think Eurovision Song Contest and Last Night at the Proms with all those sequins and pomp in glorious detail.

That’s why we believe fan zones will drive the adoption of 8K for their big screens. We first demonstrated live sports streaming in 8K with BT in early 2022 and, through our discussions with operators, we all have high expectations for the next 12-24 months.

The fan event experience is about to get an overhaul too. Imagine a £100 match ticket has a QR code with an offer to download an app to use in-stadium with a bunch of fun features such as watching the game from a different part of the stadium, replaying a tackle, keeping an eye on a rivals’ match, or placing a bet. Meanwhile, F1 fans could view the track from a drone or drop into a car for the driver’s view.

The infrastructure needed includes 5G, WiFi, CDN with WebRTC support, and support for multiple latencies within a streaming service. The augmented fan viewing experience and 8K will also underpin the metaverse as sports brands start experimenting there.”

 

Grigory Mindlin, General Manager of Broadcast at disguise 

“In the past year, extended reality and real-time graphics have become increasingly important for our TV experience, as major broadcasters such as Eurosport, iTV and NBC have started to adopt these new innovative techniques to deliver engaging sports broadcast productions, with disguise as the heart of their workflow.

Earlier this year, we were proud to enhance our production toolkit with the introduction of our Porta cloud-native control system for broadcast graphics and our px high-performance render hardware. Together they will allow broadcasters to start using Unreal Engine in all aspects of graphics generation, whether it’s a ticker, AR graphic or a high-fidelity immersive environment and easily control them from the disguise workflow.

 

Over the next few years I would expect to see all graphics rendered in the cloud and on demand. With the increasing use of AR graphics and more broadcast platforms emerging, they will also be customised for each viewer based on the viewer's preferences.”

Peter Abecassis, Sr. Product Marketing Manager at Grass Valley

“As the cost of premier sporting event broadcast rights increases, media companies are looking to maximise the use of these rights to reach as many audiences as possible. This involves versioning the event for multiple audiences, whether that’s distribution in different languages and regions or by reaching out to different demographics through different commentators, or different platforms. A primary example of this is by introducing sports to audiences in different geographies: American football broadcast in Europe, or European football in the US.

In addition, creating extra content using highlight reels and fan reactions for social media is key to extending the reach beyond just the live event itself. Many demographics are only consuming live events in shortened formats or on social platforms and getting these packages prepared in real-time keeps engagement high.

GV Playout X allows rights holders to quickly spin up new channels. This gives them the flexibility to try out new markets and do special broadcasts without a huge upfront investment. Based on AMPP, this cloud-based solution doesn’t only mean it can be spun up quickly and operated from anywhere. When working with other AMPP solutions like Elastic Recorder X for ingest, LiveTouch X for replay, and Framelight X for editing and asset management, automated workflows can easily be built to create highlight reels and share them on social media while the event is still taking place. Innovative solutions to repurposing live content for multiple audiences will help the media industry increase the return they get on today’s broadcast rights.”

 

Gabriel Baños, CEO of Flowics, part of the Vizrt Group

To keep up with the frenetic pace of live sports production and control costs at the same time, more and more sports broadcasters and production companies have been turning to SaaS-based solutions for their graphics. Cloud-native graphics creation simplifies the overall production workflow and reduces equipment and travel costs. Through a platform like Flowics, broadcasters can streamline live productions and engage fans with interactive experiences for venues, live streams, broadcasts, and digital properties.

We expect the use of cloud-based graphics production to intensify in 2023. Not only is it the perfect complement to the cloud-based workflows that are becoming more common in broadcast facilities today, but it simplifies the process and makes it more efficient. Among the many benefits, SaaS-based graphics tools offer flexibility, easy collaboration, and easy graphics versioning. Because everything happens in the cloud, there’s no need to ship or rent equipment. And broadcasters get access to a larger talent pool while lowering travel expenses.

Speaking of talent, working in the cloud creates better work-life balance for production crews, who are on the road nonstop throughout a typical sports season. Being able to create and operate graphics in the cloud means they can stay home and work remotely instead.”

Thursday 29 December 2022

What Will the Metaverse “Experience” (Experiences) Entail?

NAB

Author and futurist Mark van Rijmemam, believes that in a physical and digitally-merged world our identity, personality, reputation, and assets “can be used in new ways so that people can create their own unique, magical experiences.”]

article here

He outlines in a post on Medium how some of them will change training, education and marketing all for the better.

Education

For instance, “digital twins” or replicas of factories, can be used to train employees in a safe working environment until they master the skills to go out into the real world.

Jeremy Bailenson, founding director of the Stanford Virtual Human Interaction Lab (VHIL) and founder of the VR training company Strivr, has called education and training the “home run” use case.

“In the metaverse, skills development and training could be revolutionized, drastically reducing the time needed to acquire and develop new skills,” says van Rijmemam.

For example, an AI-enabled digital coach could provide employees professional advice and training assistance. In addition, all objects (like a training manual, machine, or product) could become interactive, providing 3D displays and step-by-step instructions.

That seems reasonable, even inevitable. It must be easier to be able to follow a set of instructions to putting up that new bed with an app that maybe connects to barcodes in each part of the kit you’re about to assemble at home, than a paper instruction booklet written (badly) in multiple languages.

The fact that we have not innovated our teaching methods in the past 100 years is remarkable to me,” writes van Rijmemam, who believes virtual reality can change education as we know it.

“We should embrace the latest technology, from AI coaching to virtual and augmented experiences, to prepare our children for a world that will look fundamentally different by the time they finish school.”

Research has shown that passive teaching methods like mass audience lectures are more ineffective than participatory teaching methods, which “drastically improve memory retention rates.”

“From promoting artistic creativity to community building, we can expect a broad range of marketing innovations in the coming decade as we move from social media marketing to metaverse marketing.”

Mark van Rijmemam

An example might be a history class in VR combined with a discussion with the group after the class has experienced Ancient Rome using virtual reality.

“It would allow students to enter a virtual environment, interact with the teacher and fellow students, pause or play back a scene or session, and notice new things every time they visit or replay a scene,” van Rijmemam imagines.

“We could teach children the world of quantum mechanics by literally stepping into the microscopic world or showing the effects of climate change on any environment. The potential is endless, and it would probably result in a fun learning environment and the best ratings for the teacher and school.

Marketing

Now, what can be done to change education can also be done for marketing. After all, says van Rijmemam, marketing is about educating future customers about your product; the best way to do so is to offer them an experience.

And the best way to do that is to involve the creators, the artists, and the influencers who already have an in-depth understanding of the various virtual or augmented reality applications.

“From promoting artistic creativity to community building, we can expect a broad range of marketing innovations in the coming decade as we move from social media marketing to metaverse marketing.”

Metaverse marketing in the immersive internet requires a different perspective when reaching your target group, he says. Brands need to rethink how to create content, how people can interact with that content, and the capabilities and utility of that content.

He highlights four ways that the metaverse will change marketing.

Brands should create unique virtual experiences with low entry barriers, he argues. “This means enabling a seamless experience for your customers to interact with you in an immersive way.”

Connection with customers is obvious so this means building up a presence in the new virtual worlds ranging from Roblox, Decentraland and The Sandbox to any of the hundreds of new worlds now being created.

Don’t copy physical reality but think out of the box, he prescribes, and create gamified rewards, virtual goods and NFTs “to celebrate your customers” and engender loyalty.

His final point is that irrespective of the objective (education or marketing), user-generated content will play an increasingly important role in the metaverse. Whether this involves designing and creating games, immersive songs, volumetric media, educational environments, or the virtual worlds, “art and avatars that will liven up the metaverse will be a creator economy, and UGC will be everything.

“The result is the Experience Era, where everything that we do can be a unique and immersive experience, which will likely make work, education and connecting with brands a lot more fun.”

 


IP Video Delivery in Review

InBroadcast 

p42 article here

We asked select opinion formers to identify the key trend in IP Video Delivery in 2022 and to highlight the major issues for the industry with regards to IP video delivery in the year ahead. Here’s what they told us.

Carl Petch, CTO, Telstra Broadcast Services

Over the last twelve months a key trend in IP video delivery has been the transition to hybrid delivery. This is being driven by a combination of fibre and Internet delivery; helping sate the global appetite for live content we are currently seeing. Hybrid delivery enables broadcasters to enjoy greater choice and flexibility – while still balancing the books in a way that all tiers of live entertainment and sports are pleased with.

Next year will see the increasing adoption of cloud production tools. This is the continuation of what started eight years ago – when the industry started its journey to full end-to-end IP video delivery. However, security for video remains an issue for the entire industry. Moving forward, how we manage security in the video environment will be crucial. We are heavily focused on IT infrastructure, so the secure delivery of video and hand off of IP video content is seamless and protected

Mark Horchler, Marketing Director, Products & Solutions, Haivision

When it comes to live video, IP is not only being applied to end-viewer content delivery, but also to the entire video chain. This includes the first mile of live contribution and within production workflows, on-prem and in the cloud. All IP video workflows, from SRT and 5G for contribution, NDI and ST 2110 for production, and HLS and MPEG-DASH for delivery, with cloud in between, is finally becoming a reality as technologies become more interoperable and easier to deploy, including in the cloud.

IP technology is also enabling remote collaboration between talent, broadcast engineers, and producers. Born out of necessity during the past couple of years, remote collaboration using IP video streaming and cloud technology is bringing exciting new innovations to the way live content is being produced.

In 2023 we can expect to see more exciting new live events being broadcast entirely with IP technology, from end to end. This will mean greater choice than ever before for consumers, including live sports, music, and comedy specials, which will be delivered across linear tv channels and OTT services alike. Being able to apply IP technology to mobile cameras and transmitters over extremely low latency 5G networks will also bring exciting new angles and viewpoints for viewers at home or on the road.

Steven Bilow, Senior Product Marketing Manager, Telestream

“Our industry is evolving from SDI to IP.  But SDI isn’t going away anytime soon. Advances in switch technology are driving speed and reliability. Broadcasters can now replace SDI with IP at lower cost. But challenges remain, and millions of operational SDI products are still deployed worldwide.

Most of the hurdles to making IP infrastructure a reality have been overcome. Facility designers are now versed in network design, using spine/leaf topologies instead of monolithic designs and planning for the bandwidth and expansion needs of 4K; and with it, more sources, destinations, and subtle anomalies. Proactive, exception-based monitoring and rapid problem resolution will become more critical.

With ST 2110 and ST 2022-6, plus compression like JPEG-XS, IP plays a bigger role in replacing SDI router systems. The trend to deploy products with native ST 2110 connectivity will continue, but ‘hybrid’ will remain the operative design concept in 2023.

Precision Time Protocol (PTP) is standard in IP media networks and serves facilities well. But the equally crucial SDI and analogue equipment must receive the same timing information via black burst or genlock. That need will be slow to diminish.

Finally, with scale comes complexity. The Advanced Media Workflow Association (AMWA) helps temper this complexity with specifications like Networked Media Open Specifications (NMOS).

Looking ahead, cybersecurity concerns will grow. SDI signals had limited connectivity to the outside world and no access to IT infrastructure. IP-based media changed this. Facilities are now more vulnerable to security breaches so protection will become increasingly important.”

Per Lindgren, Group CTO & Head of Sync

“More live content - more remote production sums up a key trend over the past year. The return of high-profile sports events, including the Beijing Winter Olympics and the World Cup Qatar, alongside growing consumer appetite for live content meant that media companies, production houses, and rightsholders had more sought-after content to work with. Industry players had to also cater to their audiences wherever, whenever, and however they chose to consume it, especially digital platforms.

Media organisations increasingly realise that satellite and dark fibre transmission is too costly and inflexible to meet the requirements of the fast-evolving media landscape. The shift to IP media delivery is already demonstrating its potential and will continue to drive new opportunities that will delight audiences and bring more revenue to the industry.

Looking to 2023: While IP and cloud media delivery will define the future of broadcasting, transitioning to IP transport workflows is a process that requires careful strategising and implementation. Completely overhauling existing workflows, processes, and infrastructures to transition to IP fully isn’t the most efficient and realistic option for many media companies.

In 2023, we will be seeing more hybrid workflows that leverage the flexibility and scalability of IP while making the most of existing hardware and software investments. Over the next 12 months, media organisations will be planning their move to IP media delivery, bringing more innovation and expertise into their business. We also see IP media security with new functions, such as the IP media trust boundary will become critical. It’s the responsibility of media tech vendors to become the partners that will guide the media industry into its IP transformation journey.”

Matt Hughes, Chief Commercial Operator, M2A Media

“The explosion of content continued in 2022, and with that came increased demand for cost efficient methods of delivery that are scalable and flexible.  At M2A we’ve seen an uptake in enquiries from sports rights-owners who are keen to move their video workflows into the cloud, because in doing so they are able to realise the full value of their live video content.  Delivery over public cloud can not only increase the number of video feeds an organisation can get to its takers across the globe, but when combined with an intuitive software solution that an Operations team can drive, it can help them scale without the need to increase engineering workload.

The sports live streaming market is estimated to grow from $18 billion in 2020 to $87 billion by 2028 and with major sports rights deals being snapped up by Big Tech, it’s unlikely that we’ll see a decrease in these volumes of content.  This is really driving innovation and collaboration in the public cloud space.  Once broadcasters and rights-owners adapt to using public cloud for acquiring, distributing and routing live video, they then want to do more in the cloud.  Transformations that were previously the sole preserve of on premise workflows, such as dynamic graphics, frame rate conversion, audio commentary and live capture, will really come to the fore in 2023 and we’ll continue to see a shift towards fully cloud-based, end-to-end workflows.”

 

 


 



 

   

Say Hello to HVOD, Your Current and Future Streaming Business Model

NAB

article here 

As if we needed another acronym to describe internet connected TV business models… well, we have one. HVOD — hybrid video-on-demand — services are streaming apps that offer both ad-free SVOD and ad-supported (AVOD) tiers and this, says Samsung, will be the dominant business model going forward.

Samsung has a horse in the field since it makes smart TVs and has its own advertising platform to support content streaming apps running on it.

“HVOD apps offer consumers more choice by incorporating an ad-supported tier at a free or reduced price,” says Samsung in its new “The Streaming Index – Retention Rules” report. “The hybrid model works to attract consumers based on their interest, engagement, and willingness to pay. As streaming services vie to be one of the limited number of go-to apps that consumers use, the wide adoption of the HVOD model marks an important turning point in the evolution of streaming.”

With nearly all [Samsung] households now watching streaming content, growth for this sector will come from increased usage and time spent. Indeed, viewers are streaming more regularly and for more time, according to data culled from 45 million US Samsung TV owners and an accompanying survey of 1,000 owners made this quarter.

In the third quarter of 2022, the average monthly number of streamers on Samsung Smart TVs increased by +17% versus the year prior. Time spent with streaming content increased by +31%. This growth in audience is great news for streaming services. But with the continued increase in streaming choices, acquiring and retaining loyal users becomes more challenging.

Streamers are settling into regular viewing patterns. On average, 23% of streaming services’ active monthly users were new in Q3. This is a decline of -12% compared to a year ago. A little more than 50% of users are retained users — those watching at least once a month each month. Existing or retained users are crucial to app growth. Retained users also represent more than 70% of an app’s viewing time, on average.

Retained users are even more critical for the largest streaming services (the top 20% by average number of monthly users). For those apps, these users represent 69% of the user base and nearly 90% of viewing time, on average. (Tier 2 is the next 20% of apps by user numbers, and Tier 3 is the 60% of apps with the smallest users counts.)

Analyzing the report, Karlene Lukovitz at Media Post says streaming services face daunting competition. In any given month, nearly half of an app’s active users are at risk of churn. On average, churned users are seven times the size of an app’s active users base.

Large/Tier 1 apps have a significantly lower churn ratio (1.7) than other apps, but that is still up by 27% versus last year, according to Samsung. Even the largest services have lost nearly twice as many users as their active user bases over the past four months.

Given economic pressures, a service being free or low costs is now the top reason cited by consumers for trying a new app. The corollary is that the more expensive a service the more likely it is to be cancelled.

All the major streamers have built out or plan to launch HVOD services. Netflix and Disney are the latest to join the party. Apple TV+ has an AVOD tier planned for 2023.

The HVOD services analyzed in this survey exhibit an average retention rate of 63% among their active audiences and this has grown by 6% over the past year. Both SVOD-only and AVOD-only services (offering only one subscription tier) have lower retained user shares.

Moreover, fully three out of four SVOD subscribers surveyed said that they would switch to a lower-priced, ad-supported option offered by their current provider.

“In other words, it’s no mystery why all Tier 1 streamers will already be HVODs by the end of this year,” says Lukovitz.

When it comes to why consumers use some of their existing apps more than others, content still dominates. The most-cited usage drivers are “service has a deep library of the content I view most” (15%), “service has content not available elsewhere” (15%), “service lets me catch up on shows that I missed on cable or broadcast TV” (14%), and “service has new content frequently” (12%).

However, “service is free or lower-cost than other services” was cited by 10%, tying with “service is easy to use.” “Service does not contain ads” was cited by just 7%.

 


Behind the scenes: The Lord of the Rings - The Rings of Power

IBC

The Lord of the Rings: The Rings of Power is the new reference point for postproduction in the cloud. Arguably no show of this scale has worked in the cloud so comprehensively as the Amazon show.  

article here 

Across the eight hours of the first series a mammoth 38,090 shots were captured of which 9164 were visual effects. By contrast Marvel movies routinely cater for 2000-3000 shots. Footage totalled 1648 content hours and since it was shot in 4K UHD that translated to 860TB of data. The post-production pipe for the show was entirely built in the cloud from the ingest of camera Raw, metadata including the whole gamut of image science, VFX pulls, returns, conform, finishing and all deliverables. 

Production was based in Auckland but required 12 VFX shops around the world who in turn farmed work to 1500 artists plus hundreds more artists in New Zealand all linked and collaborating with assets in the cloud. 

You can argue that an Amazon show, which over the course of five planned series is budgeted at $1 billion, can afford to put all its content in the cloud. You might also imagine that AWS would provide favourable rates of storage, movement, ingress and egress for the flagship show of another part of its business. 

But putting all of your eggs in one basket (in this case, a S3 bucket) might be deemed a risky venture given that by their own accounts the team had no idea how to build such a global interconnected cloud pipe when they set out.  

The innovation seems to have paid off. 

“That I would be able to manage and produce all of the technical departments from pre-visualization to exhibition meaning all of the technical trades including camera capture, colour pipeline, editorial, post-production, visual effects final archive and delivery would be connected and interconnected was the real key  the power,” said POTR producer Ron Ames. 

This is not about a technology as a standalone thing. It is actually practical, it is useful, it is efficient. It is about making art.  

“So when we talk about the Movie Labs 2030 vision it's right now. It's doable and it's actionable and it is useful.” 

Starting with a map 

Author JRR Tolkien described his process of creating the world of Middle Earth as beginning with a map. Trying to make the story fit the other way around, he wrote, “lands one in confusions and impossibilities and in any case it's weary work to compose a map from a story.” 

So, the Rings team started with a map also. “We recognised that we were going to have to plan this very carefully,” Ames said. “We moved to New Zealand a at the end of 2019 and started to built a cloud production map.” 

This came into its own after three months of shooting when Covid hit and production forced closure. They reverted to a cloud-based production from home. 

“We were doing it cloud production anyway but Covid made it a requirement and sent us to the next level of development.” 

Perhaps the most important aspect of the pipeline was metadata management. It was the glue that enables hundreds of people to work together on assets at the same time in multiple locations. 

“We think of these things as dry and technical but they're not - they're art making,” Ames said. “Underlying all of this are thoughts and ideas, notes, about a note of music or a single line of dialogue. Every one of those was a potential asset.” 

Metadata is most precious 

Metadata collection was standardised, centralised and comprehensive. Lens and camera metadata was captured onset automatically from ARRI DNA smart lenses.  

Each department had their own iPads and their own software that fed into QTAKE and then was ingested via Autodesk’s review tool Moxion into every frame. The metadata was logged and tracked in Shotgrid and stayed with the shot all the way through editorial and sound mixing and beyond. 

“We could always identify the nature of that frame, where it came from and how it was connected to the rest,” Ames said. “Each department only cared for the metadata that was important to them so we knew the information was accurate.” 

Remarkably, all footage was whisked from set to cloud without any on the ground LTO tape backups. 

“We were actually concerned that we would lose data or that somehow or other it might be corrupt because there's no checksum,” said Ames.   

But none of that was required. Instead, they used just one S3 bucket: “One bucket to rule them all” with push and pull permissions for all artists and vendors.” 

The 9000+ visual effects shots were divided among twelve vendors including ILM, Weta, Plains of Yonder, Outpost, Method Studios, Atomic Arts, Rodeo VFX, Rising Sun Pictures, Cause and FX and Cantina. All of different sizes and different internal pipelines spread all over the world.  

Ames said, “The price of admission to join us was a willingness to explore new technology, to share assets in a standard way so that nobody could put a gadget or a gizmo on a visual effect that meant somebody else couldn't use that asset. So by creating a standard we determined that we could actually do this. 

“We doubled, even tripled, our data because we had never done this before but by the end we had total faith. We never lost a frame, we never lost an asset. All of it was trackable and easily shared with our vendors.” 

None of the production team had paper scripts. Nothing was kept in any form other than digitally and that could be shared immediately via iPads. When producing under Covid protocols team members needed to be separated. They put monitors around the stage “like a TV network” so all three shoot units could collaborate at a distance using iPads.  

Collaborative production 

Since no-one can know asset might have value they tracked every single iteration from the very beginning. Concept art, set drawings, tests - all materials that were created at every stage of the show were shared on the cloud platform. This meant that departments like marketing could access work much earlier than normal. 

“The ability to share in this democratic way changes everything,” Ames said. “I cannot describe the power of it. Walking onto a set and having a cut or a diagram or a note from the Show Runners - anything that was necessary was available at all times at a push of a button, securely with permissions. 

“I could ask someone to show me every asset to do with King Durin III and it would show me a list of a set drawing, a beard layout, a costume note, all of which could be easily found and shared.” 

Into post  

They created a common asset methodology using USD (originally devised by ILM and WETA as a means of tracking 3D objects. 

“You can look at every one of our assets and know precisely what it is and it could be shared with any other vendor,” Ames said. “And no vendor could get into trouble.” 

4K colour pipeline  

With the showrunners, DPs and directors in New Zealand watching a reference monitor, a colourist in Idaho was able to real-time collaborative DI sessions. 

“We could watch the colour just as if we were in the room with him,” Ames said. “All the files were cloud-based with no local files anywhere.” 

Archive  

Ames has worked on digital restoration projects for Powell and Pressburger’s classic The Red Shoes and for Martin's Scorsese Gangs of New York and knows only too well the perils of storing nitrate film. 

“When I asked for the answer prints to Gangs of New York no one, including Marty, knew what the colour was supposed to look like. If you’ve ever tried to restore something from LTO tape it's near impossible. Now, we have the technology to keep assets forever in the cloud with colour information and everything that is required.  

Localisation and QC 

The team were even able to work from home on the massive QC and language localisation project by spinning up virtual machines and moving giant files onto and around the cloud platform. 

Season 2 

Storage of every asset in the cloud also gives the team a jump-start on Season 2, which is already filming. “Every single piece of artwork that we created, every piece of cut footage, music, sound, concept art is all at our fingertips.” 

“When we're on set and a director is chasing the light and it's dirty and it's cold we have every possible asset at our fingertip and can share. When we had crews spread across New Zealand and artists all over the world, we were able to connect and communicate, efficiently and scale as necessary to meet all of our goals and deadlines cost-efficiently. 

At the end of that Martin Scorsese’s The Aviator (on which Ames worked), Leonardo DiCaprio, playing Howard Hughes, is standing in front of the mirror and says to himself over and over, “The wave of the future.” 

“That's what we did every single day,” Ames said. “We knew that this is the wave of the future and the future is now.” 

Company 3 Synapse 

Company 3 was the leading technical partner. It built the whole post-production environment in the cloud for POTR.  

“We took the traditional facility and put it into the cloud,” said Weyron Hendriques, SVP of product development at Company 3. “On the finishing side we also have HDR and SDR reference quality streams distributed over AWS. We used the whole technology to collaborate on the show with a colourist in Idaho and conform in the US West coast liaising with facilities in New York, Auckland and London.” 

Company 3 integrated its workflow automation system Synapse into AWS. The concept is to provide uniform high-quality content to downstream post-production pipelines. The tool is allied with dailies ingest for all onset capture, syncs the sound with picture, checks the quality of what was shot and makes QuickTimes for editorial. 

“Modern productions use many camera platforms with multivariate colour spaces, picture geometries and metadata formats,” explained Hendriques. “Synapse connects the metadata to all of the camera source files. This allows the system to create editorial VFX and finishing deliverables with the correct colour and geometry.” 

On POTR onset and dailies were in Auckland with conform in Sydney. Dailies of camera raw and metadata were uploaded to a Sydney AWS bucket. Synapse harvested camera raw and metadata from the AWS bucket and built its database in the Synapse core in Los Angeles. 

For VFX, the editorial team would submit a visual effects pull request into Synapse. Synapse orchestrates its virtual machines in Sydney to process VFX pulls from the camera raw. Synapse then delivers processed VFX pulls to each vendor’s cloud buckets. A round trip of the process is made for VFX returns.