Thursday, 15 October 2020

Remote working sees Blackbird soar higher

InBroadcast 

Ian McDonough, CEO, Blackbird 

 

http://europe.nxtbook.com/nxteu/lesommet/inbroadcast_202010/index.php?startid=24#/p/24

 

You’ve worked at the top of some of the biggest media companies in the world including Viacom and BBC Worldwide. Why did you join Blackbird?  

 

I enjoyed the the corporate world for many years. I was running a $150m P&L which sounds big but was tiny in the blue-chip media landscape. Over time it became a little too bureaucratic and had lost its edge for me. I was really keen to back myself in a business where I could make many more of the key decisions without swathes of board approvals. I looked at private equity but Blackbird was one step better. Here was a listed UK company with a proven world leading technology gem at its core at a time when ten industry was moving to the cloud. They required a reboot. A focussed strategy, clear branding and a strong front office team to take it forward. I was confident I could deliver on that. 

 

 

What were the ‘gems’ as you saw them? 

 

The company had developed an incredible web-based video codec from scratch and built a fully featured sophisticated editing toolset around. I didn’t know quite how special it was until I took it to the biggest tech giants companies on the West Coast. At Google, Apple and Microsoft, the universal response was ‘Wow! We’ve not seen that before!’. That told me that instead of being a strong, niche company, this could be a world beater. That was when I was persuaded to invest in the business directly and persuaded my family to. 

 

 

That was in 2017. What key developments have happened since? 

 

The main goal for us was moving from being an artisan product in Soho servicing a small part of the post industry to being a much more industrial product under new brand and company name. 

A big breakthrough was striking a major deal with A&E Networks over three years which demonstrated how hundreds of users every single day were accessing and adding value their archive and publishing content directly from Blackbird. 

Another five-year deal with a major New York-based global news organisation closed in November 2019 took us to the next stage. This was for production and delivery of live news, daily, to a worldwide audience. 

In the background we were also winning multiple smaller deals with a number of OEM operations – Deltatre, IMG, TownNews – across a wide variety sports and 50 US TV news stations. This gave us an insight into how business side of things was going to take off by working through third parties, so integration of Blackbird into other systems became a priority. 

 

 

What has been the impact of COVID on Blackbird and its customers? 

 

Ironically, one of the things we didn’t talk about much in the first two years of Blackbird was remote working. Our technology is incredibly fast and efficient and we’re able to scale easily because anyone can use us in any browser on any laptop over bandwidth as restricted as 2 MB/s. But culturally, remote working was not high on the agenda.Since COVID-19, the most important aspect is remote.  

 

The competitors in this area are traditional on-premise solutions who have virtualised cloud offerings. To operate such a system in virtual instances requires 30-50 MB/s bandwidth, local and cloud storage and a very fast GPU on the ground. Blackbird’s footprint is insignificant in comparison. This allows people to work from home. Any home.  

 

The longer term move toward cloud has been on the cards for a number of years but a dramatic cultural shift needed to happen and that has been COVID. People don’t want to go back to five days a week, many would prefer a hybrid model and it all means tools like ours which are cloud-native will be increasingly in demand.  

 

The Genii is out of the bottle on flexible working. As a CEO and someone who has run large teams, I admit to being skeptical about working from home but now we’ve seen how productive teams can be when individuals are handed responsibility with the right tools and left to their own devices. In future we will all see a better work-life balance. 

 

How important is sustainability to Blackbird? 

 

Sustainability manifests in three ways. Users of Blackbird, such as editors, don’t need to travel to location, they don’t need to get on a plane. Secondly, we don’t need any specific hardware which means the carbon emission in manufacturing, distributing and ultimately disposing of equipment is very low. But really what is key is that we can move high bit rate content around at a fraction of the energy and power consumption of competitors. For example, if you just want to publish 7-minutes of highlights from a 90-minute match, with Blackbird you can edit from anywhere and just that 7 minutes needs to move and the rest of the media can reside where it is required. Given that we are a very powerful software means we can be a part of the sustainability program for our customers and our investors.  

 

 

Can you tell us about Blackbird’s latest wins? 

 

We have built on the idea of third parties selling Blackbird and built a significant partnership with TATA Communications, one of the largest telcos in the world. 

In the last month we have also secured direct deals with Sky News Arabia and esports customers Riot Games and VENN TV. a win with a live event company in New York which will see Blackbird used on fast turnaround live coverage and social media publication of the Democratic National Convention and the US Open Golf in August. 

 

What is on your product roadmap? 

 

The key areas to work on are data in and out, which includes AI such as speech-to-text metadata, as well as video sources; and systems integration into OEMs. We are always improving the sophistication of our tools and pushing the codec and technology forward but we know that there is a sweet spot where we are a fantastic. Interoperable, web-based tool for super-efficient, rapid turnaround, high quality production. No-one else does what we do.  

 

How do you view the new normal? 

Optimistically. Our tool is very versatile and well designed for numerous use cases. For example, the National Rugby League in Australia was one of the first contact sports to return to action after lockdown and we were able to help their team execute production and, crucially, to do more work with fewer people all working safely and remotely. We have had several other sports clients go live since then including a deployment by the National Hockey League to assist with the NHL’s ‘return to play’ plan. Blackbird’s vital use in remote production not only conveys all the thrill of the action on the field but keeps the production team physically distanced and working safely from home. This will not only become the new normal for living with the virus but become a permanent fixture because of the productivity and efficiency gains it delivers.  

 

The cost of COTS

InBroadcast

Virtualized integrated playout systems are enabling broadcasters to shift from the playout centre to the cloud without having to redeploy, retool or maintain expensive, underutilised systems.   


https://t.co/07eGEXD1zy?amp=1

 

Automation is vital for efficiency; it has to be entirely reliable; and it has to be in continuing development to meet the challenges of new formats, new platforms and new delivery requirements.  Now technology has evolved to the point where it’s possible to deliver linear, non-linear and live content from the public cloud. Radical and continuing gains in off-the-shelf compute power coupled with rapidly evolving IP standards for video and audio streams is unleashing the potential of virtualisation.   

Hardware devices have been completely replaced by software modules, with SDI transport replaced by IP streams. This means the complete playout chain can be virtualised and deployed in a private data centre or public cloud.  

“This technological transformation opens up a world of possibilities,” says Ciarán Doran, Exec VP, Pixel Power. “For instance, using a pop-up broadcast channel you can test a new service for six months without incurring high initial start-up fees. With a virtualized platform you can make adjustments as you go and get to market faster than your rivals.” 

Pixel Power Gallium Automation and StreamMaster Integrated Playout virtual machines can be spun up to play out content in sync with the main facility within 15 minutes from launch, he says.  

As with all Pixel Power solutions, these are enterprise scale software ready to run on standard IT servers virtualized in a data centre or in the public cloud. Virtualised software-centric integrated systems make it possible, for example, for an automation system to master control sophisticated graphics, switching and server ports in a unified way through a single interface, helping to significantly reduce broadcasters' overall equipment costs.   

“Your equipment options should not be constrained once you have committed to a particular automation platform,” Doran says. “The virtualised automation solution should give broadcasters the freedom to test, evaluate and execute business plans as and when needs dictate and without having to rip and replace bare metal.    

Gallium FACTORY, for example, will fully automate the creation of promo versions as well as offer nonlinear, store-and-play content delivery operations such as IPTV, VOD, mobile and digital viewing applications.  

“Free of the traditional signal chain, running on COTs and leveraging software modules to be configured and reconfigured on command, Gallium FACTORY consigns automation lock-in to history.” 

Most vendors are launching integrated open automated playout systems with the flexibility to be deployed on-prem or in the cloud (or hybrids of the two). For example, Pebble Beach’s flagship automation system is Marina.  Marina systems are built using modular services blocks, so the installation can be customised. Pebble Beach can provide constant synchronisation with disaster recovery applications and remote devices which protects the integrity and security of the system. Scalability is guaranteed with the ability to add an almost unlimited number of channels, operator positions and devices while the system remains live. Marina also uses a single system-wide database which allows for highly efficient media management. 

Orca is the vendor’s virtualised IP channel solution. It’s a software-only implementation of Pebble’s Dolphin integrated channel device. Dolphin and Orca share the same underlying architecture and operate under the control of Marina, making it easy to mirror channel templates for simultaneous playout to SDI as well as IP. 

Pebble Beach is also launching new cloud-based service-oriented technology platform, Oceans. This is intended to help broadcasters move or upgrade their broadcast workflows and services into the cloud and all from a single unified interface.  

“Oceans is designed to enable broadcast teams to handle their complex workflows in simple and intuitive manners, thanks to common core services across multiple functional applications,” states the firm. “Oceans provides greater visibility and control of the playout infrastructure. Regardless of scale, Oceans will also enable broadcasters to expand deployment as new services are released and allow them to write and integrate their own functionality using open and secure APIs.” 

Qvest Media’s cloud-based playout SaaS is called q.air. This combines cloud applications from various manufacturers for ingest, playout, automation, and graphics in one scalable package. These applications include HMS Media Solutions’ playout automation Makalu, Singular.Live’s graphics software Singular and the live video streaming application Wowza. Makalu also supports dynamic ad insertion into live OTT streams and digital broadcasts which helps generate additional advertising revenues. 

Q.air is orchestrated with the multicloud management platform Qvest.Cloud and runs on the AWS cloud infrastructure. To complicate matters a little, Qvest.Cloud is now being rebranded as qibb, the development for which is being handed over to an independent but Qvest-owned company. 

Peter Nöthen, group CEO, explains, “By transforming qibb into an independent company [called Techtriq], we can bundle resources better than before. The current focus is on products for content archiving, cloud playout and channel disaster recovery in the cloud. Further modules for content production and editing as required in fictional production or in live sports for instance will follow shortly.” 

The qibb package offers an integrated cloud app store for clients looking for a simple way to gradually move their IT infrastructure, software applications and workflows to the cloud.  

“With qibb ultimate, third-party software products can be efficiently orchestrated, managed, and analyzed either on-premise, with a single cloud provider, in a multicloud environment or in a hybrid model. System resources can be scaled according to demand as typical for the cloud while clients have the full control over running processes and user rights at all times.” 

Belgian media company DPG Media’s daily newspaper Het Laatste Nieuws (HLN) recently launched its new digital channel HLN Live using qibb ultimate. 

Czech developer Aveco claims to be the industry’s largest independent automation provider This includes the ASTRA MCR, the “only” master control automation that handles on-premises as well as remote stream splicing and cloud playout all in the same user interface. 

For master control on-premises and cloud-based hybrid playout, Aveco has teamed with Harmonic for integration of Harmonic’s VOS 360 end-to-end video cloud infrastructure as a service platform. ASTRA MCR manages hybrid on-premises and cloud-based playout using a single user interface enabling users to manage on-air operations in multiple locations. 

“The ability to control on-premises, remote and cloud-based playout functions from a single interface brings new capabilities to broadcasters and media organizations who desire the best of all worlds,” says Pavel Potuzak, Aveco’s CEO. “For example, users can deploy SDI and 2110 playout on-premises, run remote stream-splicing ad insertion and handle cloud-based channel playout OTT, all managed from Aveco's automation screen. Media companies pick which operations happen where, based on their business model and services desired by their viewers.” 

Another example is where broadcasters use their on-premises playout for their main TV channels, while using the cloud for disaster recovery, for temporary channels and for niche/low cost channels. 

Softron says OnTheAir Video 4 “is probably one of our biggest releases ever with a host of new features to enhance both the user experience and the performance”. As an example, it is adding a built-in Character Generator for adding animated logos, tickers and lower thirds into a playlist. 

OnTheAir Video, which is optimised for the Mac, can be used for automated playout using its own scheduler or it can serve as a clip store for live news operations, local broadcast or live shows.  

“We have improved the built-in scheduler of OnTheAir Video for easy broadcast scheduling,” the company states. “We have also added features to be used by major broadcasters, such as a new integration with Wide Orbit Traffic software for advanced master control operation. OnTheAir Video can now be used as a playout automation solution with features that were until now available only with expensive solutions.” 

BroadStream offers the OASYS Integrated Playout solution recently installed at public broadcast station KRSU, Claremore, Oklahoma. The timing for the launch wasn’t ideal given social distancing rules. According to Kevin Shoemaker, Chief Engineer at KRSU: “We know, moving forward, that if necessary that our OASYS system and staff can perform all necessary functions remotely and our ability to broadcast will never be in jeopardy. The change from traditional Master Control to OASYS is a huge step ahead for us.” 

Based on a software model OASYS, run on consumer-off-the-shelf hardware, making it easier to support and maintain. The approach also reduces overall requirements for hardware, multiple support contracts and multiple manufacturers. Its software modules can be tailored around solutions for SDI, IP and UHD, ingest and recording, program preparation, captioning, scheduling and more. 

PlayBox Technology has evolved its channel-in-a-box systems to the cloud under the brand name Cosmos.  The company explains: “The distributive virtualised architecture of the playout engine from Cosmos provides a fast and trusted cloud-based playout solution enabling broadcasters and service providers to spin up both OTT and traditional TV channels in a few minutes thus lowering the cost of ownership.” 

Channels can be hosted from data centres or from an MCR over a private or public cloud. It facilitates playout of file based and live services, supports AES67 along with the main SMPTE standards. 

“Cosmos virtualises the process of channel management under control from a standard enterprise computer via a firewall-protected secure internet connection. Programme playout can be automated to any required extent while always retaining the freedom to insert live content.” 

iTX is Grass Valley’s integrated playout offering. It is claimed to be the world’s most widely-deployed TV playout platform for broadcast television. Among its attributes is IP/SDI format flexibility and scalability for future readiness, along with workflow tools for greater process automation and lower OPEX.    

For applications with live content that are “highly reactive”, Grass Valley’s Morpheus Automation is capable of scaling from small, single-channel systems to very large systems. Its scalability is a result of a modular architecture that allows users to tailor make a system based on the services and components required for the playout operation. Morpheus supports Grass Valley’s ICE Integrated Playout for SDI to IP and Masterpiece, the firm’s 12G-SDI master control switcher, along with a vast list of third-party devices. 

GV has made a clutch of recent iTX sales into India including at news channels News J, in Chennai; Hindi-language broadcaster Swaraj Express; and Marathi language channel, Lokshahi News. 

VEDA Automation is the playout platform from France’s SGT capable of managing local, thematic and premium channels as well as multi-channel playout centres.  It features a fully redundant client-server architecture to run multi-playout systems in 24/7 operations. It is compatible with systems from Harmonic, Grass Valley, Imagine Communications, Evertz and more and works with traffic systems including MediaGenix and LORA. 

The VEDA Automation client optimizes operations client-side by monitoring playlists and reducing clicks and popups. It’s possible to perform live automatic recording from a playlist and to set alerts for highlighting ad quota overruns, adapted to local regulations. 

wTVision develops integrated solutions for MCRs offering scalable, flexible and customizable channel demands. Its playout automation setups, also available in the cloud, have ChannelMaker at heart and are extremely flexible. The product’s plug-in based architecture is adaptable to fit a client’s existing structure or to use wTVision’s suite of applications for Ingest and Trimming, asset management or 3D graphics. Its open architecture means ChannelMaker can be integrated with a wide range of third-party broadcast devices and solutions including Blackmagic Design, Grass Valley, Imagine, WideOrbit, Ross, Vizrt and Harmonic, 

To keep its partners on air, wTVision developed a web-based playout automation solution, that makes it possible to monitor and broadcast all channels while keeping everyone safe.  

“wTVision has provided us with a remote integrated broadcasting environment guaranteeing the safety of our staff while maintaining a high delivery capacity and a high-end quality broadcast capability, across our linear and digital platforms,” tated Jorge Pavão de Sousa, MD at Eleven Sports Portugal, one of the many channels to whom wTVision provided its safe playout solution. 

 

Behind the scenes: Project Power

IBC

VFX supervisor Ivan Moran grounds the superpowers of new Netflix sci-fi in plausible pseudo-science. 

In Netflix’ recently-released sci-fi action film Project Power anyone can potentially tap into their unique super power, a conceit that gave the film’s VFX team tremendous scope to wield their own. 

https://www.ibc.org/trends/behind-the-scenes-project-power/6815.article

“We found ourselves in a really enviable place with almost a blank canvas to explore the drug and its effects,” explains Ivan Moran (Arrival, Ghost in the Shell). “A lot of the characters were changed based on the advice or designs we came up with.” 

Like most VFX Supervisors, Moran’s involvement in the storytelling begins with interpreting the director’s vision into concepts that can work technically and within a budget. He was instrumental in guiding the production away from wall-to-wall CGI and toward shooting as much practically as possible. 

“When I first met directors Ariel Schulman and Henry Joost it was clear that most of the movie plays as a kidnap crime thriller and they wanted a gritty, visceral reality to the VFX to match. There’s no fantasy world or alien planet to hide behind where you can take a more stylistic license. This had to be real.” 

Moran took inspiration from work he had done as a compositor on the Two-Face for The Dark Knight (2008) 

“In my personal experience I find it easier to copy reality rather than completely invent it. If you create a shot out of nothing and its 100 percent CG you risk falling into the uncanny valley where it might look real but doesn’t feel right. In contrast, when you work with real elements you can build from highlights and shadow detail. Two-Face was half real, half not so that was the approach I took.” 

Starring Jamie Foxx, Joseph Gordon-Levitt and Dominique Fishback, Project Power depicts a murky world where a new designer drug can, for good or for ill, unlock five minutes of latent power in the person taking it.  

“The script suggested that individual super powers were derived from the animal kingdom in some way,” Moran says. “The problem we had at the beginning was why does everybody get a different power and how does the pill actually make it work? The director’s tasked me with coming up with some pseudo-science to explain this.” 

With art director Jonathan Opgenhaffen and production designer Naomi Shohan Moran’s research led them to root the drug’s side effects in actual science – that every living creature is subject to subatomic vibrations.  

“Each one of us experiences subatomic vibrations slightly differently and make us unique,” he says. “Rather than altering the body itself, the power pill amplifies what is already there. It chemically changes your physiology to such an extent that it manifests itself physically.” 

This idea was coupled with the notion that all of us share DNA along the chain of evolution with every other type of creature or matter in the universe. 

“The power pill awakens remnant DNA, those animal basic instincts, and amplifies their innate characteristics,” says Moran. “For example, the character in our story who radiates fire signifies that he is boiling with rage on the inside.” 

Animal instincts 
One character started out in the script as an invisible man but Moran’s pseudo-science suggested they base the design on that of a cuttlefish which has the ability to camouflage according to its surroundings. 

Another key sequence references an armadillo’s leathery protective shell. In this sequence there’s a full-frame shot of Frank (played by Gordon-Levitt) staring into the camera as he takes - and survives - a bullet point-blank to the head.  

“The last thing we wanted was to create a fully-CG face of Frank/Joseph particularly because his face fills the frame. We tried all sorts of ways to do this and settled on the old school method of firing an airgun at him. What gave it away was that his hair flapped up so we had to replace it with GG hair.” 

This was filmed in slow-motion using a Phantom Flex 4K at 900 frames per second but not even that was fast enough to capture the muzzle flash or speeding bullet of a real gun so these elements were added digitally, referencing gun shots filmed at over 70,000 fps. 

In keeping with the film’s aesthetic realism Moran devised an ingenious solution for an eye-catching sequence in which the character played by rapper Machine Gun Kelly becomes a superheated body of fire.  

Not only did this involve overcoming the perennial film-making challenge of convincingly setting a cast member on fire, but the sequence took place over 100 shots, in nine different locations and culminates with the characters being dunked underwater.  

“The hardest problem with fire is lighting. If you shoot an actor and digitally put fire on them it will always look fake because the actor is not illuminated by the fire itself.” 

Moran and the prosthetics and electric team developed programmable strips of flickering LED lights which were applied as a layer to the actor’s body encased in a head-to-toe prosthetic suit.  

“The process took up to eight hours to install and MGK chose to wear it three days in a row to not have to go through it twice,” recalls Moran. “It was about 20-30 panels with wiring, battery packs and transmitters underneath the prosthetics including two on his collar bone to illuminate his face. We ran a looping fire gif so we could swirl ‘fire’ around his body, extinguish and reignite it and control colour temperature.” 

As well as providing a flickering, fire-like light source on the actor, the suit also illuminated walls and objects as MGK ran past them. They also had a stunt actor - actually on fire this time - recreate the scene to provide real-life reference. 

“It is ground-breaking. LED suits have been used before but rarely on camera. We just warped the in-camera effect in post so they look like coals on a camp fire as if illuminated through his skin. It would not be possible without that interactive light source.”

Powering up 
The most challenging scene was the finale in which Jamie Foxx’s character Art lays bare his explosive inner power. 

“It’s easily the most challenging sequence I have ever attempted to devise, plan and film in my career,” says Moran. “The shots were filmed at night, on location, in rain, at extremely high camera speeds which was immensely challenging given the lack of light.” 

The pseudo-science for Art’s power was based on how infrasonic soundwaves interact with physical surfaces and can theoretically change the state of water into plasma. 

“If you put two halves of a grape into a microwave, and run away, it will turn into plasma,” says Moran of the real-world effect he was looking for. “I don’t recommend you try that at home.” 

He continues, “We’ve seen so many explosions on film from nuclear bombs on down but we had to make this one different. We didn’t want the energy blast to come magically out of him. We want the audience to understand the physics and chemistry of the explosion actually happening.” 

The problem was complicated by having to explain his idea to directors, VFX team, cast and crew. “I could close my eyes and imagine how that it was going to look but no one else could. I’d explain it to my team and they’d ask ‘is this really going to work out?’ I had to trust my gut and experience more than ever before because I didn’t want it to look like a normal explosion. That’s why it became a drawn-out slow-motion ballet which is visually telling you that Art is creating this explosion.”

Bang for your buck 
Close-ups were shot on the Phantom at 900fps, wider shots on a Sony Venice at 200-300fps which caused immediate lighting issues shooting such high speeds at night. Stunt performers were flung around the set for later digital augmentation, dozens of background details were added so VFX could distort them to show the shockwave exploding away from Art. The actors were also instructed to move in slow motion to enhance the effect. 

Framestore (London and Montreal) completed around 400 shots on the show including all the most complex character effects. Image Engine in Vancouver and Outpost in Montreal added 600 ‘invisible’ FX including CGI of the large tanker in the finale. Distillery VFX in Vancouver worked on a car crash sequence. 

Moran works for Framestore but was loaned out to Netflix as overall VFX Supe on location in New Orleans for the show. 

“There are supervisors who are permanently freelance but increasingly a studio will pick a lead house to do most of the VFX and that house will nominate a supervisor to go client side and run the project from a VFX standpoint,” explains Moran. “The advantage is that if you are in house you know the pipeline and crew like the back of your hand. Vendors set up slightly differently to arrive at results so close knowledge of a lead supplier gives you a short hand to making key decisions.” 

He adds, “We didn’t use cloud on this show but it’s a trend that will definitely happen. The main hurdle is security. It’s not that cloud is insecure but certain studios with major VFX projects are acutely sensitive to security. Suddenly the cloud makes even more sense as necessity when artists have to work from home. They are basically windowing into remote servers which causes slow-downs and lags but cloud software and connectivity will become much more efficient.” 

Moran enjoys his super power which as with all the best VFX supervisors is a mix of creativity and technical mastery. 

“At school I was really good at science and geeky but I was also into arty, floaty, theatrical stuff,” he says. “I struggled to choose a career. There seemed no way to combine them. I thought of being an actor or a scientist. So, I’m incredibly fortunate that this career offers a mind meld of both.  

“I’ve done a lot of photography and creative imagineering at the same time you have to take a brief and technically disseminate that the team. That involves working out what is technically achievable and cost efficient and involves a lot of to and fro to arrive at solutions. It’s this strange mix of the two disciplines that I love. 

“Over and above that, everyone always wants to do something different, so we’re constantly coming up with new schemes to produce something that no-one has seen before.” 

Wednesday, 14 October 2020

The evolution of distributed live production

Copywritten for Net Insight

https://netinsight.net/resource-center/blogs/kenth-evolution-of-distributed-live-production/

Over the past few months, agile and more streamlined production workflows have quickly become the norm. The media industry has adapted to fewer or no on-site staff and the need to produce from the home. This has accelerated the migration towards distributed productions, which depend on connected and elastic technological solutions.

Covid has also generated a lot of buzzwords – including the ones used above. There is a need for clarity about the practical implementation of remote distributed live productions.

We can take our cue from the latest webinar Open up to the Cloud featuring experts from Grass Valley and Net Insight.

It asks us to imagine a use case of three camera production of a touring car race.  The commentator for the event is working from home, as are race analysts and all operators while the presentation is from a studio. The director is in one country, the switcher is in another, the producer is also at home.

All video feeds are encoded and sent to the cloud wrapped in ARQ or industry-standard retransmission protocols (Zixi, SRT, RIST). These protocols quality assure that the streams will ingest on time and at the quality required for this production.

A cloud-based transport and switching solution for live video such as Nimbra Edge ingests the streams. It ensures synchronization and makes media available to a media processing platform such as Grass Valley’s AMPP. Importantly, both Nimbra Edge and GV AMPP are cloud-agnostic. They could deploy on any of the major cloud providers, on a private cloud, or a hybrid of the two.

The combination of GV AMPP and Nimbra Edge enables you to connect multiple locations and inter-cloud locations across the globe. It enables you to deliver a user experience on par with anything done in a traditional manner.

There are several ways of perfecting this interconnection. One of the simplest is to use Net Insight’s Edge Connect software which is designed to cloud-enable any third-party device or workflow.

Once handed off to GV AMPP, our sports production team can access all the apps that it has specifically requested for this workflow. In this case, the functionality is a switcher, clip player, audio mixer, multiviewer, and conversion tool.

All production staff working from home can switch the show live as if they were in a central hub. They can produce various programming and localization before handing the feeds back to Nimbra Edge. That’s useful because not only does Nimbra Edge take care of all inputs but in the same system at the same time it takes care of output distribution too. The feeds are entirely selectable by the producers in accordance with rights holder contracts.

Far from the monolithic production blocks of old, this is a fully microservices-based infrastructure – giving the producer the power to call up apps on-demand.

Among the many benefits of this approach is the ability to shift CAPex to OPex using infrastructure-as-a-service to pay for just the resources you need for as long as you need.

It means a technical director can work on several events in a single day, regardless of where in the world they are or where the events are taking place. On-air talent can cover multiple games without having to travel. This is the evolution of the centralized to a virtualized and decentralized gallery in action.

What we’re seeing is just the beginning.

Advances in media processing technology will deliver even greater power to the cloud. Producers will be able to interconnect to an array of production solutions and a community of marketplaces in the cloud revolutionizing how we create media today.

Monday, 12 October 2020

The genie is out of the bottle on production freedom and operational resilience

 copywritten for Blackbird

https://www.blackbird.video/uncategorized/the-genie-is-out-of-the-bottle-on-production-freedom-and-operational-resilience/


The outbreak of the pandemic sent everyone hurrying to keep the lights on. Remote production went from ‘nice to have’ niche to universal necessity. As we progress to what we hope is the aftermath, the global experiences of the C-level executives and of their employees mean that the world will never be the same again.

That’s because there’s been a fundamental shift in our attitude to technology – one which values both operational resilience and end user freedom above all.

The crisis initiated a cultural shock in which the long held norms of expensive office real estate and the spiralling carbon footprint of travel are being radically recalibrated. Flexible working, positive work-life balance and agile solutions for business continuity are here to stay. The genie has truly been let out of the bottle.

Companies now realise that whether it’s a second, third or fourth Covid wave or a indeed a whole new type of pandemic, such events can seriously impact their ability to operate and even risk their overall survival. By building in operational resilience the effects of any such situation can be significantly mitigated.

Cloud video editing platform Blackbird wasn’t conceived around a pandemic but it’s no accident it can work in one. It was conceived around concepts of resilience and freedom – freedom from location, freedom from proprietary systems and hardware and the resilience to continue to operate with very little resource. 

Light footprint packing a powerful punch

The freedom from heavy infrastructure is key. Blackbird simply doesn’t require it, whether it’s bespoke hardware, bespoke equipment, or heavy-duty bandwidth or power. It is ideal for those companies looking to reduce their technological footprint while packing a powerful punch.

For example, Blackbird’s competitors in the professional editing space like Adobe Premier and Avid require a second piece of software, Teradici, to run as a virtualised system. That will need at least a 30Mb/s bandwidth connection, along with a local GPU, plus significant local storage and cloud storage to make that operable.

To add an editor with Blackbird you simply need a laptop and a domestic internet connection of just 2Mb/s. There’s no need for heavy editing suites, expensive bandwidth or lots of hardware and storage. Since no source media is moved, the ingress and egress costs are negligible and there are massive energy and time efficiencies.

Remote production, which pre-pandemic was euphemistically called ‘at home’ production, is now literally about working from home for talent, producers, operators, engineers. Video workflows are not just contributed to a central hub but truly decentralised.

Return to Play Plan

When lockdown hit and their hundreds of production staff were unable to travel into the office A+E Networks doubled its capacity with Blackbird allowing all production staff the ability to access and create content from home. Suddenly with no live sports, Blackbird is used as a remote tool by Arsenal F.C. to access a central depository of content in order to satisfy, excite and fulfil the desires of the club’s tens of millions of fans.

One area of live sports that could continue to function was that of esports. However, production staff at Riot Games in Los Angeles are still unable to travel to their central facility and so use Blackbird to produce the world’s most popular esports tournaments remotely from their homes.

When live action sports resumed with socially distanced workflows, Australia’s National Rugby League (NRL) used Blackbird to rapidly clip, edit and publish match highlights to a variety of social platforms within as little as 30 seconds. In June, National Hockey League (NHL) deployed Blackbird to provide remote video production capabilities to assist with the League’s Return to Play Plan.

Resilience and freedom are concepts that run through everything that Blackbird do.

With Blackbird, enterprise scale production can continue to be performed from literally anywhere (a studio, your home, the office, a coffee shop) where there’s minimal bandwidth. Facilities can keep working 24/7 and across time zones wherever talent is located for true distributed, collaborative production. Why travel to a production when production comes to you? Blackbird not only enables Covid-safe production to ride out the pandemic, it is permanently ‘green’ allowing carbon neutralizing targets to be hit today.

Future-proofing has never been more apposite. At a time when organisations of all types are forced to operate on a decentralised basis, more and more content owners are turning to Blackbird to free their video production and equip themselves for whatever is thrown at them next.

Wednesday, 7 October 2020

Nvidia: The metaverse is coming

IBC

Jensen Huang, CEO of red-hot computer processing developer Nvidia says the world is entering the age of AI. 

We are entering the age of AI, one which promises super-scale virtual worlds, autonomous vehicles and a swift end to global pandemics. It’s one in which artificial intelligence actually writes software and Nvidia claims to have cracked the code. 

https://www.ibc.org/trends/nvidia-the-metaverse-is-coming/6864.article

The tech company is best known for making specialist graphics processors for gamers. Recently it became the most valuable American chip company, darling of investors and a major force in robotics, scientific computing, data centres and the 5G edge. Its AI smarts will soon potentially be inside billions of smartphones, servers, PCs and consumer electronics. 

That follows the company’s U$40 billion swoop for British chip manufacturer Arm Holdings from Softbank. The proposed takeover would be the silicon industry’s biggest deal ever. If ratified (and there are anti-competition hurdles to jump) the deal would pose a formidable threat to Intel, according to the WSJ. 

Apple, Qualcomm, Microsoft and Samsung are among companies licensing designs from Arm.  

“A few weeks ago we announced our intention to acquire Arm, the most popular CPU in the world,” said chief executive officer Jensen Huang, during a keynote to the virtual GPU Technology Conference earlier this week.  

“Together we can offer Nvidia accelerated AI computing technologies to the ARM ecosystem reaching computers everywhere.” 

Huang is not averse to hyperbole. The Taiwanese-American founded Nvidia in 1993 with just $40,000. Its designs for accelerating the graphics quality for video games brought instant success. In 2006, its invention of a programming model that enabled multiple complex mathematical calculations to run in parallel made its graphics processing units (GPUs) a staple in cloud computing, AI, and data. It has shipped over 1 billion of its chief GPU called CUDA.  

Its software development kits has been downloaded 20 million times. There are 6,500 start-ups building on Nvidia and more than 2 million Nvidia developers.  The number of its GPUs used by major cloud providers like AWS and Azure now exceeds that of all cloud CPUs.  

“Within three years, Nvidia GPUs will represent 90% of the total cloud inference compute,” he said. “We are passed the tipping point.” In July, the company overtook Intel on the NASDAQ to reach $251 billion. 

So, let’s take the hyperbole seriously. “If the last twenty years was amazing, the next twenty will seem nothing short of science fiction,” he prophesied. “The metaverse is coming.” 

First coined by author Neal Stephenson in 1992 novel Snow Crash, and elaborated by sci-fi like Ready, Player One, the metaverse is conceived as the next generation internet.  

“It is one where humans as avatars and software agents interact in a 3D space,” said Huang. “A VR successor to the internet.” 

Games building worlds like Minecraft and Fortnite presage the beginning of the metaverse. “Though they seem like games today, inhabitants of these early metaverses are building cities, gathering for concerts and events and connecting with friends,” said Huang. “Future worlds will be photorealistic, obey the laws of physics (or not) and be inhabited by human avatars and AI beings.” 

Not just a place to game, the metaverse “is where we will create the future… before downloading the blueprints to be fab’ed (fabricated with 3D printing) in the physical world,” he said.  

Nvidia’s ambition is similar to Magic Leap, Google and Apple which have also articulated versions of a 3D internet, populated by avatars, governed by spatial computing. It wants a controlling share of this online future by launching Omniverse.

Described as a remote platform for simulation and collaboration, the Omniverse “fuses the physical and virtual worlds to simulate reality in real time with photorealistic detail,” Huang said. “The Omniverse is a world where AI agents are created. It’s a sim for robots. A place where robots can learn how to be robots.” 

Cloud native and photoreal with path tracing and material simulation, the Omniverse allows designers and artists and even AI’s to connect in a common world. “This is the beginning of the Star Trek Holodeck, realized at last.” 

Now in open beta, the tech has been evaluated by users including Industrial Light and Magic and is based on technology originated at Pixar called Universal Scene Description. This is a format for universal interchange between 3D applications from Epic (Unreal Engine), Adobe, and Autodesk (Maya).  

Virtual production technology company Lightcraft Technology has also evaluated the platform. Its chairman, Bill Warner (who is also the founder of Avid), says they decided to base their entire product line on the technology. 

“Omniverse represents the platform of the future for all aspects of virtual production,” Warner said. 

Francois Chardavoine, VP of technology at Lucasfilm and ILM testifies of omniverse: “The potential to improve the creative process through all stages of VFX and animation pipelines will be transformative.” 

However, even the omniverse and the metaverse pales beside the firepower of AI. According to Huang, “As mobile cloud matures, the age of AI is beginning. AI is the most powerful technology for of our time.” 

He argued, “Computing is the technology of automation. Software algorithms automate. Automation drives productivity and growth for industries. Automation at large scale leads to breakthroughs. [But] we are limited by our ability to write software. Finally, after decade of research and deep learning, the abundance of data and the powerful computation of GPUs came together in a big bang of modern AI. Now software can write software. AI is the automation of automation.” 

He acknowledged that the software written by an AI is very different to that written by a human (“it is vastly more parallel and a thousand to millions of times more compute intensive”) but claimed the ability to power the computers needed for it are in Nvidia’s wheelhouse. 

“AI requires a whole reinvention of computing, from algorithms to the whole ecosystem. The age of AI has begun and Nvidia is advancing it.” 

An example of its god-making power is medical research. He suggested that with AI powered by Nvidia GPUs scientists can use simulation to magnify the search for a cure to diseases.  

He announced Nvidia would build the fastest supercomputer in the UK. Cambridge One, in Cambridge, would have 400 petaflops of AI performance putting it in the top 30 supercomputers in the world. Research partners include Astra Zeneca and the NHS. 

It recently launched Ampere, a GPU which fuses programmable shading, ray tracing and AI capable of generating photoreal graphics at the highest frame rates, solving the traditional trade off which video game developers had to make. 

“It’s the faster upgrade in our history. Everything we ship is instantly sold out.” 

He also launched a cloud streaming video AI platform whose first application is video conferencing. This will take advantage of a Nvidia-developed ‘Conversational AI’ called Jarvis whose neural text-to speech is described as “human like.” 

“Using AI we can perceive the important features of a face, send only the changes of the features and reanimate your face at the receiver – to save bandwidth,” he said. “AI can reorientate your face to look as if you’re making eye contact with each person on the call individually. It can realistically animate an avatar based on just the words you are speaking.” 

He added, “We have an opportunity to revolutionise video conferencing and invent the virtual presence of tomorrow.”  

More than 30,000 developers have registered for the virtual GTC to check out more than one thousand talks on topics including autonomous machines, VR and virtualisation. There are also webinars related to latest AI developments in media and entertainment driven by Nvidia tech. Among them is cloud-based tool Kamua which automates resizing, cropping and repurposing videos from desktop to mobile apps like TikTok, Snapchat and Instagram. It claims to have users who have cut manual workflows from 16 hours to 15 minutes by embracing its automation software. 

IDenTV provides advanced video analytics based on AI capabilities powered by computer vision, automated speech recognition and textual semantic classifiers. 

Another demo from Taiwan start-up A.V. Mapping shows how AI enables composers to use a its video and music mapping platform “to shorten the music-for-video search by nearly two thousand times.”       

Thursday, 1 October 2020

Netflix’s Tiny Creatures: Pushing Narrative With HDR Color Grade

Post Perspective 

Netflix’s Tiny Creatures is a natural history drama with an anthropomorphic approach to telling stories that show the harrowing life-and-death challenges that small animals face.

https://postperspective.com/netflixs-tiny-creatures-pushing-narrative-with-hdr-color-grade/

“This is a new genre,” says Jonathan Jones, creative director of Ember Films, as well as a director and DP. “There’s nothing else like it around. It’s not a documentary and it’s not completely fiction. I call it an animal drama.”

The eight 25-minute episodes tell various stories about the lives of creatures in a narrative style more common to a feature film.

Jones called on 20-plus years of experience working with and photographing animals for natural history programming when planning for Tiny Creatures. His credits include the flagship BBC series Planet Earth II, for which he won an Emmy Award, Seven Worlds One Planet and National Geographic’s One Strange Rock.

“I understand what motivates these animals, whether they are nocturnal or diurnal, and the science behind their physiology,” he says. “I put that experience into the script so that our filming approach could be plausible and technically achievable.”

Mini Movies
This grounding allowed Jones to push each story’s drama using the tropes of cinema. A rolling ball chasing a little mammal, for instance, recalls Raiders of the Lost Ark. Water pipes burst as a mouse escapes in the nick of time. A hawk looms over a kangaroo rat like a dinosaur in Jurassic Park. And a scene with an opossum in a hen house has nightmare overtones.

With producing partners Blackfin and Momentum, Jones developed the script and formulated narrative arcs for the little creatures, giving them a voice based on their real-world habitats. For example, Episode 1 plays out like a Western, only the stars aren’t a cowboy and a bank robber — they are a rat and a snake.

The landscapes were shot in various environments at locations throughout the US. Episode 1 was filmed entirely on location in Arizona, but for the other episodes, the animals were filmed at Ember Films’ studio in England — where the habitats were rebuilt as sets (i.e., New York apartments, wood lodges, sewer systems, barnyards).

“We painstakingly storyboarded every shot to be frame-accurate so that the entire shoot would unfold just as we scheduled it,” he says. Video backplates, including footage from drones plus high-resolution stills, were captured at each location “to make everything sing visually.”

Double-Pass Technique
These elements were used as a style guide to recreate the set and tune the lighting to mimic the real environment. For example, Ember used 10-foot by 6.5-foot print enlargements mounted outside the windows of the stage, rather than greenscreen, to reveal the external world.

Other in-camera tricks included blurring depth of field so that the eye concentrates on the foreground and only subconsciously sees the background; adding physical mid-ground sections between studio-shot foreground and background plates; and making rapid cuts so the viewer doesn’t have time to interrogate the picture.

However, it was another SFX technique that enabled the shots of heroic creatures appearing to be in dire peril from being captured by predators, including a snake, an owl and a hawk. Jones explains how the “double-pass” technique ensured the animals’ safety. “We photographed one animal and then made that animal safe. With the camera angle locked, and provided nothing in the set was moved, we filmed the second animal so that when we put the frames together, it looked like a predator chasing its prey.”

This was only possible with a workflow that brought post production to the set. “To ensure that the two frames match when stitched together, rushes from each take are edited in the moment,” he says. “Some shots would have eight or nine layers of 8K footage to composite into one shot.”

His tool of choice was the Red DSMC2 camera with the Helium sensor. “I wanted to shoot the highest resolution possible,” notes Jones, who shot at 6K to 8K. “For slow-motion, we used the Phantom Flex4K often at 1000fps.”

The data requirements were eye-watering. Everything was kept raw for post in Adobe Premiere. Each day’s shoot averaged 3TB to 4TB. Each episode totaled about 80TB running online, mirrored for backup and archived on LTO. Adding to the challenges of this production, the episodes set in Washington and Louisiana were filmed at the same time with dual post workflows.

HDR Master
Tiny Creatures is also mastered in Dolby Atmos and in HDR. The color pipeline was devised and managed by Toby Tomkins, colorist and co-founder at London boutique Cheat.

“Jonathan really wanted to push the narrative with the HDR grade, which was really fun,” Tomkins recalls. “For example, when there was a shift from night to daytime or dark interior to sunny exterior, he really wanted the audience to feel the change with eyeball-iris-closing inducing dynamics to really push a heightened realism not possible in SDR.”

Alongside Tomkins was Cheat colorist Jack McGinity, who developed the base look together with Jones and graded three episodes as well. Cheat calls on Blackmagic Resolve for color grading.

Tomkins continues, “The stars of the show were the animals, and we leveraged the enhanced dynamic range and color volume to amplify the texture and colors of the animals to add depth and really make them ‘pop.’ We ultimately made the decision to prioritize the HDR grade, as we believed most viewers would benefit from the enhanced experience.”

Lighting
Shooting with the Red and Phantom cameras at the same time necessitated a lot of light to compensate for the fast speeds. To that end, they employed large HMI lights as distant sun sources but many of the scenes were lit by LED Dedolights, which emit no heat and can have their color temperature changed with ease. Other fixtures included ARRI SkyPanels for softer light and circular Rotolights.


“Circular rather than square lights are useful for highlighting the eyes and looking like a reflection of the sun,” Jones shares. “The animals are our superstars, and the temperature of the lights was always monitored.”

He continues, “If an animal wasn’t feeling ready on set, we would never push it. We always had three things scheduled at any one time as a backup in case we couldn’t go with plan A. The biggest challenge was the logistics in taking a different approach for every animal and for each episode.”

Finishing Under Lockdown
The color grade was a complex job, but due to the global pandemic, it had to be completed under lockdown conditions. “Fortunately, we had already set the tone and look for the series with Toby,” Jones relates. “When COVID forced social distancing, we used Streambox to communicate, and I had a color HDR monitor manually calibrated with Toby’s Resolve. It took longer to do reviews, but it’s testament to Toby and his skill that we made it work.”

Actor Mike Colter (Luke Cage), located in the US, had voiceover work on Episode 8 to finish when the work-from-home mandate struck. Jones says, “He recorded the episode on his iPhone while sound-proofed in his daughter’s closet using a new microphone we bought online. He had to stop at the end of each page and upload it.

“Everyone on this production worked so hard to make it what it is. This is a new genre using some of the highest-resolution cameras around, with a truly cutting-edge post workflow. Everything was a real challenge but the potential for extending the format geographically in the US or anywhere in the world, as well as changing the scale of the creatures, is exciting.”

Jones concludes, “For me, though, it all comes back down to trying to engage families in wanting to ask questions about the natural world. Hopefully by enjoying this shorter, higher-octane show, they will dig further into whole seasons of amazing shows like Our Planet and Planet Earth.”