Tuesday, 7 November 2017

Graphics & Editing roundup


Avid moves to the cloud, Adobe offers collaborative editing and Blackmagic’s focus is on realtime composites.

The product strategy of most production and post production vendors continues to be one which shifts functionality and services into the Cloud. Perhaps the biggest thing in Avid’s favour is that its brand offers a familiar and trusted on-ramp for content creators, facilities and broadcasters moving to the cloud.

At IBC, the company debuted MediaCentral, a suite of tools accessed by browser which promises to help create, distribute, and manage content using one common platform.
According to Chairman & CEO Louis Hernandez, Jr., Avid has spent half a billion dollars on re-engineering its product for the cloud.  

He said, “We’ve not just completed a full transformation we have created a platform that is flexible and open and agile enough to allow us to move forward.”

The MediaCentral production suite is the firm’s centrepiece. It costs from $20,000, includes Media Composer and Nexis storage, and has modules and apps for editorial, news, graphics, and asset management as well as an array of media services and partner connectors.
Nor is it exclusively an Avid domain. One of those ‘partner connectors’ could be Grass Valley or Adobe, should Premiere be your craft tools of choice.

“MediaCentral delivers a unified view to all media – whether stored on Avid or another platform,” said Hernandez. “The user experience is common for everyone.” Avid is now rolling out web-based applications like search, edit, logging and publishing.

Media Composer VM is debuted for a user to access a cloud-based version of Media Composer from a tablet or computer. Media Composer Cloud Remote is an application run from the computer but it accesses content located in the cloud.

Microsoft is Avid’s preferred cloud provider. Dana Ruzicka, VP and chief product officer, said “The move to cloud is imminent for almost all of our customers – but it’s not a lift and shift. It’s an evolution based on their business needs. This really is the beginning of the cloud era for our industry.”

ProTools, news workflows and archiving are already available via Azure. Avid is also allowing users to tap into Microsoft’s machine learning tools, for example, facial recognition and social media analytics.

Avid also unveiled Maestro Sports described as an all-in-one broadcast graphics and video control system targeting live sports production. Its selling points include operation by a single person so broadcasters can save on costs. An operator can integrate tracked-to-field graphics and augmented reality to highlight and analyze plays, as well as visually tie sponsors into the game. All from the same UI you can create virtual formations and starting lineups, 9-metre circles, distance to goal indicators, and even virtual elements such as team logos, 3D objects, and advertising.
The system includes a video server, for graphics and video rendering, and playout as well as connection to external storage such as Avid NEXIS in the studio.  

Blackmagic Design’s main focus at IBC was on the Ultimatte 12, the latest version of the keyer which Blackmagic acquired this time last year.

Preferring to describe it as an advanced real time compositing processor, Ultimatte 12, is for creating composites with both fixed cameras and static backgrounds, or automated virtual set systems. It can also be used for on set pre visualization by allowing actors and directors see the virtual sets they’re interacting with while shooting against green screen.

Ultimatte 12 also features one touch keying technology that analyzes a scene and automatically sets over a hundred parameters to make live compositing easier.  Based on a new hardware processor the machine costs under $10,000 and is available now. Ultimatte 12 is controlled via Smart Remote 4, which is a touch screen remote that connects via Ethernet. Up to eight Ultimatte 12 units can be daisy chained together and connected to the same Remote. This costs an additional $3,855.

Announced at NAB and now released are DaVinci Resolve 14 and DaVinci Resolve Studio 14. Enhanced editing capabilities, multi-user abilities, increased native OFX plug-ins and, certainly not to be glossed over, the inclusion of Fairlight all combine to make Resolve and Resolve Studio 14 a major post production tool.  The price of Resolve Studio has been lowered from $995 to $299, making the jump from the still-free Resolve to the more feature-laden Studio easier.

New features coming to Adobe Premiere Pro via Creative Cloud include shared projects, which are like bins in Avid. These allow for different subprojects to all be stored inside of a master project. This means editors can work on different scenes at the same time while all having access to each other’s subprojects with different permissions set for who can read/write vs. who can just read and refresh changes. Other useful new features include more marker colours, close gaps and font previews. This could be a ‘game changer’ for Adobe’s place in the multi-editor workflow of large feature films and TV shows, which Avid has dominated.

VR integration between Premiere Pro and After Effects is made possible by Immersive, a toolset which includes Adobe’s new VR Comp Editor in AE with the ability to transform 360 degree footage into flat rectilinear shots. In other words, it allows the editor to review their timeline and use keyboard driven editing for trimming and markers while wearing the same VR head-mounts as their audience. It will work either on screen or more appropriately with an Oculus Rift or HTC Vive headset. Immersive effects are designed to apply Blur, Glow, Sharpen, Denoise and Chormatic Aberration filters in a VR version allowing for proper results that a flat image filter could not produce.

In addition, audio will be determined by orientation or position and exported as ambisonics audio for VR-enabled platforms such as YouTube and Facebook. VR effects and transitions are now native and accelerated via the Mercury playback engine.

“Video is the fastest growing medium in communications and entertainment,” said Bryan Lamkin, svp of Digital Media at Adobe. “Adobe is breaking new ground in collaboration, social sharing and analytics to accelerate workflows for all video creators.”

Brainstorm’s InfinitySet 3 virtual set tools now includes the graphics editing and creation features of Aston so it can edit, manage and create 2D/3D motion graphics when required.
The incorporation of Aston graphics allows for direct editing of not only Aston projects but the individual elements, animations, StormLogic, data feeds and so on as if they were in Aston. This means that in broadcast operation, or while on-air, InfinitySet 3 operators will be able to adjust any object in the scene, even those with added properties such as animations or external data feeds, without having to do so in a separate Aston application.

The package supports third-party render engines such as the Unreal Engine. As it is integrated with the Brainstorm eStudio render engine, this allows InfinitySet 3 to control in real-time the Unreal Engine’s parameters such as 3D motion graphics, lower-thirds, tickers, and CG.

The product also comes with technologies such as 3D Presenter, TeleTransporter, HandsTracking, and FreeWalking, all of them using Brainstorm’s unique TrackFree technology. Especially interesting is a VirtualGate feature, which allows for the integration of the presenter not only in the virtual set but also inside additional content within it, so the talent in the virtual world can be ‘teletransported’ to any video with full broadcast continuity.

Sixty has a slick Interactive TV graphics system that layers on top of existing high-end graphics displays (from companies such as ChyronHego, Vizrt, or Ross). This creates a simplified workflow for producers while offering a highly-interactive graphical display for users on a touch-screen tablet or phone.

Users can access loads of advanced analytics and Sixty fully integrates with data services from Opta, STATS, and more. The software also places monetisation at the forefront, allowing for producers to automate ads tied to events such as goals and even prompt sales of products such as jerseys or other items that are tied to real-time moments in the stream.

ChyronHego has a freshly minted integration of its Silver robotic camera head with the RoboRail straight camera rail system from Mo-Sys for a camera tracking and augmented reality graphics solution. Silver is part of ChyronHego's range of virtual studio/AR trackers that provide real-time, precise camera motion within 2D or 3D computer-generated backgrounds.

Installed on the RoboRail, mounted vertically or horizontally on a wall, ceiling, floor, or even the anchor's desk, the system makes it easy for news productions to enliven their newscasts with visually compelling AR graphics. MOS interfaces with newsroom computer systems enable ChyronHego's CAMIO graphic asset management server to deliver the AR and virtual set graphics to air, together with the right assets and camera motions.

Google has spent $30 billion on building out its cloud services and is now pitching it to media. “For us, cloud is about scale and there is no limit to what media can do,” explained Jeff Kember, Technical Director Media, Google Cloud, at IBC. “We have an end to end production workflow from camera raw and debayering through image recognition, editing and on to archive and disaster recovery.”

The editing he mentioned refers to Google’s ability to apply machine learning to data in the cloud. For example, it has a research application allowing production teams to automatically generate highlights reels from uploaded rushes – potentially shortcutting the bespoke systems of Avid or EVS.

“Our goal is not to replace human editors but to make the process more automated,” said Kember of what he called ‘intelligent clipping.’

Its cloud rendering was used by facility MPC in London and the shooting stage in LA for creation of The Jungle Book. “VFX companies are using Google for simulation and beginning to expand their entire production workflow into the cloud. Ultimately, digital intermediate colour correction will go this way too. If you have just released a 2K version of a show and need to do a 4K version you can pull this data from the cloud in minutes, perform all the DI, master the HDR and use Google to distribute.”

No comments:

Post a Comment