Friday 28 August 2020

Continuity and invention in a crisis

 AV Magazine


As the media and entertainment industry gradually unlocks from months of quarantine it is doing so into a world which has been permanently changed. Above and beyond the social distancing and sanitary measures which will be routine for an indeterminate period, the film and TV business faces an uncertain future which stretches from the foundations of production to the economics of consumption.

https://www.avinteractive.com/features/av-live/continuity-invention-crisis-27-08-2020/

Post production and VFX companies were not required to close during the lockdown. Aside from principal photography, all of the post aspects of programme creation were able to continue, albeit generally slower than normal, but with a degree of success which will see many of the processes remain in place long after lockdown.“Everyone says that necessity is the mother of invention, but Covid-19 has taught us that necessity is actually the mother of adoption,” says Shaun Wilton, director, Anna Valley. “Cloud software, remote workflows, extended reality, live streaming – none of these are new technologies, they’ve all been available for quite some time and were slowly rolling out across the industry, but the pandemic has massively accelerated their adoption.”

There has probably never been more need for content than during lockdown. We’ve all been watching more content than ever before but, conversely, the stuff we like to watch the most – sports and soaps – has had production ground to a halt, so in our lust for content we’ve been forced to find other things to watch. Most of that content has been streamed rather than broadcast, and a lot of it has relied on technology that was previously not considered suitable for global audiences.

“Whereas fifteen years ago live content was broadcast, from a studio, via an OB truck, over satellite, in 2020 producers and audiences have adapted to content streamed over an internet connection, from a studio-in-a-box, set up in someone’s living room. And sometimes you can’t tell the difference,” says Wilton.

Editing
The bread and butter of post production is thankfully the easiest to deliver in a remote scenario. Media can be worked on in proxy (lower resolution) form accessed from the cloud over the public internet.

London facilities Rapid Pictures and Radiant Post Production, for example, were able to link storage on premises into a data centre over fast 10GigE fibre. They used the data centre, run by storage vendor ERA, as a hub to spin up 20 remote workstations at the homes of clients and those of its editors so that projects including inserts for Channel 4’s Sunday Brunch could continue to air.

“We managed to run the offline edits without the need to send out any equipment to clients’ homes,” explains Ben Plumb, managing director at Rapid/Radiant. “We can send them a link to Teradici client software (a PC-over-IP display protocol that enables access, browse, playback and sharing of media) and supply log-in details which gets them to a Media Composer offline on our network.”

Media-specific video conferencing technology such as Evercast lets you mimic the edit bay experience, with both parties seeing the same screen and output.

Grading
It isn’t difficult to grade remotely either but the Achilles heel of colour correction is the need for precisely calibrated monitoring. This matters less in the build-up stages of the grade, where directors, executive producers and DPs increasingly prefer to view sequences on a laptop or iPhone while they are busy working elsewhere out of sheer convenience. Errors can creep in at final review and approval stage especially if High Dynamic Range is required.

“Small inconsistencies or changes can have knock-on effects,” advises Fergus McCall, head of colour, The Mill NY. “That said, it could depend on the final destination for the content. If the production is bound for an online service like YouTube or Vimeo, for a corporate or commercial client, then it could be appropriate to review on that exact platform – as the intended audience would see it.”

What it is categorically not possible to do, he insists, is to make final decisions on a long form feature or high-end drama projects. “For these, a tablet isn’t going to cut it.”

A route around this is provided by Sohonet’s ClearView Flex which permits shared viewing of remote screens with virtually no lag. This can be combined with Moxion for viewing HDR rushes on-set but only in offline mode unless all parties have the same expensive and exactly tuned monitors. This gap could be closed as more affordable high-quality OLED TVs supporting Dolby Vision HDR come to market.

Audio mixing
Voiceovers, ADR and final mixes can also be done remotely. In fact, some mixers have been working at their home studios for years as clients rarely attend sessions until the final tweaks.

“Under Covid-19, sign-off can all be done remotely over the phone, via Source Now, Skype, Source Live or the Farm’s web-based collaboration tool, Fred Live,” says Nick Fry, head of audio at The Farm. “For some content this works really well but for dramas and high-end docs it’s not always a great fit. We’ve used various platforms to facilitate remote working in our audio workflows. To get the media on to our local systems we used Media Shuttle. This let us connect to Farm servers and download media. For the mix, no extra tech was needed.”

Under Covid-conditions the biggest issue is that broadband speeds to the home are just not quick enough, particularly for uploading media. “It just isn’t viable to stream a live ProTools session either through Avid’s own cloud offering or via the Farm’s network through Virtual Private Network – certainly not the large sessions we have,” says Fry.

Soundwhale is a remote audio collaboration app that lets engineers match sound to picture, and lets actors with no audio experience record their lines, with no new hardware or additional specialised software required.

A simple stereo mix can be done effectively with high quality desktop speakers at home, but as you move up through scales of surround sound mixing, through 5.1 or 7.1, and into specialist areas such as delivery for Dolby Atmos or 3D soundspaces for cinematic quality audio, the more you’d want a facility with multiple speaker monitoring set up for precise, positional mixing.

Toward cloud
If nothing else, the crisis has proven the prudence of having a business continuity policy centred on routing media to remote locations. The end game is to transition the functions of storage and compute power that currently reside at dedicated facilities into the cloud.

“We envisage finishing post remaining an on-premise service with offline being remote where it offers an advantage to clients,” says Jai Cave, technical operations director at Envy, one of Soho’s largest post groups. “We see demand increasing for a shared approach to remote offlines, where clients spend part of the week in the facility, then part of the week at home, giving them personal flexibility whilst still prioritising creative in-person communication.”

A similar strategy is in play at documentary specialist Roundtable. Its servers and workstation hardware will remain on-prem and accessible remotely.

“It gives us the best of both worlds in having a facility for clients as needed and the flexibility to work anywhere,” says Jack Jones, technical director and colourist.
Roundtable has invested in Intel Nucs, a mini-PC attached to the facility’s network that allows an editor working remotely to access data held at the facility. “It means media on our machines can be accessed by clients in full resolution, in multiscreen and with true keyboard interaction,” Jones says. “We might have ten offlines running on physical hardware and run more on an ad hoc basis in the cloud without the client being aware of any difference in user experience.”

Other facilities have already made the full switch to cloud. Untold Studios launched as the world’s first completely cloud-based creative hub in 2018, running all desktops for editorial, VFX and finishing in AWS with fast connectivity from its premises to the cloud over a private Sohonet pipe.

“We’re never going to get into a situation where we have to turn work away because we don’t have capacity,” says head of technology, Sam Reid. “Our studio is not constrained by the physical location of our data, so we don’t have to have artists in the studio working on content – they can be anywhere in the world. That was as true before the pandemic as it is now.”

Mindset change
Beyond short-term social distancing, the pandemic will permanently shift workflows away from centralised organisations to the flexible aggregation of resources and talent located anywhere in the globe.

“The experience has opened the eyes of clients to what is feasible,” says Plumb. “Some had been a bit reluctant to try remote before but now they’ve seen it in action, it will give everyone greater flexibility to work from home or a home office in future. It won’t be appropriate for every project, but remote will definitely be part of our offering going forward.”

Nonetheless it is the realtime interaction between director or DP with colourist, audio mixer or picture editor that is hardest to replicate, making bricks and mortar facilities essential to the creative process for some time to come.

“There is no substitute for being in the same room to get the creative juices flowing,” says TV and feature film editor, Steve Mirkovich. “Those off-the-cuff comments, nuances and brainstorming – the spontaneity – that gets lost remotely.”

Virtual sets & integrated production
The limitations that the pandemic presents has also provided opportunities to address global sustainability.

“Social distancing has forced us to adopt cloud workflows so media is uploaded directly to the cloud from set, edited remotely and reviewed by multiple stakeholders without anyone needing to leave their homes,” says Wilton. “Similarly, with travel restrictions making it almost impossible to shoot in foreign locations, more productions are turning to mixed and extended reality to recreate environments that they can’t physically visit, further reducing our industry’s carbon footprint.”

Hanne Page, segment marketing manager, events at Barco agrees: “With Covid, the full benefits of distributed production and remote collaboration have been confirmed. One specific evolution that has been trending are hybrid event productions, where the advanced techniques of remote and distributed broadcasting have been further adopted and flexibly injected into live event productions.”

Integrated production systems like NewTek’s can be driven with a skeleton crew, or even a single crew member. Automation is becoming a big driver in this space as it takes the pressure off these reduced crews.

“We are seeing a wider range of locations being used, instead of a traditional gallery or studio location, in order to reduce travel – all of this of course requires a decent degree of connectivity between sites,” says Liam Hayter, NewTek’s senior solutions architect.

With the adoption of unified communications (UC) and video conferencing (VC), in tandem with remote presenters and crews, there’s also been a big uptake in the use of virtual sets and environments to recreate the studio’s look and feel.

“As people are working in UC/VC environments day in day out, it becomes difficult to engage audiences and deliver entertainment or messaging if everything constantly looks and feels the same – it is visual fatigue which virtual environments can really help break,” says Hayter.

Virtual production technologies combine camera tracking and content rendered in realtime, creating mixed reality environments which permit a presenter, as well as the audience, to see and interact with the content around them.

“The disguise xR workflow is perfect for avoiding non-essential contact, mitigating the risks posed by traditional approaches to filming immersive visuals which would involve high-level, realtime in-camera shoots, green screen and other VFX,” says Tom Rockhill, CSO at disguise.

One project completed under lockdown that went viral was a live performance by Katy Perry of her single Daisies on the season finale of American Idol. The “seamless” extension of the real-world LED screens to the virtual environments could only be done by switching between camera perspectives and the LED content using the disguise workflow, he claims.

“The goal of mixed reality is to create more compelling broadcasting and a more engaged audience and presents a watershed moment in new technologies that will throw a lifeline to many industries in the current crisis.”

Tuesday 25 August 2020

This is how Netflix Project Power character’s gained their super powers

RedShark News

Sometimes being a filmmaker is as close as one can get to having a super power. Artists can use their imagination to conjure up characters and stories or create sounds and images in contexts that we’ve not experienced before.

https://www.redsharknews.com/this-is-how-netflix-project-power-characters-gained-their-super-powers

VFX Supervisor Ivan Moran (Salt, Mowgli: Legend of the Jungle) was called on to use all his technical and creative nous when he landed the gig for Netflix entertaining sci-fi feature Project Power.

“I joined the project very early on which meant that I had a hand in a lot of the initial concepts,” he tells RedShark News. “My overriding goal was to avoid creating full body or full-face digital characters but to shoot as much as possible using real world elements and augment that reality.”

Moran, who works for Framestore, worked client-side as Netflix’ overall VFX supervisor on the show, while Framestore VFX Supes João Sita and Matthew Twyford delivered 400+ shots that took in expertise from the studio’s concept art, vis dev and VFX teams.

 “We made sure to root the drug’s side effects in actual science - from subatomic vibrational physics theory through to potential innate DNA from animal biology,” explains Moran, who took on the task of researching the story’s “pseudo-science”. The aim, he says, was not only to dazzle the audience but help them suspend their disbelief by giving each power a plausible, real-world backdrop.

The art of vis dev

Directed by Ariel Schulman and Henry Joost from a script by Mattson Tomlin (who also had a hand writing upcoming DC reboot The Batman), the film stars Jamie Foxx and Joseph Gordon-Levitt in a world where a new designer drug can, for good or for ill, unlock five minutes of latent power in the person taking it. 

The VFX team began by designing the pill, including the drug’s reactive eye effect — a key indicator that the character had succumbed to its superpowers. Framestore’s vis dev team rebuilt the human eye with its intricate iris strands and blood vessels using fluid dynamics and volumes.

“The challenge was to make it look like it’s freezing in a way that isn’t terrifying — it was massively complicated and looks stunning,” says Twyford.

The film was shot on ARRI Alexa LF and Alexa Mini, the favoured imagers of DP Michael Simmonds. He wanted to lend the show a “deliberate accidental” style in keeping with the film’s realistic aesthetic.  “His style is documentary like, very visceral,” Moran says. “We rarely changed the composition in post.”

Some VFX sequences required high speed photography using Phantom Flex 4K and the Sony Venice. A key one early in the film sees a full-frame shot of Frank (Gordon-Levitt) staring into the camera as he takes - and survives - a bullet point-blank to the head. The slow-motion sequence was filmed at 800 fps. Gunsmoke was simulated referencing real gun shots filmed at over 70,000 frames per second.

“The challenge was to precisely recreate the timing of the ripples and the hair movement from the plate which was achieved on set with a high-velocity blast of air,” says Sita. Referencing an armadillo’s leathery protective shell for the sequence’s distinctive look, the team sculpted the folds and ripples in CG, adding full digital hair onto Frank’s head, so as to perfectly time the hair lift and to disguise the constant wind through the actor’s hair from the air gun used on set.

In another scene, Frozen Girl (Jazzy De Lisser) sees her body transform into ice. This was shot in one single two-minute take comprising 2,000 frames and took nine months to complete.

“The amount of detail in that shot is incredible,” says Moran. “Jazzy is constantly moving, trying to escape from a circular tank while ice crystals grow over her skin and her breath fogs the glass.”

The actor’s performance was body-tracked. The team had to remove the reflections of the camera crew from the glass while also matching the multiple flashing light sources from the room outside the tank.

“We played the shot in realtime and every single frame had 70 lights that needed to be manually art directed, coloured, calibrated and matched from camera references so that all ice on her body could properly interact with her skin,” says Sita.

Another sequence sees the chameleonic Camo Guy robbing a bank. This character’s powers were inspired by the cuttlefish, which has the ability to mimic its surroundings.

“The work on the character was tested before main photography as a 2D based look which proved that Camo Guy would have an interesting effect, but lacked some of the lighting and more dynamic patterns in its cells,” Sita explains. “We called on FX and lookdev to achieve that additional motion and subsurface look with the texture and reflections embedded in his body in compositing. We matched the plate for lighting and adjusted his skin wetness and specularity to get a more interesting shape.”

The superpower of the character called Biggie (Rodrigo Santoro) sees him almost double in size, transforming into an imposing, asymmetrical beast-like figure that towers over his surroundings.

“The plates were shot in a miniature set,” explains Sita. “This helped us work out how big the actor should be, and how he’d be interacting in an environment where he looks much taller.”

Working with these plates, Framestore’s tracking and CFX teams performed detailed plate augmentations as they matched Biggie’s on-set prosthetics with a series of CG body parts and muscle jiggles. The FX team created gruesome tumors and intricate veins under his skin, deforming Biggie’s face to one side to finalise the grotesque transformation.

Another 600 shots of mostly ‘invisible’ effects including a giant container ship were created out of Canadian facilities Outpost VFX, Distillery and Image Engine.

Saturday 22 August 2020

Picture Monitoring: HDR Comes to the fore

 InBroadcast

p26 August http://europe.nxtbook.com/nxteu/lesommet/inbroadcast_202008/index.php?startid=26#/p/26


HDR is becoming integral to the production and distribution chain while picture quality monitoring in an IP world no longer ends at playout.


HD-HDR looks set to become the preferred production format for many broadcasters as they plan their forward development strategies. If it wasn’t for the lack of live sports, Sky’s launch of HDR in the UK would have got a lot more attention. You’ll need to subscribe to SkyQ and have an HDR telly but you can view Disney+ shows in HDR with live sport, including Olympics highlights assuming it goes ahead next Summer. What’s more, an increasing number of shows are being delivered with UHD HDR, notably for Disney+, Prime and Netflix, with monitoring critical all the way up the chain camera to post.

“When someone tells you your pictures look ‘a bit green’ or ‘a bit soft’ or ‘a bit hot’ when they’re looking at proxy images on a cheap laptop from halfway across the room it can be a bit annoying,” says Neil Thompson, a Sony Independent Certified Expert, who presents a great guide to the basics of HDR monitoring at the CVP Common Room.

“If you have proper monitoring and measurement tools, and know how to use them, you can tell them where to stick their opinions,” he says. “Assuming your proper monitoring tools don’t show they were right!”

 

The market is being flooded with HDR capable picture monitoring solutions mostly capable of working in the main HDR profiles - PQ from Dolby and Hybrid Log Gamma.

 

TVLogic’s F-10A is a 10-inch field monitor with HDMI 2.0 and HDCP 2.2 to support 4K HDR 60Hz and various video signals (3G HD/SD-SDI, HDMI). The unit’s three 3G-SDI ports provide plenty of options for connecting. In addition, it provides various professional functions such as camera LUTs, ARRI metadata display, waveform/vectorscope and HDR emulation including of PQ, HLG and SLog3.

The company’s LUM-242H is a 24-inch UHD monitor that offer s10-bit colour, a 178-degree viewing angle and Wide Colour Gamut (WCG). It features 1,500 nits (LUM-242H), a high contrast ratio of more than 1,000: 1 and HDR emulation function (PQ, HLG, SLog3).

The LUM-310X is a 31-inch 4K HDR reference monitor with a “true 4K” (4096x2160) resolution Dual Layer LCD that with maximum luminance of 1000 nits and 1,000,000:1 contrast ratio. This, says TV Logic, makes it possible to view deep blacks and ultra-sharp details. It supports 4K/60p through single-link 12G-SDI, Quad 3G-SDI and HDMI 2.0.

Leader’s LV5900 waveform monitor had a high-profile outing as part of BT Sport’s live 8K broadcast demo from the UK to Amsterdam last September.

“With the source feed an enormous 48 gigabits per second, a lot depends on the delivery network and choice of codecs,” said Leader Europe Business Development Manager Kevin Salvidge. “The LV5900 allows OB technicians to check video and audio signal parameters at any point in the production and delivery chain to ensure raw and encoded signal feeds conform as closely as possible to the maximum deliverable quality.”

For those not looking to deploy 8K services, the LV5900 can simultaneously display two 4K sources. A new Leader-developed focus detection algorithm applies nonlinear super-resolution technology even to low-contrast images for which focus errors were previously difficult to detect.

Leader has also developed new options for its LV5600 waveform monitor and LV7600 equivalent rasterizer for UHD HDR grading and finishing. The new feature allows operators to check chroma levels beyond the BT.709 or DCI-P3 gamut. A Cinezone HDR display gives camera crews the opportunity to monitor and adjust HDR images and optimise UHD focus in real time.

“This simplifies the task of identifying the reproduction errors which can occur when transmitting video content produced in BT.709, DCI-P3 or BT.2020 wide colour gamut or when converting content from BT.2020 to narrow colour gamut,” explains Salvidge. “With the colorimetry zone feature switched on, chroma signals exceeding the legal gamut are made immediately apparent using a false-colour substitute.”

Broadcast studio dock10 recently added a LV5600 to its UHD HDR live production system, checking incoming camera feeds from multiple studio cameras to ensure uniform alignment and colour matching.

AJA’s time HDR monitoring and analysis platform HDR Image Analyzer not only comes with 12G-SDI single-cable connectivity but now supports up to 8K/UHD2 HDR.

Developed in exclusive partnership with Colorfront the sub-$20,000 product offers waveform, histogram and vectorscope monitoring and analysis of resolutions up to 8K, HDR and WCG content for production, post, QC and mastering. It features support for scene referred ARRI, Canon, Panasonic, RED and Sony camera colour spaces. Automatic colour space conversion is based on the Colorfront Engine.

“Since its release, HDR Image Analyzer has powered HDR monitoring and analysis for a number of feature and episodic projects around the world. It became clear that a 12G version would streamline that work, so we developed the HDR Image Analyzer 12G,” says the company’s Nick Rashby.

Postium’s OBM-H series supports daylight monitoring of PQ HLG and S-Log3 in HD broadcast and motion picture applications. It provides the function of comparing HDR and SDR on the display simultaneously while another function permits viewing highlights and shadow detail of scenes at the same time. The OBM-H210 sports a 21-inch 1920×1080 resolution LCD with a maximum luminance output of 1500 cd/m².

Its most recent 4K monitor – the OBM-X241 – is designed for on-set HDR production and editorial applications and is claimed as the first 24-inch monitor to offer full 4K-DCI resolution, 1000 nit peak luminance, and dual-SFP input. “This makes it ideal for DITs monitoring on-set SDR and HDR camera feeds as well as desktop editorial and QC applications,” says president Sung Il Cho.

Broadcast production control

For critical image evaluation in broadcast production control environments Ikegami offers the HQLM-3125X, a 31-inch UHD monitor with specs including 4,096 × 2,160 pixel 10-bit resolution and 1,000 candela per square metre. Compliant with BT.2020 WCG, the monitor employs a ‘double-LCD-layer’ panel delivering a 1,000,000:1 contrast ratio with ‘true’ black reproduction. Ikegami describes a layer of light-modulating cells allows pixel-by-pixel control of backlight intensity, reducing light leakage and black float while reaching a black-level of 0.001 cd/m². Standard features include two 12G/3G/HD-SDI inputs and outputs, three 3G/HD-SDI inputs and outputs plus an HDMI input. The HQLM-3125X can thus accommodate three simultaneous UHD feeds. 

Blackmagic Design’s SmartScope Duo 4K model includes built-in waveform monitoring “when you need to analyse image quality more accurately than simply looking at the picture.” For U$795 you get seven of the most popular scopes most widely used in broadcast and post (waveform, vectorscope, RGB parade, YUV parade, histogram, audio phase and level scopes). You can even monitor video on one of the dual 8-inch LCDs while running a scope such as waveform, histogram or audio on the other screen. This combination means you can see the video you're monitoring alongside a real time scope measurement, all on the same unit.  You can also download software to the SmartScope Duo 4K to add new types of scopes as they become available.

The unique Picture-by-Picture function of Postium’s OBM P series allows two different SDI input signals to be displayed on the screen simultaneously. “This function is very convenient for making instant adjustments to two input sources, because there is no need to individually adjust the different characteristics of two monitors,” the company explains.

The OBM-P Series embraces seven models from 18-inch to 55-inch that accept up to 1080 60P signals displayed at native HD resolution and are equipped with standard 3G/HD-SDI input interfaces as well as HDMI 1.3a and analogue inputs.

Remote and IP-based broadcast monitoring

The coronavirus has made broadcasters dramatically reassess not only how programs are produced, but also how distribution operations are managed. The move to both IT and IP-based broadcast infrastructures had already made it possible for technicians to access systems from wherever they are. This option is now widely available, meaning broadcast centres can still run efficiently, while, at the same, time allowing self-isolation and social distancing guidelines to be observed.

"Many broadcasters were already considering remote workflows but now they are a necessity, not only to keep networks running but also in ensuring that their output continues to meet the highest standards,” says Erik Otto, CEO, Mediaproxy.

By mirroring the Mediaproxy LogServer logging and analysis systems at the master control room or network operations centre in the homes of technical operators, it is possible to maintain the same level of checking and audience experience, the company explains. Through low latency links, technicians can monitor outgoing streams using exception-based monitoring on Mediaproxy’s interactive Monwall multiviewer.” Exception-based monitoring, which uses IP ‘penalty boxes’ that allow broadcasters and multiple system operators to deal with QC and compliance more efficiently.

Ross stresses the importance of having a low-latency stream of a multiviewer that can be accessed by all off-premise operators. One quick way for a user to get a low-latency multiviewer accessible to multiple people is to use a third-party SDI / HDMI capture device for Mac or PC from a vendor such as AJA or Blackmagic.

Next, with embedded audio included on the stream, operators can use a video conferencing software (i.e Teams, Zoom) and set the capture device as the webcam input. The video conferencing software is made to handle multiple simultaneous users and operates in close to real-time, which keeps the latency low.

Ross’ free DashBoard software makes it simple to control and monitor hardware products, and many of its own software applications can be accessed through remote applications. Additionally, it has cloud-ready solutions such as Inception and Streamline hosted through web servers accessed via VPNs.

Rohde & Schwarz software-based PRISMON monitoring and analysis solution enables broadcasters to evaluate the viability of transitioning master control and playout to the cloud.

US network PBS used the PRISMON to measure the latencies associated with cloud-based encoding and transport and to synchronize numerous feeds of the same content from different signal processes running in the cloud for side-by-side visual comparisons and evaluations.

The unit also provided objective measurement of the performance of various cloud-based encoding solutions, highlighting artifacts such as ‘bad’ pixels and charting errors on a frame-by-frame basis to enable data-driven comparisons with the performance of the actual hardware encoders used at PBS.

“We recognize that the quality of a broadcaster’s signal is of the utmost importance and developed the PRISMON to provide broadcasters with objective signal measurements so they can be confident in their end product regardless of whether it’s coming from the cloud or on premise,” said R&S Head of Monitoring Development Bjoern Schmid.

Bridge Technologies has debuted Integrated Services Monitoring (ISM), a turnkey suite of tools, which puts automated eyeballs on all acquisition, delivery and production media streams.

“ISM provides exactly the set of data gathering tools broadcasters need throughout the chain – from production to delivery, uncompressed to compressed,” explains Simen Frostad, Chairman. “No other company in the network monitoring and analysis space can offer such a complete, comprehensive set of automated capabilities that deliver such a high level of understanding throughout any content distribution chain.”

ISM is able to monitor both uncompressed and compressed data across various formats and in resolutions up to 8K. It allows for full motion, colour-accurate, very low-latency video to be made available from any source to any application or user – anywhere in the world, whether fixed studio, remote OB van or headend environment, such that a geographically-dispersed team can work together on the same project.

“Individual, disparate solutions become a less and less tenable approach within an industry that is now trying to do a lot more with a lot less,” Frostad adds. “ISM takes the incredible analytical ability embedded in every part of the Bridge Technologies portfolio, and deploys it in a way that captures and utilizes data in multifaceted ways to assist with accurate decision making across the whole media chain, end-to-end.”

Densitron has added three models to its UReady 2RU Control Surface range. These are large, high-resolution screens that provide a wide-angle viewing experience and can be customised as requirements change. Human Machine Interface (HMI) options can be added through the use of proprietary “tactile-touch” buttons.

Says Global Product Manager – HMI Solutions, Bazile Peter, “Whether this is a high-performance multi-touch touchscreen, or a touchscreen with transparent buttons, or a combination of buttons and touchscreen. [Densitron’s] modular approach ensures that a vast array of control and monitoring requirements are met."

Friday 21 August 2020

Wanted: Robot actors for new $70m sci-fi film

RedShark News

Hollywood’s latest stab at creating non-human humanoid stars aims to be the first to rely on a fully autonomous artificially intelligent actor.

https://www.redsharknews.com/wanted-robot-actors-for-new-70m-sci-fi-film

The film, with the working title ‘b’, has a budget of $70 million already has its robot star and some scenes featuring it have already been shot.

LA-based outfit LIFE Productions has signed ‘Erica’ the creation of Osaka University roboticist Hiroshi Ishiguro, to ‘play’ an artificially intelligent woman who can port into the body and mind of any human host. The film follows her creators’ efforts to gain control of her as she becomes self-aware.

Even the film’s chief financier, Matthew Helderman of BondIt Media Capital, admitted to the New York Times that that hackneyed set up “had a dime-a-dozen sci-fi plot that wouldn’t have made it on his radar” if it hadn’t been for the unique selling point of its star.

Erica though gives the film some money can’t buy publicity. First unleashed in 2015, Erica was alarmingly modelled by Ishiguro on images of Miss Universe pageant finalists.

She has reportedly performed in plays, sung in malls and even delivered the news.

Since then, her AI and movements have advanced and for this role the robot has even been trained how to perform via method acting techniques. Given that she has no life experiences to draw on, producer Sam Khoze explained to The Hollywood Reporter that, “She was created from scratch to play the role. We had to simulate her motions and emotions through one-on-one sessions, such as controlling the speed of her movements, talking through her feelings and coaching character development and body language.”

She apparently has a “synthesized British voice” with “a slight metallic tone that sounds like she’s speaking into a pipe”. When she walks, the motion of her air compressor joints makes it look as though she’s performing either a sped-up or slowed-down version of the robot. For that reason, a majority of her scenes will be filmed while she’s sitting down.

Given those limitations it seems unlikely Erica the actor will be convincing enough to bridge the ‘uncanny valley’ where non-human humanoids always seem to fail. Given that, in the film, she is intended to be a robot and not a human perhaps this won’t matter.

Arguably one of the most convincingly empathetic robots on film, albeit one that made little concession to looking humanoid, was Robby from 1957’s Forbidden Planet. Its dry wit was a forerunner of Douglas Adams’ Marvin depressive ‘Paranoid Android’ from 1981’s The Hitchhiker’s Guide to the Galaxy. 

Set in 1983, filmed in 1973, Michael Crichton’s dystopian adult theme park Westworld featured a very convincing Yul Brynner as the self-aware gunslinging machine.

Forgotten (though not bad) Al Pacino comedy drama Simone (Sim-One or S1m0ne) couldn’t live up to its pre-release hype about being the first film to feature an entirely synthetic human character as lead. The film gets around this by having Pacino’s failing film director character create a completely CGI actress that becomes a superstar and even wins Academy Awards. Despairing that society could be so shallow as to fall for this smoke and mirrors he attempts to kill her off.

Other examples include Robert Rodriguez’ Manga infused performance-captured lead in Alita: Battle Angel and the more serious holography of K’s artificially intelligent girlfriend in Blade Runner 2049.

In what seems another gimmick, James Dean has been resuscitated to play the part of a new role in Anon Ernst and Tati Golykh’s proposed Vietnam war feature Finding Jack. Production house Magic City Films plans to use photos and videos to recreate Dean, who died in 1955, in CGI.

Erica the Robot is named as star on ‘b’s imdb page but there’s no director attached. You could write the reviews now for actors who are paired opposite Erica if the film doesn’t succeed at least as entertainment.

The film is still in pre-production and the producers are casting around for other robots to fill supporting roles in the film and even for an unnamed crew position. If nothing else, they will be Coronavirus immune.

Thursday 20 August 2020

How The Voice's Editors and AEs Manage Assets in an Unscripted Post Workflow

copywritten for Avid 

https://www.avid.com/customer-stories/the-voice-asset-management?fbclid=IwAR1-hNYVSoefg1PSV_tolTTu_IsYzmJrwyDtNlgZWYlncrMC2sB09wc3HOE

Reality TV has come to dominate primetime broadcast schedules the world over, but creating a winning formula for post production presents a unique set of hurdles for the editing team. Broadly, the issues are:

  • Volume: Large editing teams are working with towering heaps of footage to compose reality shows. This makes impeccable asset management a necessity for an efficient workflow.
  • Unscripted: The nature of the genre means that editors find the story and effectively write the show through their selects of on-the-fly footage.
  • Deadlines: The requirement to meet fast turnaround times means the pressure is always on.

Through trial and experimentation, The Voice's post team has established a sophisticated workflow for managing their assets amid these challenges.

THE TALENT BEHIND THE VOICE

NBC's Emmy® award-winning show has been entertaining audiences since 2011, with 18 seasons and counting. With a solid organizational framework, editors can apply their talent to finding the story, maintaining story arcs, and hitting emotional beats. That may be true across all genres, but the meticulous approach required of unscripted TV editors is extraordinary.

The Voice's editing team will draw from 50 to 60 terabytes of original camera media to compile the first few episodes of each season. This raw material comes from 20 to 30 cameras, many of which film simultaneously from different angles. Then, the production's team of assistant editors (AEs) ingest, transcode, and view the material.

The AEs' organizational process cannot be overlooked, says Supervising Editor Robert M. Malachowski, Jr., ACE, who has been with the production since the first season. "[AEs] are the centralized hub for everything that happens in post. . . . Without them, the editors would not be able to do their jobs at all."

By taking a look at their workflow, we've pulled out six tips that can help other unscripted TV editors design their own plan for managing assets in the post process.

1. DISCIPLINED ORGANIZATION OF RAW FILES

On The Voice, the goal is to produce one two-hour show and a separate one-hour show weekly during the series run. It's the AEs' job to first review the media and prep it for the editors. The raw material includes stage performances, interviews, high-speed camera footage, behind-the-scenes footage, coach cams, and "follow-home" video, where select contestants are filmed with family and friends in their hometown.

The AEs will eventually receive a camera log detailing the shots filmed and what happened throughout the day, as well as a transcript, but they try to work ahead of this to start the editors on assembly as quickly as possible. Usually, AEs need to turn that footage around within 24 hours.

"We pretty much go into it completely blind," shares Lead Assistant Editor Alan Macchiarolo. While the AEs can view a live feed of performances shot on the stage, they have to wait for a first look at all other material until the files are ingested. "We essentially have to go through every piece of footage that is shot for the show," Macchiarolo says.

Video files are organized by camera (e.g., for stage, reality, artist interview, or B-roll) and camera type. These groups break down into even more specific categories—for example, into all camera angles for a contestant's performances or all of an artist's interviews—and are then delivered to the editors. The majority of the footage stays organized in smaller chunks so the editors don't have to sort through all the material from scratch, explains Lead Assistant Editor Joe Kaczorowski.

Since the transcriptions typically arrive 24 to 48 hours after the editors have received the footage—which is 24 hours after the AEs have already handed it off—the AEs then go back into the organized footage to align dialogue and transcripts, giving the editors an additional tool to search through the material.

2. LOGGING LEAVES NO ROOM FOR INTERPRETATION

Logging is arguably the most critical part of media management for unscripted TV shows. Poor taxonomy or inconsistent application can cause all manner of problems down the line.

"Our mind-set . . . is we service the back end and then move it forward," Malachowski explains. They consider how they can stay compliant to smooth production at every stage: from offline to online, VFX to audio mixing, and collaboration with outside post houses, he says.

Logging starts with a tape name (e.g., the specific camera, the day the clip was shot) followed by a general descriptor, such as whether it's B-roll or a performance, explains Lead Assistant Editor Vinnie DeRamus. A consistent convention lets everyone know what to search for. For example, the online editor can identify whether the camera is an ARRI or a Sony simply from the tape name, says Malachowski.

With archival footage and creative calls in the field for high-speed footage, The Voice post team deals with a lot of mixed format footage including PAL. The AEs have to identify this footage and pass along that info to the editors and online editors. Then, an editor can look at a piece of media and know its true frame rate, the running frame rate, and whether or not the footage is original media or has been processed or transcoded for any reason.

"At any point we have five to eight AEs in here working on [material], and they all have to use the exact same terminology," says Kaczorowski. "It's not left up for interpretation that way. And all the editors are used to seeing that terminology. It's laid out exactly the same, so it shows up in a bin the same way. Everybody sees the same thing over and over again. They know what they're looking for season after season."

3. BE PREPARED TO ADAPT

Every workflow is different, so there's no cookie-cutter approach. But arriving at a tailor-made solution may involve trial and error—plus the flexibility to evaluate and adapt.

"We definitely did not nail this from the get-go, and every season pops up with new challenges," Malachowski says. For the first season, they followed the template set for other Mark Burnett productions and quickly learned that didn't work for The Voice because its story wasn't told in a linear fashion.

During The Voice, they can take an act or scene from one episode and move them around to a different act or drop it into another episode entirely, he explains. The producers can request that contestants who auditioned on the first day be placed into the second or third episode if that improves the storytelling. Initially, the team had all the media grouped for individual episodes; when they started shifting things between different episodes, that process broke down.

They also tried various other workflows before alighting on the asset management system that The Voice uses today. A single project or library acts as a repository for all the media, while editors creatively work within their own individual projects. This allows for faster turnover by the AEs, according to Malachowski, because all they have to do is update one bin. The editors return their finished cuts back to that one shared project.

The library project is locked so that only the AEs can update the media. "This means the editors don't have to worry about accidentally deleting or changing anything," Malachowski says. "It becomes a very safe environment for them to just grab what they need, pull it into other projects, and play with it."

4. KEEP COMMUNICATION CHANNELS OPEN

The Voice has between 20 and 22 editors working at its peak, each responsible for different tasks as the season unfolds. Once the team starts developing what will be in an episode, the editors divide into smaller teams working to form a single episode.

Episodes will go through a series of internal and network reviews before a supervising editor makes an absolute fine cut and the result is locked, approved, and sent to final finish and online. There are no hard-and-fast procedures that govern all of this back-and-forth, but Malachowski recommends keeping the editorial team in constant communication.

"There's no stupid question," Malachowski says. If anyone needs to check whether a clip has been used, they should feel completely free to ask. He likens the show to a game of Boggle, where a last-minute network request can force editors to rework material that was already in place.

"That's why we've got so many layers. The editors are tasked with creating and coming up with the best stories that they can. The finishing editors are making sure that story is complete over an episode. The supervising editors are in charge of making sure the story of each episode is continuous over the entire season. . . . Everybody is looking for finer and finer detail as we get up to the very end and actually release it for broadcast."

The Voice even has a 50-page show "Bible" available for reference that outlines all of the critical processes and procedures.

5. THINK YOU'VE GOT EVERYTHING? THINK AGAIN

Reality TV hinges on moments, and the right one is worth a thousand words. It's often the little glances and other nonverbal cues—the ones that, inconveniently, won't appear in an automated transcript—that can help sell a certain part of the story or bring out a character. How do you ensure that those vital pieces of story aren't missed?

Some editors will rewatch things, Malachowski explains. After making a near-final cut of a performance, he'll review the original material, watching each of the coach cameras during the performance to cherry-pick moments. He also might ask some of the AEs to go back in and look for certain teases or cold opens. They would supply the editors with a bin of six to a dozen additional shots. Malachowski said a request can even be as specific as "I need Blake looking left to right with a blue background."

The advice here is to stay alert and "keep an absolute watchful eye," in Malachowski's words. If your show is as nonlinear as The Voice, double down on this. "You have to really keep on your toes," he warns.

6. CREATE A SUPPORTIVE TEAM ENVIRONMENT

In such a pressurized environment with hard deadlines rotating, even personalities who thrive on adrenaline need to take a few moments away from the grind. Building a supportive work culture is essential to keeping the editorial machine running smoothly.

"What helps is knowing what role we all play in the process, and knowing what each of our strengths are," DeRamus says. "We each attack any issue based on our strengths, and we know that we can depend on [the rest of the team] for support."

"There's a lot of humor in the bays; there's [also] a lot of tension," acknowledges Malachowski. "But at the end of the day, everybody keeps in the back of their minds that, at the end of the season, we all still want to be friends. It can definitely be a challenge . . . but having the respect for your fellow editors and AEs helps a lot."

Wednesday 19 August 2020

Sohonet Helps Gentle Giant Studios To Tell Big Stories

copywritten for Sohonet

Gentle Giant on scanning full-body digital doubles for VFX to creating a fantasy starship for Porsche AG and Lucasfilm — and the cloud-based storage and delivery systems they’re using.

https://www.sohonet.com/2020/08/18/sohonet-helps-gentle-giant-studios-to-tell-big-stories/

From The Matrix and Avatar to the Harry Potter franchise, Jurassic World: Fallen Kingdom, to Wonder Woman 1984The BatmanTop Gun Maverick and Morbius, Gentle Giant Studios’ products and services permeate the entertainment industry, providing cutting-edge 3D scanning, modelling and 3D printing for greater realism or flights of the imagination. 

Applications include recreating real-world objects and characters, constructing virtual sets, devising elaborate fantasy worlds, and 3D-printing objects that would be impossible to cost-effectively construct by other means.

The Burbank-based studio is a 25-year stalwart of the industry consistently working with all the major studios and streaming networks as well as record labels, museums, clothing designers and tent-pole brands. Gentle Giant Studios pioneered the integration of 3D scanning and 3D printing in the entertainment industry lending unequalled authenticity to projects covering a broad range of end products.

“Sohonet is integral to everything we do,” says Kim Lavery, Global Executive Producer. “We use its high performance, cost-effective storage [FileStore+] connected by Sohonet FastLane every day for smart, efficient digital delivery to and from clients and internally across our Studio for our sculptors and digital designers.”

Gentle Giant’s experts provide a host of services for a wide range of entertainment media, such as 3D scanning, modelling, concepts and printing for related product development.


“Typically, a project starts with the Digital department,” Lavery explains. “We capture data including Lidar and full-frame photogrammetry and internally process it for delivery to our different departments -Fine Art, Prototyping and 3D Printing for creation of art collectables, action figures, products, etc.”

“We go full circle from physical objects and people into the digital realm and back out to physical assets. Our digital data goes right from the production development process into pre-production, production and post.”

For The Mandalorian, Gentle Giant took its Juggernaut mobile photogrammetry studio to the set and scanned the cast for downstream creation of Maquettes for the Art Department and full-body digital doubles for VFX and Costume.

“Sometimes A-list actors are fitted with under briefs so we can get a 100% accurate scan of the body. The data can be used for VFX, or to 3D print mannequins for Costume. Sometimes the talent is scanned in full costume, hair and make-up which we capture for creation of CGI digital doubles. Our photogrammetry is of such high resolution we can get right into the pores of the skin.”

All of this generates significant volumes of rich data which needs to be piped from set to the team at Gentle Giant Studios, all under the strictest of security protocols.

“Sohonet’s cloud-based storage Filestore+ is a really big help in enabling us to have that at our fingertips,” Lavery says. “We use FileStore+ for back up, short-term project parking and archiving. We have heaps of data collected over more than 20 years which is a huge asset for clients wanting to repurpose the data for future use.”

FileStore+, powered by RStor is Sohonet’s low-cost alternative to Amazon S3 for disaster recovery (DR) and cloud-first workflows, providing media teams with resilient, secure, high-performance storage at the lowest price point on the market. 

Gentle Giant was recently asked to dig into the vaults stored on Filestore+ to create new merchandise featuring WWF wrestlers and other major brand companies.

 “Previously we had a network attached storage device but the amount of data we are now being asked to store and transfer means that we needed to upgrade to infrastructure with unlimited scalability,” Lavery says. “With Sohonet FileRunner, transferring large files just became much easier and more secure.”

She continues, “We explored the options really carefully for security reasons. All the major studios are understandably super conscious of confidentiality. Gentle Giant is Marvel and Disney-approved because of the way we have set up our cloud-based storage and delivery systems. Sohonet Media Network, Sohonet Fastlane and Filestore+ are superior to other platforms we have used in every way.”

Recent projects include the design and physical creation of a fantasy starship for Porsche AG and Lucasfilm Ltd; the scanning and creation of a 3D mannequin of music artist Katy Perry for the VFX in a forthcoming music video, as well as prototyping and manufacturing promotional purses for Wonder Woman 1984.

The Studio’s implementation of Sohonet also gave it a head-start when the pandemic hit and forced the industry to work from home. Lavery says, “We were prepared for the shutdown. We were able to expand our remote workflows really quickly. It was a seamless transition that created no issues for our clients.”

“Fortunately, we had enough data in the can to continue working throughout the quarantine period. Our artists were able to access data held on servers at our facility and in the cloud using Sohonet. It worked extremely well.”

Now that principal photography is slowly resuming in LA and elsewhere, Lavery predicts a new renaissance in the processes for supporting the production of arts and entertainment content.

“Going into lockdown we had seven productions on our books that all went on hiatus. They didn’t cancel, just paused. When they restart, we expect to get slammed and it’s right around the corner.”

The Gentle Giant are now getting familiar with the new protocols and guidelines in place, and Lavery adds that the team have passed the studio vetting and “ look forward to getting back to what will be the new normal for everyone.”