Monday, 20 April 2020

What would have happened in Vegas

Televisual

http://www.televisual.com/blog-detail/What-would-have-happened-in-Vegas_bid-1067.html

In more normal times, attendees would now be tramping their way through the maze of the Las Vegas convention Centre as they took in all that was new and on its way in TV and movie technology.
  
Of course, this year the annual NAB TV tech show, like so many other events, has fallen victim to the COVID 19 pandemic and been cancelled.

But before what happened happened, and when a NAB cancellation was no more than an outside possibility, Televisual asked Adrian Pennington to ask the manufacturers planning their Las Vegas visits about the tech that would be in the spotlight. 

And although NAB is cancelled, the trends that will define television technology for the year ahead are certainly not. 

“The continued move to IP will also be one to watch, particularly for linear and occasional use scenarios regarding contribution and distribution,” says Ray Thompson, director of broadcast and media solutions marketing, Avid. “It’s getting to the tipping point as technology improves, adoption increases and comfort levels ease around the security of content in flight. With IP, content distribution can be achieved at scale to more properties in different bit rates and protocols. This trend will benefit workflows across the board, from news, sports, live production, and post production.”

IP also enables workflows which, in the past, were not economically feasible, by making use of the cloud and remote live workflows over any distance. The benefits which include greater collaboration, less expensive production costs and enhanced work/life balance now include heightened awareness of public health.

Use of virtual meeting and online collaboration tools like Zoom have surged in demand as a result of COVID-19. Blackbird, Cinegy and Grass Valley are among vendors promoting remote production solutions.

Swansea-based Quicklink is to launch a cloud-supported version of its Skype video calling solution. “A journalist could sit at home and interview someone located elsewhere live to air while a colleague edits the video online (in Adobe Premiere) and in realtime,” says CEO Richard Rees. “That edit could be passed to a control room for wider channel distribution. The whole environment is now virtualised. We believe this is the future.”
 
AJA was the first high profile name to withdraw from NAB citing coronavirus risks. Instead it will transition all of its planned NAB events, including its annual NAB press conference, to web-based video conferences.

While several productions have already embraced higher resolutions and HDR throughout the content creation and delivery chain, consumers are still slowly adopting 4K or 8K displays, and HDR-capable devices. 

“Given this landscape, productions need the flexibility to work with and combine content spanning a range of resolutions as well as both HDR and SDR,” says AJA Director of Product Marketing Bryce Button. “NAB will offer a platform for companies to unveil tools that answer the combined demand from HD SDR to 8K/UltraHD2 HDR, and everything in between.”

“Maturation of HDR is now firmly on the agenda whereas before it was a bit peripheral,” observes Atomos CMO Dan Chung.

Atomos’ Neon range of on-set and in-studio monitor-recorders are part of this picture. The appliances can be synced together to provide a system of field monitors for consistency of image all controlled by the Atomos App.
 
“Our proposition has never been for the top end $30k reference grade monitor but for indie productions who need quality monitoring without breaking the budget,” says Chung. 

The availability of hardware to capture, edit and store 8K makes the high-resolution format “unavoidable” at NAB, says Chung.  “If you go 8K you need ProRes RAW since this allows you to get a manageable file size and all the benefits of working with raw data,” says Chung. “NAB will show that all the necessary parts of the 8K puzzle are in place.”

Canon’s announcement, in February, of an 8K DSLR was a game-changer in that respect. “Not so long ago if you wanted to shoot 8K anywhere near affordably you had to shoot Red,” Chung remarks. “Now you can do so on a prosumer camera. Canon has clearly laid down a marker that others are sure to follow.”

Details including price, release date and even sensor are sparse but Canon says the full-frame EOS R5 will feature a blistering 20fps electronic shutter, dual memory card slots, and a Body Image Stabilisation function to provide shake-free movies. 

Panasonic will be previewing its 8K ROI (Region of Interest) multi camera system that is able to produce four different HD signals or ‘virtual cameras’ from a single 8K feed. It is intended to reduce operating costs on production of sports broadcasting and studio applications and is to due to debut this summer.
 

 
6K the new 4K

While 8K is still exotic, there is a trend toward shooting beyond 4K for high end documentary and cine work. 6K resolutions provide the ability to pan and zoom into the image and, after the picture has been through a debayer process, you end up with a higher quality 4K final image.  

Among cameras offering this are Canon’s EOS C500 Mark (5.9K); Sony’s PXW-FX9 XDCAM which boasts a 6K full-frame sensor plus colour science inherited from the Venice cine camera; Blackmagic Design’s 6K, S35-sized sensor version of its BMPCC camera which allows users to fit EF lenses without the need for a converter; and Panasonic’s Lumix S1H full-frame mirrorless camera.

The next product from RED Digital Cinema is rumoured to be branded Komodo and to feature a 6K S35 sensor recording to CFAST cards perhaps targeting the Canon C300, Sony FS7 and BMDPCC market. 


Cooke metadata tracking

Having presented its newest Anamorphic/i Full Frame 85mm Macro at BSC in January, Cooke Optics is lining up a new set of Macros designed by the same team to be slightly wider than full frame.

The company recently discovered what it thought were inaccuracies in its metadata tracking system, particularly paired with shading and distortion mapping techniques. After investigation, turned out it wasn’t the /i3 system but the cameras which were out of sync.

“None of the cameras were time accurate which was astonishing,” Cooke’s Les Zellan says. “They were spending a lot of processing power on the picture side but recording lens data was an afterthought.”

Zellan has since persuaded camera makers to fix this in new models. It is also working with a partner to prove “extremely detailed distribution mapping down to the pixel level” for all its anamorphic as well as spherical ranges.

Lens metadata captured on set is increasingly important to improve VFX creation and DI calibration. Post production can sync the data to camera data to produce a more natural looking 3D model of the shot significantly faster than using traditional manual processes and guesswork.
 

 
LEDs and games engines

Ncam CEO, Nic Hatch, expects to see continued interest in the integration of games engines for rendering in-camera VFX where live action is shot against a LED video wall displaying CG or digital background plates.

Its technology used to most advanced effect on The Mandalorian but similar set-ups are working their way into production of lower budget episodic shows and live sports studio presentations.

“We’ll see more integration from broadcasters of games engines into their virtual,” Hatch predicts.

An essential element of these systems is the tracking of camera moves synchronized with changing perspectives of the LED content, which is where Ncam’s technology comes into play.

“Everyone is trying to understand how to shoot with LED walls,” he says. “You need camera tracking so that the LED reacts in the right way to the angle of view of the camera.”

MKII, the latest version of its camera tracking product, is officially launched in Vegas. The camera bar features Intel RealSense technology and captures spatial data that then feeds back to the Ncam Reality server.

“We’ve made the software a lot simpler for operators, the hardware is significantly smaller and lighter for handheld and stabilized rigs and it’s easier to mount and it doesn’t get in the way when trying to balance a camera. Ultimately, we will be doing all processing on the camera which enables us to go wireless.”

Remote live

For remote live workflows, IP was to be pretty universal at NAB.  In this regard, Panasonic was planning on presenting its software-based IP switcher. This relies on CPU and GPU processing allowing users “to achieve performance levels currently impossible with traditional hardware-based products,” says field marketing specialist Oliver Newland. 

The product is both resolution and format-independent, and can be integrated into a full IP-based environment. Newland expects it to become commonplace within live production environments very soon.

The firm was also planning to demonstrate a solution for VR and AR studio work with Brainstorm. PTZF (pan, tilt, zoom and focus) tracking data is enabled from Panasonic camera into Brainstorm’s virtual set system. The real-time graphics of will be boosted by the ability to automatically track the movements of live performers using Blacktrax beacons.

EVS is soon promising a major announcement about an “exciting new live replay experience”. Also promoted by the Belgium live production specialists is production server XT-VIA which supports work in UHD-4K, upscaling of HD to 4K and HDR in all resolutions. 26 XT-VIA servers running on XNet-VIA and interconnected through three XHub-VIA switches were deployed at this year’s Super Bowl. EVS also planned to show off Overcam, the AI-based system which uses smart tracking to automate key camera positions allowing for more coverage at lower operational cost.

5G links

While numerous bonded cellular links vendors from Dejero to LiveU have upgraded their camera-back or backpack mobile transmitters to take advantage of 5G, Sony has devised a left of field option in the guise of a smartphone.

The proposed Xperia PRO is still in development and has all the trimmings of a flagship Android device including the 5G-ready Qualcomm Snapdragon 865 chipset and a 6.5-inch 21:9 4K HDR OLED display. What sets it apart will be the ability to livestream images over mmWave 5G connections from professional camcorders such as Sony’s PXW-Z450 via a micro HDMI port.

“It’s a game-changer for working on location,” says Sony. “5G mmWave is a new era for business broadcasting… capable of remarkably low latency—essential if you’re broadcasting live content such as sports or news.”

Friday, 17 April 2020

Jörg Widmer Shines A Light Into Darkness in Terrence Malick’s A Hidden Life

RED Digital Cinema

The resilience of Franz Jägerstätter and his family are the very soul of A Hidden Life and their spirit lingers long after the curtain falls. In writer-director Terrence Malick’s shattering true story set during the Nazi invasion of Austria, the fluid, disembodied camera movement and the juxtaposition of stunning light and landscape with man-made horror, are essential to the power of the story.
“This is an important story that needs to come to life,” says Jörg Widmer, cinematographer and camera operator who has worked for directors Wim Wenders, Ridley Scott, Roman Polanski and Michael Haneke. “This is about a person who is a hero, but who would never end up in the history books. It’s about faith in your beliefs and in humanity not being consigned to the darkness.”
The hero is an Austrian farmer who was imprisoned and sentenced to death for refusing to fight for the Nazis. A Hidden Life is told through the eyes of Jägerstätter (August Diehl) and his wife Fani (Valerie Pachner).
“I read the book and I read the letters they sent to each other (on which the book is based) when he was in the army and in captivity. I was smitten,” Widmer adds.
Widmer is a long-time collaborator with Malick, having operated Steadicam or camera on Tree of Life, To the Wonder, Knight of Cups and Song to Song as part of Emmanuel Lubezki’s (AMC, ASC) crew.
“When Terry said he wanted to shoot with the speed of a documentary, I instinctively understood what he meant,” the DP recalls.
Unlike Malick’s recent films, which were shot largely on 35mm (mixed with 65mm and digital formats), the primary requirement for natural light and to keep the camera rolling dictated that A Hidden Life be a digital project.
“We would have takes lasting an hour without a break so that the actors could move freely,” Widmer explains. “The idea is that they could improvise, repeat, go again or we could shoot the scene differently. If the camera flows continuously it helps the actors stay in the moment.”
He adds, “We shot some scenes twice – one version handheld and one static — and left the decision about which to use to the edit. Sometimes if the camera follows people smoothly it draws the audience into a scene. Following kids, you are able to put the camera on the ground and let them play, and then take it in hand and run with it. Sometimes it was to make movement more violent and less comfortable.”
This approach demanded that the camera team be just as nimble. “One advantage with digital is that we could shoot scenes so much longer than having to change film mags,” he says.
After tests, Widmer picked the RED EPIC DRAGON camera for its maneuverability and capacity to record stark contrast while preserving details in the highlights and shadows.
“With RED, I found I had full control of the image in a way that I don’t have on another cameras,” Widmer describes. “The whole system is lightweight and easy to use, so we could prepare the cameras in a setup which allowed us to change from Steadicam to slider or handheld in less than a minute.”
Widmer paired the REDs with ARRI Master Primes, mostly using a 12mm and 16mm lens, and occasionally an Ultra Prime 8R T2.8.
“We were always looking for backlight so we needed a package that could take a lot of contrast without flaring and with a huge range of latitude. With Master Primes we could have the sun in shot without creating too much flare, which I didn’t like for this story.”
In fact, Widmer shot with RED EPIC DRAGON at 6K, switching to RED EPIC-W HELIUM 7K when it became available during principal photography. He also deployed two camera bodies alternatively fitted with different filters (Optical Low Pass Filters) to optimize the light for darker interior scenes and for daylight exteriors.
Everything was shot on location to authenticate the natural aesthetic. Locations included a working mill and blacksmith shop as well as several real prisons, including Hoheneck, a notorious Stasi prison near Stollberg, Germany. A few scenes were shot in St. Radegund, where the events depicted actually took place – including some interiors of the Jägerstätter house – and among orchards, along rural pathways and fields, as well as in the mountains of South Tyrol, northern Italy.
 “Our locations and schedule were pre-set by the sun,” says Widmer. “If it was raining, we embraced it.”
Artificial lighting was rarely used. Lighting gear consisted mostly of bounce boards and blacks. The barns, for example, were always shot when the openings of the buildings provided sunlight.
“I wouldn’t recommend it for every film, but it was right for this, since being free in nature is part of the subtext of our story,” he adds.
The RED IPP2 pipeline allowed Widmer to preserve the details in the skies and windows as well as in darker locations such as the prison cells.
“In my experience RED offers greater possibilities than other digital cameras to treat the image later. It gives you a larger range for post. IPP2 helps you to get better roll off in the highlights and to better contrast light with darkness.
“Our principal aim with the look was to achieve contrast,” he continues. “We didn’t work with a LUT and we didn’t have any primary colors, except for the red of the swastika flags, which deliberately stick out. With RED, therefore, we took care of contrast and proper exposure and manipulated the natural light as best we could knowing that as soon as we grade (at CinePostproduction Berlin), we could pull everything out in the grass, the sky, and the natural textures.”
Widmer was the sole operator with first AC Alexander Sachariew performing “an amazing job” pulling focus from 3-inches to infinity.
“When you walk through a landscape, you see everything to the left and right, and even if you focus in front of you, you are aware of what is happening all around,” explains Widmer of the startling use of the ultra-wide angle 8R. “We didn’t want to make the choice for the audience for what they see. Even in close-up you see background and mid-ground. I think, I hope, this sense of depth imparts a lot of emotion to the characters.”
Widmer says he has seen the film four times and that the editing (by Rehman Nizar Ali, Joe Gleason, Sebastian Jones), sound mix (by supervising sound editor Brad Engelking) and James Newton Howard’s score moves him on each occasion.
“Whatever [Franz] does he can’t do it right because the outcome will always be fatal,” he says. “This is such an important film, particularly for this time, because it concerns humanity and values that we all need now.”
He adds, “We shot so much material and so many beautiful shots that weren’t used but sometimes you have to kill your darlings. This is no time for vanity. The story is well told and I’m extremely proud to have been involved.”


Wednesday, 15 April 2020

When it comes to editing video, no-one wants a bandwidth hog

copy writing for Blackbird 
Since #WorkingFromHome has become the norm, the internet’s capacity to accommodate a sudden and sustained shift in demand is being stress tested. To be precise, it is residential fixed-lines which are under pressure as never before as we stream more daytime videos of live fitness training and children’s activities to the TV.
The streaming media giants are doing their bit for the common good. Netflix is dialling down the quality of its video by 25 percent in Europe for the next month to reduce the strain on the Web. Alphabet has followed suit by tuning all YouTube videos to SD – not just in Europe but worldwide.
Between them, they account for a fifth of all traffic that passes over the internet. According to Sandvine’s Global Internet Phenomenon Report, Netflix accounted for 12.6 percent of all internet traffic in 2019 with YouTube close behind on 8.7 percent.
In figures updated just recently, the coronavirus outbreak has actually pushed YouTube past Netflix, sometimes by nearly double the volume.
The reason? Where normal consumption of YouTube is divided between mobile networks, work, or school networks, and random WiFi hotspots, this time it is all centred on the home.
As the report author neatly observes, rather than being distributed among many different locations, users are concentrated on a single network – it is equivalent to all highways but one being closed, and all traffic being routed through that street. 
Netflix recommends a minimum connection speed of 25 Mbits/sec if you wish to watch its highest quality. In response to current circumstances it has effectively turned off its 4K UHD stream. Most of us won’t notice the difference.
Likewise, when videoconferencing is to maintain office communications most employers and staff are just happy to connect. Clearing the background of any incriminating detail is of more concern than video quality provided buffering doesn’t impede communication.
If your business continuity is video however, then any reduction in standards can be crippling. Some cloud editing technologies require upwards of 50 Mbits/sec. Who has that piped into their home at the best of times, let alone with the competing demands of the rest of the family, the neighbourhood, and beyond?
Here’s where Blackbird scores. Its ultra-efficient codec converts your full resolution media into a lightweight proxy, streaming content to remote users securely over the internet at just 2 Mbits/sec. Full resolution content remains on servers where ever you want it, while content can be logged, clipped, edited, captioned and published very easily and quickly – all from Blackbird’s web platform by users, wherever they are.
Since the Blackbird technology was originally conceived for the future of video over the internet, it’s no wonder it is light years ahead of the crowd. Customers including Deltatre, IMG, A+E Networks and the US Department of State have never experienced any latency with Blackbird because it has been super optimised for the cloud from day one.
To reiterate, this is all possible now on bandwidth as low as 2Mb/s.
Why get a bandwidth hog when you can fly with Blackbird?

Tuesday, 14 April 2020

Managing Monster 8K Media

Creative Planet
For many, the idea of recording 8K video understandably conjures up images of unmanageable files sizes, long transfer times, huge piles of hard drives, and slow proxy workflows not to mention a black hole in the budget.
What’s more, with the biggest showcase for 8K TV—the Tokyo Olympics—delayed, the demand for content delivered in 8K is likely to stay in the bottle a little longer.
Leaving aside for one moment the fact that HDR and HFR are far more valuable than resolution to the consumer’s eye, there are benefits to an 8K production which an increasing number of projects are taking advantage of.
Mank, directed by David Fincher and lensed by Erik Messerschmidt, ASC, was acquired 8K using the RED Monstro in monochrome; and Money Heist, the Netflix drama which in season 4 is shot at 7K to accommodate HDR in a 4K deliverable, are just two of the most recent.
You can’t sell productions made in less than 4K to Netflix and other streaming services now. One day soon, some will mandate 8K to begin with and Netflix will have its fair share in the bank.
Even if the final output is only going to be 4K/UHD, shooting in 8K gives you many options in post that you do not have when starting in 4K. These include downscaling, cut/crop (pan/scan) or headroom for VFX.
“Before making the decision to capture a project in 8K, producers and cinematographers need to consider the project’s long-term goals,” says Bryce Button, director of product marketing, AJA Video Systems. For instance, capturing in 8K makes a lot of sense if there will be future use for the material.
“And even if not currently working in 8K nor planning to move to 8K in the future, 8K acquisition can also be hugely beneficial for capturing background plates for VFX and for virtual sets in live broadcast,” Button continues. “Having a larger raster for the background gives producers the confidence that as they zoom, pan and tilt around the background video plate or set, they’ll be delivering the cleanest possible imagery.”
When selling your shots, for example to stock footage outlets, 8K still manages to command considerably higher prices and is much rarer, so there is a chance to sell more and make more money at the same time. 8K is still a unique selling point and as Barry Bassett, the MD at London-based camera rental house VMI puts it, “That means bragging rights.”

Acquisition Options

“If you can acquire in 8K, there is no good reason not to do it,” urges Jan Weigner, Co-Founder & CTO at broadcast and post software developer Cinegy. “This is the same question that we were supposed to ponder when the switch to 4K happened. Currently camera rental cost for 8K can be higher, but in terms of total production costs, your budget would have to be seriously constrained or require many simultaneous cameras to not be able to shoot in 8K.”
Producing in 8K is no different to 4K: The availability of hardware to capture, edit and store 8K makes the high-resolution format unavoidable. There are also now tools to answer the demand from HD SDR to 8K HDR, and everything in between.
“All the necessary parts of the 8K puzzle are in place,” says Atomos CMO Dan Chung.
All current NLEs handle 8K, at least if you are using the latest version.
The main cost that will hit your pocket are camera rental/purchase and the proper lenses to go with it. RED cameras are pretty much the only option for an 8K TV or feature workflow but there should be healthy competition at the rental houses. Other options, such as Sony, Ikegami and Sharp 8K TV cameras might use the latest 8K Canon lenses and that can be costly.
Canon’s announcement, in February, of an 8K DSLR was a game-changer in that respect. “Not so long ago if you wanted to shoot 8K anywhere near affordably you had to shoot RED,” Chung remarks. “Now you can do so on a prosumer camera. Canon has clearly laid down a marker that others are sure to follow.”
Details including price, release date and even sensor are sparse but Canon says the full-frame EOS R5 will feature a blistering 20fps electronic shutter, dual memory card slots, and a Body Image Stabilization function to provide shake-free movies.
“There’s a misconception that 8K is vastly more expensive than it actually is,” says Button. “Generally, moving to 8K is an incremental cost, especially if you’re already working in 4K or have worked in stereo 3D. The biggest expense often comes with storage and moving large volumes of data, but the strides made by the industry to support 4K and S3D have provided a strong foundation to support the data needs that 8K workflows require.”

Recording and Monitoring Options

By nature, 8K is a massive format and is therefore inherently data-intensive. As such, in certain circumstances, it may be advantageous to avoid shooting a fully uncompressed 8K video and instead seek out codecs that keep data sizes manageable where the balance between data size and perceived quality is preserved.
“As with any project, it’s crucial to always start with the end in mind,” advises Button. “If uncompressed footage is a necessity for everything from video effects needs to deep color work, uncompressed will always offer a range of advantages.
However, he notes, many projects – whether for broadcast or other delivery methods – may be better served using codecs specially designed for editing and grading, where media and workload savings on workstations can be incredibly advantageous.
“Apple ProRes, for example, has been tuned to specifically provide resolution details and color depth that are more than acceptable while providing the appropriate media bandwidth storage and minimizing CPU strain.”
In terms of monitoring, 8K displays are just beginning to surface, but are still scarce but as Weigner points out so are inexpensive, cinema quality, reference grade HDR 4K screens.
“You could use UHD/4K monitors or TVs and just zoom in when necessary,” he says. “Brand name 8K TVs sized 65” or even 75” can be bought well below U$3000 and they usually have a decent enough image that can be tuned manually to meet certain TV production demands.”
AJA offers audio and video I/O solutions like the KONA 5 to facilitate downconversion and monitoring of 8K content on 4K displays in real-time, whether for editing or other tasks. AJA says it is working very closely with major NLE and color grading companies to ensure that its Desktop Software and KONA I/O cards provide a seamless 8K creative experience whether working on macOS, Windows, or Linux workstations.
For many projects, the codec will be defined by what the camera produces, unless one uses an external recorder.
The Atomos Neon line of cinema monitors and recorders come with a 4K master control unit but the firm has additionally announced an 8K master control unit, which can upgrade every Neon to an 8K recorder. The unit allows for recording and monitoring 8K video at 60 fps. Both, ProRes and ProRes RAW are supported straight from the camera sensor.
“If you go 8K you need ProRes RAW since this allows you to get a manageable file size and all the benefits of working with raw data,” says Chung.
Shooting RED
Users of RED camera will be familiar with Redcode RAW, the proprietary recording format. Redcode is always compressed – there is no uncompressed version, claimed by Red to be visually lossless, and there’s no chroma subsampling or color space attached to the R3D RAW files. Visually lossless is usually good enough for any type of post-production including green screen work.
For example, using a Weapon Helium or Monstro at 8K 24fps to a 240GB Red Mini-Mag would record on average 259Mb/s and just 16 minutes record time (per mag). Upping the compression to 10:1 would double the record time and halve the bitrate. At the highest compression of 22:1, the figures would be 59Mb/s and 69 minutes. You can calculate your own figures from the Red website: https://www.red.com/recording-time
Netflix recommend a Redcode value of between 5:1 and 8:1. UK rental house Proactive has done some useful groundwork on recording 8K with newer Red cameras like the Monstro, Helium and Gemini.
It concludes that the majority of productions shooting Red use 8:1 as it offers “a fantastic balance between quality at the highest level, and practical data rates for the production to handle.”
The big surprise though, finds Proactive, is that if you use the Monstro in 8K at 8:1 as your standard compression level, it actually becomes much more manageable than the raw formats from Red’s competition, even some Prores formats. This becomes even more obvious when you go down to the 5K Gemini sensor. It found that at 8:1, the Gemini actually has smaller file sizes in 5K 16-bit RAW than the Sony Venice does in 4K XAVC-I which isn’t a RAW format.

The Codecs

Cinegy’s codec, Daniel2, specifically targets 8K and higher resolution production. Weigner claims it is up 20x faster than Apple ProRes or AVID DNxHR.
“With Daniel2, 8K production is as fast and easy as HD production, albeit requiring considerably more storage,” he asserts. “But since the days of HD we also have seen storage costs decrease massively while storage speed, thanks to the advent of SSDs, has increased dramatically. Put these factors together allows 8K production on inexpensive laptops or computers costing well below $2000 with standard NLE software such as Adobe Premiere.”
Weigner says that he edits 8K on a three-year-old Dell laptop without any issues or speed problems. This, of course, uses the Daniel2 codec accelerated by GPU inside Adobe Premiere and exported using H.264 or HEVC for distribution using Cinegy’s GPU accelerated export plugin.
“This may not satisfy high-end workflows, but will be sufficient for the average news, sports, even documentary production,” he says. “Editing these long GOP formats is much tougher. But depending on the NLE, the use of on-the-fly proxies or render caches and hardware acceleration by using graphics cards this does not need to be the case.”
Arguably, making a production in 8K will future-proof it to mitigate any risk and make it more attractive for sale in the long term.
“In the end this all depends on the type of production and how many cameras are needed and how much you will shoot using which codec and so on,” Weigner says. “Making clever decisions to begin with will reduce a lot of pain, headaches and ultimately cost.”

Smart Solutions for Quibi's Dummy

Panavision

Panavision and Light Iron help DP Catherine Goldschmidt pioneer a pipeline for Dummy, a made-for-mobile production requiring both 16:9 and 9:16 framing.

Short-form streaming-video platform Quibi provides premium content specifically designed for consumption on a mobile device. This presented the production of its first scripted show, Dummy, with a unique set of creative and workflow issues.  
In addition to the standard 16:9 aspect ratio, director of photography Catherine Goldschmidt had to create visuals for a vertically oriented 9:16 presentation. Working with marketing executive Mike Carter at Panavision, the team at Light Iron, and DIT Peter Brunet, Goldschmidt helped steer the Dummy team toward a new workflow that would satisfy both framing requirements through the entire pipeline, from on-set capture to final color.
Originally developed as a TV pilot, Dummy was rewritten into ten 10-minute segments to fit into the Quibi format. Produced by Deadbeat co-creator Cody Heller and directed by Tricia Brock, the series is a buddy comedy about an aspiring writer (Anna Kendrick) and her boyfriend's sex doll. “From my perspective, it was like we were shooting an indie feature,” Goldschmidt explains. “The script totaled around 100 pages broken into 10 episodes to be shot over 18 days on practical locations across L.A.”
Quibi’s platform allows viewers to use their smartphones in landscape or portrait orientation, seamlessly adapting the viewing experience whenever the phone is flipped. Technically, this requires two versions of the show with different aspect ratios and a single soundtrack that syncs with both, all of which streams simultaneously to the mobile device. “The brief was to frame in 16:9 and in 9:16 for two separate deliverables,” Goldschmidt says. “The initial recommendation from Quibi was to shoot in 16:9 and then crop-in for the 9:16 — resulting, of course, in a much tighter shot.
“When I thought this through, it was apparent that if you made the conventional wide and two shot in 16:9, then as soon then as the phone is flipped, all your careful framing goes out of the window,” she continues. “I just wanted to see if there was a better way.”
After consulting with peers who had previous experience framing content for mobile as well as with colleagues at Panavision, Goldschmidt opted for a new approach. “No one had yet done what we were about to do,” she says. “We were given a demo of the way the app was going to work and the tests they had shot. We then shot our own tests, facilitated by Mike Carter and [Panavision technical marketing director]Dominick Aiello, which we took all the way through the pipeline, including dailies at Light Iron and sitting down with final colorist Nick Hasson. Dual framing requires a lot of thought and not a little imagination.”
The first decision was to shoot full frame, working under the principle that it is better to work with more information than with less, particularly when it comes to the grade. 
“I was initially thinking I could shoot a common top — a shared headroom between the frames — which would make it easier for the boom operator or to light from overhead,” Goldschmidt recalls. “But the more I scouted, the more I realized how much floor space there would be in 9:16. How could you position a shot that wouldn’t destroy both frames or make one unusable? This was the main challenge for myself, the camera operators and everybody to wrap our heads around.”
That’s when Goldschmidt suggested using the director’s viewfinder solution Artemis Prime. “I have been an avid user of Artemis as a scout tool on my phone,” she notes, “and I was delighted that Mike Carter was able to lend us an Artemis Prime finder to scout with during preproduction in order to demonstrate and visualize the two frames properly with the director, the producers and all my collaborators.”
The Artemis Prime viewfinder allowed Goldschmidt the flexibility of customizing multiple frame lines in situ while scouting, and to preview the results on an iPad. “The ability to take a lens to the location, move around and truly picture the scene happening in both frames was essential for me, the director and everyone I was working with, allowing us to be confident in our choices. It was a delicate dance to satisfy both frames, which are pure opposites.”
For her camera, the cinematographer adds, “We selected the Sony Venice and cropped the 6K sensor to a 4K square — and within that, we framed for 16:9 and 9:16 in a crucifix orientation. This way both frames were the same resolution, and the subject size stayed relatively equal in both frames.  The two frames are unique, but both share the same middle portion. The crucifix position of the two frames and the ability to orient them the way we wanted was key.”
Although she would be cropping the large-format image horizontally, Goldschmidt was still using the entire height of the sensor, and therefore needed lenses that would cover the sensor vertically. “I tested a range of different Panavision large-format lenses and chose Panaspeeds primarily because I liked how fast they were and what they did for the depth of field,” she says. “I was conscious of the final smaller viewing format and wanted to use depth of field to have more control over where to direct the audience’s attention.” She rounded out the set with Primo 70s, adjusted slightly to match the Panaspeeds, and a pair of zooms that, in the end, were rarely used.
To generate the three frame lines needed to compose in camera, the crew first turned on the Venice’s user frame line for the 1:1 square. Additionally, the A and B cameras each wore a Convergent Design Odyssey 7Q monitor, which served as frame-line generators for the 16:9 and 9:16 guides. “By placing the 7Q at the start of the signal chain,” Brunet explains, “we could pass those frame lines along to the operator’s eyepiece as well as to our Teradek transmitter for everyone else to monitor on set.”
Goldschmidt and Brock each had two 7” monitors, set up side-by-side to show the feeds from the A and B cameras. “I trusted Peter with the largest image/monitor on set, so he could let us know about focus issues or things we couldn’t see,” the cinematographer says. “I wanted the director and myself to have as close to the viewer’s experience as possible.”
Dailies were processed by Light Iron in Los Angeles, where Greg Pastore served as dailies colorist for the series. Deliverables included both 1x1 Avid media for editorial - preserving both the 16x9 and 9x16 frame - as well as a 16:9 extraction for studio dailies review.  When it came time to online the final episodes, the Resolve timelines were designed around the same 1x1 square input but were extracted and scaled from the original 6K Venice media to automatically create the two aspects from the editorial cuts. With the 1x1 square available in both editorial and the final grade, Goldschmidt was able to reframe the 9:16 image as necessary.
“Attending the final grade with Nick was essential,” Goldschmidt says. “We’d grade the 16:9 in DaVinci Resolve, then go back and transfer the same initial grade to the 9:16 so we weren’t starting from scratch. Then we’d tweak the 9:16 parts of the frame we hadn’t seen before. Sometimes we’d see things or change things and we’d have to go back again to the 16:9, but for the most part the process was fairly streamlined.” Working in the 1x1 square was equally advantageous to Hasson. He notes, “The 1x1 working area allowed all the tracking windows to automatically apply to both aspects because the pixel dimensions were the same, just extracted differently on the source frame.”
Dummy was also the first project on which Goldschmidt knew from the outset that she would be finishing in HDR. She notes, “We monitored SDR on set, so it was important for me to sit with Nick ahead of time to look at the tests and know where we were going.”
Visual effects, which were required for elements of the show involving the sex doll, added to the complexity. Rather than employing separate pipelines for the 16:9 and 9:16 aspect ratios, the all-inclusive 1:1 frame was carried through editorial and visual effects.
“I couldn’t have done it without Mike, Nick, Peter and the whole Light Iron team,” Goldschmidt reflects. “I’m very grateful I was able to stay with the project all the way through final color, because although we were making definitive choices on set, the very nature of our specific workflow allowed those choices — such as the framing — to be massively changed later.
“The ‘1:1 square’ methodology worked well for our show, and I was really happy that we went that route in the end,” she adds. “In the final grade, we all got a kick out of seeing the show in 9:16 and realizing the unique beauty of some of those frames, which I don’t think would have happened if we had merely cropped in for that version. The final result gives the viewer more in every frame, no matter how they choose to hold the phone — and in this case, more is more!”

Thursday, 9 April 2020

The show must go on

InBroadcast


A review of technologies enabling production companies and broadcasters to deliver high quality content to viewers while optimising costs, resources, and eliminating travel. 
 Whilst the world grapples with the emergency outbreak of the coronavirus, we are seeing not only how people modify their behavior but will see how businesses must modify theirs. Events being canceled, travel being scaled back and replaced with teleconferencing.  Many corporations have sent staff home to work where it is possible to do so.
This is all made possible because we as a society have already have much of the technology to facilitate flexible working. Give your office-based staff a laptop and access to the internet, and they are ready to sit in their home office or at their kitchen table.
“What has changed in the last few weeks is that working remotely is no longer a work-life balance argument, or a nice-to-have, it is now a question of business continuity,” says Jan Weigner, CEO, Cinegy. “The crisis is forcing companies to reevaluate their ways of working and finally act upon it. The technological infrastructure is in place and we have the tools ready to go – from acquisition over production to distribution, all can be handled remotely and / or in the cloud.”
With bases in the UK, mainland Europe, Middle East, Australia and North America, Never.no’s teams are able to service regional customers without the risk of the virus affecting workflows or production needs. Bee-On is its cloud-based audience engagement platform runs on AWS for access anywhere with a web browser and internet connection, “so there is no need for production teams to be managed under one roof,” CEO Scott Davies says.
“Individual projects can be pre-planned and packaged with audience generated content and dynamic visualisations prior to delivery / broadcast of live or pre-recorded content. Viewers continue to watch, more-so during a crisis, so content producers need to continue programming and deliver captivating content, with audience engagement a priority – Bee-On can help deliver this.”
He adds, “We’re seeing a need for packaged end-to-end solutions that utilise cloud-production and seamlessly integrates ‘off-the-shelf’ graphics and compatibility with native broadcast graphics for a wide range of programming, such as news, live events and popular chat shows. Gone are the days where production is managed and delivered from one hub.”
Demand for Quicklink’s video call management system has never been higher, according to CEO Richard Rees. The firm is releasing a completely browser-based cloud supported workflow with automated Panasonic PTZ camera and lighting.
“A journalist could sit at home and interview someone located elsewhere live to air while a colleague edits the video online (in Adobe Premiere) and in realtime,” says CEO Richard Rees. “That edit could be passed to a control room for wider channel distribution. The whole environment is now virtualised. We believe this is the future.”
VSN has added new capabilities for remote interoperability to its VSN NewsConnect web plugin for news production. This were on the cards for a NAB release but recent events have made them more relevant.
VSN NewsConnect, which brings together a number of third party tools required for news production, now enables users to control multiple studios in different locations, even if the systems used in the studios are different.
 “What this means is that a journalist can simply send a news item to any studio and NewsConnect will automatically ensure that the delivered content matches the format requirements of the receiving devices,” said Patricia Corral, marketing director. “This remote interoperability is very useful in enabling news to be repurposed to the requirements of local broadcasters without worrying about technical compatibility.”
Pixel Power’s work is currently mainly based around large projects for refurbishment or replacement of playout and production infrastructure; projects with long timescales, so the current viral outbreak isn’t yet causing any major changes in demand.
“Our technology can be virtualized and deployed in data centre or public cloud, with remote access operation from anywhere in the world,” explains James Gilbert, CEO. “This is not something that can be done as an impulse reaction to the current situation - this capability has to be architected and designed into the product from the beginning.”
Once the outbreak subsides, the evolution of remote, decentralised working practices is likely to accelerate. “The industry is already moving towards remote, decentralised working practices because of the ecological and economic benefits,” Gilbert says. “The ability of staff to work from any location is core to that concept and whilst it is an obvious advantage during the current outbreak where staff may be required to, or choose to, work from home, I do not feel the pace of change will be accelerated - there are already enough drivers for it.”
Collaborative workflows with someone sitting next to you or on the opposite side of the world is in the DNA of storage solutions specialist GB Labs.
“We’ve fostered cloud integration for years and therefore, have always offered a remote workflow,” says Dominic Harland, CEO/CTO. “Obviously, there will be many other challenges with this ongoing situation, but GB Labs is confident that accessing content securely and quickly will not be one of them.
He thinks current events will accelerate solutions to enable a faster response to any future crisis. “The next two/three months is not long enough to develop, test and bring to market anything exceptional, but we are definitely looking at developing new products and new solutions. Whether this becomes a real-world advantage that the customer will want to buy after the outbreak subsides, well, that’s a different question.”
Each Bridge Technologies product has transformative potential in the field of remote broadcast and production, but none so more than its Widglets API. This leverages the full value of data collected by its VB440 - video, audio and ancillary - not only for network performance monitoring but also for a multitude of other workflows and applications. Full motion, colour-accurate, ultra-low-latency video, for example, can be made available from any source to any application or user. 
“Being browser based, all that is required is a laptop and a network connection,” explains
Tim Langridge Head of Marketing. “Each geographically dispersed user receives feeds from multiple cameras with multiple waveform vectorscopes and streams via a single HTML5 video monitor view. Not only does this result in incredible technical improvements in production and improved decision making, but also logistically frees up immense amounts of room in OB vans or MCRs – making them more efficient, affordable and adaptable.
Blackbird has seen a significant increase in sales enquiries since the containment phase began. “Enterprises need effective technology solutions to enable their workforces to operate efficiently whilst working at home or remotely,” says CEO, Ian McDonough. “Blackbird is a fully featured video editor available in any browser and can operate at low bandwidth. It's the perfect solution for the majority of live and file-based video production workflows.”
Essentially Blackbird can be used by anyone, any time, anywhere and this flexibility is enormously attractive to enterprises looking to drive massive productivity efficiencies through their operations. It also runs on bandwidth as low as 2Mb/s which is ideal given the pressure in traffic over the network – a situation which has caused Netflix and YouTube to throttle back their bitrates.
“As teams become used to de-centralised video production and enterprises enjoy significant infrastructure savings together with a flexible globally distributed workforce untethered to source content, we anticipate an accelerated adoption of Blackbird,” McDonough adds.
For live sports workflows, there are few production partners more experienced than Gravity Media. In February it wrapped its 2000th remote production, in this case of a Pac-12 Networks’ broadcast of the USC Trojans 65-56 win over the Washington State Cougars.
This impressive number includes ‘At Home’ centralized productions that were undertaken under the Proshow Broadcast (acquired by Gravity Media in July 2018) and Gearhouse Broadcast brand.
The benefits of this remote approach are obvious, with REMIs offering a cost-efficient modern workflow that is operationally flexible and durable. By centralizing the control room, video switching, audio mixing, graphics, replays and show production can all be done ‘At Home’ in the broadcast centre. This means that smaller, more affordable purpose-built mobile units can be used at the venue. Only video and audio acquisition hardware such as engineered cameras, microphones and announcer headsets, as well as comms hardware, a transmission interface and engineering support are required on site.
Company president Michael Harabin, says, “The potential for creating quality programming at an attractive price has never been greater, and we now have over 2000 proof points that showcase its consistent effectiveness and our ability to deliver.”
Sweden’s Intinor specialises in helping companies overcome the challenges of remote production.  “As we are currently in lock-down of travel for personnel, the benefits of remote production could be felt all the more keenly,” says Daniel Lundstedt, regional sales manager. “Instead of having to arrange for operators to travel on location, broadcasting companies could instead work with local talent with equipment all that needs to be shipped rather than staff members.”
Intinor is already able to make going live, from anywhere, very easy, without marshalling a small (but expensive) army to make it happen. It’s all down to the “supreme mobility” of its Direkt link remote production pack. With an Intinor Direkt receiver or router in a control room, captured audio and video from a camera or mixer connected to a backpack can be streamed over public internet to a Direkt router and then re-streamed using other protocols, transcoded or outputed to SDI or NDI. 
Mobile Viewpoint has a heritage in remote production solutions, especially for live streaming.  CEO Michel Bais says the company has proven to reduce costs for production companies by not having to send a wealth of resource to an event.
“As we see companies trying to reduce their carbon footprint, it has emerged that it is not only cost savings that are driving these innovations,” he tells InBroadcast. “In line with this philosophy, we have developed remote cameras that allow sports games to be live streamed but without the need for a camera crew or an onsite production team.”
With the IQ-Sports Producer, an entire field of play can be recorded with a single 4x4K camera, while AI is used to create a virtual zoom of the play by automatically following players and the ball. Games can be live steamed in real time and with different format versions depending whether it is for web streaming, or for higher quality broadcasts requiring HD-SDI workflows, all at a fraction of the cost of an on-site production team.
vPilot is another AI driven solution from Mobile Viewpoint that can be used for remote newsrooms. A combination of cameras using 3D sensors and audio cues means round-table discussions can bet set-up without the need for a camera team or an onsite director.  “Both IQ-Sports Producer and vPilot can be managed remotely with cameras that can be semi-permanently installed to create quality and cost-effective programming,” Bais says.
Net Insight’s plug and play solution Nimbra extends the production workflow to reach remote venues anywhere on the globe, with the same ease of operations as for traditional in-house productions. Users include

Nimbra is a high-quality multi-service media transport over IP platform supporting both native video and audio in addition to standard IP/Ethernet. Built-in video processing, low-latency JPEG 2000 and MPEG-4 encoding as well as unique features for equipment control and synchronisation makes it a great choice for remote production. Users include SVT and TV2 Denmark.

“100 percent reliability is key for remote live production and our solution offers mechanisms to assure the content is delivered with perfect quality regardless of network issues,” the company states. “Enterprise customers can use the solution to deliver live video content to support internal communications and working remotely.” 
All of Cinegy’s software solutions lend themselves to flexible working practices. “We have long been a proponent of virtualization and IP – and what is the cloud if nothing more than using someone else’s computer, hosted somewhere else? Says Weigner.
“Give your office-based staff a laptop, access to the internet and access to Cinegy software– locally or in the cloud, and they are ready to remotely produce content using Cinegy Desktop, remotely playout content with Cinegy Air; remotely monitor channels with Cinegy Multiviewer. Whether our customer is at home or at another location and needs to set-up a pop-up channel in the cloud, doesn’t matter.
“Our customers who already embraced our workflows are more prepared and ready to deal with the new business practices that are emerging,” he argues. “Being ready for this business process change is markedly harder than being ready for a technology change. In this case, circumstances are dictating that there must be change. The barriers are being lowered and it is time to embrace it.”