Thursday, 5 November 2020

Taking the lead on sustainability now

copywritten for Blackbird 

Even as the world concentrates on living with Covid-19, the more urgent threat to our existence is saving the planet.

https://www.blackbird.video/uncategorized/taking-the-lead-on-sustainability-now/

Microsoft is one of a growing number of organisations to have announced its intent to be carbon negative by 2030. It’s an ambitious plan which the video industry would do well to heed.

According to Bafta’s production sustainability body, Albert, an hour of TV typically generates about 14 tonnes of CO2. That’s just production: it doesn’t include transmission or distribution. To put that into some context, an hour of TV has the same impact as running three homes for one year.

Change is an obligation

The environment is a red-button issue for consumers. Sustainability is fundamental to how organisations in all sectors should operate going forward. For instance, every time data is moved from A to B there is a carbon cost.

To put some statistics on this: Cisco believes global internet video traffic will increase by a third each year through 2022 with live internet video, led by platforms like Twitch and YouTube, growing at an astonishing rate of 73% in that period. Video streaming will constitute 79% of all mobile network traffic by 2022.

All of this has a direct negative environmental impact on manufacturing cost, energy, cooling, content transmission and storage and caching.

Technology can help. Employing a means of reducing the amount of video (data) trafficking across the internet – whilst maintaining the viewing experience – can help eliminate the need for heavy duty bandwidth connections and for bespoke hardware.

Improvement in videoconferencing and telepresence can reduce the carbon-heavy cost of corporate travel. Remote distributed anywhere production is not a nice to have concept but an essential business continuity and environmentally conscious workflow for any content producer.

It’s clear to us that the broadcast industry from international sports federations like Formula 1 to global news organisations like the BBC are taking sustainability seriously. 

An ultra green, sustainable technology built for the new world

For its part, Blackbird is committed to conserving natural resources in all that it does – delivering real, tangible environmental benefits to customers and society.

This is hard-wired into its corporate policy and includes embedding relevant environmental, social and governance matters into its culture and work practices.

Tech-wise, Blackbird’s solution means no new hardware manufacture since the platform works in any browser, eliminating the need to buy bespoke editing hardware, no matter the scale of production demand.

There’s no packaging either. Being software-based means Blackbird is available digitally, eliminating all hardware installation at a stroke. Furthermore, power, storage, compute and transport energy inefficiencies are shrunk from a production’s footprint overnight thanks to the ultra-efficient cloud-based Blackbird codec.

These credentials set the standard for all other video production applications to follow.

We are proud to be recognised for this endeavor by being shortlisted for the Video Tech Innovation Awards 2020 in their Sustainability category.

Long before the current crisis, the prospect of climate catastrophe was posing the media and entertainment industry with a do or die ultimatum. Coronavirus has provided us all with a once in a generation chance to reset the agenda and take forward our collective responsibility to reinvent content and broadcast with environmental goals front and centre.

Wednesday, 4 November 2020

Using the Arri ALEXA 65 to adapt a classic

RedShark

A literary classic of gothic romance adapted by Alfred Hitchcock in celebrated Best Picture Oscar form is not necessarily a remake you would associate with the director of Kill List – but Ben Wheatley is not one for being pigeonholed.

https://www.redsharknews.com/using-the-arri-alexa-65-to-remake-a-classic

With Kill List, Sightseers and A Field in England the British auteur was staking out a startling career in horror and macabre comedy but with his 1970’s set JG Ballard adaptation High Rise, kinetic epic shoot-out Free Fire and kitchen sink drama Happy New Year, Colin Burstead Wheatley has expanded his repertoire.

Rebecca, scripted by Jane Goldman (X-Men: Days of Future Past) is made for Netflix with probably Wheatley’s biggest budget to date. It’s a sumptuous period drama that gives him an opportunity to direct a film that harks back to the days of studio-bound Hollywood glory.

“I think Ben was quite keen on legitimacy,” says Laurie Rose BSC, who has shot all eight of Wheatley’s features. “Ben produces and writes a lot of his own work but I think he took this on because it was working with someone else’s script and to make progress in terms of budget in what he is allowed to do.”

The filmmakers were also attracted to a story that may be less familiar to audiences than they think.

“We’ve all got a sense of the Hitchcock version in our minds but I had to revisit it,” says Rose. “It is astonishing and beautiful but out of date in terms of the writing and the way that it was made in the studio system.”

Wheatley says that what he really loved “was that du Maurier had a scheme, which was to smuggle in something quite sinister inside the wrapping of something that looks like a romantic story. You’re lulled into that false sense of security before it’s pulled away from you.”

They had to steer away from emulating Hitchcock’s 1940 version starring Laurence Olivier so as not to infringe copywrite. “Our version is far more faithful to the darkness of the book,” he says. “In our Rebecca, there’s a modern element to the gaslighting story (deliberately causing someone to doubt their sanity) wrapped in a tale of posh people with servants and big houses - and it goes wrong.”


Rebecca actually immerses the viewer in a Russian doll of genres. What begins as a sweeping romance in sun-kissed Monte Carlo moves into darker psychological territory after the newlyweds, Maxim de Winter (Armie Hammer) and his second wife (Lily James) arrive at Manderley — his imposing estate (and the object of one literature’s most famous opening lines) — where the young woman finds herself battling the haunting legacy of Maxim’s first wife, Rebecca, not to mention Mrs. Danvers (Kristin Scott Thomas), the sinister housekeeper bent on keeping her former mistress’s memory alive.

Rose’s approach was to capture the Hollywood studio-bound elegance of the 1940s. He recalls two particular scenes — at polar ends of film — that showcase this in different ways.

“We worked with a really beautiful light in France, so it was a real opportunity to revel in that sunshine,” he says. “The scene on the beach where they’re talking about bottling memories, just showed the intimacy of things. The sun was very low and shallow and using the sun flare was just beautiful. Everything just about hit at the right time.”

His other particular favourite scene was a showdown with the main characters in the library.

“It was a shot that was so indicative of a film from the forties. It was done in long single takes. We were on a dolly, basically running live. So, as they came into the room, we backed up and then we moved in and would move out and across. It was hyper-mobile, almost like a live performance very much like the scenes you might get in a Hollywood film from the ‘40s. We hit these very precise marks that kept the dynamic of the scene up for actors. The dialogue was rapid fire and it was all just beautifully choreographed.”

This more formal style, working with premium production design, is unusual for the Rose and Wheatley who tended to favour (and because of limited budget) a more run and gun handheld approach. Rose seized the chance to shoot with the giant Arri ALEXA 65mm sensor.

“I knew I wanted to shoot Arri 65 digital with full frame DNAs and [producers] Working Title were keen to support it.  Our B-cam was an Alexa LF, partly so we could save a bit on data [if the 65 had been his sole camera].”

The Arri ALEXA 65 shoots 6K and the LF is a 4.5K sensor. Rose shot 5.5K on the 65 to reduce the difference in resolution between the two.

“There was a wide portrait 50mm that I liked - sweet in the centre and softened out super quickly around the edges. Also a red-dot variant 80mm I fell in love with for the indescribable way it fell off and the way it flared. I wore it at every opportunity. You end up with a real voice in the lenses.” 

He doesn’t like to bake in a hard look, preferring to remain true to the costumes, landscapes and interiors and exteriors of the heritage house locations.

“I don’t tend to lay over anything unauthentic. I always monitor with a very, very light touch rec.709 because I know I can shoot within that colour space know I can add a lot in post if needed. It means my neg is safe and I know I can hold my highlights and there’s plenty in the shadows.”

‘Last night I dreamt I went to Manderley again’ is one of the most evocative lines in literature. If ever there was location as character it is this. But with so many of England’s most famous estates having been featured in numerous productions over the years, the task of making Manderley seem new and never-before-seen was tricky. Production Designer Sarah Greenwood designed a composite of eight different country houses and estates, including Cranborne Manor (Wiltshire), Hatfield House (Hertfordshire), Mapperton House (Dorset), and Osterley House (Isleworth).

“Manderley had to be mysterious behemoth of a house that no-one could begin to make sense of,” says Rose. “It made it a little difficult to track shots because we had to Frankenstein someone’s movements. The actor walks about out of a room in one place and into a room in a different location but nobody will have the blindest idea.”

Rebecca’s own boudoir is a “fantastical, fantasy space” with higher ceilings, more gothic the rest of the house, decorated with art deco, dark wood and silver and tarnished mirrors. “It’s dreamy and ethereal,” he says.

Wheatley’s upcoming projects include Tomb Raider 2 with Alicia Vikander.

Tuesday, 3 November 2020

Bringing back the crowds

Broadcast

When elite sports resumed without spectators, broadcasters and major leagues weighed the value of adding back artificial crowds. While US sports have gone full Disney with animated packed houses, Europe’s broadcasters have been more conservative in blurring the boundary between fact and fake. Now we know that fans could be excluded for the entire 2020-21 season, the goal is to increase the community feel and connection between clubs and broadcasters on the one side with their fanbase and particularly paying subscribers. 

p31 Winter issue https://edition.pagesuite-professional.co.uk/html5/reader/production/default.aspx?pubname=&edid=16d6fab9-b66c-473a-b459-75179ed12a2a

Virtual fan experiments have gravitated from rudimentary cardboard cut outs in the stands to pitch side LEDs displaying thousands of fans on Zoom link, pioneered by a Danish league match in May.  

The concept has become more sophisticated. The NBA and NFL, for example, are using Microsoft Teams to live stream fans using webcams and smartphones onto giant video boards at the venue. The NBA settled on a bank of 320 virtual spectators on 17-foot video boards located behind the teams’ benches and at the ends of the court, as being the best visual experience – neither too big nor too small. 

Fans are also encouraged to ‘cheer’ by clicking on a logo of their team on the NBA app, an action represented by graphics on screens at the sport’s Covid biosphere in Orlando – incidentally part of the Disney World resort. 

Since the NFL’s return in September select games are being accompanied by a Fan Mosaic of 30 home club fans displayed on stadiums screens and broadcast. These fans see a dual-screen display of the live game next to a gallery view of fellow fans. Each fan video feed is isolated and mixed into the Fan Mosaic display.  

The WWE has gone further. Up to a thousand virtual fans are dropped into a giant matrix and displayed on screens that forms the entire backdrop to the wrestling bout.  

“Perhaps more than any other sport, wrestling depends on that back and forth interaction with the audience. Without that you miss out on the whole dynamic,” says Tom Shelburne, Director of Sales & Business Development, Pixotope - part of The Future Group, whose technology processes and synchronizes the individual video-conferenced feeds. 

Beginning in July, Fox Sports’ MLB broadcasts have carpeted stadia like Wrigley Fields in CG crowds. The technology unites camera tracking technology from Sports Media Technologies (SMT), virtual graphics designed by Silver Spoon and realtime graphics processing from Pixotope based on Unreal Engine 3D. The result allows thousands of fans to be dynamically created, controlled and synched with the live pictures with between 2 and 8 frames delay. 

“Fox acknowledge this is not real but what they’re doing is giving the viewer a much better experience,” says Shelburne. “We can add 42,000 unique individuals. We can change the colour of their clothing, make them sit, stand, jump, cheer, do high fives or wave. We can alter the density of the audience. Even the shadow of the sun during the live game is taken into account to light the CG crowd.” 

Reaction has been mixed. Much of the response on Twitter is consistent with this from @RomeVanLara2: ‘This is so stupid. Same with fake noise. Are we incapable of dealing with reality of empty stadiums?’ 

Others are more appreciate of the effort. “If Fox Sports didn’t see a positive outcome to it they would have stopped after the first couple of games,” says Shelburne. “Instead, they carried it over to the NFL [Fox is adding the same tech to its NFL broadcasts] and into their entertainment division as well. So, they are seeing a real value to this. 

“We’ve seen our competitors try to deliver this virtual fan experience and it comes across like Wii characters. Anyone can deliver a virtual experience but not everyone can make it photoreal. If you move the camera around the field and the crowd is misaligned even slightly, the illusion falls apart.” 

Fox is also hoping the augmented reality can open new in-game advertising opportunities. “We’ve not gone live yet, but we have tested how sports can make a return on investment,” says Shelburne, who admits that the technology is expensive. “For example, we can make everyone wear a red jersey and white hat in Coke branding or have them flip a card over to spell out a branded message. We could have a Coke bottle virtually pop out of a jumbrotron and have soda flowing over the stadium.”  

Elsewhere, NFL sponsor Budlight has created a special Showtime cam which puts the spotlight on fans from the Fan Mosaic and fan tweets on LED screens installed at each end zone after a touchdown. 

Similar intimate relationships between the game and its advertisers (especially alcohol) are still kept at arms’ length in Europe and is one reason why cricket, football and rugby remain shorn of the CG razamatazz – for now. 

“The UK market is quite entrenched in what it’s used to seeing and doesn’t respond well to what they perceive as more gimmicky simulations,” says Nick Moody, Executive Producer, Sunset+Vine. “US sports tend to be a lot more driven by data and graphics and therefore busier on screen. Audiences there are more willing to accept virtual fans into their production than we are culturally in the UK. It’s expensive to do it well and I’m not sure it really gives us much of an enhanced experience.” 

Before ‘Project Restart’ the Premier League were presented with a number of options by the broadcasters to shake up presentation. These included micing up the referees (as in rugby union), 360-degree replays, subs being interviewed during match play and a new tactical camera feed.  

“Not many came to fruition for various reasons,” says Moody. “We’ve managed to get an extra camera in the tunnel for EPL matches but micing up the captains at coin toss didn’t really give us anything. Nor did having the managers record a short pre-match piece to iPhone on the coach journey. Ultimately, as is evident by the regular positive virus tests among EPL players and staff, it’s about keeping everyone safe. Adding too much infrastructure risks undermining that.” 

That’s also the reason why BT Sport’s trailed introduction of a flagship 8K live service on its Ultimate tier has been delayed. “We are confident we can do it,” says Jamie Hindhaugh, Chief Operating Officer. “There will be more 8K event coming soon.” 

The broadcaster’s creative response to Covid included installing ‘Fan Parks’ as part of its behind-closed-doors live matchday presentation. Feeds of thirty-two fans (16 per club) are displayed in its studio for presenters and pundits to interact with live. Another initiative, Watch Together, allows fans to watch, see and chat with friends in a split screen view during the match via the BT Sport app (similarly, Sky Sports offers Fanzone). It has also taken audio stems of manager and player chat from the live feed to build into highlights reels. 

“I think our audiences tend to expect more authentic and natural environments,” says Hindhaugh.  

Faced with scepticism in some quarters, audio simulation has proved the most enduring success. “It’s counter intuitive but there’s no doubt the game looks better with enhanced audio,” says Moody. “The production team have a set of effects (sourced from EA Sport’s FIFA video game) for events like a near miss or goal and another set of archive audio atmosphere’s specific to each ground, each team and even historic games between each club. These effects are mixed live.” 

Given the choice of enhanced audio or ‘purist’ sound, 70-80% of BT Sport viewers are choosing the former. “I was wary about enhanced sound but it’s been a clear success,” says Hindhaugh. “When you’re watching at home you tend not to always focus on the TV. That sound helps signpost when to look up. 

“We are always looking for ways to bring fans closer to the game and be part of the conversation. The longer fans aren’t allowed back to grounds, the more important it is for us to give them options to enjoy as rich a connection as possible.” 

Monday, 2 November 2020

Behind the Scenes: The Trial of the Chicago 7

IBC

Phedon Papamichael ASC unlocks the visual puzzle of Aaron Sorkin’s politically charged legal drama.

https://www.ibc.org/trends/behind-the-scenes-the-trial-of-the-chicago-7/6943.article

It’s 1968 and the United States is in turmoil. Martin Luther King Jr. is gunned down by an assassin in Memphis, Robert F. Kennedy is shot and killed in LA. The Vietnam War is at its height, with over 30,000 American casualties and 1,000 more US troops killed each month. In August, scores of antiwar protestors gather outside the Democratic National Convention in Chicago and are tear-gassed and beaten by police and the National Guard. 

The following year, eight antiwar activists are put on trial for conspiring to incite a riot facing charges brought by a new Republican administration aiming to stifle and silence the movement. 

Sounds familiar? “The script didn’t change to mirror the times, the times changed to mirror the script,” says Aaron Sorkin in the production notes for The Trial of the Chicago 7 which he originally wrote in 2007, intending it for Steven Spielberg. “Just as Fred Hampton (leader of the Black Panthers in 1968) was killed by the police in the middle of the trial, George Floyd, Rayshard Brooks, Breonna Taylor, and countless others are similarly tragically killed [today]. Suddenly protestors are met with tear gas, riot clubs.” 

Sorkin has said that until Spielberg brought his attention to it he was unaware of the Chicago 7, but the volatile cocktail of injustice, protest and repression couldn’t be more timely. Netflix has intentionally brought forward release ahead of the US election. 

The West Wing creator has lassoed a heavyweight cast including a trio of Brits in Eddie Redmayne, Sasha Baron Cohen and Mark Rylance with Jeremy Strong, Watchmen star Yahya Abdul-Mateen II plus Michael Keaton. 

Phedon Papamichael ASC, Academy Award-nominated for Alexander Payne’s Nebraska, is the film’s cinematographer. He says the biggest challenge was working with a script that was intricately structured and dense in dialogue but visually imprecise. 

“Aaron’s writing is very nonlinear,” Papamichael tells IBC365. “Not just jumping forward or back in time but constantly criss-crossing these timelines. We’d sometimes just record one line and then move to another line that won’t necessarily be one that progressively follows from it.” 

The actual trial took place over six months (starting September 1969) and involved 200 witnesses. In telling what is essentially a legal drama, the trick was to find ways of making the story cinematic. 

“We’re creating this visual puzzle and filming little impressions and vignettes to support certain lines and moments and beats of the trial in order to help break it up visually so you don’t feel like you’re stuck in a courtroom.”  

Sorkin had structured the courtroom scenes and the flashbacks very specifically, with most of the intercuts written in. Nonetheless, tracking this on the page proved tricky when it came to the logistics of filming. It fell to Papamichael to create and organise the coverage – the reaction shots and atmospheres which are needed editorially to tell the story. 

“With Aaron it’s all about the word and the structure of this overlapping puzzle that he sees in his head,” says Papamichael. “He is not a visual director like James Mangold or David Fincher. He relies heavily on the cinematographer and designer to take care of all that.” 

He continues, “If he saw the actor speaking the line on camera he was good with that take. But I know we need reactions of the jury and of the prosecutor to build the scene. He doesn’t even look at the screen of the monitor on set. He literally closes his eyes and just listens to it. For Aaron, it’s all about the rhythm. He knows exactly what he wants, you show him that and often he’ll think he doesn’t need anything else – which is may be true most of the time – but you have to find a way to shoot so it’s not just two hours of talking heads in a courtroom.” 

Since the script didn’t spell out the chronology of the trial, Papamichael worked with the script supervisor to strip the flashbacks and jumps forward into a calendar in order to schedule the shoot.  

“Since the trial takes place over such a long period I wanted to convey that passage of time. So on the opening day of the trial I play it sunny, and as we go through the winter months I play it moodier. I had to assign different moods and assign specific grammar for specific witnesses. So certain scenes would be better if it were raining, others if it were overcast.” 

Lacking the budget for a huge crowd of extras on call (the film cost $35 million, which is moderate for a film of this scale) Papamichael first shot the courtroom scenes from one direction and then reshot them from a different angle when they had the extras to fill the room.  

The courtroom scenes are more composed and static in contrast to scenes set outside the court which are shot handheld and are energised by documentary footage of the riots. 

“Recreating the protests at the actual locations in Chicago enables us to create a structure that gets the movie away from being a traditional courtroom drama,” Papamichael explains. “Since we are also limited in terms of extras in the crowd [there were around 10,000 people on the streets of Chicago in 1968 and the film had 175 on set] the best way to handle that was to merge our cameras into the scene like an actual documentary crew and not create big objective wide shots.” 

Papamichael filmed with the same camera and lens package he used to acclaim on sixties-set drama Ford v Ferrari (aka Le Mans 66). This was an ARRI Alexa LF and Mini LF combined with Panavision anamorphics specially configured to fit the large format sensor. 

“Because all the characters have their own agendas, I wanted to be able to connect them and get everyone’s reaction. The large format really lent itself in the 2.40:1 aspect ratio to covering these multiple characters [defendants and lawyers] sitting in a row in the court room.”  

The film is intercut by editor Alan Baumgarten, ACE (editor of Sorkin-directed Molly’s Game), with archive of the events culled from televised coverage, amateur Super 8 and police film of the riots. 

Some of the footage was taken from Haskell Wexler’s Medium Cool, a cinema verité-style drama that takes place in Chicago in the summer of 1968 and combines fictional and non-fictional content. The film served as one of Papamichael’s inspirations during the shoot.

“The footage that we generated in our recreation of the riots is more kinetic and documentary-like, but we’re not intentionally making our characters part of a documentary,” Papamichael says. “This is still a movie. 

“In the film, we show a 4-second vignette, then you are back in the courtroom and a guy says two lines, then you move to documentary found footage and then you go to another timeline. The pacing of Aaron’s storytelling really lends itself to be able to throw all these pieces together. It’s not like we have a 20-minute sequence of the riots.”  

The large ensemble cast assembled for Chicago 7 proved a handful. Papamichael contrasts the experience to working with director James Mangold.   

“We don’t like to preconceive too much… we look at what is happening in the moment. So, with Joaquin Phoenix on Walk The Line we never really knew what we were going to do – the inspiration comes from the performance. 

“Every director is different,” he adds. “In Chicago 7, I ended up in charge of blocking the movie and assigning who gets what shots because Aaron didn’t really want to talk to them about that. So, it’s like ‘don’t I get a close up?’ and ‘isn’t this scene about me?’ – and I was very much caught in the middle. Which is fun… I love working with actors, but there’s lots of strong individual personalities on this one.” 

He elaborates; “Some actors preferred to improvise. Mark Rylance is a theatre director and actor with a very different approach – he’s very focussed. Jeremy is a super-method actor and Sasha is just goofing around between takes. So, bringing that crazy group together was hard. Having to capture it all and make it work for Aaron and for the actors not be super frustrated… that was the particular challenge on this movie.” 

The Netflix effect 
The film began life at Paramount intended for theatrical release. When Covid struck and cinemas closed, the studio off-loaded it to Netflix which ramped forward its release. 

“I haven’t shot for TV in 30 years and under different circumstances I’d be disappointed this was on Netflix. I shoot for the big screen and I want people to experience it in theatres,” says Papamichael. “But probably more people will view it this way on Netflix. 

“There was definitely a great urgency to get it out before the election. It’s an advantage to our film in one way that everything that’s going on [in the US] makes it relevant, but it’s also so tragic.” 

Friday, 30 October 2020

10,000ppi microLED displays could be here soon

RedShark News

VR hasn’t exploded into everyone’s consciousness at quite the speed it was envisaged with everything from clunky, wired headgear to poor quality content being to blame. Global spend on VR and AR is reckoned by analyst IDC to be $18.8 billion this year rising at 77% a year until 2023 but such predictions factor in tech advances.

https://www.redsharknews.com/10000ppi-oled-displays-could-be-here-soon

One of those is better displays, the poverty of which has dogged VR experiences to date. Two new developments promise a fix to the ‘screen door effect’, the visibility of the black grid surrounding individual pixels which can look as if you’re seeing the world through a mesh. You get the same effect if you press your nose to the telly – and in VR that is effectively what you’re doing. 

One solution is to up the resolution. Researchers at Samsung and Standford University have come up with a new design for OLED displays that could deliver 10,000 pixels per inch (PPI) thereby wiping out visual artifact.

“We’ve taken advantage of the fact that, on the nanoscale, light can flow around objects like water,” said Mark Brongersma, who is a professor of materials science and engineering and senior author of the research paper. “The field of nanoscale photonics keeps bringing new surprises and now we’re starting to impact real technologies. Our designs worked really well for solar cells and now we have a chance to impact next generation displays.”

Brighter and better colour accuracy

In addition to having a super-massive pixel density the new ‘metaphotonic’ OLED displays would also be brighter and have better colour accuracy than existing ones, and they’d be much easier and cost-effective to produce as well.

The technology uses films to emit white light between reflective layers, one silver and another made of reflective metal with nano-sized corrugations. This ‘optical metasurface’ changes the reflective properties and allows specific colours to resonate through pixels. The design allows for much higher pixel densities than in the RGB OLEDs on smartphones, but doesn’t hurt brightness to the degree you see with white OLEDs in some TVs.

The design of the corrugations makes large-scale manufacturing viable. Samsung says it’s already working on a full-size display featuring 10,000PPI.

Meanwhile, separate developments are attempting to close the screen door by boosting light levels in OLED displays. Many worry that OLEDs won’t be able to reach the luminance levels needed for AR and VR applications, especially for the AR application with high ambient light. A paper (accessed via Insight Media) led by electronics manufacturer Kopin suggest that OLED microdisplays have the potential to reach the 30,000 nits level.

The paper outlines the four ways to create an AR image: DLP (digital light processing), LCOS (liquid crystal on silicon), a scanning RGB laser system or OLED microdisplays. LCOS and DLP solutions can use high brightness LEDs to achieve luminance more than 100,000 nits. However, these technologies have issues such as limited contrast ratios, colour breakup, slow operating speed, need for an illumination source, and complex/bulkier optics. Laser scanning has been implemented in the HoloLens 2, but there have been many reports of the very poor image quality. OLED microdisplays fabricate the OLED emitting layers on top of a CMOS silicon backplane to drive the pixels and offer the contrast and speed performance needed for AR / VR. However, the low luminance has been a serious concern.

Eliminating lag

Kopin goes on to explain that to eliminate motion lagging as well as to accommodate fast head movements for gaming applications, a common technique is to output light during part of the frame time. This is called duty cycle. Typical duty cycles used in VR headsets are only 10%, meaning a 1,000-nit display, must now output 10,000 nits for a very short period (the time period depends on the frame rate and gets shorter for higher frame rates). For AR applications, the luminance requirement will be even higher because of the bright ambient light level and the low efficiency of see-through optics. Typical office or home lighting can push the display luminance (before the optics) requirement to much higher than 10,000 nits.

Outdoor applications can push display luminance requirement much higher – perhaps more than 1 million nits would be needed. However, Kopin suggest that 30,000 nits for a microdisplay may satisfy many AR application needs with the use of higher efficiency see-through optics and in combination with photochromatic lenses to reduce the ambient light level.

“As a result, it makes sense to contemplate a path to 30,000 nits for an OLED microdisplay,” the paper concludes, “especially for high-ambient applications. However, only limited prototypes have been shown so far with many challenges remaining to fully commercialize a monolithic microLED display solution suitable for VR or AR applications.

Even if the display is improved, the big downside to this is squeezing the higher resolution data over mobile or fixed line broadband.

Ubiquitous 5G networks and devices are promised to sort this out, though there are murmurings that 5G may not be quite enough even as new specifications are rolled out. For some, a 6G is needed to finish what 5G started and there are already dates of 2026 pencilled in for a first draft 6G standard and a commercial rollout some time after the end of this decade.

Thursday, 29 October 2020

Riding the 5G Wave

Streaming Media

Wireless telecommunications is one of the few industries to have thrived since the COVID-19 pandemic engulfed the world. At a basic level, many of us resorted even more than before to using mobile devices to communicate with friends and family, stream video, or work from home. Government track-and-trace systems to curtail the virus are dependent on our reliance on mobile. More than that, though, 5G is seen as instrumental in leading economies out of the dire straits many are in.

https://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Riding-the-5G-Wave-143577.aspx

"To the extent there was any doubt about the importance of connectivity, that doubt has been completely erased," says Alex Rogers, EVP and president of Qualcomm Technology Licensing in a video interview on the company's blog. "While broadband has done well, you need connectivity that solves all the problems. You need it to be ubiquitous, reliable, you need throughput, speed, and security. You will see 5G become essential everywhere. It's going to be in very high demand."

Despite the pandemic and the recession it spurred, 5G is still on track to become the quickest generation of wireless cellular technology to be widely adopted. According to Omdia's projections, 5G will have nearly 2 billion subscribers by the end of 2024, 6 years into the cycle versus 8 for 4G LTE.

Globally, there are now 82 5G commercial networks, a number that's expected to reach 206 by the end of 2020, per TeleGeography. In addition, there are more than 100 commercial 5G device models available worldwide, according to the "Ericsson Mobility Report" from June 2020. Omdia projects that 5G connections will reach 238 million globally by the end of 2020, of which North America will account for 10 million, spanning seven network rollouts.

 The number of 5G connections in North America will hit nearly 300 million by 2024, as LTE continues a slow but steady decline.

"Globally, 5G remains the fast-growing generation of wireless cellular technology ever, even as the world is gripped with a pandemic," says Chris Pearson, president of industry trade association 5G Americas. "In North America, we are seeing consistent, strong up­take of new 5G subscribers as new devices have been released that can take advantage of low-band and millimeter wave [mmWave] frequencies. At the same time, new network capabilities are being added." 

At the peak of the lockdown, mobile networks held up remarkably well to the strain of additional data traffic as work-from-home data usage spiked dramatically. For instance, AT&T reported a 22% increase in its core network traffic and a 30% increase in wireless voice minutes. The new combined T-Mobile/Sprint saw mobile hotspot usage spike 60%, while tethering was up 57% for T-Mobile and 70% for Sprint.

These increases in traffic seem to have persisted. OTT streaming has soared as we have stayed at home, while COVID-19 has forced a rise in working from home and the need for additional bandwidth. 

"Work from anywhere is going to be the new normal, not just from home," Rogers says. "Companies are already evaluating [how] to push the enterprise out to a wireless connected environment."

Nokia reports that peak traffic "normalizes" at 25%–30% above pre-pandemic levels and that aggregate traffic volumes continue to be more than 25% over pre-pandemic levels. A survey from IBM found that 54% of people want to continue working from home even after the pandemic has passed. 

"There is no doubt COVID-19 has had a huge impact on our industry, however, in the midst of the pandemic, Verizon has been able to maintain and, in some cases, accelerate its 5G deployment by being nimble, flexible, and downright scrappy," says Heidi Hemmer, VP of network engineering at Verizon. This has been possible by focusing efforts on 5G antenna attachments where social distancing has been easier to maintain and by taking advantage of the dramatic drop in road traffic to extend the company's hours of operation when working outside on fiber trenching and laying.

"Our engineers conducted virtual site walks with municipalities, providing pictures, videos, and access to our engineers remotely, and we worked with municipalities to deploy digital permitting solutions (to submit applications for licensing without entering an office)," says Hemmer.

So 5G has moved from hype to reality. None­theless, as GSMA pointed out in March, 4G is still king. Its "The Mobile Economy 2020" report finds that 4G will continue to grow, increasing to account for 56% of global connections by 2025.

In addition, consumer devices still lag behind 5G network rollout, curtailing usage. Most mobile devices in the market are not 5G-enabled, according to Ampere. In fact, there is not currently an iPhone model that supports 5G—despite 57% of U.S. internet users owning an Apple smartphone—as reported in Ampere's Q1 2020 consumer survey.

"Currently, devices which are 5G-enabled are also higher cost, which will also limit the short-term uptake and wider market uptake," says Ampere research manager Daniel Gadher. "However, as with any new tech when it first launches, prices will be high due to low economies of scale for manufacturing. With the network coverage having been scaled up nationwide, it will lead to greater scale, and prices of devices should come down, as long as consumer demand is there, supporting wider uptake."

NSA to SA

All operators initially launched non-standalone (NSA) network architectures, which combine 600-MHz 5G radio access networks (RAN) tied to 4G LTE equipment at the core. The next step is to migrate to standalone (SA) networks, which boast 5G in both RAN and the core.

In North America, the first to reach this mark is T-Mobile, fueled by its completion in April of the merger with Sprint, and in partnership with Cisco, Nokia, and Ericsson. T-Mobile says its new 5G SA network has been tested to deliver up to 40% lower latency and 20%–30% improvements in download and upload speeds over prior performance.

T-Mobile has passed AT&T to become the nation's second largest carrier and to claim the "5G coverage crown," boasting that its 5G network is more than two times larger than AT&T's and more than 10,000 times that of Verizon's.

In the near-term, T-Mobile explains that SA allows it to unleash its entire 600-MHz footprint for 5G unhindered by using mid-band LTE, with a signal that's able to cover hundreds of square miles from a single tower and go deeper into buildings than before. 

Verizon and AT&T suddenly find themselves playing catch-up—but not for long. Verizon plans to start moving traffic onto its new 5G SA core in the second half of this year, with full commercialization in 2021.

"Verizon was the first carrier in the world to launch a commercial 5G mobile network with a commercially available 5G-enabled smartphone in early April 2019," asserts Hemmer. "To date, we have launched our 5G Ultra Wideband network in parts of 36 cities using mmWave spectrum—a keenly differentiated experience from low-band 5G—and we plan to reach 60 markets this year. Our Dynamic Spectrum Sharing work is on track, which will pave the way for the most efficient use of spectrum to deploy the nationwide coverage layer of 5G on our other spectrum assets. We will launch 5G nationwide by the end of the year. 

"We have also completed successful trials of our 5G SA core. Built with a strategically different architecture of virtualization from the ground up, this will provide the foundation [that] a non-cloud native core simply will not support," says Hemmer.

"Our strategy of deploying 5G in both sub-6 (5G) and mmWave (5G+) spectrum bands will provide the best mix of speeds, latency, and coverage that are needed to enable revolutionary new capabilities to fuel 5G experiences," says Chris Sambar, AT&T's EVP of technology operations. "Our competitors are still working to provide that same mix, which for them could take months or even years. What we offer is available to consumers and businesses today, and we're not slowing down."

AT&T announced in its Q2 2020 financial results an additional $1 billion invested to purchase 5G spectrum, "showing its commitment to growing its 5G coverage nationwide," according to Gadher. "Typically, the major U.S. carriers have focused deployment in highly populated major cities, with rural deployment being slower."

3GPP Release 16 and SA

SA architectures are based on the latest release from the standardization body 3GPP. Release 16, finalized in March, paves the way for deployment of fully virtualized networks using 5G SA cores and the facilitation of edge computing, network slicing, and massive IoT.

Release 16 introduces enhanced ultra-reliable low-latency communication (eURLLC) to deliver millisecond latency, time-sensitive networking, and improvements to "high-power high-tower" transmissions for supporting higher mobility and better coverage of terrestrial TV. Also introduced in Release 16 is high-reliability 5G positioning, which enables a broad set of 5G IoT use cases, such as as­set tracking.

According to a recent Nevion global survey of broadcasters, 82% believe that cellular networks like 5G will eventually replace traditional broadcast distribution as the preferred way to access TV content. More than a third (37%) expect this to happen within 2 years.

Looking further out, 3GPP's Release 17 (due in summer 2021) includes enhancements to NR Broadcast and Multicast, a mixed mode for enabling dynamic switching between unicast and multicast, both in the downlink and the uplink. It will also feature 5G NR-Light, targeting new efficiencies for lower-complexity devices such as industrial cameras, higher-end wearables, and lower-tier smartphones.

Future Applications

Applications for 5G capabilities are gaining ground, although most remain experimental or theoretical. 5G Americas' Pearson suggests that 5G live streaming at sporting events or concerts could bring "instantaneous feedback from thousands of mobile device users around the world in new ways to bring a mobile crowd into the experience."

For individual consumers, Hemmer says Verizon's existing 5G Ultra Wideband running on mmWave spectrum has already achieved peak speeds of a gigabit or more, allowing n enhanced, immersive NFL experience at the Super Bowl; production partnerships with Disney; enhanced gaming with partners such as Bethesda Gaming; and more.

"Many of the use cases we are seeing emerge are with our business partners," Hemmer says. "Corning is implementing smart manufacturing solutions. We recently lit up the first 5G-enabled hospital with Verizon 5G with the VA at their hospital in Palo Alto and plan to test how 5G could enhance AR/VR applications for medical training [and] enable telemedicine and remote patient monitoring."

Ericsson president and CEO Börje Ekholm summed it up neatly in an online keynote: "While 4G gave us the app economy, 5G will be the greatest open innovation platform ever."

That is predicated on 5G SA cores, which T-Mobile declares to be the future of wireless connectivity, bringing 5G closer to reaching its true potential, with faster speeds, lower latency, and massive connectivity. "SA, especially when coupled with core network slicing in the future, will lead to an environment where transformative applications are made possible—things like connected self-driving vehicles, supercharged IoT, real-time translation … and things we haven't even dreamed of yet," according to T-Mobile. 

Yet not all 5G standalone cores are created equal. Hemmer says, "Not all cores are designed to be able to fully take advantage of the more robust technologies such as network slicing and Mobile Edge Compute. By building the Verizon 5G core with cloud-native containerized architecture, we will be able to achieve new levels of operational automation, flexibility, and adaptability."

In the interview with Qualcomm's Rogers, he says the defining difference between 4G and 5G is not throughput or capacity but (with the new 3GPP releases), a drive into verticals and different industries: "Vehicle-to-vehicle and vehicle-to-X is not possible without standards. Smart buildings, smart cities, ports utilities, [and] other infrastructure connected through 5G and the management of these facilities will be revolutionized based on 5G.

"As you push computing to the edge of the network," Rogers adds, "you are going to see new form factors and XR, augmented and VR experiences using new devices we've never really seen before."

A 5G SA cloud-native virtualized core, in combination with built-in AI/machine learning, will enable the dynamic allocation of the appropriate resource (network slicing). It will also allow for automated network configuration changes, including the ability to scale up or scale down network function capacity to provide the right service levels and network resources needed for each use case.

For example, network slicing is expected to play an important role in providing guaranteed QoS, which is critically important in terms of bandwidth and latency and is required for high-value content production such as sports. Operators can take advantage of network slicing to offer differentiated network services for content production.

As deployment continues and the ecosystem builds up around the technology, video applications will evolve. Some use case examples provided by Verizon include rendering high-end gaming graphics on low-cost, portable devices; creating 360 degrees of sound for a headset, allowing the user to fully experience surround sound in a virtual, mobile environment; and developing dynamic 3D image recognition to overlay virtual information on real-world objects in near real time.

The world is going to need these capabilities as it digs itself out of the COVID-19-induced economic hole and builds a stronger economy. Indeed, telcos are expected to be pivotal in dri­ving the global economy forward as the world emerges from the initial phase of the pandemic.

"[Telcos] will be key in enabling a new digital society," says ABI Research. "Beyond the obvious conclusions that we are likely to see, including more remote working, more virtual meetings, and more virtual teams, … a raft of new solutions could accelerate GDP growth and all will require a robust level of support from the telco community."

 

Tuesday, 27 October 2020

Recreating the Edit Suite: How Post Teams Are Keeping Teamwork Alive While Collaborating Remotely

copywritten for AVID 

Filmmaking is a collaborative art. With the current remote work paradigm changing how the industry gets things done, many creatives are missing the easy collaboration and camaraderie they’re used to having in a face-to-face environment.

http://www.avidblogs.com/recreating-the-edit-suite-how-post-teams-are-keeping-teamwork-alive-while-collaborating-remotely/

Technology goes a long way toward bridging collaboration gaps in remote post-production workflows, but it’s hard to replicate the spontaneity of having everyone physically present. Even the sheer enjoyment of bonding over a project can get lost when your team is remote.

Since it’s going to be a while before you can stand over someone’s shoulder in the edit bay again, addressing the challenge of remote video editing collaboration will help to chart the course for the team’s cohesion and creative inspiration for the long term.

“We are an industry of creative storytellers, where creative communication is essential in producing a quality product,” says Jai Cave, technical operations director at UK post facility Envy.

As Tessa Treadway, VP of post at Film 45, puts it, “While technologies allow us to connect our media, our greatest challenge today is figuring out how to connect our minds.”

Here are some techniques editors are using to keep communication flowing freely across remote video editing workflows as they adapt to the industry’s new normal.

Highlight Remote Desktop Solutions

To keep editorial connected across a remote post-production workflow, collaboration systems allow teams to stream content to each other and discuss changes in real time.

“It’s hard to compete with the magic and momentum of live collaboration,” says Brad Thomas, cofounder and COO of Evercast. “Having to pass content back and forth and wait for feedback puts a huge damper on the creative process. But under the right circumstances, it can be done.”

He explains that Evercast works with an ultra-low latency experience (a nearly imperceptible 200 milliseconds) to facilitate live collaboration on video content. The goal is to create a channel for “natural communication” that feels “just like you’re sitting in the same physical space, shoulder to shoulder.”

“Instead of having to upload and download files and pass notes back and forth, Evercast enables you to simply ‘hop into’ a virtual room from your Google Chrome browser and interact with your team while experiencing high quality video and audio from the editor’s workstation,” he says.

Lisa Bromwell, ACE, says she became used to treating desktop sharing app TeamViewer as a stand-in for her edit room while remotely finishing two episodes of Netflix drama Shadow and Bone.

“My lame joke was to call my assistant (Paul Alderman) and ask him to step into my ‘room’,” she says. “Once he was logged on, we could look at things together, talk about the cut, look at something in the source monitor, look at the timeline. While it is not ideal, it does approximate standing in the room together and talking over either technical or creative issues.”

Lead the Team by Example

Aside from providing reliable tech, production heads can improve remote editing collaboration by scheduling predictable group meetings and encouraging everyone to connect as questions arise.

“It’s easy to feel isolated and invisible when working remotely, so it’s crucial to have a ‘virtual office door’ to knock on at any time—via Slack, texting, or a simple phone call,” says Treadway. “We all need to be accessible to the team. I believe connection and camaraderie is created by the team, not the physical space we occupy.”

Treadway says it’s the leadership’s responsibility to provide effective communication tools, structure, and creative platforms to nurture this connection.

Film 45, the Santa Monica-based, Emmy Award-winning production and post company led by filmmaker Peter Berg, holds daily morning virtual meetings. There, the team distributes information, shares ideas and challenges, and reports on any personal or professional “wins.”

“This meeting becomes a daily rally, and we see how the team extends beyond our living rooms and into a full network of peers,” reports Treadway. “Slack has been an excellent tool for unscheduled communication for one-on-one or groups, and simulates impromptu conversations.”

Promote Softer Collaboration

It’s also possible to foster team spirit with more informal techniques. For Steve Mirkovich, ACE, patience and understanding is a must. He says, “As the lead editor on [Sony Pictures] feature Escape Room 2, I feel I need to be the cheerleader. I believe staying positive and keeping things in perspective will help us all to get through this very weird and challenging time.”

While editing the feature remotely, Mirkovich has spent a lot of time talking with director Adam Robitel using Zoom meetings and Evercast sessions. “Working remotely can sometimes be clunky and slow until bugs are worked out,” he cautions. “Patience and focus are key.”

For Bromwell, the lack of interpersonal communication is the hardest part about working from home. She makes a point of regularly talking with her assistant about things other than the job:

“Current events, his life, his dog . . . things that naturally come up when you see each other at the office but get oddly lost when you’re working remotely and the tendency is to keep the focus on the work.”

She adds that pre-COVID, the team on Shadow and Bone had a standing “whiskey hour” every Friday at 6 pm. They have continued that remotely as a Zoom whiskey hour.

“Erin Conley [assistant to showrunner Eric Heisserer] organizes it. It’s a nice way to chill and actually see the faces of the people you’re working with.”

At Film 45, everyone is encouraged to speak at the daily virtual meetings. “This is where we combine work questions with just being human,” says Treadway. “We celebrate birthdays, we show off our pets, we introduce our children when they inevitably walk into the meeting asking mom/dad a question.”

“We don’t ignore the fact that we’re working from home,” she emphasizes. “We integrate it into the process.”

The Productive Positives of Going Remote

With only internet traffic to contend with, remote working can in some cases be more productive and collaborative than going into the office. For example, having the team more available for video calls has flattened out geographical differences between studios.

“Putting aside time zone differences, meeting someone who works in another continent is now as easy as meeting one of our local colleagues,” says Michele Sciolette, CTO at Cinesite, on the studio’s site. “Even for quick unexpected meetings that would have normally required room bookings in multiple studios. Some members, particularly from our support teams, suggest that the lack of frequent interruptions is making them more productive.”

Jack Jones, technical director at London documentary specialist Roundtable, says their assist teams can work “flawlessly” by connecting to media and desktops in the facility over a virtual private network. “The assists are transcoding, ingesting rushes, or troubleshooting crashed machines. They can do all their jobs remotely without issue. In turn, that opens up the ability to have fewer staff members on site. Where physical space is restricted post-COVID, it makes sense to use the capacity you have for clients.”

Whether working with COVID conditions or in a post-pandemic world, it would be endlessly difficult to reverse the global experience of collaborative remote video editing—and doing so wouldn’t be worthwhile. “Working remotely can actually enhance creativity,” insists Thomas, “because it allows creatives to work wherever and whenever inspires them.” As remote video editing collaboration continues, post houses will have to continue to pivot and evolve their approach.

“One thing is certain,” Treadway says. “It’s the combination of tech and team that creates a successful remote environment.”