Wednesday, 7 July 2021

How Are Video Professionals Responding to Remote Workflows?

NAB

As working-from-home (WFH) and flexible working become the new norm, video production teams across countless industries have had to think on their feet, get creative, and adopt new approaches to triumph over ever-emerging obstacles.

https://amplify.nabshow.com/articles/how-are-video-professionals-responding-to-remote-workflows-heres-a-report/

“We’ve made incremental changes to almost everybody’s workflow,” Eric Lund, senior manager, post technology at Warner Media Studios explained to IPV. “We’ve made everything better and better, and we’ve discovered new things.”

When the pandemic began, Warner Media’s digital services production team seized the opportunity to flex their adaptability. For instance, they were approached about editing a Zoom-based talk show — a challenging project in and of itself, but the team needed to have it on-air in only about five days.

After the production team shot it remotely on their Zoom systems, they sent it over to Warner Media Studios. They cut the show and did the sound design in New York, color corrected and finished it remotely in Atlanta, and then sent it to a finishing editor, who prepped it all for air and got it out under the deadline.

The team also completed digital premieres for three television shows and two movies, as well as a launch for HBO Max. The HBO Max launch included a full-day event and a 25-minute internal video, involving around 40 interviews.

“We ramped up very quickly, and we were able to get essentially everyone back to work. We just said, all right — here is the new reality, let’s do it!” said Jacob Anderson, Manager, Asset Management, Warner Media Studios.

COPA90’s Smart Cloud Move

Sports broadcasters have always drawn heavily upon archive footage — it’s helpful for reels, highlights, and more. But current circumstances have made archive footage all the more important, as teams repurpose it to make new, engaging content for fans. However, many sports broadcasters struggle to organize and provide their remote-working team with easy, speedy access to archive footage.

According to Barry Flanigan, Chief Product Officer of soccer fan site COPA90, it’s vital now more than ever to have the right tools to utilize and deploy the archive more effectively. “That will very definitely continue beyond this crisis. That’s something that we’ve always planned to do with our cloud infrastructure.”

As long ago as 2018, ahead of the FIFA World Cup in Russia, the team decided to move the majority of their content workflows into the cloud. This made sense — most of COPA90’s ‘human interest’ sports content is evergreen, continuing to resonate even when it’s a few years old.

When lockdown measures hit, this prep-work put the team in a prime position. Much of their footage was already in the cloud, tagged with metadata, enabling easy access. This made it easier for the team to repackage archive content at a time when shooting fresh content wasn’t really possible.

Flanigan says, “I think we made a smart decision early on to move to the cloud, which is now paying dividends at a time like this.”

WFH Expansion

Perhaps no one knows more about the necessity to prepare for the future of video than traditional broadcasters. To compete with OTT platforms, broadcasters must adapt, stay flexible, and seek new avenues for success. This means increasing and diversifying their long-term outputs.

“These times have allowed us to really step up in a lot of ways and see areas where we’ve become stagnant and areas where we didn’t even know we wanted to expand,” says Michelle Conroy, production manager at NYC facility Crew Cuts.

There’s no doubt that the shift to WFH culture has revolutionized the broadcasting industry. “Before the pandemic I think the industry would have gone through quite a lot of pain in trying to convince people to edit from home,” says Brian Leonard, Head of Engineering, IMG.

However, Leonard notes that thanks to this transition, the door has flung open to welcome new possibilities, such as broadcasters expanding their editing capabilities beyond what was possible in their previous physical spaces: “In theory, where before we had 50 suites on-site, with people working from home, we could go up to 60.”

And very notably, during these times, the broadcasting industry has also seen a shift in the types of content audiences are willing to engage with. Recorded Zoom conversations and cell phone videos have become widely accepted forms of content.

According to Leonard, this trend has also enabled more inclusion and diversity: “We’re talking to single moms who wouldn’t have been able to come into the studio. We’re bringing in a more diverse clientele.”

Of course, for broadcasters looking to capitalize on all of the lucrative trends and possibilities, it’s crucial to build an infrastructure that will stand the test of time.

As Sesame Street editor Memo says, “Our business is lucky that it happened in 2020, when there’s enough of a remote infrastructure that we could actually make it work.”

The common denominator in all these examples is working with a media asset management system, specifically the Curator software from IPV.

Among its attributes is the ability for productions to extend access to media files remotely via the cloud, to automate proxy creation and save on bandwidth, the ability to track assets and to use rich metadata to search content. These aren’t unique to IPV but the case studies of its clients are: “In these ‘unprecedented times’ and beyond, there’s no question that MAM will play a critical role in video production’s long-term future.”

 

Immersive Video Looks (and Feels) Like the Future

NAB

Like the Holy Grail, the quest for immersion is the often the producer’s and creative media technologist’s end game. Achieve peak immersivity and audiences will flock to an experience that in some way goes beyond the two-dimensional rectangle of the screen.

https://amplify.nabshow.com/articles/immersive-video-looks-and-feels-like-the-future/

Immersion means deeper viewer involvement in the content — the game, the story (the video service, the advertising, the sponsors) and it’s usually held to mean a presentational upgrade.

Greater resolution is front and center of the immersive experience but it’s not just about pixels. All the UHD attributes come into play: dynamic range, frame rate, color gamut, bit depth and enhanced audio.

Movie theaters have been locked in a battle with TV and home cinema for decades offering technology responses to greater visual spectacle and sensory overload. Examples range from widescreen VistaVision to 3D and haptic seating.

In live sports, the goal is to immerse the fan in the game, to get them closer to an experience of being there at the stadium with the best seat in the house. Experiments here include virtual reality and the ability to interact with the live stream by selecting camera angles.

Computer gaming takes this further by presenting players with first-person perspectives and the tools to participate in the action.

YouTube presents alternatives in the form of a single shot fly through of a Minnesota bowling alley captured on racing drone that went viral and the trend for atmospheric music videos designed to more deeply connect a viewer or listener with a place or time.

Developments in R&D but with a view to creating greater involvement between the viewer and the content include location-based virtual experiences such as The Illuminarium — which just opened in Atlanta — socially connected online virtual experiences afforded by the Metaverse, and — in the long run — video holography.

The term is broad and like the Arthurian knights of yore, the quest for the ultimate in experiential entertainment will likely always remain out of touch.

 

Up, Up, Up: SVODs Show Producers the Money

NAB

If you thought spending on content was impacted by last year’s COVID-ravaged schedules, then think again. Budgets for streaming content just keep on going up.

https://amplify.nabshow.com/articles/up-up-up-svods-show-producers-the-money/

Fresh data from London-based fin-tech platform Purely reveals that the total global pot of money for producing and licensing new content rose 16.4% to $220.2 billion in 2020 and it’s on course to top that by the end of 2021.

What’s more, Netflix has been knocked off its perch as number 1 big spender. It’s now in third place behind Disney which grossed $28.6 billion for 2020 and the combined Warner Media-Discovery empire whose content spend totaled $20.8 billion. Netflix’s outlay was a paltry $15.1 billion.

Presented in the form of a infographic created by digital publisher Visual Capitalist, the Purely Streamonomics data shows how audience demand, content expenditure, and TV budgets all reached all-time highs.

Based on current trend lines, Purely expect production spending to top $250 billion by the end of 2021—and to keep rising beyond that Warner Media-Discovery, Amazon-MGM and Televisa-Univision start to impact the business.

While the actual number of films that went into production dropped last year, and TV series across the board experienced various shooting delays depending on location, more cash than ever was committed to content last year. Purely suggest this is a reflection of continually rising production budgets, of accounting methods for delayed titles that had already received the green-light, and of greater rights-buying activity as platforms sought to plug their programming gaps with content made by others.

What is remarkable about these record numbers is that the industry’s spending has yet to bump up against any natural ceiling.

According to Wayne Marc Godfrey, founder and CEO of Purely, “Streaming is not just displacing traditional sources of entertainment revenue such as pay-TV and linear broadcasting, it is actually expanding the global marketplace for video. The big question then becomes whether there are enough good stories out there, and talent to tell them, to keep fueling this transformation.”

Since 2019, the number of global customers subscribing to streaming video platforms has grown from 642 million to more than 1.1 billion, a 71% leap that has been turbo-charged by months of enforced lockdowns at home, per Purely.

It’s not just the most familiar global SVOD platforms leading growth. Joining the hunt for monthly customers are several regional champions including France’s Salto, Scandinavia’s Viaplay, India’s Eros Now and ALT Balaji as well as the Chinese mega-platforms, iQIYI, Tencent Video and Youkou.

“All these SVOD platforms are fighting over the bragging rights to distinctive content as they spend billions of dollars trying to attract and then retain subscribers for the long-term,” Purely states.

It expects the total number of subscribers to reach at 1.6 billion by 2025 — representing about a fifth of the planet’s total projected population.

There is blowback from households wanting to cut the cost of their SVOD bundles. Rather than turn to linear TV or cable networks, they are likely to sign up to free of charge ad-supported streaming services.

AVOD is particularly prevalent in Asia where AVOD accounts for the majority of sector revenues outside China and where ad-funded players such as YouTube have become the destination for professional content in Korea, Japan and Southeast Asia.

Not only is the deluge of new streaming platforms transforming consumption, it is also turning the economic calculus for film and TV producers on its head.

The research shows that, in the US, average budgets across scripted, unscripted, daytime and kids rose 16.5% in 2020. In TV and film, this budget inflation has largely been created by the fight for talent exclusivity, especially with regard to contracting top names and locking them in for future seasons of a show, together with “the necessary hike in production costs in order to deliver lavish, glossy and impactful shows that act as subscription drivers to a platform,” such as Disney+’s The Mandalorian or WandaVision.

The cost of introducing and monitoring COVID protocols in 2020 also added 20%-30% to production budgets, per the report. Even if these costs subside, industry talk of introducing “green production initiatives” could see a further 5-10% added over time. Either way, costs keep creeping up.

“It is only a short matter of time before we see the $50 million television episode,” says Godfrey, who urges producers to “follow the money.”

“Now that the tables have well and truly turned, a domestic public service broadcaster or local linear network should no longer be the main goal for an ambitious production business. I think the time has come for [producers] to take their biggest and best ideas — whether scripted or unscripted — to the streamers.”

 


Tuesday, 6 July 2021

Aspect Ratio Rethink: When and Why to Shoot 4:3

NAB

TV shows are made to be screened in 16:9 and films are shown in cinema also in widescreen formats. So why are films and TV shows increasingly being made in old school 4:3?

https://amplify.nabshow.com/articles/aspect-ratio-rethink-when-and-why-to-shoot-43/

We explain.

In the early days of cinema silent films were shot with an aspect ratio of 1.33:1. When sound came along the additional space for the soundtracks moved the ratio to 1.37:1, a standard enshrined by the Academy. Its ubiquity was cemented with the arrival of TV which handily suited the box-like 4:3 format with which the ratio pair are colloquially referred to.

At the same time as cinema started to adopt a wider aspect ratio (CinemaScope, VistaVision) to differentiate it from TV. The standard theatrical aspect ratio – 1.85:1 or 2.39:1 – later parlayed into the reformatting of TV screens to 16:9 (1.77:1/1.78:1) since when wide has remained the standard “with 4:3 aspect ratio pixels a relic of the 20th century” writes Rafael Abreu in a handy potted history of the subject https://www.studiobinder.com/blog/what-is-4-3-aspect-ratio/

More recently many filmmakers have chosen to return to the boxier 4:3 format – why?

After all modern audiences and many esteemed filmmakers (Chris Nolan, Quentin Tarantino) seem to hold that bigger and wider is better and more spectacular. Only one film shot (mostly) in 4:3 (1.37) has made over U$100 million at the box office. That is Wes Anderson’s Grand Budapest Hotel (2014).

As Fandor’s excellent overview points out, since 4:3 is riskier financially, it’s clearly a very important artistic choice by the filmmaker.

Often the decision is as simple as evoking a certain time period.

Films like The Artist (best picture Oscar winner in 2011) used 1.37 since it chimed with a story about early Hollywood told in black-and-white and silent.

In Grand Budapest Hotel, DP Robert Yeoman shot 1.37 for the scenes that take place with concierge Gustave to signify the time period of the 1930s and 1940s. It’s a technique he also uses on Anderson’s latest The French Dispatch.

“We loved the compositional possibilities that this aspect gave us and so carried this over to The French Dispatch,” Yeoman says. “Occasionally we would change to a wider ratio for emphasis, just as we did using colour.”

Even if a film takes place in an era after the emergence of widescreen format, 4:3 can still be used to enhance the feeling of an earlier time, notes Fandor’s Jacob Swinney.

It’s a period choice

Jonah Hill’s Mid90s (2018) for example was shot in 4:3 to convey nostalgia for the titular era.

Director Paweł Pawlikowski and DP Łukasz Żal filmed Ida (2013) in 4:3 not just to hint at the 1960s storyline but also because they wanted the characters to dominate the frame.

Even in wides, Ida is shown larger than life in way not possible with a wider format.

They went with a similar aesthetic choice for 2018’s Cold War which landed Żal an Oscar nomination. Here the square aspect ratio lent itself to painterly composition with Citizen Kane (shot 1.33:1) providing Żal with a template for building depth of field.

“The format allows you to suggest things beyond the frame,” Żal explains. “At the beginning of the film everything has depth and we place the camera high up with as many layers as possible.”

Ida inspired director Paul Schrader to shoot 2017 drama First Reformed in 4:3. He explained that the aspect ratio “was about taking things away from you, taking the edges off the frame. When you start withholding things people become a little uncomfortable. Out of that disparity you can start to weave another reality.”

Confines the characters

Other recent movies which have used a square frame to evoke claustrophobia, or horror, include harrowing concentration camp first person perspective drama Son Of Saul and Charlie Kaufman’s study of depression and anxiety I’m Thinking of Ending Things (lensed again by Żal).

This box aspect ratio also drives the virtual lines forcing more of the human body into the frame.

For psychological horror The Lighthouse, director Robert Eggers and cinematographer Jarin Blaschke select a square (5:6) aspect ratio to confine the characters (Willem Dafoe, Robert Pattinson) inside the vertical lighthouse.

This seems to be the thought process of director Zack Snyder (of which more below) and Andrea Arnold who has shot her last three films in 1.37 including American Honey.

In referencing this Arnold stated, “My films are usually about one person and their experiences of the world. So, I'm mainly following them around, filming them quite closely. And it's a very beautiful frame for one person. It frames them with a huge amount of respect.”

Cinematographer George Steel crafted the aerial vistas for Tom Harper’s The Aeronauts (2019) ballooning saga using spherical Panavision lenses in 1.85 contrasted with cropped aspect ratios for the flashback scenes on terra firma in 2.39.

These are all artistic choices but as Swinney observes “using 4:3 surely seems more intimate and personal for drama” than the ultra-widescreen of, say, The Hateful Eight.

“Ironically, while we may be accustomed to wide images on the big screen, outside of the cinema it seems we consume the majority of content on our phones which caters to boxier aspect ratios,” he says.

Changing aspect ratios mid-story

Trailers for widescreen movies are cropped down to a square for Instagram or Facebook. While this trend may be frustrating for some filmmakers, others are using it creatively to their advantage.

Steven Soderburgh’s feature experiment Unsane (2018) is screened with a ratio of 1.56:1 (close to the Academy ratio) perhaps because he no other choice given he was using an iPhone but suitable for a story of creeping paranoia and mental illness.

Season 1 of Amazon drama Homecoming used changing aspect ratios to relay different story timelines. The past is shown in 16:9, while flash forwards are told in an unsettling pillar-box (or iPhone vertical) 1:1.

 “There was something claustrophobic and limited about that box aspect ratio that I thought fit what [the main character played by Julia Roberts] was going through in that [future] storyline,” series creator Sam Esmail said. 

The 1:1 ratio also led to DP Tod Campbell’s favorite shot in the series, when an investigator goes down a flight of stairs. That might not sound compelling, but he frames the descent within the square ratio, dropping the camera down the stairwell with a graceful elegance. “Oh my god, it’s ‘Vertigo,'” he said. “I’m so in love with that shot. It’s literally the desktop to my computer.”

These choices stem from the indie filmmaking scene but have now bled into the biggest budget, highest profile episodic drama and studio features.

Marvel and DC join the party

Disney+ series WandaVision used a variety of aspect ratios for different parts of its narrative which begins in 4:3 in homage to TV of the ‘50s, ‘60s, and ‘70s. By the time, Wanda has punched through the forcefield (called Hex) surrounding Westview into the Marvel Cinematic Universe in episode 3, the aspect ratio dramatically shifts from 1.33:1 (4:3) to the cinema screen ratio 2.39:1, explains DP Jess Hall. At the same time, the warm tones of the family sitcom are left behind for the cooler sheen of the MCU. 

Then there’s the Snyder Cut of Justice League. The DC Universe blockbuster was originally released by Warners in 2017, this year recut and ‘restored’ by the director.

During Justice Con, Snyder explained that re-watching the parts of Batman v Superman: Dawn of Justice filmed for IMAX, “really got me obsessed with the big square. My intent was to have the entire [Justice League] film play in a gigantic 1:4:3 aspect ratio on a giant IMAX screen." 

There was another reason behind the sizing, too. Superhero characters tend to be shown in vertical compositions (unless its Superman flying) so for Snyder, using the 1.43 aspect ratio emphasises this.

"I really started just compositionally falling in love with that concept," Snyder said.

But since the theatrical version released with a different aspect ratio, some restoration work went in to readying the Snyder Cut for release. "Everything's composed and shot that way, and a lot of the restoration is trying to restore the full frame. It's like literally a restoration project at this point, because there were certain scenes that were just fucked up by the crop, so we have to kind of like fix it, a little bit," he added.

Snyder also talked about the aspect ratio with The New York Times. "It's in the same aspect ratio as First Cow [Kelly Reichardt’s 1820s set drama]. Those two movies share some common DNA, I think. I would love that in a double feature, First Cow and the Snyder Cut of Justice League.”

You can argue that deviating from standard aspect ratios especially within the same story is a gimmick if not used wisely. Filmmakers could be tempted to play with aspect ratio as a means to differentiate their film without grounding the choice in the story.

It’s a tool and like stereo 3D,  editing, jerky camera-work or warped lens effects anything that distracts from the audience’s involvement in the story isn’t working.

 

Quantum computing's 1000 qubit leap by 2023

RedShark News

For decades, quantum computing has been viewed as a futuristic technology: it would change everything, if it ever moved from the fantastical to the practical. The cogs now appear to be shifting – and fast.

https://www.redsharknews.com/quantum-computings-1000-qubit-leap-by-2023

The CEO and co-founder at quantum software platform Classiq thinks quantum computing is set to become the next global “space race.”

Meanwhile, IBM say we’re about to enter the Quantum Decade. Big Blue reckons quantum will help drive “the most significant computing revolution in 60 years.”

The building blocks of quantum computing are already emerging. Last week came news of an operating system that allows the same quantum software to run on different types of quantum computing hardware. 

This week California’s Rigetti Computing announced plans to build a quantum computer with 80 qubits by the end of the year. The significance of this is that it uses a modular architecture that solves some of the key scaling challenges associated with fault-tolerant quantum computers.

For perspective: In 2019, Google demonstrated a calculation on its quantum machine using 53 qubits, that was able to calculate outputs from a random number generator in 3 minutes and 20 seconds. Even the most powerful supercomputer would take potentially thousands of years to complete the same calculation.

The complexity of a 100-qubit quantum computer would require more classical bits than there are atoms on the planet Earth, according to IBM.

IBM itself predicts quantum computing hardware will reach the order of 1,000 qubits by 2023 and that this will lead to practical application, “characterized by systems executing error-corrected circuits and widespread adoption” as soon as 2030.

IBM’s own focus is on the integration of quantum computing with artificial intelligence, and classical computing into hybrid multicloud workflows. The intersection of classical bits, qubits, and AI ‘neurons’ —not quantum computing alone—are driving the future of computing, it says.

Classical versus Quantum match-up

To counter naysayers who dismiss quantum “as too arcane, a far-off, far-out pursuit for academics and theorists” IBM recently shared details of a quantum versus classical computing showdown.

“We're exploring a very simple question,” explained its research team “How does the computational power differ when a computer has access to classical scratch space versus quantum scratch space?”

We wouldn’t be hearing about it had quantum not won. It delivered a 100% success rate versus the 87.5% of normal compute.

“We show that qubits, even today's noisy qubits, offer more value than bits as a medium of storage during computations," said the researchers. 

IBM stresses that classical computer bits (which can store information as either a 0 or 1) will still be needed even scientists explore the probabilistic states of quibits.

“Quantum computing will not replace classical computing, it will extend and complement it,” IBM state. “Even for the problems that quantum computers can solve better, we will still need classical computers. Because data input and output will continue to be classical, quantum computers and quantum programs will require a combination of classical and quantum processing.”

Different ways to skin a (Schrödinger's) cat

There are many different ways to build a quantum computer and this depends on the type of qubit technology used. Quantum hardware companies are currently trying to scale up their quantum computers and include enough error corrected qubits to reach quantum advantage.

Finnish start-up IQM, for example, use superconducting qubits in its quantum hardware, similar to Google and to Rigetti.  Canadian start-up Xanadu takes a different approach by using photons rather than electrons to carry information. PsiQuantum and Orca Computing are also developing photonic hardware. PsiQuantum expect a commercial quantum computer as soon as 2025. Another route to quantum hardware, called, Ion-trap, is being pursued by IonQ, Universal Quantum, and Oxford Ionics.

“We don’t yet know which type of hardware will win the race and be able to run commercially relevant applications,” says software developer Riverlane, which is why it’s quantum OS is designed cross-platform.

A thousand qubits is just the start. Forecasters have systems calculating with 1 million qubits in the not too distant future.

There’s no doubt there’s a race to what is called ‘Quantum Advantage’.  This is when a computing task can be performed more efficiently, more cost effectively, or with better quality using quantum computers – and therefore money to be made.

Getting to Quantum Advantage will not happen overnight, admits IBM, although its subsequent argument sounds like a sales pitch for more PCs.

“Organisations that enhance their classical computing capabilities and aggressively explore the potential for industry transformation will be best positioned to seize Quantum Advantage.”

The types of applications for future state quantum computing include discovering new drugs, managing financial risk, and re-engineering supply chains. Those backing quantum computing hope that as the technology emerges it will accelerate solutions to increasingly complex societal, macroeconomic, and environmental problems on a global scale.

Where have we heard that before? Ah yes, just a few years back when telco operators and cheerleaders at the GSMA wanted governments to open up frequencies and invest in 5G networks.

We’re still waiting for those grand claims to come to pass.

 


Saturday, 3 July 2021

Quantum Computing just got desktop sized

RedShark

Quantum computing is coming on leaps and bounds. Now there’s an operating system available on a chip thanks to a Cambridge University-led consortia with a vision is make quantum computers as transparent and well known as RaspberryPi. 

https://www.redsharknews.com/quantum-computing-just-got-desktop-sized

This “sensational breakthrough” is likened by the Cambridge Independent Press to the moment during the 1960s when computers shrunk from being room-sized to being sat on top of a desk. 

Around 50 quantum computers have been built to date, and they all use different software – there is no quantum equivalent of Windows, IOS or Linux. The new project will deliver an OS that allows the same quantum software to run on different types of quantum computing hardware.  

The system, full name Deltaflow.OS (full name Deltaflow-on-ARTIQ) has been designed by Cambridge Uni startup Riverlane. It runs on a chip developed by consortium member Seeqc using a fraction of the space necessary in previous hardware. Seeqc is headquartered in the US with a major R&D site in the UK. 

“In its most simple terms, we have put something that once filled a room onto a chip the size of a coin, and it works,” said Dr. Matthew Hutchings, chief product officer and co-founder of Seeqc in a press statement. 

“This is as significant for the future of quantum computers as the microchip itself was for commercialising traditional computers, allowing them to be produced cost-effectively and at scale.” 

Quantum computers store information in the form of quantum bits, or qubits. Like Schrödinger's cat (which would not have had the colloquial impact had he chosen an inanimate object), qubits can exist in two different information states at the same time.  

But for quantum computers to be truly powerful they must be able to scale up to include many more qubits, making it possible to solve some seriously challenging problems. 

“Where it took a rackful of electronics to control the qubits, now it’s available on a chip the size of a penny,” Hutchings explained. “All the functionality is on a chip, so we’ve solved the issue for the quantum era.” 

Riverlane has a grand vision: an operating system that makes quantum software portable across qubit technologies - scalable to millions of qubits. 

“That teases the highest possible performance out of every qubit – even for applications like error correction that need fast feedback loops,” it mission states.  

Deltaflow-on-ARTIQ is the first step towards achieving this vision.  

What will quantum computing be used for? 

With enough qubits, quantum computers can process complex calculations at very high speeds. 

One application the vast processing power can be used for is to simulate digital versions of chemical compounds, to test out theories and predict real chemical actions, without the use of a physical lab. According to Riverlane, it takes around $1bn to bring a new drug to market and many years of research, tests, and clinical trials. Quantum computing could offer a short-cut.  

The benefits of that post-COVID-19, are clear. 

Another benefit: better batteries. The UK government has ambitious aims to achieve net-zero emissions by 2050.  In a similar way to drug development, quantum computers can be used to create a ‘virtual lab’ environment that enables a much faster, less expensive, and more robust way to screen battery materials. This sustainable method will allow for improved research and development towards a cleaner future. 

Quantum developments in logistics, weather prediction, cybersecurity, and finance are expected. 

Other members of the Riverlane group include Hitachi Europe, Oxford Quantum Circuits, chip designer Arm and the National Physical Laboratory. They will evolve their technology and develop firmware for their quantum processors that will later interface with Deltaflow.OS. 

Hitachi Europe, for example, is building a quantum computer based on the same microprocessor technology currently found find in laptops, cars and mobile phones.  

Riverlane is not the only quantum specialist in Cambridge. Cambridge Quantum Computing says it is building tools for the commercialisation of quantum technologies whose long-term impact will be profound.   

In fact, there’s a bit of a race on to transform quantum computers from experimental technology into commercial products. The name for this is “quantum advantage” and its been likened to a gold rush – as in this article at Nature.

As Riverlane says, “Quantum computing is a high-risk industry that has the potential to lead to huge rewards. The huge amount of private and government investment suggests that it’s not a question of if, but when, with some estimating that we could reach quantum advantage in just five years.” 

If the flourishing quantum ecosystem continues to grow at pace, then perhaps the revolution might happen sooner than we think. 

By the way, Riverlane is hiring. 

 

Thursday, 1 July 2021

How Arena TV brought advanced remote production to The FA Cup Final

IBC

The BBC’s coverage of the FA Cup Final in May was remote produced from Wembley by production teams working at Arena TV’s newly outfitted production hub and utilising CoreTX. This new Arena solution is a decentralised and cloud-based production service that includes a Production Centre based at Arena’s HQ in Redhill, Surrey and the ability to pop-up CoreTX galleries anywhere.

https://www.ibc.org/how-arena-tv-brought-advanced-remote-production-to-the-fa-cup-final/7693.article

In either case, the source media is held on site with feeds remote controlled from surface applications over a Net Insight Nimbra media router with all the benefits of cost, time and carbon savings.

“It’s a different way of producing a programme where key editorial teams are detached from the venue teams, but it went really, really well,” says Phil Bigwood, Executive Producer, BBC TV Sport.

Beginning of the journey

“Our journey to remote production began back in 2016 when we built and launched our first all-IP OB vehicle,” explains Peter Love, Director of Operations at Arena. “It opened up the potential for us to connect multiple IP-enabled cameras on location and remote the feeds over a Net Insight Nimbra system to a central location without needing to ship the large amount of equipment you’d traditionally have on-site.”

With the seeds planted, Arena developed CoreTX, which it believes will change the way sport is produced.

Evolution

Arena’s remote production solution has evolved in concert with the BBC and in particular a contract with the broadcaster to produce the presentation for matches of the FA Cup.

To meet the BBC’s drive to reduce the environmental cost of sending kit and crew to venues, Arena worked with studio facilities provider Dock10 to send remote presentation feeds and select ISOs from FA Cup stadiums into a pre-existing fully routed and monitored MCR at MediaCityUK in Salford. Presentation was switched at Dock10 and transmitted from Dock10.

This remote gallery solution was first used during the fourth round of the FA Cup in February 2020 and subsequently for the 2020 FA Cup Final aired live on BBC One. This application had the added benefit of enabling BBC Sport’s production team to be based in a Covid-safe gallery at Dock10, while reducing the amount of crew and kit sent to Wembley Stadium.

“Essentially what we were able to do was to provide the production team with the same facilities they would have on site uncompromised by being accessed at a remote gallery,” explains Daf Rees, Arena’s Head of Special Projects. “We were able to do a live test a few days before we went on air on that first job in February 2020. Afterwards the match directors said ‘Well that was just like being on the truck wasn’t it?’ We were all very happy with it and we’ve been refining and expanding the solution ever since.”

The pandemic brought a new urgency to this initiative. “Before Covid, remote production has been always been a swear word,” says Love. “Production teams and directors wanted to go to site. Outside broadcast was what they were used to so it was an uphill struggle to push the idea of remote forward. The pandemic has accelerated these workflows right across the broadcast industry.”

CoreTX on the road

The next phase for Arena was to build a gallery as a flyback for clients to put wherever they want, rather than having to use an already kitted facility. All feeds are remote surfaced into the CoreTX mobile gallery, meaning that the presentation team is able to control all the hardware including CCUs which remain on site. The presentation team has the same full-size sound desk and full-size vision mixer plus all the talkback panels and replay controls they are familiar with and they have access to every single source that the OB truck has on site.

“Our goal with CoreTX is to provide the same resources and same experience as a client would normally have on-site but with the all the advantages of a remote workflow,” Love says. “Instead of sending a separate presentation truck we are able to send just one while using the sound and vision cores much more efficiently.”

He continues: “The crew have got more space, the overall environment is more comfortable, you’re not limited to the OB compound in bad weather. The gallery can be installed quickly, fired up on demand and switched off after the event saving on power and energy. There’s no need for permanent equipment and no need to keep the machines running 24/7.”

CoreTX Production Centre

Simultaneously, Arena was building out the heart of CoreTX. This is a centralised production hub based at its Redhill HQ to handle live productions. The aim is to maximise resources by allowing production teams to support multiple events in a day.

CoreTX at Arena houses large galleries, scalable graphics and VT replay facilities, presentation studios and edit suites all connected by two geographically diverse 40Gbps fibre to BT Tower and Telehouse North.

The solution requires only the camera heads to be deployed at the remote location reducing the bandwidth by 90%. Using this workflow, the production team can manage everything from camera matching, colouring and shading through to editing and graphics from a home base and establish a true centralised production.

Confidence guarantee

All of this is underpinned by Net Insight Nimbra 600. “Nimbra is a very robust piece of equipment that enables us to send diverse feeds with confidence,” says Rees. “We can simulate any distance within the kit without actually running signals down lengths of cable. We can verify whether signals handshake or not. Does a sound desk surface or vision mixer or KVM work over tens or hundreds of kilometres away? We can test all this ahead of time.”

This proof of concept led the BBC to greenlight remote production of the FA Cup Final in which all feeds were controlled from Wembley at Arena.

“Telling a client that you’re using the Nimbra to remote their production is one thing, showing them its performance is another,” says Love. “We simulated the distance between Redhill and the stadium using the Nimbra so we were thoroughly confident it would run the real distance.

He continues: “Another advantage of the Nimbra is for venues that don’t necessarily have point-to-point connectivity. You could, for example, send a second link over the public internet. It’s not something a client necessarily wants to do yet because they are used to having dedicated connections, but we are bringing the option online because we believe this dramatically opens up opportunities for live production. Ultimately, we want to do it down a 1Gb pipe with back-up over 5G. Nimbra can take us down that journey.”

CoreTX and the future of live event production

The CoreTX technology is set up to handle everything from small six-camera OB vans up to 60-camera full-scale OB operations. It’s also capable of supporting decentralised production worldwide.

“CoreTX is highly scalable both as a production centre and with location-based elements,” explains Love. “Whether that is augmenting client facilities through the addition of our cloud services, or by working with fellow independent studios looking to enhance their capability for a particular project. We can use any resource in the cloud or in trucks on the road and studios located anywhere and draw it all together. There’s no reason someone can’t fire up a studio in Australia and have a production team in London.”

There’s now a real push to make sure everyone within the supply chain is focused on meeting sustainability targets. CoreTX is Arena’s solution for this.

“The key with CoreTX is that any part or all of the OB truck can be remote if you want it to be. There is so much you can do,” Love concludes.