Monday, 12 July 2021

Sure Seems Like Cinema and Streaming Can Coexist

NAB

Consumers don’t distinguish between content sent FTA or OTT — it’s all just television. So could the way we feel about cinema be headed this way too? The choice to watch fresh movie releases day and date in theaters or on screens at home is here to stay at least while COVID lasts and probably beyond.

https://amplify.nabshow.com/articles/sure-seems-like-cinema-and-streaming-can-coexist/

The numbers are starting to stack up.

The opening weekend results for Marvel Studios’ Black Widow is the latest test case. It recorded the largest domestic box office debut since the pandemic began and the largest domestic opening weekend since Star Wars Episode IX: The Rise of Skywalker in December 2019, amassing $80 million.

The film topped that with another $78 million in overseas markets and earned another $60 million from Disney+ subscribers paying $30 (£20 in the UK) via its Premiere Access program for a cumulative $215 million. Disney was so pleased with its hybrid cinema-SVOD strategy that in an unusual move it issued figures for the film’s streaming performance.

The take-away is that theatrical versus streaming is no zero-sum game. SVOD gain need not be proportionate to exhibition’s loss. Given the right product (for Black Widow a strong female lead, action franchise, decent reviews; see also A Quiet Place Part II) audiences favor the chance to select whether they want to watch it on IMAX or tablet.

Strong recent box office showings clearly demonstrates an appetite for returning to theaters. Covid fears undoubtedly played a part in some choosing to pay a similar amount direct to Disney at home. Audiences want the big budget immersion of something made for cinema whether they view it on the biggest screen or not. The studio wins both ways.

Exhibitors might argue that they miss out on vital concessions revenue from punters staying at home. On the other hand, that Black Widow was successfully “event-ized” as a must-see by being exhibited in theaters surely boosted the coffers of both cinema chain and Mouse House.

Also factor in the 100+ million accounts that Disney has amassed for Disney+ worldwide. If you’re already a subscriber it is easier to pay for a one-off. That almost frictionless purchase is not the case with less ubiquitous streaming platforms which may instead choose to offer new titles as part of the bundle in order to build their base.

For example, Warner Media/Discovery’s HBO and HBO Max share 44 million subs, in part boosted by offering Godzilla vs. Kong at no additional cost to subscribers on the day it released in theaters.

Disney has the mass to make Premium Access work. It’s not clear that other studios do.

Netflix also offers critical mass (north of 200 million households worldwide) but its revenue model is not dependent on theatrical. Certain prestige titles will be given the cachet of a cinema outing to boost Awards profile and salve the conscience of A list directors but exhibition is not a deal breaker and Netflix remains locked out of Cannes.

Movie release and production budget strategies have been ripped up by the pandemic. Talent agents are pondering how to give their clients points on a picture when the traditional means of judging a film’s performance in various windows has been smashed. Meanwhile platform owners are in the habit of releasing streaming figures only when they have a hit. That lack of transparency is causing havoc basis on which to value a movie going forward.

There does seem to be symbiosis between theaters and streaming that will see cinema bounce back, if not to pre-pandemic heights then close to it, but the old certainties such as they were are gone for good.

 

Oh, Yes, Remote Media Workflows Are Here To Stay

NAB

The COVID-19 pandemic has prompted a dramatic reshaping of media workflows. Tasks, processes and business functions never seriously considered as candidates for being done outside the TV studio have moved wholesale offsite as organizations implemented strategies to reduce the risk of employees contracting the virus.

https://amplify.nabshow.com/articles/oh-yes-remote-media-workflows-are-here-to-stay/

Widespread adoption of working offsite that looks to continue post COVID-19.

One of the latest surveys charting this trend is from remote production tech firm Teradici in conjunction with TV Technology.

It found a wide swath of employees have been affected, ranging from management and others in the business office to those in the newsroom, production and master control rooms, traffic department and on air.

Many newsroom workflows are now being done remotely. More than eight of 10 respondents said digital workflows, including social media and web distribution of news, as well as video editing are being done offsite. Seven in 10 respondents said title creation, newsroom computer functions and editorial meetings are also being done remotely.

With one year of experience working during the pandemic, the prevailing attitude is that remote workflows will remain once the pandemic has passed.

Reasons range from improved job satisfaction and productivity to reducing the real estate footprint of the studio and the ability to attract fresh talent to the operation.

Significantly, over 75% of respondents said that more than half of their organization’s staff could work remotely, further pointing to a future in which more employees execute their duties from home.

The survey also revealed a fairly even mix exists among organizations that prefer their employees remotely access existing workstations on site and those that prefer they work in the cloud. However, twice as many respondents said their organization’s preference depends on what workflow is involved.

Further, there appears to be a high degree of uncertainty over whether or not the economics are right at the moment to move media workflows to the cloud.

However, regardless of how they do it, it appears media organizations — and broadcast and cable TV, in particular — have made a breakthrough of sorts in attitudes about and implementation of remote workflows. While not as apparent as the changeover from black-and-white TV to color or SD television to HDTV, the transformation in how the work of television gets done to a model largely rooted in remote workflows may one day prove to be no less significant.

 

Friday, 9 July 2021

FileRunner Keeps the Engine Running at Ignition Creative

copywritten for Sohonet

https://www.sohonet.com/our-resources/blogs/filerunner-keeps-the-engine-running-at-ignition-creative/

The enforced closure of cinemas and knock-on delay in movie releases was taken in its stride by Ignition Creative. The creative production agency shared in the huge uncertainty of studios having to rethink theatrical campaigns amid fears of evaporating exhibition revenue but helped clients pivoted to new strategies.

Ignition’s 40+ team, based in London and LA, provide essential marketing services to support film and TV title releases from major studios. Its work includes the cutting and finishing of trailers for cinema projection to creative execution of trailers and other assets for TV, Facebook, YouTube and the web.

“the continuation in production under lockdown and pent up release of content has provided a lifeline for the industry.”

“The past year has been one of huge uncertainty for everybody in the film business and especially film marketing as situations kept changing,” says Nathaniel Durman, IT Manager, Ignition Creative. “That said, the continuation in production under lockdown and pent up release of content has provided a lifeline for the industry.

Sohonet FileRunner was already used for part of Ignition’s business catering to in-house content creation for clients. Sohonet’s internet connectivity is core to workflows at its Soho facility.

“For this type of production work we were regularly sending 100GB files and sometimes terabytes of data around and FileRunner is best at handling content of such large sizes at speed,” Durman says. “Combined with Sohonet internet connectivity and we have a perfect synergy of service.”

FileRunner was only used for around 10 percent of Ignition’s workload prior to Covid-19. Since March 2020 it has been in routine demand.

“FileRunner has this two-factor authentication system which sounds so simple when you say it, but it’s really extremely effective and highly secure. More than anything, FileRunner is completely reliable. We’ve never done work from home on this scale or switched to remote overnight, but FileRunner was a massive part of that reorientation. It’s a really seamless piece of software that runs in the background and has kept us going.”

Over the course of the year, Ignition Creative has transferred more than 10 Terabytes of data using FileRunner. 

“While remote collaboration works really well there’s nothing like having that creative energy of everyone bouncing ideas off each other in an office,” Durman says. “Also, it means we can get back to performing sound mixes and colour grading in professional suites and with colour-accurate monitors. Plus, we know clients are eager to get back to attending sessions.”

He says there will be a happy medium between working in the facility with home office production. “Everyone is really comfortable using FileRunner and more importantly there has been no interruption for any client. It has never caused us a problem and it will be essential to our workflows in future.

“Although there is still a degree of uncertainty about unlocking from lockdowns due to health concerns we are basically as confident and as comfortable when working from home as in the office and that’s because Sohonet has stabilized the transition

 


Thursday, 8 July 2021

New Immersive Experiences: Inside the Illuminarium

NAB

What museums are to art, cinemas to movies and concert halls to music, Illuminariums are to experiential entertainment. That at least is the grand claim of a new visitor attraction launching July 1 in Atlanta in a project which could expand to a city near you if successful. 

https://amplify.nabshow.com/articles/new-immersive-experiences-inside-the-illuminarium/#.YObFs-vT5wQ.twitter

It also fits into the growing desire - and market for - virtual experiences shared in concert with other people and therefore plugs right into the Metaverse. 

Illuminariums are described as “reprogrammable immersive theaters that surround visitors in a sensory space of sight, sound and scale” according to its operator Legends, which already runs over 150 venues and attractions around the world. 

The Illuminarium concept is backed by $100 million in funding from various investors. 

The heart of the enterprise is a an 8,000-square-foot room featuring 350-feet long, 22-feet high projection screen offering a 240-degree field of view. On this, visitors can view a 50-minute film called Wild which emulates being on safari. 

“We are in many ways VR without the glasses,” Alan Greenberg, Illuminarium’s CEO says “VR’s a singular experience, hard to share, hard to talk to people, hard to have a ‘wow’ moment with somebody. We’re not strapping a computer on your back and fitting goggles on your face.” 

A second location will open in Las Vegas next January, another at Mana in Miami, planned for fall 2022 with other US homes being considered including New York City, Chicago, Toronto, Montreal, LA and Austin.  

According to Fastcompany, the actual experience of visiting Illuminarium will involve a timed entrance into an 8,000-square-foot room where the walls and floor will be covered in the projected safari film. Broken down into distinct chapters, covering different parts of Africa, the film itself is nonlinear, and able to be entered at any point in its roughly 50-minute run time. Visitors can walk throughout the space or find a place to sit. 

It's the kind of attraction that has been a staple of theme parks since at least the 1950s. It’s reminiscent of French theme park Futuroscope which opened in 1987 housing multiple moving image experiences. It included attractions with screens on the floor and walls, motion simulators, projection domes and an IMAX screen showing the first IMAX dramatic film ‘Wings of Courage’ starring Val Kilmer (prior to then IMAX films were docu-style immersions of natural wonders or space). 

Kamen says the sheer scope of its video projection and cutting-edge interactive elements will make the Illuminarium experience unique. 

“It’s only in the last few years that you could really contemplate doing what we’re doing,” he says.

This includes haptic effects in the floor that will make visitors “feel the rumble of a lion walking nearby”. Responsive elements in the bar space we feel “like a flock of birds that bursts from a tree when visitors approach.” 

Other elements, drawing on decades-old of sensory cinematic gimmicks, include “dust” that might kick up as you walk by, and even “authentic scents” that will let you get a whiff of your virtual surroundings, says Variety. 

The kit list 

Techwise, the $10 million film was shot by production company RadicalMedia on location in South Africa, Botswana, Kenya and Tanzania using a specially built array of six cameras. Images are stitched together to form the panorama and projected using Panasonic 4K laser projectors.   

Panasonic has also made “a unique lens for Illuminarium to produce an enhanced immersive experience”. According to the firm, Panasonic's engineers collaborated to create for Illuminarium an ultra-short throw lens with minimal offset and loss of light.  

Other manufacturers involved in the venture include Holoplot, which provides “proprietary beamforming and wavefield synthesis technology with the ability to localize and isolate sound” across the venue; Ouster’s OS0 ultra-wide view lidar sensor which responds to guests’ movements, and Powersoft which is responsible for the haptic infrasound floor. 

Adjacent to its venue in Atlanta, Illuminarium is building a R&D and post-production center called The Illuminarium Lab. Panasonic, Holoplot and XR specialist Disguise are involved here in developing future experiences. 

 

Like walking into a film  

The technological hurdles were only part of the challenge, says RadicalMedia CEO Jon Kamen. Creating a film for this type of space—with people moving through, entering and leaving at different times, and only able to see a small amount of the entire experience at any one time—called for a new kind of storytelling. 

“It’s a bit of a mind-bender, to be honest,” he says. “You have a much bigger physical responsibility, because anybody in the room can be looking in any direction at any time. It’s a completely different discipline of filmmaking.” 

Venue designer David Rockwell, founder of Rockwell Group, added perches and areas where visitors can peer out into the space or walk right up to elements of the film.  

“To tell a story spatially you have to leave seams in the story for audiences to find themselves in it,” he says. “If it’s a hermetically sealed, complete story, there’s very little opportunity for people to bring themselves into it.” 

Exit through the gift shop 

The core business model of the Illuminarium is depressingly familiar. The press release says visitors can also “experience” The Illuminarium Café, offering an outdoor patio “facing the BeltLine”. The café is being sold as an “extension of the immersive experience content, serving authentic dishes, beer, and wine from the African continent.” 

A $50 “all-inclusive” ticket gifts you a $10 voucher for the café or gift shop which will “retail a wide variety of gifts inspired by the safari experience.”  

Further, Illuminarium venues will be promoted as nightlife destinations with a bar “letting visitors experience different virtual settings, from a Tokyo city street to fantastical dreamscapes.” 

All this is perfect for Vegas where Paris and Venice themes barely disguise shopping malls but may not live up to the hype if visitors come expecting to be wowed by experience that truly immerse them in the atmosphere of places they have never been. 

According to the PR,“Each environment evolves throughout the night to deliver ever-changing visual destinations rendered in real-time. Guests may toast friends as they float on billowing clouds that overlook a glowing sunset; the following night they may encounter the fluorescent light animations and holograms decorating the surfaces of a futuristic street in Tokyo.”   

I’m reminded of the distinctly underwhelming World Showcase at Disney World’s Epcot which claims to represent different cultures like Mexico, France, Italy and the UK but are little more than shells for overpriced cafes and gift shops. 

Not stopping there 

Illuminarium expects to have 25 to 30 of its venues open in the world's “great megacities and mega tourism locations” within the next five years.  These will be launched as joint ventures.  

After ‘Wild’, its next production is ‘Spacewalk,’ will let visitors “stroll across Moon and Mars.” Also planned is film about the depths of the world’s oceans. 

“It’s not a movie,” says Greenberg of the space trip. “You’re going to be able to walk on the surface of the Moon!” 

For that you’d need a gravity field about 1/6th that of the Earth. It can be done (see NASA flight sims) but I’d wager the experience will be more like a glorified terrestrial planetarium. 

 

 

Invest in Digital Skills and AI to Improve Creativity

NAB

As AI advances into all industries it will eliminate millions of jobs but will generate more employment than it consumes. That general finding from the World Economic Forum last year would seem to hold true for the creative industries too where there’s growing understanding that AI/ML can help talent to their core competency — creativity — faster by speeding up the time-consuming mundane stuff.

https://amplify.nabshow.com/articles/invest-in-digital-skills-and-ai-to-improve-creativity/

According to the World Economic Forum report, the rapid acceleration of automation and economic uncertainty caused by the pandemic will shift the division of labor between humans and machines, causing 85 million jobs to be displaced and 97 million new ones to be created by 2025.

Many of these are tech jobs requiring skills in artificial intelligence, blockchain, data security and emerging coding languages.

Those are all relevant skills to the future of media which is why media organizations better take note of a survey from LinkedIn, which reports that resilience and digital fluency are the skills which are most prized by learning and development professionals.

Resilience here means the ability to adapt to rapid-fire change — a clear consequence of the pandemic. Digital fluency is means having the technology skills to effectively operate in an increasingly digital world. It includes everything from understanding how to use communicate with video to advanced artificial intelligence.

This trend is global. Resilience and digital fluency landed the #1 or #2 spots across every country LinkedIn surveyed, including the US and Canada, France, Australia, Southeast Asia, and India.

Companies and governments are stepping up to the plate to retrain millions of their staff for the digital economy. As cited in the report: JPMorgan Chase added $350 million to its existing $250 million plan to upskill its workforce. Amazon is investing over $700 million to provide upskilling training to their employees. PwC is spending $3 billion to upskill all of its 275,000 employees over the next three to four years; Microsoft (LinkedIn’s parent company) said it would upskill 25 million people with LinkedIn Learning programs.

“The goal is not to replace the animator but to get it to the point where the animator can bring it to the next level. AI has a lot of potential to help express our creative potential by simplifying a lot of frustrating tasks and accelerating work and enabling artists to get that aha! moment as quickly as possible.”

— Roy C. Anthony, DNEG

AI/ML has entered media and entertainment to automate speech to text captioning or to analyze and reduce the cost of storage. Algorithms are also thought foundational for next-gen video compression schemes.

On the creative side, AI tool-sets such as Adobe Sensei are already helping creatives speed the process of video assembly. Colourlab Ai can quickly match footage to take the pain out of the more laborious aspects of grading allows colorists to then make the most of their time by focusing it on the more creative aspects of the task.

Other AI’s can assist in performing a lot of the routine work of visual effects. Researchers at the University of Toronto have built an AI that uses audio as an input to drive character animation for multiple languages.

“The goal is not to replace the animator but to get it to the point where the animator can bring it to the next level,” argues DNEG’s global head of research Roy C. Anthony. “AI has a lot of potential to help express our creative potential by simplifying a lot of frustrating tasks and accelerating work and enabling artists to get that aha! moment as quickly as possible.”

A 2020 study by London Research found that content creators only spend about 48% of their time actually creating content, with the rest being spent on administrative tasks associated with content creation.

Marketing and creative teams in particular could benefit from using creative automation tools to overcome time-consuming, low-value tasks — whether it is tweaking banner ads, making small changes to campaign videos, or editing the copy on a Google ad.

“If it’s used for data driven content, it opens new possibilities in terms of creative outputs,” brand and visual designer Pamela Giani tells Creative Review.

“Imagine you want to create 15,000 videos based on real-time data and publish them online on different platforms. That is just impossible to achieve without automation.”

She continues, “I think automation can be great to allow us to spend more time on creative thinking and skip the manual and low skill-based tasks. Because humans make mistakes, I think machines allow for a higher level of consistency and quality.”

Her company, Monzo, won’t be the only one exploring the potential benefits, as well as the downsides, of automation.

It seems that introduction of an AI is best run in parallel with upgrading the digital skills of the workforce. If that happens, there could be a huge upside in greater human attention on creating stuff.


Behind the Scenes: The Tomorrow War

IBC

During a televised World Cup soccer game in Miami, time-traveling soldiers from the year 2051 appear on the pitch with an urgent message for the planet. In this case the enemy is a terrifying alien species but of course it’s a metaphor for climate catastrophe. Not that The Tomorrow War is too heavy handed about it. Director Chris McKay envisions the Amazon release as a military sci-fi version of It’s a Wonderful Life starring Guardian of the Galaxy’s Chris Pratt. 


https://www.ibc.org/trends/behind-the-scenes-the-tomorrow-war/7716.article

 

“Chris [McKay] and I wanted the film to feel relatable, especially regarding the present day, home and family sequences,” explains Larry Fong ASC. “But we also wanted epic scope and breathtaking visuals. So basically, we tried to apply some restraint for half of the film, and then cut loose for the rest.” 

 

Having shot a number of blockbuster science-fiction and fantasy films in the past, such as 300, Watchmen, Batman v Superman: Dawn of Justice and Kong: Skull IslandFong was no stranger to shooting movies about aliens and post-apocalyptic battlefields. 

 

Peter Wenham, the production designer, delivered “fantastic, original concept art” but Fong prefers to apply his own ideas whenever possible.  

 

“I’m always excited to try and come up with new visual approaches rather than be influenced too much by other movies. I prefer to visualize, dream, meditate, whatever you want to call it, as I go through the script or walk the sets and the locations, and bounce these ideas off of the director, production designer, gaffer, key grip, camera operators, DIT… anyone who will listen. The process is to narrow down and distill things into the best ideas and then figure a way to translate them into a tangible approach on set.” 

 

McKay, who made his feature-film debut with the Lego Batman Movie, was able to use his extensive background in animation to direct a live-action movie where the aliens are, for the most part, created in CG.  

 

“Chris wanted the movie to have a classic vintage look with as much in-camera as possible,” Fong says. “We decided on shooting anamorphic early on to achieve the epic scope and grandeur—not only for the 2.39 aspect ratio, but for the texture that anamorphic glass brings. I used Panavision T-Series lenses that were modified a bit to my specs by Dan Sasaki [Panavision’s VP of Optical Engineering]. It was an easy choice to go with ARRI Alexas, which have amazing colour science inherent in their design. 

  

Additional testing with the lenses and the Alexas helped Fong’s team tune them for the desired flare characteristics— “always an important thing for me,” he says. 

  

Additional ultra slow-motion shots were made with a Phantom, and Alexa Minis were deployed for helicopter aerials and several drone shots. GoPros captured some surveillance camera type of shots. The material was principally shot 2.5K. “The studio, post-production, and VFX were all good with that. I didn’t have any reason to shoot higher.” 

 

He used a base LUT supplied by Steve Yedlin, ASC. From there Fong and DIT, Robert Howie, modified it to discover each sequence’s preliminary look. Howie also made sure the dailies matched that vision each day. “That’s an important part of the workflow. Fotokem took care of us very well in that regard.” 

 

 

On location in Iceland 

 

Because McKay wanted The Tomorrow War to feel real rather than hyper-stylized, he chose to shoot on location and limit the amount of green screen used. That meant shooting in Iceland, which in the story stands in for Russia, on Europe’s largest glacier, Vatnajökull. 

 

Not just on the glacier either – but on its highest peak. “It took at least an hour, in various vehicles, to get there each morning,” says Fong. “My key grip, Gary Dodd, rigged a Technocrane on skis and snowboards which we towed to the top of the peak.” 

 

“The cold in Iceland is absolutely brutal on the camera equipment,” says Kong who once shot a car commercial there. “It affects all the electronics, and you have to keep the cameras warm because the lenses tend to fog up if you’re not careful.” 

 

A descent into an ice tunnel set required an absolute commitment to an approach, he says. “The only way to move the camera through the small, angled space with the actors was with a Technocrane, and we had to determine the exact places for art department to make holes to push the arm though. Grips had to build scaffolding to get the crane base to the correct height. It takes a lot of engineering and measuring to pull that off. 

 

“There are massive crevasses that go down for hundreds of feet, and if you fall into one, you’re gone. It’s also very challenging because the weather is constantly changing and you only have so much daylight available. We had about six hours of shootable time each day, and then the sun would just crawl across the horizon and go down, leaving us in the dark.” 

 

 While much of the film was shot on location, certain key sequences had to be filmed in a studio. 

 

“The light was beautiful…until a storm came in and we were unable to get all our shots. Therefore, we had to complete the scene on stage in Atlanta. Not the ideal situation, but we did our best. My gaffer, Jeff Murrell, came up with the idea of using a 100K SoftSun gelled warm to simulate the low sun we had in Iceland. VFX, editorial, and our colorist went beyond the call of duty to make the sequence work.” 

  

A scene of soldiers entering a long, dark, circular stairwell was shot at Pinewood Atlanta because of the control needed to perfect it.   

 

“The scene starts off so quietly, and then the music completely stops and you feel all of this tension begin to rise as they’re creeping further down the stairs,” Fong says. “Then when the last guy looks up and thinks he sees something moving in the shadows, we used a long, slow zoom to stretch that out. Some people are afraid to use zooms, but Chris and I agreed it was the perfect moment to include one. It makes the whole sequence feel super tense, and it forces the audience to strain to see what’s up at the top of those stairs.” 

 

 

Time jumps 

Time travel has played a role in countless films over the years, but McKay wanted a unique take on the concept.  

 

“We looked at images of the northern lights and the view of Earth from space, and at one point I showed [McKay] images from the Hubble Space Telescope because there’s something kind of intimate and mysterious about them,” says VFX supervisor James Price. 

 

“We designed a force field that forms above the draftees right before they jump in time. The travelers slowly rise up, pierce this membrane, and begin their journey.” 

 

We get the first glimpse of this at the beginning of the movie where dozens of people from the future appear out of thin air in the soccer stadium. 

 

To capture the effect, the special effects team ran tests using an underwater cloud tank to simulate the time-displacement effect. “Then we got the bright idea to use a practical wall of smoke instead,” says SFX supervisor J.D. Schwalm (First Man, Venom).  “We made the smoke thick enough so the camera couldn’t see through it, and then one of our stunt coordinators rigged the actors on wires and flew them through the wall of smoke. We added a bunch of CG electrical currents in post-production, and when you finally see it on screen it looks as though the actors are materializing out of thin air.” 

 

 

The army of White Spikes  

 

The ferocious extraterrestrial species in the film are known as white spikes. Ravenous monsters that hunt in packs, McKay envisioned their skin being raw and covered with psoriasis and pock marks. 

 

“They’re fundamental to the story and if you get them wrong then everything else can fall apart like a house of cards,” production designer Wenham says. 

 

The filmmakers looked at rhinoceros and hippopotamus skin, studied cheetahs and leopards for the way their joints allow them to run quickly, and examined snakes for their ability to unhinge their jaws. Marine predators and insects were also part of the research. 

  

“Shark eyes were my inspiration for their eyes, because there’s something very eerie about black shark eyes,” says concept illustrator Ken Barthelmey. The creature’s front arms are based on praying mantis claws, and their back plates resemble the shell of a grasshopper. 

 

With the design complete, a full-size creature puppet was constructed, along with several smaller pieces used for specific shots and a couple of animatronic white spikes for something physical for the actors to react to on camera. 

 

Principal photography was in the can two years ago with original release scheduled for Christmas 2020. Not only was release postponed but much of editorial including recording the score had to be done remotely.  

 

“Luckily I was able to do the final DI grading in person with colorist Dave Cole, thanks to the protocols in place at Fotokem Burbank,” Fong says. 

 

Production was nearly derailed by the effects of global warming the film is warning about. When scouting locations in Iceland, the production team found several magnificent sites including Anaconda Ice Cave and the Blue Diamond Cave but when they returned a few months later, found that they had melted away.  

 

Wednesday, 7 July 2021

These XR Advances Are Become a Reality

NAB

Robotic boots, AR contact lenses and haptic suits are some of the key advances in XR tech that are just around the corner.

https://amplify.nabshow.com/articles/these-xr-advances-are-become-a-reality/

Author and “futurist” Bernard Marr, writing for Forbes, looks at the tech advances coming our way in VR and AR and what these might mean for everyday life in the future.

LiDAR Will Bring More Realistic AR Creations To Our Phones

The iPhone 12 and iPad Pro already come equipped with LiDAR technology, and it’s reasonable to expect other devices will follow suit. LiDAR (Light Detection and Ranging) is essentially used to create a 3D map of surroundings, which can seriously boost a device’s AR capabilities.

“It can provide a sense of depth to AR creations — instead of them looking like a flat graphic. It also allows for occlusion, which is where any real physical object located in front of the AR object should, obviously, block the view of it — for example, people’s legs blocking out a Pokémon GO character on the street.”

This, Marr says, is vital for making AR creations appear more rooted in the real world and avoiding clunky AR experiences.

VR with Hand Detection and Eye Tracking

Marr thinks sensors for these functions will be built-into the next-gen of VR headsets. Because hand detection allows VR users to control movements without clunky controllers, he says, users can be more expressive in VR and connect with their game or VR experience on a deeper level.

“Eye-tracking allows the system to focus the best resolution and image quality only on the parts of the image that the user is looking at (exactly how the human eye does). This taxes the system less, reduces lag and reduces the risk of nausea.”

The XR Experience Will Be Accessorized

Marr gives the example of wearable robotic boots developed by Startup Ekto VR. “These provide the sensation of walking, to match your movement in the headset, even though you’re actually standing still. In future, accessories like this may be considered a normal part of the VR experience.”

Full-Body Haptic Suits

We already have haptic gloves, which simulate the feeling of touch through vibrations. But what about full body suits? The Teslasuit from British headquartered company VR Electronics is one example you can buy today. “But they aren’t exactly affordable for everyday VR users,” writes Marr who suggests that such gear will be cheaper and more effective in time.

Merging the Human Body with XR

The logical extension of an XR wearable is to internalize the device. This may end up with chip implants but before then we’ll get AR contact lenses, themselves the natural successor to AR goggles. California-based startup Mojo Vision is working on AR contact lenses with micro-LED displays that place information inside the wearer’s eyes.

Marr imagines the uses for the tech over and above helping those with poor vision.

“When demonstrating the prototype to journalists, the lenses displayed pre-loaded information like text messages and the weather report, indicating that AR lenses could help us consume content in new ways. It could also help us enhance our sight in low light conditions or even serve as a teleprompter for speaking events.”

AR lenses could potentially be used to augment the world around us. “If you hate the garish paint job your neighbors have done on the exterior of their home. In the future, your lenses could change it for you, and you’ll see whatever color house you choose,” Marr says.

It’s perhaps a trivial example, but is not far-fetched and will call further blur the boundaries between the real world and the virtual one.

Marr seems pretty open minded about the technology and aware too of the ethical and data privacy pitfalls that need navigating. On balance he believes the potential benefits of XR far outweigh the challenges.

“At the end of the day, XR is about turning information into experiences, and this can make so many aspects of our lives richer and more fulfilling,” he says. “Certainly for business, XR offers huge scope to drive success, whether that’s creating immersive training solutions, streamlining business processes such as manufacturing, or generally offering customers innovative solutions to their problems.”

We shall see.