Wednesday 31 January 2024

HPA Tech Retreat and MovieLabs share vision for profound change

copy written for HPA

As HPA’s Tech Retreat approaches its 30th anniversary in 2025, the event appears to be even more relevant and stronger than ever. With brisk, record-setting ticket sales, and programming that touches on the most important topics, the retreat remains prescient.

article here 

“The HPA Tech Retreat has evolved way beyond Hollywood and beyond post production,” according to Leon Silverman, HPA founder and past president, who currently serves on the HPA board, in addition to his role as MovieLabs’ Advisor for Strategy and Industry Relations. “The Tech Retreat is increasingly recognized as an important international event reflecting the increasingly important role that technology plays in enabling creativity as well as the entire media ecosystem.”

This global reach and technological evolution are nowhere more apparent than in the MovieLabs 2030 Vision. What started as a high-level vision of thought leadership five years ago from the major motion picture studios has become a roadmap not just for Hollywood but the entire M&E industry from scripted content to broadcast to indie creators.

“We’re approaching five years into our odyssey towards ‘ProductionLandia’ – an aspirational place where media creation workflows are interoperable, efficient, secure-by-designdesign, and seamlessly flexible,” Silverman said. “It is the destination – and the 2030 Vision is our roadmap to get there. What we are finding is that this Vision resonates well beyond Hollywood and that the HPA Tech Retreat has become the vital meeting place for fellow travellers to learn all about MovieLabs’ work while MovieLabs gains visibility into pioneering developments that would otherwise go under our radar.”

On the Thursday morning of the HPA Tech Retreat, beginning at 8:45 AM, MovieLabs has a full morning of programmed sessions demonstrating that the 2030 Vision is not just aspirational but is becoming reality, with several principles of the vision put into practice and deployed today across multiple workstreams and companies. The MovieLabs session will illustrate that the goals of the 2030 Vision are not unique to Studios but one shared across the industry.

“At Movielabs what we really love about the HPA Tech Retreat is learning how deeply the concepts of the 2030 Vision are resonating. The Tech Retreat enables attendees to catch up, but I’d go further and say that if you are not up to speed on the 2030 Vision, then you are not a part of the industry’s conversation.”

The table is set with a “State of the Vision” mini keynote by Mark Turner, Program Director, Production Technology, MovieLabs that outlines where we are on the road to “ProductionLandia”, highlighting industry progress and also identifying gaps that need to be filled, as well as updates on important MovieLabs initiatives.

Next, Turner will host two “speed date” case studies: Avid and Amazon will discuss lessons learned from their joint efforts to create a 2030 Vision aligned “Studio in the Cloud” and how they plan to move forward. Underlining the extent to which the MovieLabs 2030 Vision is resonating worldwide, Japanese post production house IMAGICA will explain how it is beginning to implement software defined workflows in the context of the 2030 Vision.

While the topic of Ontology might not be at the top of many people’s industry radar, the vital role metadata and common data models like MovieLabs Ontology for Media Creation (OMC) plays in enabling an interoperable industry will be the focus of the next MovieLabs Session, led by MovieLabs CTO, Jim Helman. Presentations from France-based Digital Asset Management company, Perfect Memory will demonstrate the role OMC plays in streamlining workflow and enabling interoperability. Australasia based Rebel Fleet will provide a case study on the role OMC and metadata played and its impact on the HBO Production, Our Flag Means Death. This session will also feature an important industry discussion presented by Google and Paramount which will make the case for an Open Standard for the Exchange of Metadata, based on the MovieLabs Ontology for Media Creation (OMC).

The last MovieLabs session of the day will be also led by MovieLabs CTO, Jim Helman. Joining Jim will be representatives from Studios and key industry tool providers who will not just be talking about interoperability but demonstrating and revealing significant new products and industry initiatives.

“These sessions as well as the rest of this year’s Teach Retreat will be important to all those looking to not just be up to speed, but to be in the “room where it happens.” This year will be a full immersion course that will provide deep context across the many industry technology topics that are impacting, disrupting and empowering our industry,” Silverman noted. “Especially now, at this critical and challenging time in our industry – miss it at your career peril.”

Artificial Intelligence is a massive topic reflected in the HPA agenda where attendees will be guided through some of the latest AI technologies and ways to approach its use. This includes an HPA Supersession ‘Deconstructing an AI Created Animated Short’, a presentation on the ‘Use of Gen AI as a Screenwriting Partner and Preproduction Assistant’ and the future of ‘The Computable Studio’ delivered by Yves Bergquist of the ETC. A vital studio perspective on AI’s role in content creation is provided by Tony Guarino, EVP of Worldwide Technical Operations at Paramount Pictures.

Round tables, illustrative case studies and Q&As form the backbone of a packed four-day HPA Tech Retreat conference program that deeply explores the nuances of technological concepts embracing production, virtual production and content creation in all its forms.

“Most conferences and trade shows focus on product marketing. Simply not the case at the HPA Tech Retreat, which is authentically about sharing deep technical knowledge among peers,” Silverman said. “Its success is thanks to the generosity of our attendees who are committed to moving the industry forward. Hundreds of the smartest, most influential and geekiest people in our industry come to the desert to learn from each other in formal and informal discussions across four days from breakfast to the bar way past midnight. There is not one day that can be missed.”

Tuesday 30 January 2024

Behind the Scenes: Occupied City

IBC

article here

Steve McQueen’s new documentary juxtaposes the past with the present and breaks to bring the facts of the holocaust into the light.

It seems we’ve now arrived at a new phase of storytelling about the Nazi era. While most of those with personal experience of the Holocaust have passed on, we are left with the trickier task of recalling history.

The Zone of Interest directed by Jonathan Glazer is one such attempt to keep that history potent and fellow British filmmaker Steve McQueen has produced another fresh perspective.

In his mammoth feature documentary Occupied City, McQueen literally puts the past in a perpetual dialogue with the present. Nothing in the film, which is entirely shot in Amsterdam, contains archive stills or colourised footage or stock imagery or talking heads – none of the material of a routine documentary about WWII. Instead, his camera roams modern streets alighting on seemingly random everyday scenes by canals, buildings, playgrounds and houses while a narration tells us what was happening in that exact same spot when the city was occupied by the Nazis in 1942.

“In Amsterdam, you have a city where many of the buildings people used in the ’40s are still here, still of the same scale,” McQueen states in the film’s production notes. “They are being used in different ways from the ’40s, but not that much has visibly changed.”

McQueen lives in Amsterdam and discovered that his daughter’s school used to be a SS interrogation centre. More than 1,000 brass plates embedded in Amsterdam streets are inscribed with names of those murdered by the Nazis.

“It’s almost like Pompeii in a way. The past is right there, physically, within our present.”

He made the film with his Dutch wife, the director and producer, Bianca Stigter, who had authored a historical encyclopedia of the occupation called Atlas of an Occupied City: Amsterdam 1940-1945.

They conceived the film as being almost like an archaeological excavation, bringing the past out into the current city and reactivating 80 year-old stories. The film puts the viewer in the unusual position of having to negotiate two different elements: what you’re seeing and the information you’re hearing, both of which are very strange.

“Out of that negotiation, I think a third thing emerges and I don’t know what that is exactly, or how to describe it, but it’s what I was after,” McQueen said. “As a viewer, I think sometimes you follow the voiceover, sometimes you are drawn into the images, but then, something else happens in your mind, where the connections are coming together. That third thing is maybe where other people’s stories from decades ago interact with our own inner stories. There’s dismay and sorrow in watching the film, but it’s also beautiful and inspiring, because it makes you think about how memories of these vital historical events are sustained by the living.”

McQueen grew up in London and while “living with ghosts” as he has put it may be integral to the psyche of Amsterdam natives, the idea of the present interacting with the past, the living and the dead, is what triggered his interest.

Initially, he thought of using archive footage to project on top of the present day footage, but then decided to use narration based on Stigter’s text and to merge sound (past) and vision (present) together.

Behind the Scenes: Occupied City - Film Choices

Cinematographer Lennert Hillege filmed on 35mm which McQueen preferred over digital in order to make every shot count.

“The standard way of shooting documentaries isn’t for me,” he said. “I’m interested in possibility and finding the moment. And that is about trust, about waiting, and having the skill to sort of see a thing before it happens, and to see the evidence of things that are invisible. It’s almost like a Miles Davis philosophy, where it’s more about the silence between the notes than the notes themselves. And that’s what I was after in this film. Embracing the unexpected as if you knew it was always there.”

Anne Frank’s house is the country’s major tourist attraction but there were 800,000 people living in Amsterdam in 1940 so there are potentially 800,000 stories to tell. For that reason, and also because her address on Prinsengracht is now a “static” museum, the annex where she hid is not included (though there are quotes from her diary in the film).

McQueen, who began his career making video installation art, is also preparing a “36-hour sculptural version” as an art work. Of the four-hour long cinema cut he said, “It needed to be a journey. It takes time to familiarise yourself with the feeling of a city and there has to be room for that. It’s a very different thing from watching interviews. You kind of go onto another mode, and it’s okay to drift in and out.”

Similar to Glazer’s approach with The Zone of Interest, McQueen is careful not to push the audience in a particular way by manipulating what they should feel. The narration, for example, is deliberately pitched to be almost non-committal and matter of fact.

“I think it helps the viewer to draw their own pictures in their mind, because they’re not given any particular emphasis or dramatic sort of leaning,” he said.

The facts can speak for themselves.

Behind the Scenes: Occupied City - Covid Conditions

In 2020, when they shot most of the film, the visible landscape of Amsterdam changed perhaps in ways it hadn’t since the 1940s. It became a ghost town under Covid conditions, as if time had stuck.

“As I started editing and looking back at the footage, it was clear the film is partially a document of this time, of its strangeness and its peril,” he said. “And the stories of the occupation became timelier. It felt like everything in this moment had very high stakes and everything was heightened by several notches. And it is equally about learning from the past. It was very unsettling to be making a film about the occupation and all the denialism that went on, while seeing this resurgence of fascism, racism, and anti-Semitism. It’s a reminder of how things can develop.”

Occupied City ends with a bar mitzvah ceremony because it was important to McQueen and Stigter to show the persistence of Jewish life in Amsterdam.

“I didn’t have an ending until a friend of mine’s son was having a bar mitzvah and that was the last thing we shot, and it was a gorgeous way to close the film. To see these young people, with all their possibility, and to have the Rabbi saying to the younger brother ‘you’re next,’ I think it takes us beyond the present and shows how the past will continue to survive.

“We can do that by making sure [fascism] doesn’t happen again. That’s why Bianca wrote her book. That’s why I made this film. The hope is in the future of these kids you see in the film, the hope of what they might be. We’re trying to clear the path for these young people. That’s what you can do.”

Monday 29 January 2024

Deepfakes, Disinformation, Data Leaks: Being Online Is… Not Great

NAB

In 2024, we will face a grim digital dark age, as social media platforms transition away from the logic of Web 2.0 and toward one dictated by AI-generated content, says Gina Neff, executive director of the Minderoo Centre for Technology and Democracy at the University of Cambridge. Writing for Wired, she says online trust will reach an all-time low thanks to unchecked disinformation, AI-generated content, and social platforms pulling up their data drawbridges.

article here

Her view is echoed in a new report by the World Economic Forum, which highlights risk of AI-generated mis- and disinformation in exacerbating a cost-of-living crisis and socio-political polarization.

The WEF’s 2024 Global Risks Report is based on the views of 1,500 global risks experts, policy-makers, and industry leaders. It finds that the world’s top three risks over the next two years are false information, extreme weather, and societal polarization.  

Cr: World Economic Forum

The threat posed by mis- and disinformation takes the top spot in part because of just how much open access to increasingly sophisticated technologies may proliferate, disrupting trust in information and institutions.

“The boom in synthetic content that we’ve seen in 2023 will continue, and a wide set of actors will likely capitalize on this trend, with the potential to amplify societal divisions, incite ideological violence, and enable political repression,” said Saadia Zahidi, MD and head of the Centre for the New Society and Economy at the WEF.

What’s more, false information and societal polarization are linked, with potential to amplify each other. Zahidi said, “Polarized societies may become polarized not only in their political affiliations, but also in their perceptions of reality. That can have a profound impact on many crucial issues ranging from public health to social justice and education to the environment.”

These trends are occurring at a time of heightened economic hardship for many people around the globe. Together, this “potent mix” of economic distress, false information, and societal divisions can create challenges for many societies, “providing fertile ground for continued strife, uncertainty, and erratic decision-making,” the WEF warns.

This has broad repercussions for the long-term outlook. A decade from now, according to the WEF’s Global Risks Report, the top three risks are all related to the climate emergency: extreme weather, change to Earth systems, and biodiversity loss. Mis- and disinformation stays high on the agenda at number five, followed by other adverse outcomes of AI technologies at number six, and involuntary migration at number seven, while societal polarization also stays in the top 10.

In response to the uncertainties surrounding generative AI and the need for robust AI governance frameworks, the Forum has launched the AI Governance Alliance.

The aim of the Alliance is to unite industry leaders, governments, academic institutions, and civil society organizations to champion responsible global design and release of transparent and inclusive AI systems.

Benjamin Larsen, the WEF’s Lead on AI and ML, says, “Sustained dialogue lays the groundwork for greater cooperation and a potential reversal of digital fragmentation.”

Neff laments the shut down in access to user data on social media sites like Twitter or Facebook. “Companies have rushed to incorporate large language models into online services, complete with hallucinations (inaccurate, unjustified responses) and mistakes, which have further fractured our trust in online information,” she says.

To clean up online platforms and prevent the excesses of polarization she calls for the adoption of the STAR Framework (Safety by Design, Transparency, Accountability, and Responsibility) that she says would ensure that digital products and services are safe before they are launched; increase transparency around algorithms, rule enforcement, and advertising; and work to hold companies both accountable to democratic and independent bodies, and responsible for omissions and actions that lead to harm.

The EU’s Digital Services Act is another step in the right direction of regulation, but its capacity to ensure that independent researchers can monitor social network platforms will take years to be actionable. The UK’s Online Safety Bill — slowly making its way through the policy process — could also help, but again, these provisions will take time to implement.

Until then, Neff says, “the transition from social media to AI-mediated information means that, in 2024, a new digital dark age will likely begin.

 


Wednesday 24 January 2024

ISE 2024 - Preview Convergence, VP and Star Wars

IBC

If you think ISE is about the convergence of AV with broadcast think again. Show organisers say this has already happened. Even the IABM has a speaking slot.

article here

What is the director of the new Star Wars movie doing at a trade show where exhibitors promote smart heating solutions and visitors want to know how to design wireless networks for schools?

You won’t find a clue in the name – Integrated Systems Europe – which is itself no longer as relevant as it was when the show launched in 2003. But you don’t have to look too far to understand that the underlying technology used to distribute digital media around corporations, shopping malls and educational institutions, is broadly the same as that used to produce and display filmed entertainment. Strip it all back and everything is IT and on a network.

That technology has converged to the point where there is barely a semiconductor wafer between its use in film, TV or any other digital creation. Virtual production, for example, is one of the most visible technology crossovers between AV and film/TV and it is featured heavily at the ISE show.

What has also changed in recent years is that content for museums, municipal sound and light shows, art exhibits or monumental commercial venues like the Sphere is sophisticated and immersive, increasingly interactive and giving the traditional 2D rectangular frame of narrative entertainment more than a run for its money.

The stories and the audiences may be different, but even those lines are blurring.

The Star Wars director is Sharmeen Obaid-Chinoy, a Pakistani double Oscar winner for documentary short films (such as Saving Face) who is the first women to direct an edition in the Disney franchise (which is in pre-production starring Daisy Ridley and scheduled for release in 2026). She is giving a keynote at ISE 2024 next month where she will be talking about the importance of storytelling and how technology is transforming lives.

Her presence is a coup for ISE and whoever had the idea to pitch the gig to her should be applauded. It helps the organisers market the show as one about content creation even though the majority of its vendors are not directly connected to the entertainment industry and in some cases very far removed indeed.

As ISE puts it, storytelling is fundamentally linked to the audiovisual industry and influences every element of this year’s show. Almost every facet of the AV landscape is driven by storytelling – it says – including immersive experiences at visitor attractions, inspirational presentations by corporate leaders, and advertising campaigns delivered over digital signage.

It’s the second year of the show’s conference strand dedicated to Content Production & Distribution and it provocatively asks whether brands and corporations are becoming the new broadcasters? It’s rhetorical too since the conference is subtitled unequivocally, ‘Brands: The New Broadcasters’.

“The days of saying there’s a bit of convergence going on between AV and broadcast, are gone,” says Ciarán Doran, chair of the Content Production & Distribution Summit and former marketing exec for the likes of stalwart broadcast tech vendors like Rohde & Schwarz. “It’s happening, it is here and that’s the essence of this conference.”

The conference will explore how brands and corporates are creating and distributing “incredible” content direct to viewers, some working with high-end tech and professional broadcast facilities “to reach audience numbers traditional TV broadcasts simply couldn’t attract.”

A basic corporate VHS training video will no longer cut it. Gen Z, the new workforce brought up using screens and playing video games, demand more.

Doran cites the example of a major fashion brand that recently streamed a fashion show to hundreds of millions of viewers, and WeTransfer which won a 2022 Academy Award for a short film commissioned by its WePresent digital arts platform.

For a long time now select corporates have had access to production budgets that an indie TV producer could only dream of. What’s interesting, Doran says, is that corporates are now beginning to break ground with the type of content they are producing in order to engage with audiences.

On the flip side, the ISE conference will also ponder to what extent traditional TV broadcasters are now seeking professional AV technology to enable more efficient and cheaper production.

“It’s no longer ‘let's dumb something down so that it'll fit into that market’,” Doran says. “They don't need to do that anymore because AV broadcast is now reaching up to acquire the content quality, both technical and creatively, of that service provider.”

Speakers include those more typically associated with the broadcast world – Jigsaw24 (kit hire), Chyron (live graphics), ARRI (cameras) and even broadcast manufacturers trade body IABM. Michael McKenna, CEO and director of VP at Final Pixel, will discuss his work with Oracle Red Bull Racing on Formula 1’s first virtual production shoot. Spanish director David Cerqueiro talks about creating branded content for a corporate communication or a mini feature film. Sessions on the creation of eXtended reality content are now a staple of ISE just as they are at more broadcast related events.

Brands can’t do it alone though. To engage with younger audiences in particular they need to partner with influencers or content creators who will often use off-the-shelf technology like an iPhone and Adobe Photoshop to shoot and package content before streaming to their followers on social media. Increasingly creators will use combinations of AI tools like Midjourney and ChatGPT to create more content more efficiently and perhaps skip the manual camera and edit stage altogether.

Naturally, the role of AI on storytelling is on the ISE agenda. Digital artist Jeroen van der Most gives a keynote entitled ‘Breaking Boundaries with Creative AI’.

I will explore how we can innovate art – using AI to change it from something static into something more fluid,” he explains. He then goes on to say that he will use AI to  “build closer, deeper relationships with non-human entities”. van der Most says he will take the audience on a journey into his mind using AI – “it’ll be a bizarre trip where you’ll encounter some weird things.” No kidding.

The content creation part of ISE is growing and pretty fast too but it is still a relatively small side of the AV industry and therefore of the show itself. There are huge areas devoted to the nuts and bolts of putting together networks for anything from smart buildings and luxury homes to retail chains, restaurents and hotels.

It’s worth recalling the genesis of ISE, twenty years ago, as an Integrated Systems show. It was formed to be a venue to gather together the vendors and practicioners of an AV industry that was nascent and in some cases unprofessionalised. The show was also a political union of two different markets within a broader AV umbrella – the corporate or commercial side and the private or residential side. These two – the commercial and the residential - remain distinct but the skills and technologies began to overlap in ‘integration’. This is where various and disparate components were combined within the fabric of a space to create a new audio visual environment. That was the core of ISE then and remains its biggest strength. In a way it makes ISE unclassifiable and therefore malleable to co-opt adjacent industries, as it is trying to do with broadcast.

“Integration in 2004 was a big deal,” explains Dan Goldstein who is chief marketing officer at one of ISE’s owners, the commercial AV trade body AVIXA. “It required a lot of hard work to do technically. As digitisation gathered pace, the industry and by extension ISE, was more about solutions to business problems. Now AV is more about ideas and ISE reflects that. It’s a very creative event, one that forces you to challenge assumptions about where tech is going.”

This year’s show, held in Barcelona, is said to be the biggest yet in terms of show floor, to around 66,000m2, and the organisers won’t be disappointed if it receives a similar uplift in visitors, to around 66,000, some 8000 more than 2023. Pre-registrations are reportedly around the 100,000 mark but it would be extraordinary if the 81,000 record attendance of 2019 were broken.

Some things don’t change though. Last year’s female attendee quotant was just 15% while men dominated at 82% - all sadly reflective of the AV industry. Not even the presence of Obaid-Chinoy is going to change those numbers any time soon.

 



Is This New Era of Spatial Computing Really… New? Or Are We Just Remaking the Metaverse?

NAB

The latest buzz word is spatial computing, a term adopted by Apple to describe its latest consumer electronics “wearable,” Vision Pro. But as much as companies like Apple, Sony and Siemens might claim that this initiates a new era, there will be those wondering if this is not the metaverse by another name.

article here

So scarred is the tech industry by the failure of the metaverse to take off (and so synonymous with Mark Zuckerberg’s Meta has the name become), that the 3D internet and successor to flat, text-heavy web pages, appears to have been essentially rebranded.

Futurist Cathy Hackl offers this subtle distinction: “Meta is on a mission to build the metaverse, and Quest 2 is their gateway. Apple seems to be more interested in building a personal-use device. One that doesn’t necessarily transport you to virtual worlds, but rather, enhances the world we’re in.

The term spatial computing has been around at least as long as the term metaverse but is being given a new lease of life by the second coming of augmented reality (AR), virtual reality (VR) or mixed reality (MR) glasses or goggles; the collective term for those acronyms is XR or eXtended reality.

Snap, Sony and Siemens are just some of the companies with new XR wearables due to launch over the next few months. Undoubtedly, all will be a step up in terms of comfort and tech specifications on the early round of such hardware which was led by Google Glass, Meta’s Oculus and Magic Leap.

Apple’s Magical Step Into the Metaverse

“The era of spatial computing has arrived,” said Tim Cook, Apple’s CEO promoting the Apple Vision Pro. In the same sentence he then described it as having a “magical user interface [which] will redefine how we connect, create, and explore.”

Let’s get beyond the smoke and mirrors. There’s no “magic” in the Vision Pro other than a brand name for apps (think Magic Keyboard and Magic Trackpad).

The tech community has, however, been keenly looking toward Apple to bring such a product to market. Having defined and popularized categories for consumer tech, including the tablet and the smartphone, the best bet for XR wearables to go mainstream was always going to come from Cupertino. 

Encounter Dinosaurs, a new app by Apple that ships with Apple Vision Pro, makes it possible for users to interact with giant, three-dimensional reptiles as if they are bursting through their own physical space.

One reason why Cook and others prefer the term spatial computing is because there is greater confidence that this iteration of the tech can better blend the actual and the digital world with seamless user interaction.

As Cathy Hackl put it, spatial computing is an evolving 3D-centric form of computing that blends our physical world and virtual experiences using a wide range of technologies, thus enabling humans to interact and communicate in new ways with each other and with machines, as well as giving machines the capabilities to navigate and understand our physical environment in new ways.

From a business perspective, says Hackl, it will allow people to create new content, products, experiences and services that have purpose in both physical and virtual environments, expanding computing into everything you can see, touch and know.

It is an interaction not based on a keyboard but on voice and on gesture. As Apple puts it, the Vision Pro operating system “features a brand-new three-dimensional user interface controlled entirely by a user’s eyes, hands, and voice.”

It’s not “Minority Report” just yet, but you can see where this is headed. Here’s Apple’s description: “The three-dimensional interface frees apps from the boundaries of a display so they can appear side by side at any scale, providing the ultimate workspace and creating an infinite canvas for multitasking and collaborating.”

Its screen uses micro-OLED technology to pack 23 million pixels into two displays. An eye-tracking system combining high-speed cameras and a ring of LEDs “project invisible light patterns onto the user’s eyes” to facilitate interaction with the digital world. No mention is made of having to sign away your right to privacy — this being a pretty invasive aspect of the technology. Do you want Apple to know exactly what you are looking at? If so, expect hyper-personalized adverts pinged to your Apple ecosystem when you do.

Or as Hackl  — a tech utopian — writes: “AR glasses will turn one marketing campaign into localized media in an instant.”

Apple’s Competition

Such features are not exclusive to Apple. A new head-mounted display from Sony, designed in collaboration with Siemens and due later this year, also has 4K OLED Microdisplays and an interface called a “ring controller” that allows users to “intuitively manipulate objects in virtual space”. It also comes with a “pointing controller” that enables “stable and accurate pointing in virtual spaces, with optimized shape and button layouts for efficient and precise operation.”

The device is powered by the latest XR processor by Qualcomm Technologies. Separately, Qualcomm has unveiled an XR reference design based on the same chip that features eye tracking technology. The idea is that this will provide a template for third party manufacturers to build their own XR glasses.

The Sony and Apple head-gear are aimed at different markets. Both are hardware gateways to the 3D internet — or the metaverse, even if Apple studiously avoids referencing this and Sony only does so when talking about industrial applications.

Apple Vision Pro is targeting consumers, even if early adopters will have to be relatively well heeled to fork out the $3500 ($150 more for special optical inserts if your eyesight isn’t 20/20).

This Changes… Some Things

Chief applications include the ability to capture stills or video on your latest iPhone, which users will be able to playback in Spatial 3D (i.e. with depth) on their Vision Pro. The video and stills will appear as two dimensionally flat as viewed on any other device.

FaceTiming someone will also be possible in a new 3D style experience within the Vision Pro goggles. According to Apple, this “takes advantage of the space around the user so that everyone on a call appears life-size.” To experience that users will have the choice to choose their own “persona” (which Apple chose to differentiate from Meta’s colonization of the term “avatar”).

In addition, Apple had loaded Vision Pro with TV and film apps from rivals Disney+ and Warner Bros’ MAX (but not Netflix) to be viewed “on a screen that feels 100 feet wide with support for HDR content.” As a reminder, the screen is millimeters from your face.

Within the Apple TV app, users can access more than 150 3D titles, though details are not provided. It could be that these are experimental 3D showcase titles or stereoscopic conversions, in a revival of the fad a decade ago for stereo 3D content.

More significantly, Apple Immersive Video launched as a new entertainment format “that puts users inside the action with 180-degree, 3D 8K recordings captured with Spatial Audio.” Among the interactive experiences on offer in this format is Encounter Dinosaurs.

No details were given of how this content is created or at what production cost, but Sony’s new XR glasses are targeting the creative community.

Indeed, Sony is marketing its development as a Spatial Content Creation System and says it plans to collaborate with developers of a variety of 3D production software, including in the entertainment and industrial design fields. The device includes links to a mobile motion capture system with small and lightweight sensors and a dedicated smartphone app to enable full-body motion tracking.

In Sony speak, it “aims to further empower spatial content creators to transcend boundaries between the physical and virtual realms for more immersive creative experiences.”

Where Is This Headed?

Spatial computing unshackles the user’s hands and feet from a stationary block of hardware and connects their brains (heads first) more intimately with the internet.

Hackl thinks Vision Pro is the beginning of the end for the traditional PC and the phone.

“Eventually, we’ll be living in a post-smartphone world where all of these technologies will converge in different interfaces. Whether it’s glasses or humanoid robots that we engage with we are going to find new ways to interact with technology. We’re going to break free from those smartphone screens. And a lot of these devices will become spatial computers.”

She thinks 2024 will be an inflection point for spatial computing.

“Eventually you’ll have a spatial computing device that you can’t leave the house without,” she predicts, “because it’s the only way that you can engage with the multiple data layers and the information layers and these virtual layers that will be surrounding the physical world.”

She admits that right now “there’s a bit of chaos” and that Apple Vision Pro may not be the breakthrough everyone expects in its first iteration.

“To me, the announcement of Apple offers a convergence of the idea of seamless interaction, breaking through the glass and a transformation from social media-driven AI to personal, human AI,” she says. “Will all that happen with the release of Apple’s first headset? No, and I wouldn’t expect it to. That’s a lot to put on one company’s shoulders. But Apple is different from other headset makers which gives us an opportunity to see a different evolution of AR.”

 


“Presence:” Shooting Steven Soderbergh’s Ghost Story (From the Ghost’s POV)

NAB

Steven Soderburgh’s latest film is a riff on the haunted house genre with the twist being that the spooky story is told from the ghost’s point of view.

article here

In Presence a family moves into a new home only to recognize an unsettling feeling that something or someone else is also there. The script is by regular Soderbergh collaborator David Koepp, from an idea by the auteur and stars Lucy Liu, Chris Sullivan, Callina Liang, and Julia Fox.

The story may be familiar, but it is the filmmaker’s camerawork that makes the movie stand out from the horror pack.

“The camera drifts through spaces, hovers around actors, races up and down stairs, and looks out windows — usually in single takes that constitute the entirety of a scene,” describes Bilge Ebiri for Vulture. “Nobody can see this presence, but they do occasionally sense it.”

Lots of films will occasionally cut away to the ghost’s or killer’s or monster’s point of view for some visual flair and added tension. Ebiri notes that Dario Argento perfected the idea in his giallo classics. Sam Raimi turned it into the ultimate lo-fi aesthetic in his Evil Dead movies. Stanley Kubrick riffed on it in The Shining.

“The technique is nothing new,” says Ebiri. “But Soderbergh doesn’t use it as an occasional directorial indulgence, instead maintaining the ghost’s-eye perspective throughout the whole movie. The camera’s presence, the question of where it goes and why, and which characters it focuses on, thus all go from stylistic questions to narrative and thematic ones.”

Previously Soderbergh has been disparaging about creating narrative films told entirely from a first person perspective but he now appears to have changed his mind. During the post-screening Q&A at Sundance, as reported by TheWrap, he admitted:

“I had real questions about the choice that was at the center of this, because I’ve been very vocal about the fact that first person POV VR is never going to work as a narrative. They want to see a reverse angle of the protagonist with an emotion on their face experiencing the thing. I’ve been beating this drum for a long time — it’s never going to work.”

“The only way to do it was you never turn around,” Soderbergh continued. “It was really fun because there was no other plan. That’s it. You live or die by that.”

As TheWrap’s Drew Taylor observed, the Oscar-winning Ocean’s 11 and Traffic filmmaker is prone to experimentation, like shooting features on iPhones or self-releasing entire shows through his website.

Presence is another big stylistic swing that connects. Ironically, it’s not hard to imagine the film becoming a hit on VR headsets.”

As is now his custom, Soderbergh is also behind the camera, acting as his own DP under the pseudonym Peter Andrews, which adds another meta layer of analysis for film theory buffs.

“The unseen figure of the ghost becomes an expression of the filmmaker’s power over the frame, evoking the sadistic-voyeuristic nature of cinema in general and genre cinema in particular,” says Ebiri.

He used the newest iteration of the Sony DSLR, which was small and light enough for him to carry for those extended takes, and wore martial arts slippers while sliding around the house to make as little noise as possible.

This presented two challenges, he told Amy Taubin at Filmmaker Magazine. “It probably weighed 10 or 12 pounds, which is fine unless you have to hold it out in front of you for eight minutes. Then it gets hard. A couple of takes are that long, especially the penultimate shot, where there’s a lot going on. My arms are turning to cement.”

The other challenge was the stairs. “I was in that house for a month and there’s no version of me going up and down those stairs without having to look at my feet. What that meant was I had to do a series of rehearsals where I got a sense of where to aim the camera and where I could just feel the level of tilt and pan that would result in the correct composition without looking through the lens. But sometimes I’d get it wrong, and I would ruin a take halfway through.”

The director also edits (billing himself as Mary Ann Bernard) and told the post-screening Sundance audience, as reported by The Hollywood Reporter, that for him this is the most creative part of the process.

“There’s no analogous sort of tasks in any other art form. You’re bringing it all together, all the elements,” he said. “Sound, picture — it is the best. It’s the reward for being on set. The power of it still amazes me. How you can change the intention of something just by reordering shots or holding them at a certain length, pulling out lines, giving a line to somebody else.”

Soderbergh has apparently experienced some sort of unsettling behavior in his own house where a previous owner had been killed.

“The circumstances surrounding it were very murky, but everybody on the block was convinced that this was not a suicide, as the police described it, but that her daughter had killed her,” he revealed to Filmmaker. “As the son of a parapsychologist, I found that fascinating.”

Also during the Q&A, he talked about the film’s climax and likely to be most talked about scene. This was scripted by Koepp and one that not even Soderbergh expected. Taubin describes the sequence as “a horrific close-up of an act of violence that mesmerizes the camera — just as horror films mesmerize their audience.”

Anyone who has seen Michael Powell’s Peeping Tom may be able to guess what Soderbergh was trying to achieve.

“When I read that scene for the first time, I thought, “Wow, that’s, that’s rough,” and was immediately aware of the sort of philosophical point that you just made. What I didn’t know was how amplified and extreme it would be when we were actually there doing it. My decision ultimately was that it had to be excruciatingly intimate. Which is, you know, 14 inches from a very personal form of murder.”

He also admitted in his interview with Taubin that he is not an aficionado of horror films; “You see a lot of horror films that feel very much like single-use plastic, where you don’t really think about them again,” but what attracted him to the genre was “this whole idea of directorial presence. It’s what the whole thing is built on.”

Nonetheless, critics don’t consider Presence a frightening movie noting the absence of conventional jump-scares.

THR critic David Rooney judged it “an enormously satisfying watch for haunted house movie fans, favoring sustained anxiety over big scares and practical effects over digital trickery.”

Variety’s Owen Gleiberman has problems with the conceit. He writes “we’re just watching a shoestring movie shot with a rather nosy and flamboyant visual style. All very stylish and percussive. But if he had made a version of this movie without the ghost-as-camera-eye conceit, it would have been more or less the same movie.”

He adds, that Presence amounts to “another half-diverting, half-satisfying Soderbergh bauble, only this time he’s the ghost in the machine.”

 


Monday 22 January 2024

BTS: True Detective: Night Country

IBC

The fourth instalment of the HBO’s crime drama True Detective is set in a remote outpost of Alaska but the show was filmed almost entirely in Iceland where cinematographer Florian Hoffmeister BSC used an innovative Infrared technique to capture the perma-dark snowbound landscape.

article here

“We were doing prep for the show in Iceland in August when the sun doesn’t set until midnight but planning to shoot in October for a story that is set in almost endless darkness,” he says. “Iceland is such amazing scenery which you are very excited by but everyone had to remind themselves that we will not see it because it’s going to be dark.”

True Detective: Night Country stars Jodie Foster as Detective Danvers, who along with her partner Evangeline Navarro (Kali Reis) sets out to solve the case of six men who vanish without trace from an arctic research centre.

Crucially for the story’s supernatural and horror element, it is set in a region so far north that it experiences polar night, when during December and January people will only see four to five hours of daylight on average.

“Not only were our exteriors going to be dark but it would also be white which makes a substantial difference when it comes to lighting. When you light night exteriors for mystery and danger you tend to light the people and the light on their faces will naturally fall off into darkness. But if you have a white surface the landscape this will illuminate faster than you will ever see a face. I had to really recalibrate my lighting approach.”

Conventionally you would light night exteriors with lights mounted on cranes but the potential for gusty to gale force winds in Iceland made this impractical.

“It can become windy very quickly and when it does you do have to evacuate the area. Even shooting 45-minutes drive from Reykjavík you are in effect extremely remote. And remember, it is dark. So when you are told to leave a place, you have to do so, it would be dangerous otherwise.”

In order to keep the production from stopping on windy nights Hoffmeister deployed different techniques. Some scenes were shot dramatically such as just using the light from an actor’s headlamp and showing them disappearing into complete darkness.

Other times they used the few hours of day light to shoot sequences such as shots of cars driving to see more landscape.  Hoffmeister describes this light as “like an endless sunrise and endless sunset all at once because the sun never really comes up past the horizon.”

For Episode 1 a sequence where Rose (Fiona Shaw) is skinning a wolf outside of her house they wanted “to play with the magic of the sky and a landscape,” he says. “Everything else we shot right on the cusp of darkness.

“Then there is the feeling of isolation and the vastness of nature that we need to convey so I thought where we need to see the landscape. We can’t put the lights high in air but neither could we fill in from the front with conventional lights because then the actor’s faces would look terribly bright and washed out in the foreground. So what to do?”

He came up with an idea partially inspired by DP Hoyte van Hoytema’s work on Nope. Hoffmeister explains. “For some scenes he had basically created two versions of the same frame by using two cameras at the same time. I decided to modify this and use a stereo 3D rig fitted with one colour camera and one infrared camera. Around the infrared camera we arrayed infrared lights in a semi-circle.”

The show was shot on ARRI Alexa 35 with Panavision Ultra and Super Speeds but for the stereo rig they opted to use Alexa Mini LF with Sigma glass. One of the Mini LF had its infrared filter removed. The colour camera would photograph people and their flashlights but would not register the IR lighting. Meanwhile the IR camera would capture the surrounding landscape. Feeds from both cameras were blended in post to create depth. The rig came from Stereotec, Munich under the supervision of Florian Maier and Hoffmeister’s gaffers across the show were John Dekker and Sigurdur Bahama' Magnusson.

“The aspect I was interested in when choosing Alexa 35 is that ARRI has introduced a new feature called Textures allowing you to can burn in parts of the look. So we built a LUT [with ARRI’s head colourist Florian 'Utsi' Martin] and we took part of the LUT and built a texture which was then burnt into the image. I feel this is the closest in terms of workflow you can come digitally to photochemical image manipulation. Your decision stays within the files and not just added in post.”

For interiors the DP looked to the photographs of Estonian Alexander Gronsky who captures the impact of the environment.

“He had taken a series of stills in a Russian mining town in the Arctic circle and I noticed that the highlights were screaming super bright and the darkness is super dark. l felt that the dynamic range of the Alexa would keep colour rendition even in the highest highlights.

Naturally, if live in darkness you will light everything artificially. My idea for the public spaces in True Detective like the ice rink and the police station is that it is here that people create their day. When people return home they don’t choose moody lighting. They switch on all the lights.”

While Norway was considered as a proxy for Alaska, the production chose Iceland aside from its similar ice-capped wilderness because of its film infrastructure.

“It would have taken quite a big effort to move a ship the size of this show to Alaska whereas Iceland has small but articulate and active film community,” says Hoffmeister.

Interstellar, Captain America: Civil War and The Midnight Sky are just some of the features to have shot there. The crew could base themselves in Reykjavík and not have to travel far outside the capital to film scenes that required genuine remoteness. The country’s favourable film tax incentives played a part in the decision too.

Beginning in September 2022 they shot for two months on stages there and another 50 days of locations shoots, half of which were night exteriors.

“I spent a lot of time trying to build a LUT and for that I needed to shoot a test in darkness with snow. We did this in September when the only ice or snow was to go up on a glacier at 11pm at night. We took all the gear and waited for darkness and by 12am we had to evacuate because a glacier is too dangerous to be at night.”

There actually is an Innns in Iceland (population 65) but the fictional town of Ennis, Alaska is composed of locations from all over the country including at a former school building part of an old American airforce base and at Dalvík, a village in the North. Ennis high street is on the road to Reykjavík airport.

Robert Hunter Baker led a second unit to shoot some plates in Alaska including aerials of cars on snowbound highways.

Hoffmeister, who shot all six episodes, was attracted to the project primarily because of writer director and showrunner Issa López.

“The brand and the legacy of True Detective was equal parts attractive and intimidating but what I found interesting was the personal creative arc you can make by working with one writer-director on this project. It is unusual for streaming TV.”

The series is shot through with supernatural elements and the finale in particular, set mostly in the research arctic station, recalls the science-fiction of a film like Solaris in its blend of reality and imagination.

“The supernatural was always a strong presence from the start. The investigation and mystery  is the narrative drum but we move beyond genre with the spiritual aspect and the belief that  somebody dead can be alive. I personally felt we shouldn’t differentiate visually between them and that the dead should be seen as real, just as the characters see them.

He continues, “You can read the supernatural element as the disconnection between humans and nature and the disconnect between people in terms of their relationships. I’m not sure if this would have been carried over so well if there were not the presence of Issa as showrunner and director.”

Hoffmeister was excited to work with the iconic Foster, whom he admits partly inspired his love of cinema in films like Taxi Driver and Silence of the Lambs.

“I was very excited to work with her. When anybody someone asks me how she was I say she is exactly how we all want her to be. Super professional and very welcoming. Her performance is amazing but there is a spirit of ensemble that she helped to lift throughout the whole show.”

ends

 

 

Friday 19 January 2024

How the Creative Team for “True Detective: Night Country” Made Such a Multilayered Mystery

NAB

Mexican horror filmmaker Issa López may not have been an obvious pick to spearhead Season 4 of True Detective, the latest in HBO’s anthology series.

article here

López, who created, wrote and directed all six episodes of True Detective: Night Country, is best known for her 2017 crime film, Tigers Are Not Afraid, which earned rave reviews from critics and gained a cult following after its relatively small opening in the US.

“If I can bring back the feeling of two characters that are carving entire worlds of secrets within them, and trying to solve a very sinister crime in a very strange, eerie environment that is America, but it also doesn’t feel like America completely, and then I sprinkled some supernatural in it — I think we’re going to capture the essence of True Detective,” López told TheWrap’s Loree Seitz.

 Season 4 of True Detective introduces the franchise’s first female detective duo in detectives Liz Danvers (Jodie Foster) and Evangeline Navarro (Kali Reis), former partners who reunite to investigate the disappearance of six men working at the Tsalal Arctic Research Station in small town Ennis, Alaska.

During lockdown Lopez had been developing her own murder mystery scripts when HBO came calling. “I believe they saw Tigers Are Not Afraid, which is very gritty and ultra-real and violent, but at the same time has elements of the supernatural and [is] very atmospheric,” she told Matt Grobar at Deadline. “So I think that [they saw] something in that movie, they were like, ‘Oh, this could be an interesting point of view for True Detective.’”

Showrunner/Writer/Director/ EP Issa López on the set of “True Detective: Night Country”

On developing her idea for the format created by Nic Pizzolatto she revealed that David Fincher’s Seven was a big influence. Like True Detective it has two different odd-couple characters who come together to solve a mystery.

Isabella Star LeBlanc as Leah Danvers in “True Detective: Night Country.” Cr: Max

“I’m sure that was one of the references that informed Pizzolatto’s writing, at least unconsciously, so I was thinking of Seven. It was two detectives, a forgotten corner of America with its own system of culture and rituals, and it just clicked massively with True Detective. It didn’t take a lot of effort.”

The new series shares with the first season an undercurrent of the supernatural but it also layers in the politics of environmental change, of marginalized communities and, most clearly, male violence on women.

“The environmental theme came when I started to understand the inner workings of northwest Alaska and the industries and the conflicts in the area,” she said to Grobar. “You just start to create this town, and the forces that pull energy inside it. Mining is a huge deal in that area of Alaska, and there’s constant conflict around the benefits of a burgeoning energy industry, but at the same time, the damage that it creates in an environment where people need the environment to survive. So, it’s just rich grounds to create the story.”

Finn Bennet as Officer Peter Prior in “True Detective: Night Country.” Cr: Max

After focusing on women that had been taken and killed in two of the four films López had previously directed, spotlighting the story of a missing Iñupiaq women felt like the “natural continuation” of her interests.

“It doesn’t matter if it’s Mexico, the US or Canada … this violence doesn’t care for borders,” López said.

When it came to casting, López knew she wanted at least one of the characters of the two main characters to come from a native community, and was impressed by Reis, a professional boxer who broke into acting with 2021 film Catch the Fair One. Reis is of Cherokee, Nipmuc and Seaconke Wampanoag ancestry.

Erling Eliasson as Travis in “True Detective: Night Country.” Cr: Max

“I knew that one of the characters had to be indigenous because I don’t believe in bringing agents of justice to fix the situation in the native community. I simply don’t believe in that,” López said. “It was a challenge because there were not indigenous stars in the tradition of ‘True Detective,’ but that needs to change,” she said.

“Now we have a Lily Gladstone [Killers of the Flower Moon] and now we have a Kali Reis,” López said. “This is the year that changes.”

López worked closely with Germany-based DP Florian Hoffmeister (TárPachinkoGreat Expectations) on each episode. Hoffmeister told Mia Funk and Halia Reingold at the Creative Process podcast why the story’s wider scope appealed to him.

“It’s about the transient nature of life up in the polar North,” he said, “Permanent settlement is only possible since the Industrial Revolution, because normally the indigenous cultures were living and communicating with the land in a whole different way.”

He describes his experience of filming in a region (Iceland stood in for Alaska) where for months on end daylight is restricted to just a few hours a day.

“Further North you get no sunlight at all. And it was an interesting creative decision to embrace how lighting has a whole different utility and necessity than just regular light. if I come home at nights here in Berlin, I might switch on a few lights but if you live in darkness, your relationship with lighting changes. If you live in darkness, you tend to over light.”

The locations in True Detective: Night Country included an ice rink and a police station, which he lit as if every light were on, “because you’re literally craving light, and you don’t want your workspace during the day to be moody and dark.”

Since the murder mystery genre tends to have moody lighting, this presented an interesting conundrum. “If you go to the supermarket, and it’s minus-20, you will keep your car running while you’re inside. Because if you switch the car off the engine might freeze,” he explained. “So there’s a whole different way to deal with what we take very commonly as the achievements of our industrialized living environment. I wanted that to be reflected in the lighting. So [scenes] in the police station and ice rink are really bright, basically switched everything on.”

They started prepping the series in summer in Iceland during which time it barely got dark because of the region’s “endless summer” but finished shooting in almost perma-darkness in the winter months.

“If you light at night in a snowfield, the first thing to burn and [overexpose] will be the snow. So the whole lighting chain outside had to be tackled differently,” he said. “I think there’s some really exciting footage where we shot right on the blink (of darkness) where we can still see and you get the scale of the landscape, but it’s almost disappearing into blackness.”

Hoffmeister also suggests that one of the themes of the show is this feeling of a disconnect. “It feels like the end of the world because I think you have this disconnection between us and the environment. And the biggest contrast with the indigenous people that used to live there is that obviously they had to live connected with their surroundings, but we don’t. Not only are we disconnected from our environment, but we also disconnected from each other.”