IBC
article here
IBC
article here
IBC
article here
If virtual production is to sell the illusion of what’s
being filmed the LED lighting and background environment must be merged with physical
sets and practical lighting as seamlessly as possible.
A LED volume not only provides an extension of the scene environment
it essentially acts as a massive light box. Light emitted by the walls can be
used to create dynamic reflections that interact with the set and actors in
realtime. This lighting can be adjusted and fine-tuned by using light cards as
well as colour and brightness controls.
“While the volume is a great base source of lighting we
highly recommend pairing it with traditional practical lighting for the best
result,” says Jamie Sims, VP Projects Manager at MARS Volume. “This is where a
skilled Unreal Operator can make a huge difference. Our Unreal Operators and VP
Supervisors work hand in glove with DOPs and gaffers to achieve the creative
vision.”
Dan Hall, VP
Supervisor at Slough’s Virtual Production Studios by 80six says, “Candles, lamps, even fish tanks are
fantastic examples of practical lights because they’re subtle and give you an
accurate representation of how light will work in a room. Additionally, it
takes the eye away from the background, which should not be the focal point.”
Soft and hard lighting
LED panels are ideal at creating soft lighting which generates
soft edged shadows but they can’t
produce hard light such as hard edged, crisp shadows, spot lights or
‘beauty lighting’. This is where working in creative collaboration with the Gaffer
and DoP on a production is crucial to creating the required look.
“While LED screens
are an excellent source of interactive lighting and reflection they are behind on colour rendition when compared to
today’s practical LED fixtures,”
says Sam Kemp, Virtual Production, Technical Lead, Garden Studios.
Hard light is produced by a point source light, such as a
tungsten Fresnel or an LED point-source fixture. Consequently, a volume without
any additional fixtures can't produce hard light and therefore scenes in
daylight require the addition of practical fixtures to 'sell' the idea of direct
sunlight.
Kemp notes, “Practical
fixtures can replicate hard sources such as sunlight and also help to fill the spectral deficiencies of RGB LED
panels. Standard lighting communications control like DMX can be used from the engine for synced effects.”
Image Based Lighting
Image Based Lighting (IBL) is a form of pixel mapping that
uses calibrated photographic (video) colour (RGB) information to generate
subject and environment lighting. The technique – which some practicioners describe
as a philosophy - uses images and lighting displayed on LED sets to produce
realistic reflections and ambient lighting in a scene.
“The three main benefits are accuracy, time saving and control,”
says Tim Kang, Principal Engineer, Imaging Applications at lighting vendor
Aputure. “The biggest one for me is control. We’ve been chasing naturalism in
lighting for 100 years but have only been approximating the real world.
With IBL you can get the naturalism you want and you can control the
variables and much more directly.”
Garden Studios has
been using IBL since 2021
primarily for driving and VFX heavy scenes. It has recently developed a workflow for tracking
hard sources, allowing for a sun source to automatically move around a car
driving down winding lanes.
“The key is finding
a good balance between IBL and traditional lighting controls; between the VP
team and the Desk Op,” says Kemp. “Image based lighting doesn't really
apply to specific sources when talking about practical fixtures (such as a
normal light on a stand) and more to the conceptual control of those sources, such
as mapping the colour and intensity of a video to a light fixture’s output
colour.”
An accurate colour
pipeline is key to matching colours, and this includes the pipeline for IBL. Allowing adequate time to
complete camera calibration leads to a smoother shooting experience.
“Garden Studios
calibrates its screens’ colour pipeline so virtual fixtures
lighting virtual content will correctly match their physical equivalents,”
explains Kemp. “A colour meter helps
match lighting from LED panels (e.g from a ceiling panel) to physical fixtures,
as does using DMX modes such as CIE-XY (which denotes universal colour
space representing the colour spectrum visible to the 'average human'). Newer fixtures can define a source colour
space when using RGB modes for pixel mapping.”
It's not always as straightforward as it sounds since
identical LED panels might have been produced in different batches and therefore
emit light differently.
“Assuming that the colour pipeline has been set correctly
for the Volume, we can pixel map lighting fixtures from the environment to
ensure accurate colour replication,” says Hall. “But trying to match an LED
panel and a lighting fixture, that are in no way identical, is extremely hard
as they display different colour gamut. You must ensure your colour pipeline is
set correctly and then dial it by eye. You have to trust your trained eye to
see what looks right or not.”
Virtual and real camera team collaboration
The clear advice to production is to pair the DOP, Gaffer and
Production Designer with the Virtual Production Supervisor at the earliest
stage possible.
“We always recommend a pre-light before a shoot so that the
gaffer and DOP can run through all of the shots and lock off any variables
before the shoot day,” says Sims. “Working in a Volume gives you so many
possibilities, but with that we find that leaving the experimentation to shoot
day is an unwise strategy - as it can lead to the time on a shoot day running
away. A pre-light day is highly recommended to find what works, confirm
approaches and lock everything off so that when it comes to shoot, everything
can be achieved quickly and smoothly.”
It is also important for the Production Designer to be “synced”
with the Virtual Production Supervisor from an early stage in production. Sims explains,
“This is to ensure that the virtual set can be married up to the physical set
that is being built. This becomes especially important when trying to make the
line between virtual and physical set seamless. Once the set is built and in
situ the VP team can then colour match the virtual environment to the physical
set.”
Matching practical set and fixtures with virtual assets
Some of the biggest challenges on a virtual production set
make themselves abundantly apparent when trying to extend the physical elements
of an environment seamlessly into the virtual world. The complexity of this
challenge completely depends on what it is you are trying to bring together and
the illusion you are trying to masterfully create.
Sims cites the example of attempting to convincingly marry physical
and virtual sets for the outside of a building. “You need to match up straight
solid lines and subtle block colours so anything that isn’t bang on perfect or
colour matched will be glaringly obvious. This also means your camera tracking
needs to be inch perfect to avoid jumping or unwanted shaking.”
Less challenging environments are ones where the line
between physical and virtual aren’t as strict, for example, a sandy desert.
Colour matching is vital here to sell the illusion.
“To overcome these challenges, we have to underscore the
importance of the pre-light day, and getting up close and personal with your VP
team at your volume stage. Construction collaboration is key here. The more
time the VP Supervisor has to colour match with the set in position the better.
Set build days and pre-light days allow for this care and consideration to be
taken.”
Fighting on a
freight train
Garden recently
shot a fight scene on a moving freight train with its custom lighting controller using a
combination of IBL mapping, DMX cues and OSC variables (Open Sound
Control/OSC is a protocol for networking sound synthesisers and other devices
for musical performance or show control).
“As the train moves
around corners and through a tunnel, a hard-source light array kept the sun in
the correct relative position, flickering behind trees, and pixel-mapped LED
tubes gave full-spectrum soft fill on the talent, automatically changing intensity
in the tunnel,” Kemp explains. “Closeup
fill lights were manually set; everything else could be fully automated.”
80Six worked on a recent car shoot where the windscreen was
taken out and therefore there was no LED ceiling for the shoot.
“Traditionally, when you shoot through a windscreen while
someone is driving, there will be reflections of the sky on the windscreen,”
Hall notes. “Because the shoot we were doing was as if the camera were inside
the car and we only shot out of the lateral windows, we didn’t require an LED
ceiling because there was no reflective surface.
“We put an old school light on a revolving wheel that spun in time with the plate playback to simulate the illusion of orange streetlights passing overhead. The colour of the orange sent to the fixture was selected from the footage of the driving plate.”
IBC
article here
If the UK’s creative industries are to continue to add
hundreds of billions of pounds in value to the country’s economy then much will
rely on the success of a new network of tech labs exploring the future of
media.
“The UK’s creative lifeblood is creative IP,” says James
Bennett, director of CoSTAR National Lab. “How does that creative IP live on
different platforms and reach audiences beyond screens in hybrid spaces? We
need to see creative applications in 5G and 6G that work together with AI
neural networks to create a future of holographic imagery, innovative live
performance and enhanced mixed reality experiences.”
The CoSTAR Network is the evolution of the government funded
Creative Industries Clusters Programme that ran ended in 2023 and spent £56
million to drive innovation and growth across the UK’s creative industries.
Four of those clusters (led by Universities in Dundee, Edinburgh, Belfast and
York) are participants in CoSTAR which is awarded a £75.6 million grant over
six years by the UKRI Infrastructure Fund delivered by the Arts and Humanities
Research Council.
University R&D powering creative innovation
Each of the Labs is equipped with a private 5G network,
compute power for AI and the latest equipment for virtual and mixed reality
production though each has a different focus.
Just as importantly, the Labs are supported by a team of leading researchers
with expertise in the use of immersive and virtual technologies
“We're turning the traditional academic route of engaging
with industry on its head,” says Bennett. “Historically industry comes to
universities. At CoSTAR, we are embedding University researchers in the heart
of the industry.”
Bennett ran the StoryFutures Cluster project which saw the
creation of nearly 150 projects exploring novel storytelling formats and
audience experiences. “We've spent the last five years making sure that
university research is at the service of industry via the Creative Clusters
program,” he says. “We had proof of concept that if you put R&D from
universities in the service of creative industries, you get growth and
innovation and new products and services.”
Four Labs have launched with a fifth National Lab opening at
Pinewood next year. This is intended as a convergent media production hub and
will coordinate efforts to bring the network’s research and infrastructure to
bare on projects.
“It's been a really tough time for our creative sectors and
a long wait between the end of the first days of Creative Industries Clusters,”
says Bennett. “We know there is a huge appetite among our creative sector to
have opportunities to innovate and to be supported by world class research. In
really uncertain times, what CoSTAR provides for them is a safe space to make
serious attempts at innovation.”
Flexible funding to deliver value
SMEs and start-ups can apply to two Access Programmes for UK
companies and partners to access the infrastructure and expertise of the entire
CoSTAR network. Typically, applications will be either based around an existing
project where R&D might supercharge development, or where the lab acts as a
sandbox for pilots and prototypes.
“What we look for in each is essentially whether there is an
innovative idea that's got a clear route to growth that's underpinned by an
ethical, sustainable and inclusive approach to that growth that we can actually
support,” Bennett says.
The Access Program fund is worth £7 million over the three
and a half years but Bennett says the value to recipients is double that. “The
real value of the seven million cash is much more like fourteen million because
the infrastructure itself is being provided for free, including the staffing,
to support companies’ R&D. So, when a company gets a cash in grant from us,
that is then match funded with access to the infrastructure.
“We can only grow the UK’s creative industries if we have
innovative SMEs and startups that are experimenting in the space. CoSTAR will
offer opportunities for large organisations to work with SMEs, but the
lifeblood will be getting lots of SME innovation through the door, seeing
what's possible, accessing the kit and people.”
CoSTAR Screen Lab
In March, Ulster University unveiled the CoSTAR Screen Lab
virtual production facility at its Studio Ulster campus in Belfast Harbour
Studios.
“Northern Ireland has long punched above its weight in
screen production,” says Declan Keeney, Co-Founder & CEO of Studio Ulster
and Director of the CoSTAR Screen Lab. “We're seeing the creative industries
replacing the heavy industries here, clustered around the harbour. We have
about 1200 AAA crew here and a nascent but fast-growing creative technology
sector. These are well paid creative technology jobs. CoSTAR Screen Lab will
accelerate the development of breakthrough techniques that will redefine how content
is created.”
Studio Ulster itself is a wholly owned subsidiary of the
University and a large commercial facility with two LED Volume stages which
have hosted BBC Factual 4-part doc, Titanic Sinks Tonight.
The facility is wired to ST 2110 IP standard giving it the
power to run 32 channels of 8K compressed video at any one time. A third ICVFX
stage installed by Los Angeles-based NantStudios opens in April.
CoSTAR Screen Lab is designed into the core of the building.
It offers facilities for ICVFX, robotics and a 5G private network. There are 3D
and 4D volumetric scanners capable of ingesting multiple images a second from a
250-camera array (the fourth dimension to go with height, width and depth is
sequential).
Over and above these state-of-the-art toys the Lab offers
access to expertise particularly in AI, computer vision systems, cognitive
robotics and ambisonics audio. Last year Invest Northern Ireland and the NI
Department for the Economy invested £16.3 million in an Artificial Intelligence
Collaboration Centre (AICC) based at Ulster University in partnership with
Queen's University Belfast. Michaela Black, professor of AI and Daryl Charles,
Professor of AI and Computer Games are among the academics on site.
Also on campus is Professor Greg Maguire, former technical
animation supervisor at Walt Disney Feature Animation, Lucasfilm Animation and
at ILM where he worked on Avatar. Maguire is also founder and CEO at Belfast
animator Humain building technology to create digital humans.
This Lab has issued a funding call to support creative and
innovative use cases for 5G across the screen and performance sectors
facilitating multi-site collaboration.
“The CoSTAR Screen Lab is about getting local and national
companies to make proof of concepts and develop capabilities in their
companies,” says Keeney.
“The investment point is very high for this technology but
if you have access to a facility like the Lab and the world class expertise we
have in the building, all of a sudden you're empowered to take your idea to the
next stage.”
CoSTAR Live Lab – understanding the experience
The Live Lab based at West Yorkshire’s Production Park near
Wakefield will explore immersive, multisensory, and interactive technologies in
the live environment.
Production Park already boasts one of Europe's largest
campus’ of companies dedicated to innovation in live performance. It
hosts large stages where artists like Pink, Metallica, Beyonce and Foo Fighters
have come to set up their arena tours before taking the show on the road. Among
companies established on site are global staging, scenic, and automation
supplier for live events Tait, sound reinforcement specialist L-Acoustics and
LED display vendor ROE Visual. It’s facilities include marker less performance
capture in partnership with Vicon.
“The artists that come to Production Park are not just here
to rehearse, they ideate their entire tour here,” explains Live Lab Co-Director
Helena Daffern. “They turn up with the seed of an idea ‘I want to be catapulted
in on a giant giraffe’ or whatever their ambition is for their tour and it gets
developed and designed here.”
Daffern explains that the Live Lab’s foundational research
is not just around the live performance industry but the very “concept of
liveness” itself.
“That's where the network really comes into its own because
the way we engage with audience across all different types of media is
changing. We want to interact with our digital world in a different way. The
research we can do across the CoSTAR Network will let us explore the human
experience of how we interact with screen and gaming technologies That’s why
the network is so important because it allows us to share knowledge and
innovate in an efficient way rather than in silos.”
There are even lab spaces dedicated to user experience which
explore biometrics for heart rate and skin conductance. “We want to understand
what really drives the human experience and response to new technologies,” says
co-director Gavin Kearney. “By using visual tracking from a camera turned on an
audience can we infer from their facial features exactly what they're feeling
and emoting? It's these types of technologies that help drive the new
generation of immersive experiences.”
One stage at Production Park featuring a 28-channel
loudspeaker array installed by L-Acoustics will be offered for experimentation
in immersive audio.
“For example, we could take an audience within Live Lab and
run tests on 50 people or we could bring them in to experience one of the Arena
venues with a 10,000 capacity,” he says. “That's the wonderful thing about this
Lab - everything is scalable.”
Another avenue of exploration is connecting performers with
audiences over the internet. “We’re looking at the technologies that will
enable shared virtual environments to happen in a meaningful way,” says
Kearney. “Our dedicated spaces allow us to test new technologies under
controlled conditions so we can vary things like the codecs, bandwidth, latency
conditions and so on. We can think about each of the individual technologies in
turn and then converge them to create something unique.”
Wakefield Council recently poured £3.2m into expanding the
studios to include four additional production studios tailored for live music
and film, as well as new facilities for The Academy of Live Technology.
Live Lab is currently inviting applications for a 'New
Frontiers for Live Performance' pilots and prototypes programme.
Applicants can apply for cash funding of up to £13K to contribute to the
costs of their R&D project. In total, the support package on offer is
valued at over £100K per project.
Daffern adds, “If the network is successful it will have
succeeded in bringing together different strands of R&D, shared knowledge,
resource and facility.”
CoSTAR National Lab – into the metaverse?
The CoSTAR National Lab at Pinewood will offer virtual
production technology, a 236m2 sound stage and labs featuring spatial audio,
volumetric capture and multisensory devices as well as a private 5G/6G network.
“This is where convergent media experiences are going to
live,” says Bennett. “We will look around the corner to what is coming in
converged media landscapes where it's hybrid physical and virtual or real time
interactions across different devices. We’re also thinking about the built
environment as a canvas on which these creative experiences and creative IP can
live.”
BT is providing the telco network at the site, while
Disguise is providing its RenderStream technology which enables real-time
streaming of data between media servers and rendering engines. It is commonly
used in virtual production, live events and immersive experiences. CoSTAR also
has agreements with a number of unnamed “large organisations” to become
partners with news to be announced.
The focus is not just entertainment. Bennett says they are
doing work around “accessibility and wayfinding” that will provide new forms of
e-commerce.
“A lot of our future landscape gets imagined by Hollywood
and features holographic images and AI generated audio visuals coming at us
from all angles. One of the really interesting pieces we're putting together is
how do you actually create an environment where we may have huge amounts of
sensory experiences bombarding us yet be able to block things out and focus on
particular areas. How do we create experiences that enables people to enjoy the
next wave of the metaverse?”
CoSTAR Realtime Lab – connectivity and AR
With the main site located at Water's Edge in Dundee with
close links to the UK video games sector and a second facility at Edinburgh
College of Art, the RealTime lab run out of Abertay University will specialise
in virtual production, integrating CGI, motion capture and AR. It is equipped
with a Mo-Sys tracking system, ROE Carbon LED panels and a Brompton processors.
While Scotland’s screen sector can look to benefit from the Lab, Abertay’s
demonstration of real-time geographically dispersed production over 5G has
already caught the eye.
There are plans to evolve this experiment over 5G and
nascent 6G networks with Pinewood when that lab launches in 2026.
CoSTAR Foresight Lab – skills for the future
Led by Goldsmiths University in London the Foresight Lab is
a thinktank scanning the creative industries sector-wide with a focus on key
areas including decarbonization and advising on the regulatory framework to
support growth and innovation.
Board members include ILM, Dneg, BBC R&D, the RSC,
Framestore, USC School of Cinematic Arts and Microsoft.
“They've been leaning into the debate around AI and
Copyright which is being reviewed by the government, for example,” says
Bennett. “It has a 20,000 company business tracker to look at emerging trends
such as where is public and private money being spent globally on CoSTAR
technologies and where is market activity going, where are the skills gaps and
where is intervention needed most urgently. That provides a really good context
for what is happening in the sector.”
Among the other questions it is researching: How extensive
is the use of convergent technologies (including artificial intelligence) by
firms working in CoSTAR sectors? What are audience experiences and expectations
for products and services using CoSTAR technologies? And what essential data
structures and metadata elements should be collected for CoSTAR technology
productions?
“CoSTAR is the next obvious step for UK Creative Industry,”
says Bennett. “The Creative Industries generate six percent of UK GVA (worth
£124 billion in 2024), but only receive around one percent of R&D spend.
Now we are making cutting edge infrastructure available within academia where
industry can access it. Fundamentally we are putting put world-class research
at the service of creative industries to grow innovation in an ethical and
sustainable framework.”
interview and text written for Sohonet
article here
House of Parliament, an independent VFX and creative studio, was founded in early 2020 with a vision of reimagining the concept of the traditional studio. Five years on and the company is a serial award winner working on the highest profile projects. That notoriety includes delivering nine commercials for Super Bowl 2024 in just one month. In the fast-paced world of visual effects (VFX) and creative production, their innovation and adaptability are crucial to Parliament’s success.
Underpinned by over twenty years of experience in high end
visual effects, Parliament are experts in consulting, creating and executing
visual content to the highest level.
Parliament’s animated work has appeared in prominent
productions such as Taylor Swift’s self-directed 2024 VMA Video of the Year for
'Fortnight' featuring Post Malone, and 'Smoke and Mirrors,' awarded the 2024
Prix Ars Electronica’s 'Golden Nica' to conceptual artist Beatie Wolfe. The
studio is also a finalist for VFX Company of the Year at the Ad Age Creativity
Awards for campaigns including Apple’s 'Flock' directed by Ivan Zacharias for
Smuggler, and LAY'S 'The Little Farmer' directed by Taika Waititi for Highdive
and Hungryman.
The Power of Collaboration for Speed and Scalability
The Parliament pipeline was built, managed and resourced by
their technology partners at Gunpowder; designed to exploit the latest
developments for scale, speed and collaboration.
“We are effectively their CTO,” says Founder of
Gunpowder Tom Taylor, a leading systems integrator specializing in
cloud virtualization solutions. Their role in visualizing and implementing
Parliament’s workflow is extensive. “We build the pipelines, we operate the
render farm, we help them scale and we help with all the upgrades. We manage
billing to ensure projects remain on budget and that the infrastructure is on
tap as required and costs don’t spiral.
The key to the success of House of Parliament’s VFX
workflow is the virtualized version of Sohonet's real-time review
tool ClearView Flex (aka VFlex). Taylor says: “It is exceptionally easy to
set up and use which producers love. Since ClearView Flex gives peace of mind
to their clients it makes Parliament happy, and it reduces a lot of engineering
time for us.”
Solving Critical Connections
A key issue was solving the critical connections for
interactions between clients and artists working from home. “We’d jury-rigged
open-source tools to get streams at a high enough quality to clients remotely,”
reports Taylor. “To be honest we were not consistently successful. Sometimes it
would work well, sometimes it would falter. And it always required an engineer
to set up and do some tweaking during the session.. We found ourselves
constantly trying to make it work. We did not want the clients to notice,
and it was getting to that point.”
In 2022 Gunpowder reached out to Sohonet. Taylor
explains, “I knew at the time, the virtualized version of ClearView Flex
(VFlex) was operating in AWS, but Parliament was on Google Cloud.
Sohonet arranged for us to beta test a version of VFlex in Google and we set it
up. From day number one it was like night and day.
“The producer suddenly had control. It was easy enough and
clear enough that they could then manage the sessions. The clients were happy
because it looked great, and they were also using a tool that they were
familiar with. You can’t overestimate the importance of this. Lots of clients
had used ClearView Flex all over the world and they were excited to use it when
we presented it to them.
“The clients wanted it. We wanted it. Sohonet delivered it
for us - in Google, specifically - so that we could move forward. We've got
very smart engineers who tried to build this but in the end for peace of mind
of the clients and for ourselves we ended up using VFlex and we haven't looked
back.”
The result: smoother collaboration, less downtime, and
happier clients.
Benefits of VFlex
VFlex has become an essential part of Parliament’s daily
workflow, allowing artists to work with Autodesk Flame, Houdini, and Maya in a
virtualized environment. Its reliability in maintaining color accuracy and
quality across devices has significantly enhanced client satisfaction and
streamlined the creative process.
“Now, we didn’t need engineers to set up
sessions,” says Taylor. “We are no longer relying on open-source tools that
risk disrupting our workflow. Think of it this way: we had a whole chain of
plug-ins that were our version of VFlex. To get that chain working took a lot
of effort. And, if any one of those pieces got updated it would quite often
break something else in the chain. We were operating in a very unstable
structure for sending daily reviews out on for clients, crossing our fingers to
see if it would work.
“With VFlex, it’s 180-degrees different. It's a known product and clients are
very comfortable with it. They know that if they’re watching a ClearView stream
that it’s going to be excellent quality, and we know it's not going to lag.
Plus, it’s going to be color accurate.”
Ensuring artists and clients are seeing the thing is a
perennial issue with distributed workflows but not when VFlex is part of the
solution.
“Colorimetry is notoriously tricky when you have some people
on an iPad, others on an iPhone or laptop and sitting on the other side of the
world. Getting that consistency of viewing experience is exceedingly
difficult,” Taylor says. “VFlex gives us peace of mind. We know that the source
signal is consistent across any device that the client wants to connect from.”
The Cloud-First Mentality
House of Parliament launched in March of 2020 with a roster
of high-profile projects signed and ready to go. Notably, this included
production on multiple 2024 Super Bowl commercials. With everything set, the
global pandemic enforced lockdown just one week later.
“They weren't able to get a lease on office space or obtain
infrastructure or equipment,” says Taylor. “We had to scramble, fast, and
figure out how we were going to do this.”
Cloud postproduction studios were not a new concept at that
time, but none had left on-premises workstations entirely. Out of necessity,
Parliament had to pioneer a cloud-first mentality.
Gunpowder tackled the problem head-on, talking with cloud
providers and using available infrastructure. In a matter of weeks, they had
built an alpha cloud studio that enabled Parliament to scale out to 100 artists
across different regions and get the commercials done and dusted for Super Bowl
LV.
Post Super Bowl, still in the pandemic, Gunpowder reviewed
the infrastructure and began to evolve it. “The first few months were
definitely a scramble,” Taylor recalls. “We needed this to work irrespective of
the issues we encountered. It was trial by fire.”
Scale for Super Bowl
While no two projects are the same, Gunpowder built a core
pipeline for Parliament that can scale. VFlex is integral to each one.
Taylor says: “Each department has a volume control in it, if
you will, and depending on a job’s ebbs and flows we turn it up or down. That
can be multi-region. It can be different countries. If they want to hire a
specific designer who's in Australia to produce a certain look, we can get that
person in front of the project within minutes. We are literally able to grab a
slider bar and drag it up and get 100 extra machines online in three minutes.”
This flexibility enabled Parliament to more than triple in
size to accommodate the increase in work, involving over 300 artists, 2 PB of
data, and thousands of hours of rendering to complete nine spots ahead of Super
Bowl LVIII 2024—all over the course of just six weeks.
“One of the nicest compliments we received from Parliament
was that they didn't even have to think about doing this. The key to VFlex is
that it is easy to set up. It just works. Producers love to use it, and it
makes our clients happy.”
Template for Success
Parliament recently opened a design department and is
working with Gunpowder to explore the integration of real-time workflows.
“Design and post workflows are traditionally kept separate but we’re bringing
the two together so that our 3D artists can benefit from being able to model
quickly in tools like Unreal and then bring those tools back into Maya.”
Separately, Gunpowder has taken the cloud template and
applied it for clients outside media and entertainment in sports verticals for
architecture firms, toy manufacturers and more.
“We not only help legacy creative VFX studios accelerate
their transition to dynamic cloud-based operations and workflows, but our goal
is also to free production teams to concentrate on delivering their best
creative work, by taking care of the cloud infrastructure and management.”
House of Parliament’s partnership with Gunpowder exemplifies how cloud-based solutions can redefine creative production. By focusing on robust infrastructure and reliable client interactions, the studio has set a benchmark for the VFX industry, showcasing how innovation and collaboration lead to success.
Enginelab and the new breed of cloud postproducer
article here
It takes a brave soul to launch a new VFX facility given
the meltdown at one of the industry’s largest, but creative entrepreneurs
conversant with cloud economics are confident that there are good opportunities
to be grasped.
UK startup Enginelab is the latest of a new breed of
postproduction company designed around facilities in the cloud and powered by
AI.
Two of its three founders come from Untold Studios which
broke ground in 2018 establishing the world's first cloud-native creative
studio with a template of cloud render nodes and virtual workstations.
Sam Reid was CTO of the initial Untold team helping grow the
company from a handful of employees to several hundred bringing international
business to its creative services from commercial brands, pop artists, studios
and streamers.
“I've learned a thing or two about how to how to work in the
cloud and how to how to make the cloud work for media and entertainment,” says
Reid. “We're cautiously optimistic that increased volumes of work are coming
back into the market and that new studios are going to pop up that will need
next generation technology, solutions and workflows to support them.”
Describing Enginelab as a full-service independent
technology business he adds, “We don't need edit suites. We not going to be
hiring artists. We're going to be providing the infrastructure for studio
businesses and we’re going to be the technology experts they can call upon for
guidance and leadership.”
Joining Reid in the venture is colleague and senior
developer from Untold Daniel Goller; and Matt Herman who founded roto
and paint shop Trace VFX before selling it to Technicolor in 2016. Subsequently, Herman took animation and
visual effects outfit Psyop from multiple on-prem studios to a fully cloud and
remote operation, expanding the business by opening lightweight facilities in Mexico
City, Berlin and Hamburg.
“Because we have [set up facilities] once before we should
be able to do it again but a lot quicker,” Reid says. “We’re also going to use
AI to help us do that.”
Specifically, Enginelab will use AI to automate processes.
“AI helps with technical manipulation, the really boring, mundane jobs that an
artist would have to do so they can focus more on their craft,” Reid explains.
“I’ve have spent a lot of time at Untold evangelising and implementing AI
workflows. Now I’m keen to unlock efficiencies in workflows for other
businesses. For example, AI can write code a lot more efficiently and a lot
better too.”
It’s not too much of a stretch to suggest that the recent collapse
of Technicolor is end of the line for post models with volumes of real estate, thousands
of employed staff and huge overheads. It is being replaced by leaner
organisations where infrastructure is for hire to be tailored per project and
scaled up or down as required.
“It's all very well shutting everything down and minimising
spend, but you need to be able to quickly kick it back into motion when you get
a big project that needs lots of render nodes, for example,” Reid says. “You
also have to be comfortable doing it, because it’s one thing knowing you can do
it, but you need to have the team around you who know how to do that properly
so you don't end up with huge bills and in situations you find it very
difficult to get out of.”
In his obituary to Technicolor, Michael Elson, COO at MPC
from 1998 to 2008, said The Mill was “founded by visionaries and powered by
super talent, ravished by neglect”. MPC, he said, was “killed by a management
so adrift it’s criminal”. Of Technicolor itself Elson concluded, “A corporate
behemoth was never equipped to deal [with] a world that requires you to be
light on your feet and adaptable.”
Reid and Herman are alumni of The Mill, both starting out
their careers there in the engineering departments. They are wary of not making
the same mistakes as its parent.
“It’s about staying lean and not falling into a trap of huge
overheads by being able to adapt to dips in work,” Reid says. “Cloud technology
helps with that because you can be very in control of the costs.”
He adds, “I really enjoyed working at The Mill and it’s sad
to see what's happened to it. It's where I fell in love with technology. One
thing I’ve learned is that when our backs are against the wall everyone bands
together. You can see it happening right now. We're having some really
interesting conversations with people about setting up new studios and
hopefully we'll be able to help them.
“The future is definitely much less about having a physical
presence and owning kit. The facilities are disposable to be honest.
“People are the assets and always have been in this
industry. We need to protect them because they are the ones that drive value.”
Enginelab are optimistic that the industry as a whole has
turned a corner on the last few years of Covid, strikes and economic
downturn. It has its eye too on the 29.25%
tax credit for UK VFX that comes into effect on 1 April 2025 (and is backdated
for activity after Jan 1 2025). It probably won't receive Royal Assent until
late March.
“There zero chance it will fail at this point,” Neil Hatton,
CEO, UK Screen Alliance tells IBC365. “HMRC, however, won't issue guidance
until it's written in law and there are signs that this is causing some clients
to hang back on commitment until they are 100% certain of what is claimable.”
Reid highlights the increasing global and transient nature
of the workforce and shifts in locating productions to soak up different tax
benefits.
“We hope to see a lot more studios come to the UK especially
for films and HETV work. The key to success in 2025 is being able to work with
pockets of people around the world. Our challenge is how to make it a seamless
and frictionless process.”
They aspire to emulate the business model of Untold which
spans longform as much as shortform work.
“With a longform project you are looking at many
months to potentially years of work, so things like managing the data become a lot more of a challenge and more of a
focus point. Advertising can be started and finished within a few weeks. The
challenge here is to be very efficient and render shots quickly.
“We should be able to set up a very secure environment for
creatives to focus on what they do best while we make the technology work
really hard for them. Those artists could be in Boston or Cape Town as equally
as they might be north of London.”
Having established a relationship with AWS at Untold, Reid
says it starts as Enginelab’s preferred cloud provider. “If a customer wants to
use a different cloud provider then we'll be agnostic. I'm not a cloud
salesman, I'm a technologist. We want to work with businesses to craft them the
best technology solution that could be in the cloud or it could be on prem or
it could be both.”
“If it was a full cloud environment with render, storage and
workstations there for maximum efficiency we can also help businesses work
together. If more people use the same platform we can create some smart
automations and ways of sharing data.
“For example, a big feature film might want to engage us to host their data and
we would securely serve data and functionalities out to different vendors on
that show. Certainly, there will be a power in numbers if everyone is using the
same infrastructure.”
Streaming Media
article here