IBC
Accelerated VFX workflows, video game characters you can converse with, and auto-generated visual experiences from sound for XR headsets are just some of the AI innovations devised by start-ups as part of a recent Digital Catapult programme.
article here
Government-backed deep tech innovation organisation Digital
Catapult partners with industry and academia to find advanced technological
solutions that benefit businesses UK-wide. In a recent scheme, Digital Catapult
allocated funding to help explore how AI could be leveraged by start-ups to
boost the creative economy in the South West of England.
The BBC, AWS and Nvidia are some of the external partners in
the scheme which also hooks up entrepreneurs with the latest academic research.
“We design programs that are specifically going to work to
address the most cutting-edge challenges for the sector and that play to the
strengths of individual businesses,” explains Sarah Addezio, Senior Innovation
Partner and Programme Lead at Digital Catapult. “Because this is public
funding, our selection of projects is a very rigorous process.”
To date, 18 projects spread over two rounds have received
£50k of funding each and the opportunity to participate in a 16-week
accelerator which offers technical and business support to build a new product
or develop an existing prototype. A second funding mechanism for R&D
projects in collaboration with academia delivers up to £200k.
“We’ve cascaded £2.9m worth of funding to 29 businesses in
the South West region,” says Addezio. “As a result, we've seen new jobs created
and £800k of investment achieved by the businesses we’ve supported in follow-on
funding.”
The first round of funding was around tooling for the
creative industries and the second more specifically themed on AI.
Lux Aeterna puts GenAI through its paces
Bristol-based VFX shop Lux Aeterna has explored the
capabilities and pitfalls of GenAI models for a variety of VFX processes. As
the company’s Creative Technologist/VFX Artist James Pollock explains: “If
you’re doing huge photoreal scenes with lots of volumetrics like cloudscapes
this has huge computational costs to the company. We’ve devised a process using
Intel Open Image Denoise that is able to reduce render times to four
hours per frame whereas previously it was four times that.”
The facility has also explored ways of using GenAI to
increase detail on 2D aerial maps that can be used to create 3D landscapes such
as alien planets and ancient Earth settings and to quickly generate room
interiors for 3D models of buildings.
“If you had a 3D building with a thousand windows, you could
put a unique interior in each of those windows and change the perspective on
them as we move past the windows, just as in real life, but without the need
for modelling that interior space,” he adds. “You can do that through GenAI.”
Such practical applications of GenAI are being researched in
tandem with whether their use on a commercial project is ethical and legal.
“Digital Catapult ran workshops on responsible innovation
which was a really great way of thinking about ethics because sometimes that
topic can appear a bit high level,” says Pollock. “Many GenAI tools have been
trained on images scraped from the internet with significant legal and ethical
question marks over their use in a professional commercial environment.”
As such, LAVFX does not currently use GenAI models like
Stable Diffusion in the creation and delivery of its work and it declares what
AI tools have been used in projects for clients including the BBC.
Pollock says: “It’s about providing a consistent framework
for responsible use based on the realities of these technologies and the real
risks, rather than guesswork.”
They made the sci-fi short film Reno as a
case study into both the creative and legal implications of working with GenAI.
Paul Silcox, VFX Director at Lux Aeterna, says: “With Reno, we are
describing every piece of every AI model that we’ve used along the way, how
we’ve used it and what our experience has been with it. A lot of this education
that we’re learning is going to be fed back into the industry.”
The education process has also changed the perception of
AI as a job threat to one of assisting VFX artists.
“We can use AI to generate interiors and change them at the
drop of a hat, say, from a Chinese design to a Victorian architecture,” Silcox
says. “The ability to do that adds a huge amount of options for a VFX pipeline
and before it just wouldn’t have been possible. It wouldn’t matter how many
people you employ, you wouldn’t have been able to do some of the things we can
now with generated techniques.”
In another example, a shot that would normally take four
days to rotoscope was done in a day with AI. “That didn't take four days of
work away from a rotoscoping artist,” he adds. “The shot would not have been
achievable or cost-effective any other way.”
Lux Aeterna’s research is tuned to building synthetic data
sets in order to train models in-house for applications like denoising or
upscaling.
“Using machine learning to drive efficiencies in the
creative process within Houdini is of interest to us and to [Houdini developer]
SideFX themselves. Another focus is on using AI and data sets to maximise the
unique skillset we have in creating digital twins for visual effects.”
Meaning Machine’s game-conscious characters
Experimental games studio Meaning Machine is developing
natural language models to enable game players to converse with characters in
more meaningful ways. It won funding in 2023 from Digital Catapult to explore
the concept and with it, created a game demo called Dead Meat in
which the gameplay involves talking to characters to solve a murder mystery.
“You literally have the freedom to say anything, which is
pretty much unheard of in games up until now,” says Ben Ackland, Co-Founder and
Tech Lead at Meaning Machine. “It’s an example of what is possible today.”
Out of this emerged the concept of ‘game-conscious
characters’, terminology which Meaning Machine uses to describe how characters
understand what's going on in the game.
“They are conscious of the way the narrative unfolds; for
example, of who has been killed, what events have happened, where every player
is,” Ackland says. “It combines game data with AI to ensure that non-playing
characters can adapt their script to anything the player does, even when the
player does something totally unexpected.”
This year, the company received a larger grant to develop
the technology in partnership with teams at Bristol University. It is also
continuing a mentor and business relationship with Nvidia which was initiated
by contacts at Digital Catapult.
“We have a deeper working relationship with Nvidia and are
working on something which they’ll be sharing in the new year,” says Ackland.
Digital Catapult has also guided the nascent business on investor readiness and
commercialisation ahead of plans to license the technology to game developers.
Ackland explains that the advent of AI has the potential to
kickstart a golden age of creative experimentation, as long as creative people
remain at the helm. “The conversations you can have with players in games now
have not really kept up with other aspects of gameplay. You’ve got very complex
physics systems creating free-to-roam worlds but the narrative that you can
have with characters has remained static. Even role-playing games like Baldur’s
Gate 3 which has a very deep narrative experience, are still
pre-scripted and not that reactive to what’s actually going on in the game.
“The more conscious the characters are of what’s going on in the game, the more
opportunities there are through their dialogue to actually relate and react and
reflect what’s going on,” he says. “As AI enters our lives with things like AI
voice assistants, players are going to expect games to become more intelligent
and more wrapped around their experience, more targeted at them and
more personalised.”
In doing so, the player gains control over the game
narrative with implications for storytelling that extend beyond the video game
industry.
“AI is only as powerful as the humans driving it,” Ackland
concludes. “And those humans could well be the writers or the game designers,
but also the players. One of the key things we found on Dead Meat was
that the more the player is willing to give to the game, the more they get out
of it. The player helps the emergent narrative evolve and the game gives back
in the sense it has more to work with to deliver that experience.”
New wave AI
The current wave of Digital Catapult funding is focused
squarely on AI-driven innovations for the creative industries. The nine
start-ups in the programme include Fictioneers which is a division of WPP-owned
agency AKQA. It is using the BBC Archives and building a prototyping tool for
interactive podcasts.
Octopus Immersive Lab is already using AI to create
responsive and generative visualisations of audio in real-time for experiencing
in XR headsets. It is working to tailor this for the BBC.
Audio design and technology outfit Black Goblin is
developing a platform that uses machine learning to allow content creators to
design sound alongside audio professionals. It will automate the generation of
sound effects from visual content, also in conjunction with BBC engineers.
Finishing and remastering specialist Nulight Studios says it
will use generative AI for video and audio production. Its first tool will
automate the identification and replacement of unwanted objects (such as lens
dirt, street lights, and radio collars on animals) in video and says this will
be particularly beneficial for the region’s natural history filmmakers.
Outcomes from these projects are expected later this year.