Thursday, 25 April 2024

When It Comes to Creativity, Humans and Machines Can Co-Exist (But It Won’t Always Be Easy)

NAB

As uncertainty and debate heats up about the impact of generative AI on the creative arts, Adobe wants to defend its own position in developing AI tools integrated in Photoshop and its own GenAI model Firefly while also reinforcing its brand as a champion of the creator.

article here

“I truly believe that to be human is to be creative, that creativity is a core part of who we are, whether or not you consider yourself a creative person,” says Brooke Hopper, principal designer for emerging design for AI/ML at Adobe, speaking to Debbie Millman, designer and educator at The School of Visual Arts and host of the podcast Design Matters.

While AI is in its baby steps phases it is easier to cling onto the idea that creativity is essential to what it means to be human. As AI technology advances however what passes for art, imagination or the lived experience of someone may be indistinguishable from the machine.

“It’s our emotions, our point of view, our life experiences,” says Hopper. “It’s spontaneity, it’s deciding when and where to break the rules. And so I do think that there’s a coexistence of humans and machines [where] humans do what humans are good at and ultimately, the machines are learning from us.”

Adobe can speak with a position of some strength here since it took a decision several years ago to support and build a pathway for tracking how AI has altered images and video content while training its own AI tools on data that it owns or has been cleared for use by third parties.

“We have to give them that data,” she says, while also anthropomorphizing the machines. “They’re not at this point in time making up data on their own. They’re simply taking the data day that we feed them, breaking it down, and then recreating it from noise.”

Hopper acknowledges the issues that come from feeding the machine data that humans have created.

“One thing to remember is these machines rely on information that humans put out into the world. And humans are biased, whether we try to be or not, we are. Therefore the machines are. We need to do things in order to mitigate that bias.”

Adobe advocates training AI on data that is licensed, with verified ownership, and that isn’t copyrighted. It began the Content Authenticity Initiative in 2019 to help avoid some of the deepfake issues that are now surfacing with regularity.

By embedding metadata in the content that’s being created and being able to tag content with “do not train” credentials, it hopes to “actively pursue ways that we can make sure that there are artists protections and that creators are being protected.”

As other Adobe execs have indicated, there is only so much responsibility a supplier of AI tools, is prepared to accept. Consumers need to accept their fair share of responsibility for quizzing whether content is “faked.”

Hopper says, “The general population [should be] educated about how to spot a deep fake, how to know if a website is not secure, because the technology to create deepfakes is getting better. And unfortunately, it’s the same technology that’s helping people create new and different content.”

Hopper backs moves by Congress and other state bodies to enshrine protections from deepfakes in law. “But if nothing is done legally, then morality is always a little bit of a slippery slope.”

The argument from Adobe is that human creativity will never be usurped by AI; that it remains a tool to be used as part of a human led creative process. Hopper is an artist herself, and says she would like to use GenAI tools to help her print 3D designs.

“That’s not to say that I’m going to become a professional 3D artist by any means, but allows me to work in medium and media that I wouldn’t be able to, or would struggle to previously. And that’s what I’m super excited about.”

Generative AI, she insists, is “super useful” within the ideation phase. “Imagine being able to generate even more ideas and more different directions to be able to come to such a better end goal.”

And the one thing that differentiates human from AI generated content, she says, is our own ability to break the rules.

“Machines don’t know when and how to break rules. They follow the rules. So that’s what we lean into. One of the biggest design principles is you have to learn the rules in order to break the rules and breaking the rules is what makes something creative and enjoyable. It’s that serendipitous rule breaking that that feeds into creativity.”

Just now, your basic GenAI tool cannot “think.” It will spew infinite versions, each one different, of an input we give it, based on data we give it. That may well change. But Adobe and Hopper look on the bright side. What else can they do?

“In the next 10 years we’re going to see an explosion of more creativity and content and, I think, more awareness. I’m really excited about the possibilities of more immersive design and experiences,” Hopper says. “Like, what happens when you’re potentially interacting with the artist in [a gallery or museum] piece, or you become part of the piece?”

 


No comments:

Post a Comment