NAB
No issue exercises more minds in the industry just now than AI but as
legendary scriptwriter William Goldman once said, “Nobody knows nothing.”
article here
That’s because AI is a runaway train, speeding ahead of existing
protections such as those around copyright and in contract law, which actors
and writer’s are demand rewriting to account for their own personal data being
rewarded in perpetuity. AI tools like ChatGPT are also black boxes even to the
folk at OpenAI, who developed them — no one seems sure of how it actually
works, let alone what it is capable of.
It feels like a maelstrom right now although more than one commentator
has pointed out that what is actually happening is a good old-fashioned fight
for rights between labor and capital.
In which case, a good deal of what is being played out in the name of AI
is a continuation of existing trends and inequalities that could be dealt with
by existing law, or extensions of same.
Competition
law specialists Geradin Partners writes in a blog post, “many of the problems
experienced by media organizations are neither new nor specific to GenAI.
Therefore, in many cases, the solution may not necessarily consist in adopting
new rules, but in sensibly revising and extending the scope of existing rules.”
Another
lawyer, Gregor Pryor of Reed Smith, explained, “AI is pushing
existing legal concepts to their limits, inventing new ones and generally
questioning the relationship between our legal systems and machines in an
unprecedented manner.”
A factor determining the economic impact of generative AI, for example,
is who owns data and models. Geradin Partners notes that ownership of data and
models are often highly centralized, leading to market concentration.
At a recent
government hearing Senator Cory Booker said, “One of
my biggest concerns about this space is what I’ve already seen in the space of
Web2, Web3 is this massive corporate concentration. It is really terrifying to
see how few companies now control and affect the lives of so many of us. And
these companies are getting bigger and more powerful.”
Alarmingly, OpenAI boss Sam Altman confirmed at the Senate hearing:
“What’s happening in the open source community is amazing, but there will be a
relatively small number of providers that can make models at the cutting edge.”
Given the monopoly of such power, it is likely that concerns about
abuses of dominance and “gatekeeping” will arise,” thinks Geradin Partners,
adding that practices such as tying/bundling, self-preferencing, default
settings, and refusal to grant access to data may be employed to strengthen
existing ecosystems. Examples include bundling search or social networks with
generative AI tools, tying cloud services packages to AI services, etc.
Ensuring that generative AI evolves in a manner that is conducive to
intellectual property rights (IPR) protection is arguably one of the greatest
challenges to address from the perspective of the media and creative industries.
“It is widely accepted that inadequate IPR protection chills content
creativity and innovation,” says the law firm. There are established rules for
text and data mining but Geradin wonders as generative AI services proliferate
and their popularity increases, whether the these are is fit for purpose.
Two of the most debated issues when discussing the regulation of the
digital economy — not just AI — is the lack of transparency underpinning how
technologies and applications work and the limited accountability of their
providers. In the EU, the main instruments that (will) establish rules seeking
to promote and transparency and accountability in the digital economy are the
Digital Services Act and the AI Act.
This is
happening as the industry pushes for regulations. Michael Nash, chief digital
officer for Universal Music Group, tells Winston Cho at The
Hollywood Reporter that AI programs training machine learning
models by feeding them copyrighted works without permission from or payment to
UMG’s artists “enables us to have a very important seat at the table around the
evolution and use of these models, particularly with respect to developing new
licensing opportunities.”
He underscores the adoption of AI is to “put these tools in the hands of
artists” to see “how far their vision can take this technology.”
The Society of Composers and Lyricists, with creators of scores and song
for film, TV and theater as members, maintains that AI firms should have to
secure consent by creators for the use of their works to train AI programs and
compensate them at fair market rates for the subsequent creation of any new
work that’s created on top of providing the proper credit, Cho reports. The SCL
stresses that any regulatory framework should not grant copyright protection to
AI-generated works since doing so could flood the market with them, diluting
the value of original pieces.
According
to THR, UMG has been sending requests to take down AI-generated
songs, but is fighting “an entire online community dedicated to making, sharing
and teaching others how to create AI music.”
On Discord, members of a server called AI Hub released an album in April
called UTOP-AI — a play on an upcoming project from Travis Scott — featuring
the AI-generated voices of the rapper along with Drake, Playboy Carti and Baby
Keem. It got nearly 200,000 views on YouTube and Soundcloud in just three hours
before it was flagged for copyright infringement by Warner Music Group.
Tech companies entrenched in the M&E industry are taking a cautious
approach. Some studios, Pixar among them, are building their own generative AI
models but training them on their own back catalog of films. Others like
Shutterstock, Valve and Adobe are flagging copyright concerns as a selling
point.
Like Disney
and other large studios, however, Adobe and Shutterstock are in the fortunate
position of owning databanks of images, concept art and videos to train new AI
models. Because they can be absolutely sure where the assets used come from,
they can offer indemnification against any copyright
lawsuits brought against their users.
“It is an
extremely smart marketing technique for the companies, as they are both
highlighting a fundamental issue with AI and highlighting how, by their nature,
that issue will never arise for its users,” says Chris
Sutcliffe at The Drum.
Valve,
meanwhile, said it would not be hosting games that use AI-generated assets on
its Steam platform. The company clarified to Victory
Kennedy at Eurogamer that its decision to
delist a game created by a solo developer due to its use of AI-generated assets
was not simply its opinion on the tech, but a reflection of how it interprets
the current copyright laws.
A team of
14 legal experts across disciplines has just published a paper on generative AI
in Science magazine. One of the key questions that emerged was
whether copyright laws can adequately deal with the unique challenges of
generative AI.
“Generative
AI might seem unprecedented, but history can act as a guide,” three of the
paper’s authors conclude in an essay written for The
Conversation. For example, the US Copyright Office has stated unequivocally that
only humans can hold copyrights.
Matters are more complicated when outputs resemble works in the training
data. If the resemblance is based only on general style or content, it is
unlikely to violate copyright, because style is not copyrightable.
Even here there could be a solution. Since copyright law tends to favor
an all-or-nothing approach, scholars at Harvard Law School have proposed new
models of joint ownership that allow artists to gain some rights in outputs
that resemble their works.
The issue is complex in the extreme but writer’s and actors (possibly
directors and producers and production designers and concept artists and
costume and make-up designers down the line) understandably want some assurance
now that they are not going to be taken for a ride in future.
It almost requires an AI to compute all the possible consequences of how
the technology might be used, and that use is being interpreted by humans with
feelings and emotions and fears and values which in some ways are not
predictable and not consistent with binary logic.
Back to
Goldman. Would an AI really have devised the sparse but brilliant script
for Butch Cassidy and the Sundance Kid, the film that set the
template for every buddy movie since?
“Kid, the
next time I say, ‘Let’s go some place like Bolivia,’ let’s go some
place like Bolivia.”
No comments:
Post a Comment