How AI Raises Challenges to Protecting Creators’ Work 

Text from the ChatGPT page of the OpenAI website is shown in this photo, in New York, Feb. 2, 2023. (AP Photo/Richard Drew, File)

The rise of generative artificial intelligence (AI) technology is posing new challenges to intellectual property law.

The technology is raising concerns about how existing laws protect creators from potential threats posed by the booming industry. At the same time, lawmakers are aiming to strike a balance to put rules of the road in place — with patent law in mind — to keep the U.S. competitive on a global scale.

The Senate Judiciary subcommittee on intellectual property held a hearing Wednesday on concerns around patents and innovation, escalating lawmakers’ focus on a wide range of concerns about AI.

During a Judiciary subcommittee hearing on AI threats last month, Senators on both sides of the aisle expressed concerns over how AI products are trained on language models, and how that impacts artists and writers.

Growing concerns among lawmakers

Concerns about intellectual property took center stage at Wednesday’s follow-up hearing, as lawmakers aim to balance how to regulate the industry while allowing the U.S. to remain competitive on a global scale.

Sen. Chris Coons (D-Del.), the chairman of the intellectual property subcommittee, said it is “critical that we include IP considerations in ongoing AI regulatory frameworks.” 

“We should change our patent eligibility laws so that we can protect critical AI innovations,” he said. 

Ranking member of the subcommittee Sen. Thom Tillis (R-N.C.) said the U.S. needs to consider how to regulate in a way that ensures the nation remains a leader in the industry. 

Mike Huppe, CEO of SoundExchange, a nonprofit collective rights management organization that collects and distributes digital performance royalties for sound recordings, told The Hill it is important for lawmakers to understand how AI systems work and the threats they pose. 

“We’re at a point in time where thoughtful legislation and thoughtful guardrails and thoughtful regulation can have a real impact,” Huppe said. 

“Now is the time to be having these discussions, before AI gets too far away from us to have the impact we want those regulations to have,” he added.

How ChatGPT changed the game

The broad concerns around generative AI have been a growing focus for Congress and regulators this year since the launch of OpenAI’s ChatGPT chatbot.  The fears stretch beyond intellectual property into concerns about potential national security threats, diminishing the workforce and proliferating the spread of dangerous misinformation.

Senators will also convene for three bipartisan briefings on AI in order to “deepen our expertise in this pressing topic.”  

ChatGPT and rival products, as well as image-based generative AI such as OpenAI’s DALL-E, are trained on large language model sets. In terms of intellectual property concerns, part of the debate focused on copyright law and how it may apply to the large language model sets the technology is being trained on. 

Robert Brauneis, a professor and co-director of the intellectual property program at the George Washington University Law School, said most of the litigation over potential copyright infringement cases against AI companies will be centered on how the “fair use” exception to copyright law is interpreted. 

Who owns AI-created work?

The fair use exception allows for the unlicensed use of copyright-protected work in certain circumstances evaluated under four factors: the purpose and character of the use, the nature of the copyrighted work, the amount and substantiality of the portion used, and the effect the use has on the potential market for the value of the copyrighted work, according to the U.S. Copyright Office.

Other factors may also be considered and are decided upon based on circumstances by courts on a case-by-case basis, according to the Copyright Office.

To date, that legal battle has not yet played out in the courts, Brauneis said. 

“I don’t expect Congress to step in until at least the first round of litigation is kind of through, and so that means probably five years at least,” he said. 

“Congress has a lot of things to be worried about, other than copyright, and copyright isn’t the kind of issue that drives most voters to the polls,” Brauneis added. 

As cases make their way through, he said, there could be an argument in favor of training the models off of work by relating it to how humans learn. Human artists learn by viewing other artists’ work, and authors learn by reading the works of others. 

And with human creators, the copyright analysis doesn’t ask whether the creator has learned from another’s work, but rather if the output is substantially similar to a previous work — asking about the similarities of the output rather than the input, he said. 

Based on that line of thinking, the argument could be made that “if that’s the way it works with human beings, why should it work any differently with computers,” he said. 

On the other hand, Brauneis said an argument against fair use is that the output of the generative AI is likely to be competing in the same market as the input. 

“If I’m an illustrator, and OpenAI is using my illustrations to train their machine how to generate illustrations, I’m going to find myself competing against that AI,” he said. 

The booming generative AI industry also poses a new question on the output side over who — or rather, what — qualifies as a copyrightable creator. 

At the moment, copyright law states that only humans are eligible. The situation has come up in the past over animal skin prints or when a monkey took a selfie of itself, Huppe said. 

If a piece of work is fully generated by AI, he said U.S. law would dictate it cannot get a copyright. At the other end, a recording that only has a “little piece of AI in it” may be recognized as a regular recording, he said. 

“There’s a big range within that, from 100 percent generative AI to just using a tiny little AI plugin. Where on that spectrum does something become copyrightable under U.S. law, that’s going to be an interesting thing to see develop as well,” Huppe said.

This article was published by The Hill .

Previous
Previous

UMD Receives $500K to Boost Number of Black, Latina and Native American Women in Computing

Next
Next

Shilton Discusses AI-Powered Phone Scams Targeting Seniors