• Sunday, October 06, 2024

New AI Lawsuit Filed by Authors Against Anthropic Targets Expanding Licensing Market for Copyrighted Content

Authors accuse Anthropic of illegally copying their books to train AI chatbot Claude, sparking legal debate on AI copyright infringement and fair use.
on Aug 21, 2024
Authors Sue Anthropic Over Copyright Licensing

Over the last two years, there has been a growing market for licensing copyrighted material to train artificial intelligence systems. OpenAI was the first to make deals with publications such as 

Axel Springer, News Corp., and the Associated Press, with others following suit. 

When artificial intelligence firms first filed litigation for widespread infringement, these agreements were not yet in place. Now, lawsuits are increasingly focusing on this licensing market to argue that AI companies illegally use creators’ works.

In a proposed class action filed on Monday evening, authors accused Anthropic of illegally downloading and replicating their books to power its AI chatbot, Claude. The lawsuit alleges that the Amazon-backed company “seized a licensing market for copyright owners.”

Unless Congress intervenes, the courts will likely decide on the legality of using copyrighted works in training datasets. The question will probably be answered by considering fair use, which offers protection for using copyrighted material to create a secondary job as long as it’s “transformative.” It remains a primary point of contention for the mainstream tech adoption.

The authors’ lawsuit indirectly challenged the actions of artificial intelligence firms, stating that they were covered by legal principles. According to the complaint, by not obtaining licenses for the content used to build Claude, Anthropic is undermining an existing market that other AI (Artificial Intelligence) companies have established.

The allegations may target weakening a fair use defence, which was limited when the Supreme Court administered its decision in Andy Warhol Foundation for the Visual Arts v. Goldsmith. In that case, the majority stated that assessing whether an allegedly infringing work was sufficiently modified must be balanced against the “commercial nature of the use.” The authors and other creators in a similar situation seek to use that ruling to prove that its actions illegally restrict their ability to profit from their material.

The lawsuit states, "Instead of using training material from pirated collections of copyrighted books similar to modern-day Napster, Anthropic could have sought and obtained a license to make copies of those books. Instead, the company made a deliberate decision to take shortcuts and rely on stolen material to train their models.”

Post a comment

Your email address will not be published. Required fields are marked *

0 comments

    Sorry! No comment found for this post.