October 13, 2024
Intangible Assets

Authors sue Claude AI chatbot creator Anthropic for copyright theft


SAN FRANCISCO, California: A group of authors have sued a San Francisco-based company, which called itself a more responsible and safety-focused AI developer, for copyright infringement of pirated books.

The authors alleged startup Anthropic committed “large-scale theft” in training its popular chatbot Claude on pirated copies of copyrighted books.

While competitor OpenAI, maker of ChatGPT, has been facing similar lawsuits for more than a year, this is the first from writers to target Anthropic and its Claude chatbot.

But this week’s lawsuit in a San Francisco federal court alleged that Anthropic’s actions “have made a mockery of its lofty goals” by tapping into repositories of pirated writings to build its AI product.

A trio of writers, Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, filed the lawsuit seeking to represent a class of similarly situated authors of fiction and nonfiction.

Anthropic is already fighting an earlier lawsuit by major music publishers alleging that Claude regurgitates copyrighted song lyrics.

The authors’ case joins a growing number of lawsuits filed against developers of AI large language models in San Francisco and New York.

OpenAI and its business partner Microsoft are already battling a group of copyright infringement cases led by household names such as John Grisham, Jodi Picoult, and ‘Game of Thrones’ novelist George R. R. Martin; and another set of lawsuits from media outlets such as The New York Times, Chicago Tribune and Mother Jones.

What links all the cases is the claim that technology companies ingested huge troves of human writings to train AI chatbots to produce human-like passages of text without getting permission or compensating the people who wrote the original works. The legal challenges come from writers, visual artists, music labels, and other creators.

Anthropic and other technology companies have argued that training of AI models fits into the “fair use” doctrine of U.S. laws that allows for limited uses of copyrighted materials such as for teaching, research, or transforming the copyrighted work into something different.

However, the lawsuit against Anthropic accused it of using a dataset called The Pile, including a trove of pirated books. It also disputes the idea that AI systems are learning like humans.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *