The US Judge allows the company to train AI using copyright literary materials | News of technology


The ruling side against the writers who allegedly trained the AI ​​model by using their work without consent to an AI model.

The United States Federal Judge has ruled that the company “used” the books used by Athropic for training Cosmetic (AI) tools without the permission of the author.

Favorable decisions when it comes to The results of AI Talks are being discussed by regulators and policymakers, and the industry is using your political influence to try for a loose regulatory framework.

“Like any reader who wants to be a writer, the LLM (big language model) in anthropology does ahead and replicating them or not supplementing them – but it works to change the corner and create something different,” said American District Judge William Alsp.

A group of writers had filed a class-consid of the Class-Conside No.

But Alsp said, the AI ​​system did not violate the Safguards in the US copyright laws, which is designed “to enable creativity and to enhance scientific progress.”

He claimed that AI’s output was “extremely variable” and hence the “reasonable use” was under protection.

Alpsp, however, has a rule of copying and stored seven million pirates of anthrops in the “Central Library”, and the author violated the copyright and did not use properly.

The teaching of reasonable use, which allows the limited use of the contents of the copied material Creative purposeThe job has been employed by tech companies because they produce generating AI. Technology developers often leap to existing materials to train their AI models.

Nevertheless, there is intense debate on whether AI will facilitate more artistic creativity or allow a large amount of cheap imitation to allow the production of cheap imitation. Artist obsolete artist For the benefit of larger companies.

The authors filed by Andrea Bartz, Charles Gramber and Cancer Walse Johnson alleged that the methods of anthropology are “greatly stolen” and the company tried to make a profit from human expression and ingenuity behind those tasks. “

While the AI ​​developers were considered a victory of the Tuesday’s decision, Alps said that anthropology still has to be sued in December on allegations of alleged theft of pirates.

The judge wrote that the company “does not have the right to use pirate copies for its central library”.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *