Anthropic company, one of the leaders in the field of generative AI, presented detailed counterarguments for accusations of a number of large musical publishers and copyright holders of illegally using songs protected by the copyright.
Previously, the musical giants of Universal Music, Concord Music and Abkco Music filed a lawsuit against a technological startup. They stated that for the training of the Claude chatbot, Anthropic allegedly illegally downloaded songs of songs in the network, and then the neural network reproduced fragments of works in dialogue with users.
Responding to the request of the plaintiffs about the preliminary law ban, which actually blocks the work of the Claude chatbot, Anthropic has brought several arguments that have already sounded in other similar disputes about the use of texts for teaching neural networks.
Firstly, the company said that the use of song texts when creating a language model Claude is transforming, since the texts are used not as objects of copyright, but as data for studying the characteristics of human speech by a neural network. At the same time, songs of songs make up only a meager part of the entire array of data on which Claude learned to understand and generate a natural language.
Secondly, the company indicated that it was possible to license the entire volume of texts necessary for high-quality configuration of such neural networks, both technically impossible and economically inappropriate. Alternative ways to collect data for teaching AI have not yet been developed, but in the future the situation may change.
In addition, Anthropic actually shifted part of the responsibility to the plaintiffs themselves. The company indicated that the lyrics were obtained by Claude as a result of special “attacks”, when representatives of the musical business in a dialogue with a chatbot purposefully provoked it to reproduce protected fragments in order to record violations. Thus, according to Anthropic, content companies themselves contributed to the alleged violations, and Claude only reacted to requests in automatic mode.
In addition, the startup challenged the statements of the musical industry about irreparable damage from Anthropic actions. Based on the position of the plaintiffs, losses can be compensated by money, which means that the harm is not irrevocable. On this basis, the company called an excessive radical measure in the form of a ban on the use of neural networks developed by it.
In addition to these arguments, Anthropic raised the question of choosing the jurisdiction of the claim, claiming that the lawsuit was filed in an inappropriate trial. The company indicated that it has no business relations with Tennessee, where a lawsuit was filed, and that all disputed actions took place outside this state. According to Anthropic, any disputes should be considered in California courts, where the headquarters and main operations of the company are located.