Ethereum

According to the NYT lawsuit, OpenAI refers to AI models trained on copyrighted works.

The New York Times has launched a lawsuit against OpenAI and Microsoft, alleging that millions of their articles were improperly used to train AI models that have become direct competitors in the current information and news environment.

The lawsuit said OpenAI is “using The Times’ content without payment to create products that replace The Times and steal its audience.” The legal action highlights growing concerns about the use of copyrighted material in the development of artificial intelligence tools. If the outcome of the lawsuit is upheld in court, it could play a role in influencing the future landscape of digital content and intellectual property rights.

“OpenAI and Microsoft have built businesses worth tens of billions of dollars by taking the combined works of humanity without permission.” The New York Times says: “It is designed to protect protectable elements of expression such as style, word choice, arrangement and presentation of facts,” the suit reads.

Large-scale language models (LLMs), such as OpenAI’s ChatGPT, are at the center of this controversy. LLM is trained using massive data sets, including text from books, websites, and articles, to understand and produce language in a human-like way. We do not retain specific articles or data. Instead, we use it to learn patterns and information structures. This training allows them to create content across a variety of topics and styles, and enter areas traditionally reserved for human experts.

However, the New York Times claims that OpenAI paid special attention to the article when shaping the character of its model. “While defendants engaged in extensive copying from a variety of sources, they placed particular emphasis on Times content when constructing LLM, demonstrating a preference for recognizing the value of that work,” they said.

Considering the millions of media outlets on which OpenAI’s LLM has been trained, it is perhaps unsurprising that this is not the first legal challenge for OpenAI or the broader generative AI community. Recently, a group of high-profile authors, including Pulitzer Prize winners Taylor Branch, Stacy Schiff, and Kai Bird, represented by Julian Sancton, filed a lawsuit against OpenAI over similar claims of unauthorized use of their work. The lawsuit highlights a growing trend of creators and experts opposing AI’s free access to their intellectual property.

the value of originality

The landscape of generative AI is not limited to text. The advancement of AI art has been controversial, with several lawsuits filed over the copyright implications of AI-generated artwork in fields such as film, music, and illustration. However, some of these cases have been dismissed, indicating that the legal understanding of AI’s creative abilities and their relationship to existing copyright law is complex and evolving.

The New York Times’ lawsuit is particularly significant because it is the first major media organization to directly challenge the tech giant over alleged unauthorized use of its content. The lawsuit does not specify a monetary amount, but suggests the infringement resulted in significant damages, necessitating substantial compensation and corrective action.

“Without extensive copyrighted material available to exploit, there would be no ChatGPT,” the lawsuit claims. “Defendants’ commercial success was possible because they copied and digested protected and copyrighted expression contained in billions of pages of actual text. , across millions of copyrighted works, all without paying authors and rights holders a penny.”

The broader implications of this lawsuit extend to how AI companies can continue to access and use existing content. The legal challenge raised by the New York Times against OpenAI and Microsoft has set the stage for a broader conversation about the intersection of technology, law, and creative rights. The lawsuit highlights content creators’ concerns about the threat of AI-based competition.

“If The Times and other news organizations are unable to produce and protect independent journalism, a vacuum will be created that no computer or artificial intelligence can fill,” The Times claims in the lawsuit. “It will be huge.” For The Times, this means fewer journalists; for others, it means the end of working society as we know it.

Edited by Stacey Elliott.

Stay up to date with cryptocurrency news and receive daily updates in your inbox.

Related Articles

Back to top button