Daily Mail prepares for legal battle with Google over AI copyright
This is not the end of the debate, though, as the borders are blurred in generative AI tools that can take a human-made work as a base and modify it to create a new artwork. Thaler has previously suggested AI-generated content can be checked for originality by manually looking through patent filings, or by using Google reverse image search for artworks. However, unlike other recent AI copyright cases where the aim is to establish precedent that will make it easier to commercialise partly AI-generated works, Thaler wants to establish his AI as the sole, valid author. US court has ruled AI-generated art cannot be copyrighted, in a case related to one currently heading through the UK Supreme Court.
It’s probably no exaggeration to say artificial intelligence (AI) exploded into the public consciousness in late 2022 and early 2023. Further details about how we collect and use your personal data on the Knowledge Portal, including information on your rights, are set out in our Global Privacy Notice and Cookie Notice. Organisations will also need to address the evolving threat landscape caused by the adoption of generative AI, with organisations ingesting huge amounts more data to train their models, and cyber attackers reported to be using AI to create more sophisticated attack methods. The EU is progressing the world’s first bespoke legislation for AI regulation with its AI Act expected to be passed later this year. Diverging from the EU, the UK is aiming to take a more pro-innovation approach – and to lead on safety, hosting the first major global summit on AI safety later this year. ChatGPT quickly captured the public imagination because of its ease of use and broad potential application.
The fast-growing field of generative AI has raised novel intellectual property issues. The Copyright Office has also rejected an artist’s bid for copyrights on images generated through the AI system Midjourney despite the artist’s argument that the system was part of their creative process. The intersection of generative AI and copyright laws presents a multitude of concerns and challenges. The primary concern lies in the potential for AI to inadvertently infringe upon copyright-protected material as it consumes more of such data for the development of algorithms. Balancing this risk with the tremendous potential for innovation that AI offers is a complex task, but a necessary one.
At stake are both compensation and control – with artists understandably believing they should have the right to decide how their work is used and to be financially compensated for its use. So let’s take a look at some of the issues involved and some of the arguments that could have an impact on the future roles of both AI and art in society. The rapid rise of chatbots such as ChatGPT and Bard have sparked particular concerns in the news industry due to fears that tech firms have already carried out copyright violations on a vast scale. Earlier this year, Getty Images sued the owner of image-generation tool Stable Diffusion, accusing it of stealing more than 12 million of its copyrighted photos to help train its software. As a further point of concern, generative AI systems can be used to identify bugs in codes facilitating potential cyber-attacks, including ransomware attacks.
More The Conversation Articles…
Publishers must be fairly compensated for the tremendous value their content contributes to the development of generative AI technology. Emerging technologies such as AI must respect publishers’ intellectual property (IP), brands, reader relationships, and investments made in creating quality journalistic and creative content. Unfortunately, history has shown that new genrative ai technologies are rarely limited by regulation in the creative space. So perhaps, instead of constraining AI music growth, focusing on managing outcomes and leveraging it for enhanced creativity might seem a more pragmatic answer. In this regard, human-crafted music remains paramount, as affiliations with cultural icons facilitate a deeper connection with audiences.
This was originally introduced in EU Directives on software and databases but has now been applied more broadly to encompass copyright works beyond software and databases (see for example the Painer and Cofemel judgments). The “author’s own intellectual creation” is generally regarded as requiring a higher standard of originality than the English case law standard. Many commentators consider that AI-created works that do not have a human author cannot meet this higher standard. Since AI-generated content does not have a human author in the traditional sense, determining copyright ownership and affirming it as copyright content can be challenging.
Small Developers and Licensing
It’s helped businesses to overcome challenges such as processing terabytes of data and has taken over multiple mundane tasks from human workers. Tools like ChatGPT are capable of writing articles, stories, and poems at an often convincingly human level. These AI models that create new content are called ‘generative AI’ or ‘creative AI’. Art-focused generative AI models can compose impressive pictures, songs, and even videos. Yet, AI-driven music creation poses competition for creators, and raises concerns about IP protection and monetisation. While existing laws in most countries shield melodies, lyrics and chords, safeguarding thousands of AI-generated variations and preventing unauthorised usage presents challenges.
This came up recently in the case of an AI-generated comic book that was denied copyright protection in the US. The Copyright Act provides that a ‘mechanical licensing collective’ designated by the Register of Copyrights collect and process such royalties. As the use of artificial intelligence (AI) to generate new works has expanded rapidly, the US Copyright Office has sought to keep pace by issuing guidance on the application of copyright law to such works. On April 20, 2023, the Copyright Office published a letter (the MLC Letter) sent to the CEO of the Mechanical Licensing Collective (MLC) providing guidance on how the MLC should handle royalty distributions for musical works created through the use of generative AI.
AI cannot generate copyrightable material, says US judges
There are many proposed uses for the technology, but its impressive capabilities raise important questions about ownership of the content. This may be some bias on my part, but I find this far less compelling than the previous argument. It’s also revealed another problem – as the work generated does not have a human author, it’s not eligible for copyright.
For this reason, it is important that, at least in the near future, AI is monitored by humans. The focus of this article is on the legal issues related to content generating AI such as ChatGPT and Dall-E. There is wide disparity in the scope of exceptions in national copyright laws permitting copying for the use of training AI. There is also the risk that, after being trained on copyrighted material legally or otherwise, an AI produces material that infringes copyright.
Will Copyright Issues Get Tougher When Humans and AI Do The Work Together?
Where a literary, dramatic, musical or artistic work is made by an employee in the course of their employment, their employer is the first owner of any copyright in the work – subject to any agreement to the contrary. This article is for general information purposes only and does not constitute legal or professional advice. It should not be used as a substitute for legal advice relating to your particular circumstances. ChatGPT clearly thinks that copyright material should be excluded from the training dataset if the copyright holder cannot be contacted. This reminds me a debate I was involved in recently online regarding StabilityAI introducing support for an “opt-out” tag that could be used to prevent your content from being scraped into their dataset. Meanwhile several lawsuits have alleged that the way generative AI systems are trained using vast amounts of existing material amounts to a large-scale piracy.
- This article will explore the major shortcomings of the Arrangement Model in attributing copyright to AI-generated works.
- It doesn’t rely on direct instructions carefully written into a program by a programmer, which provides precise steps for the machine to follow to complete the task.
- Augusto Preta is a consultant, economist and market analyst, with long-established experience in the field of content media and digital markets.
This murkiness complicates the matter further, highlighting the potential pitfalls of feeding more copyright-protected material into AI systems without robust regulatory safeguards. Understanding both the functionality of generative AI and the nuances of the relaxed UK copyright laws is the foundation for addressing the concerns of copyright infringement in the era of AI. Secondary infringement can occur if an AI user uses the output they’ve obtained from an AI, the creation of which in the first place was a copyright infringement. In this situation, it is possible that the user could be liable for copyright infringement.