Developments in artificial intelligence (“AI”) have excited people across industries for years—in 2017, PwC projected that AI would contribute over USD$15 trillion to the global economy by 2030—but recent introductions in the space of generative AI have raised new implications for the entertainment industry in particular.
Stories about ChatGPT, a program that generates conversational text, are on the tip of everyone’s tongue; more than 2 million images per day are being created in response to users’ written prompts on DALL-E; and Google has created an AI tool called MusicLM that can translate written prompts into music. These changes spell out new opportunities in entertainment and important legal considerations as we navigate the future of AI. This article outlines a few of these key points, many of which will be explored further in later publications.
What is AI?
AI refers to the concept of computer systems and machines being able to perform tasks that would ordinarily require human intelligence. Machine learning is the process of using data sets and algorithms to improve the accuracy of AI over time. For example, the Netflix and TikTok algorithms use machine learning to make predictions about what you want to see. Generative AI refers to the “creative” types of AI applications (examples above) which are trained using large sets of data to produce their own unique output, whether that is in the form of writing, images or music.
Applications in entertainment
Given the above, it is not hard to imagine the applications of AI in the entertainment industry and the related legal issues to navigate.
AI can act as a support in the creative process to introduce speed and efficiency into tasks like writing and editing, whether that is in relation to books, film and television or music. Video games can be programmed to learn and respond in unique ways to player behaviour. Visual and special effects in films and video games could be completely AI-generated.
But this begs the question, if AI is informed by input examples prior to creating its output, what intellectual property rights do the creators of those original inputs retain? If an AI program’s work looks eerily similar to your own, what are your rights? And who is responsible for any infringement? Getty Images, the stock photo company, has filed lawsuits in both the US and UK against the company behind Stable Diffusion, which is similar to DALL-E, and the results should shed some light on these issues.
Further, who owns the copyright of AI-generated work—the programmer, the user, or the program itself? Other concerns related to AI outputs include issues related to privacy and personality rights, which may be triggered when outputs consist of depictions of people. These questions require analysis of concepts in copyright law like authorship, originality, and fair dealing, which are better left for another article.
Beyond copyright issues in creative works, AI has and will have major implications on the rest of the business side of entertainment. For example, AI will assist in targeted marketing and advertising through big data analysis and algorithms. In relation to employment, businesses will start considering the use of “robot labour” as well as using AI tools in the hiring process. New AI tech acquisitions will raise commercial and legal considerations and businesses concerned about ESG standards will wish to take steps to vet and maintain the reliability, transparency and accuracy of the tools they use. Lastly, attention to privacy and cybersecurity matters will be essential to proper handling of AI tools, which rely on reviewing large amounts of data.
The future of AI
At this time, Canada does not have a comprehensive regularity scheme in place for the AI sector. However, change may be on the horizon, as in June 2022, the federal government tabled the Artificial Intelligence and Data Act, as part of Bill C-27. Regulators around the world are moving towards regulation. The EU is working on the AI Act and in the UK, lawmakers recently walked back a proposed exception to copyright laws that would allow AI developers to train AI systems using copyrighted content without permission from the owners. As already mentioned, lawsuits are underway and court decisions may assist in filling in legislative gaps.
Beyond the law, optimists may point to the music industry as an example of how we can count on technology and creative minds to find ways to address some of the potential downsides of generative AI, given the story of how streaming platforms like Spotify emerged out of the frightening copyright landscape artists were dealing with in the wake of Napster. With the appropriate controls in place, we can harness the value and opportunities that are inherent in AI to create a positive impact in the world of entertainment.
Miller Thomson’s Artificial Intelligence (AI) team
Our Artificial Intelligence (AI) lawyers are dedicated to supporting clients in every aspect of AI creation and implementation. With our diverse expertise, we can help businesses take advantage of the exciting developments in this field while protecting them from the risks associated with it. Please reach out to a member of our team with any questions.
Disclaimer
This publication is provided as an information service and may include items reported from other sources. We do not warrant its accuracy. This information is not meant as legal opinion or advice.
Miller Thomson LLP uses your contact information to send you information electronically on legal topics, seminars, and firm events that may be of interest to you. If you have any questions about our information practices or obligations under Canada’s anti-spam laws, please contact us at [email protected].
© 2023 Miller Thomson LLP. This publication may be reproduced and distributed in its entirety provided no alterations are made to the form or content. Any other form of reproduction or distribution requires the prior written consent of Miller Thomson LLP which may be requested by contacting [email protected].