Music Creators Want Consent in the AI Age, But Developers Find Safe Havens Abroad
It's creatives vs. computers as artists fight to protect their copyrights.
When Universal Music Group emailed Spotify, Apple Music and other streaming services in March asking them to stop artificial-intelligence companies from using its labels’ recordings to train their machine-learning software, it fired the first Howitzer shell of what’s shaping up as the next conflict between creators and computers. As Warner Music Group, HYBE, ByteDance, Spotify and other industry giants invest in AI development, along with a plethora of small startups, artists and songwriters are clamoring for protection against developers that use music created by professionals to train AI algorithms. Developers, meanwhile, are looking for safe havens where they can continue their work unfettered by government interference.
To someday generate music that rivals the work of human creators, AI models use a process of machine-learning to identify patterns in and mimic the characteristics that make a song irresistible, like that sticky verse-chorus structure of pop, the 808 drums that define the rhythm of hip-hop or that meteoric drop that defines electronic dance. These are distinctions human musicians have to learn during their lives either through osmosis or music education.
Machine-learning is exponentially faster, though; it’s usually achieved by feeding millions, even billions of so-called “inputs” into an AI model to build its musical vocabulary. Due to the sheer scale of data needed to train current systems that almost always includes the work of professionals, and to many copyright owners’ dismay, almost no one asks their permission to use it.
Countries around the world have various ways of regulating what’s allowed when it comes to what’s called the text and data mining of copyrighted material for AI training. And some territories are concluding that fewer rules will lead to more business.
China, Israel, Japan, South Korea and Singapore are among the countries that have largely positioned themselves as safe havens for AI companies in terms of industry-friendly regulation. In January, Israel’s Ministry of Justice defined its stance on the issue, saying that “lifting the copyright uncertainties that surround this issue [of training AI generators] can spur innovation and maximize the competitiveness of Israeli-based enterprises in both [machine-learning] and content creation.”
Singapore also “certainly strives to be a hub for AI,” says Bryan Tan, attorney and partner at Reed Smith, which has an office there. “It’s one of the most permissive places. But having said that, I think the world changes very quickly,” Tan says. He adds that even in countries where exceptions in copyright for text and data mining are established, there is a chance that developments in the fast-evolving AI sector could lead to change.
In the United States, Amir Ghavi, a partner at Fried Frank who is representing open-source text-to-image developer Stability AI in a number of upcoming landmark cases, says that though the United States has a “strong tradition of fair use … this is all playing out in real time” with decisions in upcoming cases like his setting significant precedents for AI and copyright law.
Many rights owners, including musicians like Helienne Lindvall, president of the European Composers and Songwriters Alliance, are hoping to establish consent as a basic practice. But, she asks, “How do you know when AI has used your work?”
AI companies tend to keep their training process secret, but Mat Dryhurst, a musician, podcast host and co-founder of music technology company Spawning, says many rely on just a few data sets, such as Laion 5B (as in 5 billion data points) and Common Crawl, a web-scraping tool used by Google. To help establish a compromise between copyright owners and AI developers, Spawning has created a website called HaveIBeenTrained.com, which helps creators determine whether their work is found in these common data sets and, free of charge, opt out of being used as fodder for training.
These requests are not backed by law, although Dryhurst says, “We think it’s in every AI organization’s best interest to respect our active opt-outs. One, because this is the right thing to do, and two, because the legality of this varies territory to territory. This is safer legally for AI companies, and we don’t charge them to partner with us. We do the work for them.”
The concept of opting out was first popularized by the European Union’s Copyright Directive, passed in 2019. Though Sophie Goossens, a partner at Reed Smith who works in Paris and London on entertainment, media and technology law, says the definition of “opt out” was initially vague, its inclusion makes the EU one of the most strict in terms of AI training.
There is a fear, however, that passing strict AI copyright regulations could result in a country missing the opportunity to establish itself as a next-generation Silicon Valley and reap the economic benefits that would follow. Russian President Vladimir Putin believes the stakes are even higher. In 2017, he stated that the nation that leads in AI “will be the ruler of the world.” The United Kingdom’s Intellectual Property Office seemed to be moving in that direction when it published a statement last summer recommending that text and data mining be exempt from opt-outs in hopes of becoming Europe’s haven for AI. In February, however, the British government put the brakes on the IPO’s proposal, leaving its future uncertain.
Lindvall and others in the music industry say they are fighting for even better standards. “We don’t want to opt out, we want to opt in,” she says. “Then we want a clear structure for remuneration.”
The lion’s share of U.S.-based music and entertainment organizations — more than 40, including ASCAP, BMI, RIAA, SESAC and the National Music Publisher’s Association — are in agreement and recently launched the Human Artistry Campaign, which established seven principles advocating AI’s best practices intended to protect creators’ copyrights. No. 4: “Governments should not create new copyright or other IP exemptions that allow AI developers to exploit creators without permission or compensation.”
Today, the idea that rights holders could one day license works for machine-learning still seems far off. Among the potential solutions for remuneration are blanket licenses or something like the blank-tape levies which are used in parts of Europe. But given the patchwork of international law on this subject, and the complexities of tracking down and paying rights holders, some worry these fixes are not viable.
Dryhurst says he and the Spawning team are working on a concrete solution: an “opt in” tool. Stability AI has signed on as its first partner for this innovation, and Dryhurst says the newest version of its text-to-image AI software, Stable Diffusion 3, will not include any of the 78 million artworks that opted out prior to this advancement. “This is a win,” he says. “I am really hopeful others will follow suit.”