The money and attention swirling around artificial intelligence has attracted a pitch almost as old as copyright itself: in the midst of uncertainty, precarity, and technological change, an individual author can leverage their copyrights to stop change, to cash in, or both. Among the folks dining out on this idea is Ed Newton-Rex, UK-based propagator of the “Statement on AI Training” as well as the “Is This What We Want?” silent album and founder of the “Fairly Trained” non-profit. Catering to technopanics with copyright delusions, Newton-Rex and co. are selling snake oil to artists and policymakers alike.
The AI Statement preys on artists’ uncertainty about the AI future by proclaiming that “unlicensed AI is a major, unjust threat to [their] livelihoods.” The alleged major threat is AI-powered competition: writers, artists, and musicians will get fewer jobs because other people (who are assumed not to be writers, artists, or musicians) will use AI to create works that are better, cheaper, or both. Newton-Rex argues that because an AI tool could be used to compete with the creators on whose works it was trained, the training process can’t be a fair use.
Their proposed solution is licensing. By their lights, “fairly trained” means training only on licensed or public domain data. Licensing, they say, gives copyright holders the opportunity to withhold consent or to demand payment for the use of their works in training.
Here’s the contradiction: If you think AI poses a “threat” because it will enable more competition for creative jobs, what is it about licensing that would prevent that outcome? Indeed, these same advocates for licensing say it’s entirely possible to train fully capable AI tools using only licensed materials. (This is false, at least for now, but leave that aside.) In that case, licensed AI tools would be just as capable of enabling competitors to make better, cheaper creative work. Licensing does nothing to reduce the so-called “threat.” No single work or body of work has much value to AI training, so withholding permission won’t stop the machine, and granting it will earn each individual artist a pittance. The “fairly trained” crowd is selling a non-sequitur: an allegedly existential threat, resolved by giving the alleged victims a worthless veto or a few micropennies.
The only real winners in the “fairly trained” scenario are the usual copyright maximalist suspects – mega corporations and aggregators who hold enough copyrights that they can extract more than a few dollars from fearful developers, with individual creators either shut out or paid a few crumbs from the table. The victims in this scheme are the public, including creators themselves, who are worse off in a world where fewer, less innovative, more expensive AI tools are available to help power their own work because non-profit researchers, start-ups, and small or medium technologists can’t afford to pay the copyright protection money.