Artists are understandably concerned about the possibility that automatic image generators like Stable Diffusion will undercut the market for their work. We live in a society that does not support people who are automated out of a job, and being a visual artist is an already precarious career.
In this context, it’s natural to look to copyright law, because copyright is supposed to help ensure that artists get paid for their work. Unfortunately, one copyright theory advanced in a class-action lawsuit by some artists against Stable Diffusion is extremely dangerous for human creators. Other theories—both in that lawsuit and another suit by Getty Images—propose to alter and expand copyright restrictions in ways that would interfere with research, search engines, and the ability to make new technology interoperate with old.
This legal analysis is a companion piece to our post describing AI image-generating technology and how we see its potential risks and benefits. We suggest that you read that post first for context, then come back to this one for our view on how the copyright questions play out under U.S. law.
Copyright law is supposed to embody a balance between giving artists a sufficient incentive to create, by granting them control of some of the ways their art can be used, and giving the public the right to build on and/or use that art in new and interesting ways. Here, the question is whether those who own the copyright in the images used to train the AI generator model have a right to prohibit this kind of use.
To answer that question, let’s start with a few basic principles.
First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when tho
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: