Now that computer-generated imaging is accessible to anyone with a weird idea and an internet connection, the creation of “AI art” is raising questions—and lawsuits. The key questions seem to be 1) how does it actually work, 2) what work can it replace, and 3) how can the labor of artists be respected through this change?
The lawsuits over AI turn, in large part, on copyright. These copyright issues are so complex that we’ve devoted a whole, separate post to them. Here, we focus on thornier non-legal issues.
How Do AI Art Generators Work?
There are two different parts of the life of an AI art generator. First are the data that teaches it what a “dog” is or, more abstractly, what “anger” looks like. Second are the outputs that the machine gives in response to prompts. Early, when the generator has not had enough training, those outputs only loosely reflect the prompts. But eventually, the generator will have seen enough images to figure out how to properly respond to a prompt (this is just how people do it, too). AI-generated creative content can run the gamut from “prompt based on an image I saw in a fever dream” to “very poorly written blog post.”
How Does an AI Art Generator “Learn”?
AI art generators depend on “machine learning.” In a machine learning process, a training algorithm takes in an enormous set of data and analyzes the relationships between its different aspects. An AI art generator is trained on images and on the text that describes those images.
Once it has analyzed the relationships between the words and features of the image data, the generator can use this set of associations to produce new images. This is how it is able to take text input—a “prompt”—like “dog” and generate (that is, “output”) arrangements of pixels that it associates with the word, based on its training data.
The nature of these “outputs” depends on the system’s training data, its training model, and the choices its human creators make.
For instance: a model trained by feeding it images labeled with text that appeared close to tho
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: