AI's recent surge in popularity generated art that sometimes looked like someone else's. How can gamers use it ethically?
Art that is AI generated uses its data set to blend it into something recognizably similar to user-entered parameters but (according to AI developers), uniquely different. The problem is that often the art is TOO similar; so similar that it looks just like an artist's work, down to faking signatures.
Which raises a legitimate concern: if AI art can effectively mimic an artist's style for free, will anyone still pay the artist?
But on the Internet, everything is copied for future reference. Google's image searches can dig deep into sites to find pictures independent of their creators' sites. That said, Google doesn't store copies (a fact that was critical in a court decision). Pinterest, however, does.
Pinterest doesn't just store a thumbnail graphic, it stores a full-sized copy. By merely pinning any graphic, users are unwittingly giving Pinterest advertising revenue and potentially violating copyrights. Examples abound of this, but the most common is a "phantom pin" in which the pin no longer links to the site, essentially keeping a photo on the Internet long after the artist has revoked permission.
Unfortunately court cases have not swung in favor of artists, ruling that it's the people pinning the content, not the site, that is the problem. This is all coming to a head because some art LLMs use Pinterest as a dataset, thereby creating content inspired by artists who never consented to their art being used in the first place.
Similarly, it's nearly impossible to make a creature have spider-like characteristics without Spider-Man's red-and-black web pattern and large white eyes. Spider-Man's so popular as art that he effectively has replaced what real spiders look like on the Internet, warping AI's perception of what "spider-like" means.
The obvious answer for game developers is to not use AI-generated art. Paizo won't. Wizards of the Coast won't. Most other major RPG publishers won't. This is important, because these statements aren't just a commitment to artistic ethics: it means these companies will continue paying artists for their art.
But there are other ways that art can be ethically sourced. One way is to use AI to modify art so it looks like a different style. I'm particularly fond of taking art I've created (and own) and asking an AI to make it look more realistic. Conversely, you can apply these types of AI filters to documents that were intentionally released into the public domain with clear licenses. Using AI this way, it can turn clipart into three-dimensional monsters and characters, or turn a standard creature into something more exotic (a bull can become a metal gorgon, a bird can become a phoenix, a human bard can become an undead bard).
For game masters who are using art for their home games, AI art can act as a tool to illustrate what's happening in a game: character portraits, maps, landscapes, monsters, and magic items.
For artists, offering free content to potential customers now comes with significant risk. It's always been possible for users to just steal art, but thanks to AI it can now be stolen at scale without tracing it back to the original owners. AI isn't currently required to show its homework, and until it does, there's a legitimate argument that posting anything for free is no longer worth the risk. A login or paywall may be increasingly necessary for artists to balance advertising their services while protecting their work.
Unfortunately for many artists, it already may be too late. Even if you take your art down today, Pinterest is saving it without your consent, and LLMs are using that data to build its art without proving where it got it from. As publishers, declaring when and where AI art is used (or not used) is an important first step.
But the group most influential in the future of AI art is us. Perhaps the best we can do is ask for AI art to be labeled and then make our down decisions about whether or not to purchase it.
The Problem
Because what we term "AI" are Large Language Models (LLM), the "intelligence" part of "Artificial Intelligence" is actually us. LLMs use data sets to generate their content, much of it publicly sourced from what's freely accessible on the Internet. And that's where AI art gets into trouble.Art that is AI generated uses its data set to blend it into something recognizably similar to user-entered parameters but (according to AI developers), uniquely different. The problem is that often the art is TOO similar; so similar that it looks just like an artist's work, down to faking signatures.
Which raises a legitimate concern: if AI art can effectively mimic an artist's style for free, will anyone still pay the artist?
How Did We Get Here?
Part of the problem is that artists advertise their by sharing it for free on the Internet. In the physical world, an artist might hang art at a booth. Only the memory of that art is in the mind of potential customers. They don't walk away with a copy.But on the Internet, everything is copied for future reference. Google's image searches can dig deep into sites to find pictures independent of their creators' sites. That said, Google doesn't store copies (a fact that was critical in a court decision). Pinterest, however, does.
Pinterest doesn't just store a thumbnail graphic, it stores a full-sized copy. By merely pinning any graphic, users are unwittingly giving Pinterest advertising revenue and potentially violating copyrights. Examples abound of this, but the most common is a "phantom pin" in which the pin no longer links to the site, essentially keeping a photo on the Internet long after the artist has revoked permission.
Unfortunately court cases have not swung in favor of artists, ruling that it's the people pinning the content, not the site, that is the problem. This is all coming to a head because some art LLMs use Pinterest as a dataset, thereby creating content inspired by artists who never consented to their art being used in the first place.
What to Do About It
The biggest problem with AI art is the kind that's generated from scratch. This is the type that uses Pinterest to generate its images. Fantasy art in particular is dominated by Magic: The Gathering, and it's not uncommon to try to create a monster via AI only to be served up what looks like card art.Similarly, it's nearly impossible to make a creature have spider-like characteristics without Spider-Man's red-and-black web pattern and large white eyes. Spider-Man's so popular as art that he effectively has replaced what real spiders look like on the Internet, warping AI's perception of what "spider-like" means.
The obvious answer for game developers is to not use AI-generated art. Paizo won't. Wizards of the Coast won't. Most other major RPG publishers won't. This is important, because these statements aren't just a commitment to artistic ethics: it means these companies will continue paying artists for their art.
But there are other ways that art can be ethically sourced. One way is to use AI to modify art so it looks like a different style. I'm particularly fond of taking art I've created (and own) and asking an AI to make it look more realistic. Conversely, you can apply these types of AI filters to documents that were intentionally released into the public domain with clear licenses. Using AI this way, it can turn clipart into three-dimensional monsters and characters, or turn a standard creature into something more exotic (a bull can become a metal gorgon, a bird can become a phoenix, a human bard can become an undead bard).
For game masters who are using art for their home games, AI art can act as a tool to illustrate what's happening in a game: character portraits, maps, landscapes, monsters, and magic items.
For artists, offering free content to potential customers now comes with significant risk. It's always been possible for users to just steal art, but thanks to AI it can now be stolen at scale without tracing it back to the original owners. AI isn't currently required to show its homework, and until it does, there's a legitimate argument that posting anything for free is no longer worth the risk. A login or paywall may be increasingly necessary for artists to balance advertising their services while protecting their work.
Unfortunately for many artists, it already may be too late. Even if you take your art down today, Pinterest is saving it without your consent, and LLMs are using that data to build its art without proving where it got it from. As publishers, declaring when and where AI art is used (or not used) is an important first step.
But the group most influential in the future of AI art is us. Perhaps the best we can do is ask for AI art to be labeled and then make our down decisions about whether or not to purchase it.