Robots Can’t Sign Art: The Unethical Nature of AI Art

The act of pressing a simple button seems like a minute challenge to take on in order to prove you are not a robot. In other variations of this situation, we just have to choose fire hydrants, electric poles, buses, and other vehicles in an array of blurred images that seemingly fool robots to reveal themselves for what they are. For us humans, it is almost just like a game to handle CAPTCHA prompts to differentiate ourselves from machines. However, the game changes when we realize that even the most human creations become ambiguous for their creator. 

  • AI Art works through a process known as Machine Learning that allows a program to “learn” how to make “art” through art databases that contain human-made pieces.
  • These art databases are sourced from various publicly accessible platforms that do not necessarily entail any consent or permission from artists for their art to be used or reproduced.
  • IT’S NOT ABOUT MAKING ART. IT’S ABOUT AVOIDING MAKING IT. As these AI models rise to fame for their applications, more and more people including corporations are looking into using these models to generate creative and visual content. 
  • Through AI, art credit is not given where it is due. It is given to nameless machines whose non-existing hands can’t even draw hands YET. Money is made, but not for the artists who contributed to making it. 

The notion of art as a human product is currently being challenged by the existence of AI art generators. Its implications  for art and its applications in all aspects of living may be seen with child-like excitement for how magnificent technology like this is. But like every sci-fi film involving robots, we can’t always expect a happy ending. The novelty and brilliance of artificial intelligence (AI) especially from a technological perspective are not questioned. However, it is  the  great deal of exploitation of human craft that  raises the question of ethics in the discourse of AI art. What a machine borrows from artists, it does not return. Thus, AI art then becomes the glorified mask for art theft in the 21st century.

Thus far, the discussion on robots and artificial intelligence has been quite tame as like the science itself, these technologies are budding at best. There are occasional prospects on what AI could mean for the future with more extreme discourses on what could happen if machines become sentient, although  these examples are usually left to the imaginations of sci-fi writers. However, as AI-generated art surfaced on the internet with mainstream media applications including TikTok filters and free-for-all generators online, the discussion practically exploded and polarized people on online platforms. 

An interesting point brought up in these conversations is the fact that some AI art that spread in social media seemingly includes ineligible signatures on the sides of these pieces as if an artist actually signed them. Can robots sign art? Like most internet phenomena, these AI-signed art pieces have a simple explanation, which also sheds light on the sinister nature of artificial intelligence art. This discussion will delve into how AI art works, how AI art can be used, and how exactly it can be considered as art theft.


First, we must establish how AI art works. Algorithms are written to deliberately perform certain actions and follow rules. However, AI art algorithms are not written to make the machine follow rules, rather they  are allowed to “learn.” Machine learning facilitates the process of AI programs to grasp certain aesthetics by analyzing thousands of images that already exist (Elgammal, 2019). These programs then “create” images that correspond to the aesthetics learned by AI.  

There are several types of AI that can create art, most of which are still developing, but these models generally perform select tasks including processing pictures, recognizing artistic elements, editing existing images, or producing new art. Most of the popular AI art generators in the market make use of GAN or General Adversarial Network that involves a “generator” and a “discriminator” in which these two neural networks try to compete with each other in becoming more accurate in their predictions. In this case, the generator attempts to make original art, while the discriminator judges whether the art is really original using a database of existing images. Examples of programs that make use of this AI are DALL-E 2 and Imagen. (Du Plessis, 2022) 

Regardless of the AI type, the baseline is that machines learn by being fed. They are fed with images that allow them to develop an understanding of how certain art is generated. By further modification of the algorithms, one can simply instruct these machines to produce a set of images that correspond to certain language prompts. This becomes quite a marvel of technology, but at what cost?

Where things get sketchy

To answer the initial question about AI signatures, robots cannot sign art pieces. That’s one less reason to worry about robot invasions. Nonetheless, AI art still poses a problem as it exploits art created by humans without providing credit or even obtaining  consent. As established, AI programs are fed to learn. They are fed with databases of existing art to create art of their own. 

LAION-5B is one of the data sets used by popular AI art apps including the infamous Lensa AI. LAION-5B is a German nonprofit, which garners billions of images from art sites and public websites that have been captioned for easier categorization. The websites used by this database include publicly accessible platforms such as Google Images, DeviantArt, Getty Images, and Pinterest. A fact familiar to several Pinterest board makers who just wanted to make mood boards and fan cam collections is that sites like Pinterest allow users to upload their own art, photography, and creations for the public to view. However, this does not mean that the artist consents to the reproduction and use of their images. Therefore, AI art generators use databases of art by users that do not necessarily permit their art to be used as training materials for machine learning (Penava, 2023). 

This is probably why AI can sign their masterpieces. They can’t. They just use an amalgamation of the pieces made by artists who sign their work to create art that also appears to be signed. 

Unfortunately, artists are playing a losing game when it comes to the legal side of this conversation. This is because AI apps and databases carry their own claims that put the entire process of AI art generation in legal ambiguity. Additionally, one can never make a foolproof case that their art was stolen as the AI apps use billions of images online, which involve possibly more than thousands of artists. 

The irony of it all is that AI programs can actually produce art that explicitly associates with the art styles of specific artists. As mentioned earlier, programs like DALL-E 2 can be given language prompts to create images that come close to the descriptions given. As such, a user can input an artist’s name in the program to advise the AI to use reference images from these artists to create a piece with a similar art style. 

An example of this scenario would be the disgrace of AI art on Kim Jung Gi. Kim Jung Gi, a renowned Korean illustrator known for his elaborate manhwa or comic book art style, passed away last year in October. Following his death, an online user fed his pieces into an AI model that sprouted out pieces eerily similar to Kim Jung Gi’s work. Others were quick to point out that this action was unacceptable and disrespectful to Kim Jung Gi’s family, his life, and his body of work (Deck, 2022). 

Source: rest of world

The fact that AI models claim that they do not directly replicate art and artists even if they can produce art that can resurrect a late artist’s style speaks volumes on the unethical nature of this entire process. The exploitation is both ambiguous and blatant. 

Where things get even sketchier

There are several applications of artificial intelligence, but most of the studies and products made are geared toward helping people. As capitalistic as most societies can get, this purpose for technology is blurred by the potential for financial gain that artificial intelligence can bring in the future. We already see a glimpse of this scheme with the unethical AI art generators in the market. AI art can be used and is already being used for commercial purposes. 

As an anonymous Twitter user stated, “It’s not about making art, it’s about avoiding making it.” By extension, it is also to make money off of it. Corporations would rather come up with a 5-second piece using cheaper subscriptions to AI art generators than pay actual artists to do the work. It is even sadder to think that a lot of the artists the AI models stole from are also trying to make a living with their art. No matter how indirectly this scheme goes, money still goes to corporations and to AI apps from the art that artists made. This is no longer the petty cash that artists could have gained if they were commissioned but the billions of dollars that companies could make with the help of AI art. 

Recently, even popular personalities such as the musician and influencer, Jacob Sartorius, were criticized for using AI art to produce album cover art for his latest single, High. An artist under the username @evergreenqveen pointed out that her art posted in 2021 of pink clouds and a moon looked very similar to Jacob’s cover. 

Left: @evergreenqveen’s art; Right: High by Jacob Sartorius cover art

Jacob and his team explain that it was an accident and that they used AI to generate the album cover. The truth on whether or not the art was stolen directly is unclear. However, it is still extremely questionable that an artist as big as Jacob could openly admit that they used AI to produce cover art. An artist such as @evergreenqveen who was stolen from was still not compensated well enough in the process.

Analyzing the brush strokes

As the machine is fed with more information and stolen art, it only gets better over time. That’s how machine learning works. During the earlier stages of AI art, people argue that AI art is not even as good as those made by humans as a way to foster tolerance for AI art theft. This point would not stand because AI art can only get better over time. What artists take years to master is rapidly and easily learned by AI generators. This contrast in skill learning time should not sit right with anyone.

A piece of AI art is remarkable, convenient, and quick. A piece of art made by the artist takes time and thought. AI art strips humanity from art. Art is defined as the “expression or application of human creative skill and imagination” (Oxford Languages, n.d.). Once we categorize art made by artificial intelligence models as art, we defeat the purpose of creating art as human expression. 

AI art, therefore, is not art nor is it a concept that should passively exist without our objections. It steals from artists around the world and masks itself as a fun pastime or even an innovation with various applications. This might be true, but one of its evident applications now is that it is art theft for even bigger gains for people who will use it, not the people who actually make art. 

Through AI, art credit is not given where it is due. It is given to nameless machines whose non-existing hands can’t even draw hands. Spoiler alert: it probably will not take long before it starts to learn how to draw hands. I am not a robot. I am not a thief. Artists are not robots. Artists are victims of thieves. AI is a robot. Most importantly, AI is a thief. 

Source: Know Your Meme


Deck, A. (2022, November 30). AI-generated art sparks furious backlash from Japan’s anime community. Rest of World.

Du Plessis, L. (2022, September 9). What is AI Art? How Artists Use AI, and How To Generate Your Own. Domestika.

Elgammal, A. (2019, June 14). AI Is Blurring the Definition of Artist. American Scientist.

Escalante-De Mattei, S. (2022, December 9). Artists Voice Concerns Over The Signatures In Viral LensaAI Portraits.

Penava, E. (2023, January 24). AI Art Is in Legal Greyscale. The Regulatory Review.

Oxford Languages. (n.d.). art. Oxford Languages.

Weekman, K. (2022, December 8). Here’s Why People Are Speaking Out Against AI Art Apps Like Lensa. BuzzFeed News.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s