Skip to Content, Navigation, or Footer.
The Eagle
Delivering American University's news and views since 1925
Friday, June 21, 2024
The Eagle

Op-Ed: AI art is art theft and should be a crime

As uncertainty around AI grows, artists need legal protections

The following piece is an opinion and does not reflect the views of The Eagle and its staff. All opinions are edited for grammar, style and argument structure and fact-checked, but the opinions are the writer’s own.

A U.S. District Court ruled last August that art generated by artificial intelligence can’t be copyrighted. On a separate occasion, a judge almost dismissed a case brought by three artists who sued AI companies for using their work. Lawmakers and courts have failed to protect artists and their work from being stolen by AI companies for profit.

These two cases come as AI rapidly evolves and conversations around how to manage it become more important by the day. As AI models continue to adapt, lawmakers must set guidelines to protect artists’ work. 

AI models are trained to create images using real artists’ work. When they do, artists receive no compensation for their work, which is essentially copied and robbed from them. Art generated by AI should be a crime, at least while AI models are made this way.

Legal guidance needs to be set by policy makers, and the creators of these AI models need to be held accountable.

Currently, generative AI models like Midjourney and Stability AI’s Stable Diffusion train their models by compiling massive amounts of images and artworks from the internet. This practice is known as data scraping. These images are then fed into the AI, giving it the ability to create generative art.

The models distill images down into a series of points and codes. While the images pulled from the internet are not stored permanently, temporary copies are made. Many of these images are copyrighted by their original artists, which supposedly protects the works from being electronically reproduced without permission.

Artists are not compensated or asked for permission when AI companies take their art and use it to train AI models. These models come into direct competition with artists, who often work independently on a commission basis. AI-generated art threatens artists’ livelihood as they try to find work.

These AI models belong to companies. They may be in the research stage for now and some may be free to use for the public, but this will not last. Many programs, like ChatGPT, already have paid versions. The Midjourney art-generating AI model is no longer free for use. When these tech companies profit from their AI models, they are profiting from the artworks made by the artists whose works they used to train their models. And artists have not consented to take part in this process of creation.

AI is a product, not a person, and it’s certainly not an artist. But there is a way to fix all of this.

AI companies could train their models with free-to-use stock art images, or they could pay for the art they use to train their AI models. Until a more ethical way to train AI models emerges, the current way these models are made must be stopped.

Research and exploration into AI should not end. There is a valuable place for AI. But humans must decide what role it will play and how it should be used — this includes how AI is taught. 

Zachary Olson is a senior in the School of Communication at American University.

This article was edited by Jelinda Montes, Alexis Bernstein and Abigail Pritchard. Copy editing by Isabelle Kravis and Charlie Mennuti.

As the semester comes to an end and one of the founding members leaves American University, Section 202 has decided to take a trip down memory lane. For our fans, old and new, who are wondering how Section 202 came to be, this episode is a must. Listen along as hosts Connor Sturniolo and Liah Argiropoulos reminisce about the beginning of Section 202 and how it got to where it is now.

Powered by Solutions by The State News
All Content © 2024 The Eagle, American Unversity Student Media