My Unpopular Stance on AI

I'm neither pro nor anti-AI. But...

A gorgeous woman with long brown hair in a magical forest
AI image generated by author via Nightcafe

It’s trendy these days to despise or adore AI.

Not as many people have an in-between stance. Or at least, the people in between may hesitate to speak up, because both the anti and pro-AI crew have such loud and fierce voices.

But now I will be loud and fierce about my more nuanced views of AI.

Let’s start with the toughest question:

Is AI Stealing From Artists?

Before we begin, I learned from a critical thinking course that the word choices we use, really impact how we feel about something.

For instance, are we performing “abortions” or are we “murdering” a child? Similarly, “terrorist” and “freedom fighter” have very different vibes.

For AI, do we call it “stealing,” “crawling,” “scraping”; or “training” and “machine learning”?

You can see how different the words feel, right? Stealing, crawling, and scraping give us a visceral disgust reaction.

But training and learning sound much more benign.

If you think this is manipulating language to alter perception, you may be right!

Yet, both sides could argue that people are spreading pro- or anti-AI sentiments through their word choices.

Lately, I encountered a fascinating debate on Reddit on whether AI art is “stealing” from artists or not.

In my opinion, both sides presented some good arguments. I personally found the “not theft” side more persuasive, though you may disagree and that’s okay.

Here were the main points of the debate:

  • AI learns from the images it sees just like a human brain does.

Some people argue that AI “brains” are fundamentally different from human ones. AI stores bits of mathematical data from its training sets. I.e. It is stealing data from artists’ work.

Other folks argue that this is how the human brain works too. We see works of art, and store data about them in our brains through our neural circuitry.

Why should storing data in AI systems be any different from storing data in our neural systems?

This reminds me of the philosophical debate in cognitive science: Can computer brains be equivalent to human brains? The only difference is whether the hardware is made of metal or biological tissue, right?

Some argue that AI has no soul or consciousness, though. Others counter that the human “soul" or “consciousness” are just the combination of brain structure, neurons firing, etc.

This sounds like the religious debate on whether souls exist…

Full disclosure: I do believe in souls, and believe that AIs (currently) don’t have one. However, whether AI has a soul or not, isn’t relevant to whether it’s “stealing” or not.

In fact, the paradox is that a thief has agency and consciousness. If an AI doesn’t have agency or consciousness, how can it even be a thief?

Now here’s the point I find most convincing:

  • You need 10% overlap between an AI image and an original image to qualify as plagiarism. Yet we can’t find any images with that degree of overlap.

There’s some nuance to this argument. In the Reddit thread, the original poster (OP) asked people to show them an example where the AI image is very similar to the original.

Some people showed images with an artist’s signature on them, and claimed these were plagiarized.

But as the OP pointed out, where is the original image with this artist’s signature? It looks like the AI just created a signature because it saw many paintings with signatures.

Interestingly, nobody had any examples of an AI image beside the original image where the AI copied the artist’s signature.

Somebody shared an example where someone used another person’s artwork as the base, so the AI used this image and created a derivative from it. The original artist accused them of plagiarism.

Wow! Of course you shouldn’t use someone’s image as a base without their permission. And of course the AI generation will look similar to the original, because it was the base.

There’s a big difference between using an artwork as the base image, versus training AI on millions of images including this one.

Next, there were people who deliberately generated copies to show the OP.

Someone generated Totoro in the style of Studio Ghibli, for instance. The Totoro looks mangled and distorted, but is similar enough to be recognizable.

They also generated a copy of one of Andy Warhol’s famous paintings.

However, it’s one thing to deliberately generate a character (or famous painting) in the style of a particular artist.

It’s another thing to prompt something generic. E.g. “A chinchilla in a park.”

About imitating artist style, however, the OP had an interesting argument:

  • Why do we judge AI more harshly than humans? Humans are allowed to imitate another artist’s style. And they’re not required to credit other artists for their influence. But when AI does the same, it’s accused of stealing.

Good point. I had never thought of that, how we judge AI more harshly than we do humans. Ok some people still judge humans. E.g. Saying that Suzanne Collins “copied” Battle Royale in writing The Hunger Games. Despite that the two books are so different beyond their surface similarities.

Some say that no ideas are new anymore anyway. Sure there can be significant overlap between stories (or artworks). Some are explicitly copying, or writing parodies. But we can call this “transformative,” where it’s derived from but still different to the original.

Heck, there are people saying that fan art and fanfiction fall into the same category. Technically “copying” but in a fruitful way. (I write Pokemon fanfic, so I’m not judging.)

Then how about AI?

I hear people say that with AI, it doesn't have feelings, experiences, memories, biases, beliefs, etc. So there’s no creative transformation of what it “learns” into something new. There’s no “putting its own spin” on art.

At first, I agreed with this argument. But the OP of the Reddit post surprised me again.

They argued that AI does create its own images, its own “spin” on things. For instance, you show it countless pictures of dogs and of sunsets. Then you tell it to generate a picture of a dog in a sunset.

You could see the resulting image as some purely mathematical, randomized combination of data. (They do use “seeds”, after all.) But you could also see it as the AI creating a unique combination based on what it has experienced during its training.

Again, the debate feels more philosophical than scientific here. I don’t think there’s one right answer, though you may feel one side is more compelling than the other, depending on your own feelings and experiences.

Do I Think That AI Is Stealing My Work?

The short answer is no.

I’m no visual artist, but I’m a writer. Like many other writers, you could say AI has “stolen,” “scraped,” or “crawled” over my work.

Those word choices sound awful.

But I realized that, when someone generates a post or article with AI, I don’t feel stolen from.

No, I don’t condone people generating articles and claiming them as their own. I’m iffy about people using AI to generate social media posts too, though it’s still better than generating entire articles.

Kids using AI to generate research papers for school doesn’t feel good, either.

But I still don’t feel stolen from. The kid’s research paper has little to no overlap with my own writing.

I don’t think it’s okay to cheat on school assignments like that, partly because the kid won’t learn how to write for themselves.

But cheating on school assignments, is not the same as stealing my writing.

If someone copied several chapters from my book into their own book, then yes, that’s absolutely stealing and plagiarism.

When a kid’s research paper was written by an AI trained on gazillions of writings online, my influence on their paper is virtually zero.

You may argue that outside of school, I don’t write research papers anyways, so of course I wouldn't care.

All right. Then let’s talk about something I do write often: Blog posts about LGBTQ+ issues, based on my personal experiences.

Somebody could generate an article in my style on an LGBTQ+ topic.

Ignoring how I’m not famous enough for anyone to care about my style, it would be a joke for someone to try to imitate me. They would need to have my personal experiences to copy me properly.

Perhaps the AI could fabricate lived experiences of queer and trans folks. But they would be made up stories. I believe my writing style is quirky (and imperfect) enough that it’s hard for AI to truly mimic me.

The article may be interesting or even entertaining, given how advanced ChatGPT can be nowadays.

I wouldn’t support it, as that is deception, claiming an article is written by the author when it was written by AI.

However, the LGBTQ+ article would still be so different from my own writing, that I don't feel copied. I don’t feel stolen from.

Similarly, I write dragon fantasy fiction. Sure you can ask AI to generate a fantasy fiction about dragons in my style. I’d like to see what it comes up with. But I’m confident enough in my writing skills not to feel threatened by AI.

So no, even when it comes to AI training on my writing, it doesn’t bother me anymore, since I don’t see it as stealing. There is nowhere near 10% overlap between the AI work and my writing. I also feel that the AI can’t compete with my writing skills anyway, so I’m not afraid.

If anything, the more people get used to AI writing, the more human writing will stand out, with its quirks, imperfections, idiosyncrasies, etc. that AI struggles to mimic. More people will desire human writing. AI will ironically push us to write in more interesting, entertaining ways.

AI here reminds me of the wolves in Mongolia. They chase the horses, which forces the horses to run faster and faster, or else the wolves will catch them.

Thankfully this is not a life and death situation. Unlike the poor Mongolian horses, you won’t die if you get overtaken by AI. You just need to adapt and distinguish yourself from AI. No more boring, textbook-like articles!

The True Problems of AI

All that said, even though I don’t think AI is “stealing,” that doesn’t mean it doesn’t have problems.

Here are some examples:

  • Cheating in school by generating research papers.
  • People not fact checking and just believing everything the AI tells them.
  • Uploading AI generated pictures onto stock photo sites without disclosing that it was AI generated. (I.e. Deception.)
  • Impersonating people by generating books that claim to be written by that author.
  • Using AI to generate articles and books for money, which is also deception.
  • Rich corporations laying off artists by having AI do most of the work instead.

Now I want to talk further about the last point.

Sometime ago, I saw a funny meme on Reddit that all the problems of AI, are just problems of capitalism.

The original poster argued that artists only feel threatened by AI because they need to make money from their art to survive. But in a society where you don’t need to work for food and shelter, artists would simply see AI as an interesting tool.

Let’s put aside the point that not all artists rely on their work for income (let alone full-time income), and that there is no such thing as a society that lets you live a leisurely existence.

But I do think a big, maybe main, opposition to AI, is that it’s taking away jobs from artists and writers.

The Reddit post sparked some passionate debates, with artists and non-artists chiming in with their thoughts.

One artist believes that certain fields are more likely to be replaced by AI (e.g. simple digital drawings). And some fields are less likely to be replaced. (E.g. App interface design.)

Interestingly, someone proposed that AI can generate more “unimportant” things, such as the backgrounds in a video game. But an artist who draws video game backgrounds spoke up. They said that background illustrations are more complex than you think, and AI often gets the details wrong.

It’s easy for some folks to say, “Adapt or die.” But it’s harsh to say that to someone who studied their craft for decades.

As an example, someone said they used to be a successful clothes maker. But as commercial clothing became popular, they lost business. Commercially made clothes are way cheaper than handmade clothes, even if the quality is poorer and the clothes break more easily.

But can you blame people for wanting to buy the cheaper, though crappier clothing? Especially in this tough economy.

The same thing is happening with AI art. A hidden issue that not enough people talk about, is financial privilege.

Poor vs Rich People When It Comes to AI Art

It’s easy to sit on a moral pedestal when you’re rich.

A friend, let’s call her Lara, writes fiction on Substack. One reader messaged her, saying that they loved her stories, but won’t support her unless she stops using AI images.

Yikes! Lara said that since she posts quite often (sometimes up to 20 posts a month), she doesn’t have the money to hire an artist many times a month.

In contrast, I know a popular Substack writer who is very outspoken against AI. He hires an artist to illustrate his weekly blog post images.

It’s amazing that he can hire an artist for his blog posts. But guess what? He is a retired neuroscientist and professor, as well as a very successful Substacker (lots of paid subscribers). So his finances are way above what most people can afford.

In fact, he was thinking of hiring private tutors to homeschool his kids.

Wow! Most people do not have the money to hire private tutors to homeschool their children. He’s really on a whole other level, socioeconomically.

What About Cheaper Art Options?

You may wonder why we poorer folks can’t just rely on cheaper, or even free options.

Okay, let me get into the nitty gritty.

There are stock photo sites, including free ones such as Unsplash. Sure you can use them. But depending on what you’re writing about, you might find them unsatisfactory.

I use stock photos for most of my nonfiction blog posts. But for my fiction, especially fantasy fiction, they just don’t look compelling enough, in my opinion. Or there aren't enough choices suitable for fantasy fiction.

Another option is to buy cheap(ish) premade covers for book publishing. I did look through premade covers, but they didn’t fit the vibe I wanted.

Plus, even though we call these “cheap,” not everyone can afford them.

They could be anywhere from $20-50 on the low end. A friend wanted to try a premade cover for her book, but gave up when she saw the price tag. She’s unemployed and lives with family members, so she doesn't have much money for things outside of bare necessities.

Likewise, if we think about Fiverr designers, they are “cheap” compared to most other places. A friend told us he got his cover design for $50.

Another friend replied, “I wish I could afford to pay $50 for a book cover.”

Some people are truly very poor. It’s not fair to compare them to those who can afford the fees.

I hear some people advise us not to hire artists on Fiverr, because of the low quality.

Goodness gracious. If you’re rich enough to afford better, that’s great! But not everyone can.

At my maximum, I managed to get to 99Designs, on the lowest (bronze) tier, which was $200+ for a book cover design.

I can’t do that often, though. I wish I could pay an artist $1000+ like some people do. But I can only pay a few hundred bucks at a time, like once or twice a year at most.

You might be surprised to hear this, since I’m a therapist and you’d expect us to make a lot of money, right?

Nope. There is a huge range of therapists. I’m early in my career and only have a small client load. And high business expenses.

If you look at my gross earnings, I make more than minimum wage (though I’m still in the lowest income tax bracket).

If you subtract my business expenses, then I make below minimum wage!

I’m only surviving because my parents are supporting me. So I’m far from making big bucks and spending $1000+ on cover art, unfortunately.

A Frustrating False Dichotomy

About cover art and finances, there’s something else that I never hear anyone talk about:

You can use AI and pay artists.

I had some Wattpad book covers where I generated an AI image, and hired an artist on Fiverr to make a great cover for me. So I used AI and I hired artists.

I even did it for a novel I published on Amazon. For context, I wanted to find a stock photo of my character (a five-year-old boy) at first, but just couldn’t find one that was a good fit. So I gave up and generated one with Nightcafe AI. I bought a stock photo of a starry space background.

Then I hired a graphic designer on Fiverr to make a beautiful cover for me. I also hired a book formatter to make the interior look great.

Both of these Fiverr artists were what I’d call “high-end Fiverr.” They had great review ratings and charged more expensive fees. I also gave them big tips at the end. In total, I spent a few hundred dollars on that book cover and interior design.

A science fiction novel book cover.  A young blond boy against a space background.
My book cover where I both hired artists and used AI.

So yes, I used an AI image. But I also spent a few hundred dollars hiring artists.

Of course, I wish I was rich and could spend another $500+ to hire an illustrator too. But nope. I just didn’t have the money.

Some people may judge me for using any AI at all. Some may never want to read or buy anything from me as a result.

If it bothers them that much, they certainly don't have to buy or read anything from me.

But I hope more people can see the nuances of AI. Consider financial privilege, and that you can both hire artists and use AI.

Also, as mentioned above, I don’t think AI is actually “stealing,” since there isn’t the 10% overlap needed for a plagiarism accusation.

Artists Use AI, Too

Moreover, even some artists use AI. Some artists explained that AI helps them with the boring grunt work parts, while they can do the more sophisticated, refined parts.

Heck, when I held a design contest on 99Designs, almost all the designers used an AI art generator! When I told them that my writing platform banned AI images, most of the designers left my contest.

One designer even messaged me to complain, saying that they had never heard of a writing platform that bans AI art, that I would be better off publishing on a different platform.

Suffice to say that the whole “AI is evil and harms artists” assertion, isn’t entirely true. It really depends on the context, especially since some artists use AI too, with their clients’ permission.

And not just visual artists. Recently, I learned that some book editors use AI to generate editorial letters. With their clients’ permission, of course. I was shocked to hear it. But if they just use the AI to organize their thoughts in an essay, and their client is okay with this, then what’s the problem?

I have an artist friend who said she doesn't use AI for client work. She only uses AI (with heavy editing) for her blog posts. And yes, with the AI disclosure in the image caption.

To me, it makes a big difference whether you make a disclosure in the caption or not if it’s AI generated. It’s like how someone can generate a story for fun and clearly label it as AI generated.

There’s also a difference between selling AI art directly for money, versus using AI images for book covers and blog post display pictures. I.e. You’re not selling the AI image itself; you are selling your writing, which you wrote yourself.

In image licensing, it makes a difference whether you use an image just as a small part of your work (e.g. book cover) versus using the image as the main thing (e.g. a print or poster).

So why would you treat people who sell AI art directly, the same as people who only use AI for a book cover or blog post pic? (They wrote the book or blog post content themselves).

Again, it’s one thing if a rich publishing house uses AI instead of hiring a full team of artists.

It’s another if someone still in the lowest income tax bracket uses AI while hiring Fiverr artists.

Final Thoughts

So as you see, my views on AI are neither wholly positive nor wholly negative.

What’s often missing in these discussions, is the difference between the AI technology itself vs the people using the AI.

AI is just a tool. But the way the person uses AI, could be good or bad (or in between). There are also external factors, such as the person’s financial privilege.

What do you think? Do you agree with my stance on AI?


If you also believe in questioning popular wisdom to form your own views, connect with me here!