Who's Threatening Your Job: AI or Your Boss?

WALL-E

Photo by Lenin Estrada

"AI is taking my job".

Freelance writers have been singing this refrain for months at this point, and for good reason. The Writer's Guild didn't go on strike for fun - there's a real concern about Large Language Models putting writers out of work.

I think there's a real conversation to be had about the way our jobs may evolve to include AI, and how those of us that choose not to adopt change will likely be left behind.

I also think the cases currently making their way through state-level courts over copyright claims (AI used my work to learn; therefore, anything this AI model produces is copyright infringement) is a fascinating chance to talk about what really counts as intellectual property and what we have a right to use when we're training large language models.

The issue we all seem less eager to talk about, however, is the fact that Large Language Models like ChatGPT have incredible potential to help humanity solve problems and get more creative if used properly. If looked at through that lens, why react in this way to AI at all?

It comes down to how we treat writers and other creatives in the first place.

I was going to write this morning about writing and one of the most common things I see with clients, which is having a goal( I'd like more website copy) but not a "why" behind that goal(I'd like more website copy because I want to increase brand awareness by ranking higher for x,y, and z keywords and down the road, I'd like to convert that awareness into sales by building out my content pillars, internal links, and backlinks).

That post would have been valuable, but as I was reading Rachael Kay Albers newsletter this morning on how to ethically market at all - one of the core tenets of my business model - it got me thinking about AI again.

She laid out a few of the things we ethical marketers face when we decide to try and do a little better:

  • Leaving money on the table

  • Deciding to pay people fair wages

  • Being patient. I, for example, don't want to pay people scraps. So instead, I do all of the work that goes into building a business myself until I can outsource work at a fair market value(or more) price.

Writers and artists dealing with the dawn of AI are facing a problem as old as time - their bosses don't want to pay them what they're worth, and now they've found a replacement that is so much cheaper than them it's almost funny.

On the other end of the spectrum, freelance writers dealing with clients who would rather die than publish AI work have to jump through hoops with AI detectors that function like Magic 8 balls.

So instead of talking about the potential AI holds, the narrative around AI( at least in creative circles) lumps AI in with the people who would replace us with AI because it's cheaper for them and allows them to take maximum profit home.

The question here is, can we learn to treat each other like human beings? Can we let go of the need to have as much money as possible?

Maybe, just maybe, AI could replace a lot of us writers and actors and artists and do it while using our writing and art and films as learning material. The counterargument is that AI could never replicate us, and I largely agree, based on what I know about Large Language Models and how they learn.

However, there's clearly a fear that AI is coming for the jobs of people who have dedicated their lives to creative arts. The other day, I saw a sign outside a local gallery that said, " Original Art - Not AI!"

Perhaps this copyright infringement lawsuit wouldn't be necessary if we simply made peace with leaving money on the table and deciding to pay people fair wages instead of replacing them with AI.

Is that too much to ask?

Maybe freelance writers would stop jumping through inane hoops to prove they're human to their clients if Google wasn't so tied up in knots about AI content all over the internet and the diminishing quality of content.

The quality of content diminished long before AI, thanks largely to Google's SEO requirements, especially in the keyword stuffing days. Many bloggers and writers no longer write for both the search engine and the user. Content is written so that the search engine will index it and rank it, without considering if the consumer - the human on the other side of the screen will actually enjoy it.

Color me cynical, but AI is not the great tragedy of creativity at this point. Constantly trying to appease an algorithm is.

I won't try and make the proverbial argument to you that you should hire someone like me(you should) and not just use AI. It's not that black and white. I'm learning to use Large Language Models for a reason. I'm interested in what they bring to the table.

I want to have conversations about what is intellectual copyright, if AI is an unavoidable part of the future of writing, and if creativity is diminishing because of AI - or some other reason. I want to talk about if creativity is something we even value anymore.

Whether or not we have a paid position to do it in, humans are inherently creative and will always want to try and find ways to create.

I hope I'll always be paid to write, but even if I'm not, I always will.

What do you think about AI? Is it a threat to our creativity and livelihoods or a misunderstood and misused piece of tech?

Previous
Previous

What is Copywriting, and Why Should You Care?

Next
Next

You Don't Have to Sell Your Soul to Succeed