Some Thoughts About Generative AI and Synthetic Voices

Artificial Intelligence tools, especially those that generate text or images from a prompt, along with synthetic voices that mimic human narrators, have become a hot topic in the indie author world this year. As an author, I've had these tools pushed at me continually. Google, Apple, and Amazon have invited me to have a synthetic narrator read my books. Software firms are pushing “writing” software that uses large language models to generate paragraphs or even entire chapters based on a prompt. Many independent cover designers and authors have taken to using tools like Midjourney to create illustrations and images.

It. Is. Everywhere.

Large corporations view these tools as a way of making labor more efficient. (MBA speak for eliminating jobs). For software companies, these tools are a source of revenue. Creative people, including writers, may view AI tools as a way of getting access to things they otherwise couldn't afford. For example, I can't afford to hire a professional narrator without losing money on every book, but I could make a synthetic audiobook for free. Thus all sides of the industry, from the powerful CEO to the small fry, will find no shortage of incentives to use AI.

And there's a difference between legality and ethics. AI tools may or may not be legal; we're still waiting for the courts to sort that out. Yet the ethical implications trouble me. Most (if not all) generative AI tools were developed by ingesting huge numbers artistic works without permission or compensation. Unscrupulous technologists like to use the word “learning” to de-fang what they are doing here, but it's doublespeak. When most of us think “learning,” we conceptualize the word in a human context. It's as if these people are saying: This technology is just a little boy! You wouldn't prevent a little boy from learning, would you? Only software isn't a person any more than a corporation is. Mass scraping of artistic works for the use of a corporation to create tools designed to replace human labor isn't “learning.” Double-meanings only serve to obfuscate what is truly happening, and I choose not to be fooled by that baloney.

I also have a strong emotional reaction to these technologies. For me, art is about human expression, and to strip away that human core is a deeply icky prospect. People who throw a prompt into a box to get an idea are not creating those ideas. People who jot down a bullet point and have software write a paragraph of description for them are not writers, not even if they swap a word or two. I'm sorry to report that popping a frozen burrito in the microwave doesn't make me a chef. Not even if I drizzle some delicious hot sauce on top. Others may disagree with my opinions, and such is life, but I'm not interested in debating what is evident to me, deep down in my bones. Nor am I going to knock on anyone's door to bully them. But I don't have one whit of interest in what these people produce.

Others are free to do as they like. But I have to wonder: Why don't you care about other people?

To be an artist (or a human, really) is to exist in an ecosystem. Over the last thirty years too many of us have shrugged as “innovations” scoop out the pulsing center of our communities. We shopped at the corporate mega store instead of the neighborhood shop run by the guy we knew, and we shrugged at self checkout, allowing our consciences to be soothed by the empty promises to retrain those cashiers. I know I did. I've been way too complacent about this shit. Now, AI companies want to replace not just human artists, but people working the drive through, and office workers, and the gods-know who else.

I can already hear some designer-jeans wearing apologist shouting from a distance: We're not replacing anyone! We're helping!

Hmm. Helping whom, exactly?

Even if you think the technology is swell, even if you have no moral objection to the way it was made, do you not care that the people you're displacing have nowhere else to go? Tech CEOs will mutter about things like Universal Basic Income, maybe, but that's a dodge, a set of magic words thrown into the air. If you actually cared about that solution, you'd put it in place before you gave ten thousand jobs to fancy autocomplete and kicked your workers onto the street.

Why don't we care about other people? I mean, a professional narrator would put my books in the red, so I'm not hiring one, but I can still care about narrators. I can't always afford a cover artist, but I can DIY my covers rather than buy cheap, AI-produced art built on the backs of artists I've never met. Every use of synthetic voices and generated imagery serves to normalize and strengthen the means of our peers' destruction, doesn't it? And remember, these companies could have built their models with permission. They could have asked. But I think they knew what the answer would be, and when it comes to exploitation, the morally bankrupt never ask. They take. And they count on the rest of us to gleefully snap up the first few rounds of freebies that result, making us compliant participants in the destruction of our own ecosystem.

And yet... we can care about each other! It's a thing we're allowed to do. In a sense, it's the only power we have. It is possible to see a technology and say to ourselves “Yeah, the cons outweigh the pros, right now.” AI pushers want us to believe that wide adoption of this shit is inevitable, but that's the grand mythos of the “disruptive” tech firm. First, you believe the overwhelming narrative that says you have no power, and then you go along quietly. They're very, very good at this.

Also, (please allow me a side-note here) when it comes to outsourcing my writing to fancy autocomplete, I have to ask: Why in the heck would I? Why would I outsource my joy? I'd no more outsource my writing, my craft, than I would buy a piece of software to love my family for me, or to hug my husband for me, or to attend a friend's party for me. Art is connection. Art is struggle. Art is love.

I realize that there will always be those who view a painting or a story or a piece of music solely as a consumer product, a widget to be produced at the lowest cost. That mindset is endemic in the “Get Rich in Self Publishing” movement, replete with an endless parade of talks on how to create “minimum viable products” or “low content books” and master the dark arts of Amazon Ads to click your way to success. Meanwhile, those pushing AI tools to authors try to muddy the waters by claiming that all spell checkers are AI, and so are assistive devices for the disabled, and therefore you're already doing it, right? WHY ARE YOU SO AFRAID OF PROGRESS, TECHNOPHOBES? IT'S ONLY $39.95 A MONTH TO MAKE YOUR WRITING DREAMS COME TRUE. Only a fool expects truth from the mouth of a snake, and here we indie authors are, together in the pit among the hissing hordes, surrounded.

Ugh, I say.

Alas, I'm not here to tell anyone what to do. I can only share how this shit-show makes me feel. And here's where I've landed, personally:

I won't use AI tools to generate text or images. When sourcing stock images to use for my covers, I'll try to find those that were made by a human, although it can be difficult to tell the difference as things aren't labeled, and I'll probably get it wrong sometimes. When I hire a cover artist, I'll ask that they not use any AI images, and I'll avoid those that focus on the stuff. Also, I won't use synthetic voices. I think the bar for any AI-driven technology needs to be, minimum, was your tool built with consent from the contributors. Beyond that, I'd also like people in my industry to be able to afford rent, thank you very much. My writing doesn't make much money, so these choices will mean more hassle for me, and not being able to have things that other authors have, and that's okay. I can live with that.

My big worry? What happens when distributors start forcing these things on small time authors? But I suppose I'll deal with the future when it arrives.

We're all familiar with the dark side of the human condition. Our penchant for weakness. Some people are so eager for a shortcut that they don't care who gets exploited or left behind along the way. It's easier to remain ignorant, always. To get ours, and to let the details work themselves out. I'm not immune to stupid, selfish decisions. I don't always get things right, and I'm just muddling my way through, like everyone else. How will technology change the arts, in years to come? I don't know, but I know what writing means to me, and I hope to keep on doing it with integrity.

Anyway, these are my thoughts about Generative AI in the arts. Best of luck as you formulate your own.