AI in the Writing Industry
- Jayden Thompson

- Jul 20, 2025
- 6 min read

So a while back it was discovered that Meta has been using a database called LibGen, which consists of pirated books, to train their AI. A search tool has cropped up where people can see if certain books are in this database, and I’ve been seeing authors all over Instagram and Threads in complete shock after they found their own books inside this database. Now, I was never a fan of AI—especially when it was being used in the writing process. I’ve debated for a while on writing an article about my thoughts but was hesitant to jump into such a controversial topic. But after the reveal of the LibGen database, the AI debate has ramped up, and I’ve decided that the situation has hit a little too close to home for me to stay quiet about it any longer.
Buckle up, y’all. It’s ranting time.
Now, after hearing about this LibGen database, the first thing I did was find the search tool and see if my books were there. I’m a self-published author with two books on the market, and the idea of seeing pirated versions of my novels being used to train AI horrified me. Thankfully, I didn’t see anything after searching my name—my books aren’t popular enough to warrant anyone wanting to pirate them in the first place—which was a relief, but I was still upset after seeing countless authors on Threads finding out their books had not only been stolen, but used to help and train the very thing so many of us authors are against.
See, the thing is, most authors dislike the use of AI for books. We put in a lot of time and effort to write our novels, only to see others taking a shortcut by having AI do it for them. It’s unethical to say that you wrote a book when all you did was type a prompt into a computer. As an indie author, I feel this more than anyone. I did everything myself—writing, editing, cover design, formatting, and marketing. Every bit of it was done 100% by me—no AI at all. Writing a book is a dream that many people have, and I get the appeal of using AI to make a story come to life. But there is something different about it when you are the one pulling the strings. Seeing my story go from an idea scribbled on a sticky note to an actual book that I can hold in my hand is more than just amazing—it’s exhilarating. I put in months of effort, scraping out time every day to sit and write so that this story could come to fruition. And when it was complete, you better believe that I formatted each individual page by hand. I did hours of research on how to self publish, spent days playing around with Canva until I got a cover I was happy with, and started a YouTube channel to market my book. And when it was all said and done and I was sitting in my room with an actual copy of my book in my hand, I was so proud of myself for all of the work I did to get there. Whenever I see my reports for the month and see how many books I sold, even if it’s only one or two, I feel like I have earned every cent because of just how much effort I put into it.
But not everyone is willing to put in that same amount of effort. I’ve heard so many excuses of why people use AI for writing. “It’s a shortcut!” “I’m just streamlining the process!” “I do most of the work myself but use AI for the tricky parts!” “I just had AI give me the idea!”
The thing is, it doesn’t matter how much of the process you used AI for—if you used it at all, you are taking a shortcut. Writing isn’t something that you can streamline. It’s a long and difficult process. There are days when the words aren’t flowing, parts that need to be rewritten several times to get right, scenes that are just difficult to get down on paper. Part of the reason that finishing your story is so rewarding is because you pushed through all that—you pushed through the writer’s block and bad days and difficult scenes and actually wrote the book instead of taking a shortcut by using AI. I can’t imagine that I would feel as good about my story if I knew that I let a computer write part of it, even if it was just one scene. Marketing it as my book just wouldn’t sit right if I knew there was a part that I hadn’t actually written. The same thing goes for cover design. As someone who designed her own cover—and went to a lot of trouble to do so—I don’t like the idea of something slapping a prompt into a generator and coming out with a cover that they put barely any effort into. It just irritates me when I spend so much time doing everything to the best of my ability only to look up and see people cheating their way through the writing and publishing process.
I think the worst instance of this I’ve ever heard was when a writer on Threads was discussing how she used AI to streamline her writing process. She said that she used ChatGPT to rewrite certain parts of her story to give them more emotion because she was bad at writing those scenes. That never sat right with me. ChatGPT is nothing more than a computer program; how is it supposed to be able to convey emotion better than a human writer? It can’t feel happiness or grief or anger. People feel those things—and it’s people who are able to write the best stories. Good books make their readers feel something, and I stand firm in the belief that it is human voices conveying human emotions that make for the best stories, not something that was generated artificially.
The only reason that AI is even able to come close to portraying human emotion is because of its network of databases that it pulls from. Take the incident with Meta’s AI training. ChatGPT and other artificial intelligences are only intelligent because they are given information to use. They pull from the internet to provide their answers. In Meta’s case, their AI is pulling from thousands of books that it has put in its database to study writing styles and word choices to learn how humans write and communicate. To me, that’s horrifying. A computer system studying human behavior to mimic it seems like something right out of a dystopian novel. Not to mention the legal ramifications of it—if AI cannot create anything new and instead pulls from what is already written, is that not plagiarism? Is it not unethical to write a book with AI when it pulls its material and writing style from other authors?
And don’t even get me started on the fact that Meta’s database is using pirated books. It’s bad enough that they’re using books to train their AI, but the fact that they’re using stolen works and absolutely do not have the author’s permission is wild. One author I saw on Threads even stated that her books have an anti-AI clause on the copyright page and she still found her books in the LibGen database!
I’ve heard a lot of people say that AI is here to stay and that the industry should adapt. And it’s certainly trying to—every grammar checker and novel writing software sites seem to have AI functions these days. IngramSpark recently announced their AI ad campaigns. Kindle Direct Publishing is now offering audiobooks featuring AI voices. Canva and other design software sites have jumped onto the train as well. But just because AI is here to stay doesn’t mean that we should let it feel welcome in our community. Not only is it unethical in many ways, but it is detrimental to the value of literature. It takes away from what it means to be a writer. AI is stripping the written word of the very thing that makes it so powerful, which is the human voice behind it. Because without that human voice, without that raw emotion and untamed creativity, why would it be a story worth reading?
_edited.png)



Comments