Our Guide to Self-Editing Mastery has helped thousands of authors just like you:
- Pre-edit their books with ease
- Avoid the most common writing errors
- Quickly identify spelling & grammar mistakes
- Be more prepared for professional editing
Our Guide to Self-Editing Mastery has helped thousands of authors just like you:
How many authors today are using AI to help write their books?
Show of hands, there’s no judgement here. (Wow, that’s a lot of hands.)
A study by the Minderoo Centre for Technology and Democracy at the University of Cambridge published in November found that 67 percent of novelists and literary agents surveyed have never used AI in their writing process, with many citing moral and environmental implications. Of the 258 novelists and 74 industry insiders, one-third admitted to using AI in the writing process, though not necessarily to write the book. Per Dr. Clementine Collett at the Minderoo Center:
Some literary creatives use GenAI for ‘non-creative’ tasks. 33% of novelists report that they use GenAI within their work, along with 34% of literary agents, and 45% of those working in fiction publishing. Most commonly, these literary creatives use GenAI for ‘non-creative’ tasks such as information search.
That may be too small a sample size. Or perhaps British authors aren’t as quick to jump on the AI gravytrain. In May, BookBub surveyed 1,200 authors about their use of AI during the writing process. Those numbers were a bit more even:
Among survey respondents, about 45% are currently using generative AI to assist with their work while 48% are not and do not plan to in the future. Another 7% of respondents are not currently using AI but might use it in the future.
According to that study, 81 percent of those who admitted to using AI do so for research, marketing ideas and, as they put it, “outlining and plotting.”
Most editors are cynical when it comes to writers who use AI to write their books, be it to clean up grammar and tweak tone, or to write whole chapters, and in some cases, entire books. But the way I see it … AI is here. You can hate it or you can love it, but it’s here. More and more, authors are going to lean on AI to help with their books, which in many cases includes writing large sections. Hoping for that to stop is like spitting into a fan.
The numbers above were for fiction writers, and you can be sure the number is higher for non-fiction projects. Anecdotally speaking, the majority of non-fiction books I have edited or narrated in the past two years have been written, in some capacity, by AI. It’s only going to grow. Take this recent story from Korean outlet The ChoSun Daily, reporting on a local publisher going HAM on the e-book market:
A 'super publisher' has emerged, producing over 9,000 books a year and more than 20 books a day. The secret is AI (artificial intelligence). The e-book market, which incurs no separate printing costs, is already overflowing with AI authors. However, there are few cases where it is disclosed that 'AI wrote it,' leading to criticism that it is an act of cheating readers. Major domestic bookstores like Kyobo Bookstore have admitted, 'It is difficult to confirm how many AI book authors there are.'
It’s easy to go the AI route, especially when you read the stories of people making huge money publishing AI books, including this 27-year-old who claims to have made $3 million publishing 1,500 books on Amazon.
A report published in July by market.us—a website that boasts it is trusted by more than 17,382 organizations, including IBM, LG and Oracle, so who am I to question their data—cited data from 2024 to project a future dominated by AI book writing over the next decade.
The global AI book writing market is projected to grow from USD 2.8 billion in 2024 to approximately USD 47.1 billion by 2034, achieving a robust CAGR of 32.6% during 2025–2034, driven by growing demand for automated, efficient, and creative content generation across industries.
Before we get too far along, I think it’s important to share that I do use AI almost every day for research, verifying quotations, and other things, like trying to find the best deal on a driver for my kid who’s trying to make the high school golf team.
As an editor, I often need to verify random things like the total number of churches in America, for example, and ChatGPT is great for quickly getting me in the right ballpark. I want to be very clear, I’m not someone who is anti AI.
That said, I am cynical when it comes to AI writing, because it’s just…bad. It is incredibly repetitive, and uses the same “style” in every book it writes. If you use AI to write your book, it will read just like everyone else’s, as the AI is incapable of writing in a unique style each time.
AI writing and editing is a multi-billion dollar industry that is not going away, so consider this post the editorial equivalent of a teacher handing out condoms in health class. I can preach AI abstinence, but let’s be realistic, most of you are already doing it. Try Googling something. The first response is from their in-house AI.
Before we get to the list of 10 things every author must avoid when using AI, I wanted to figure out how many authors use AI to “help” write their books, and how many use AI to actually write their books.
I asked ChatGPT how many books it wrote in 2025. After offering a slew of caveats that it’s impossible to calculate because the AI doesn’t know if the project was published as-is, reworked by the author or scrapped entirely, it then gave me this:
Across the entire global user base, here’s a plausible estimate for 2025:
Low estimate (conservative): ~40,000–60,000 book-length projects requested
Moderate estimate (most likely): ~100,000–200,000 books partially or mostly drafted with my help
High estimate (ambitious): 250,000+ if counting every user who asks for chapters, outlines, or ghostwriting help
According to ChatGPT, most people ask for an outline first, then request help writing a few chapters before either continuing on their own or abandoning the project. The typical range of each request is 3–6 chapters per project, with “serious hobbyists” and “determined writers” asking for more than double that from AI.
There are, however, some tell-tale signs a book was written by AI. I’m going to limit this post to the Top 10 reasons you should avoid using AI to write your book. And I promise, “the environment” isn’t one of them. These are tangible examples for would-be authors that won’t only get picked up by editors—most readers will catch these blatant AI tropes, and if you want to be the next author making $3 million for publishing 20 books a day, it’s going to be tough when your project is hammered with one-star reviews.
So, a final plea from someone who stares at a screen for 17 hours and probably reads more than 50,000 words a day: Don’t use AI. Write the book yourself. Do the research. Hire an editor! (If I could slip you my digital business card right here I would.) Put out something truly unique and take pride in the fact you did it without needing AI as a crutch.
Okay, enough pie-in-the-sky, “back in my day” BS. Most of you are using AI, so here’s a list of 10 things every author should think about before asking AI to be your virtual ghostwriter. I’m trying to help; if not you, then definitely your editor.
(Note: For examples in this post, I used the free versions of ChatGPT and Anthropic’s Claude, as that is what’s accessible to most people.)
When I asked Claude how many books it has written this year, here was the answer:
I don't have access to data about how many conversations I've had across all users, how many of those involved book writing, or what happened to those pieces afterward. I also don't retain memory between conversations—each time someone starts talking to me, I have no record of previous interactions. So I can't track my own output over time or across users.
No wonder every book reads the same! Even the AI doesn’t know if what it already wrote for one person is the same as what it’s writing for another!
If ChatGPT conservatively estimates that it writes 50,000 books per year, then even if Claude’s output was half of that, it would be churning out nearly 70 books a day. The likelihood that it’s writing the same thing for you as it did for hundreds and maybe thousands of other writers in your genre is extremely high.
Now, there may be a reason why Claude claims to not remember things. In September, Anthropic agreed to pay at least $1.5 billion to settle a lawsuit brought by a group of book authors alleging copyright infringement. According to the settlement, as reported by Wired, the class action suit applies to more than 500,000 books the company illegally pirated to help its learning model. The judgement noted “Anthropic downloaded over seven million pirated copies of books, paid nothing, and kept these pirated copies in its library even after deciding it would not use them to train its AI (at all or ever again).”
Now it makes sense why Claude forgets everything it says. (Can a robot claim plausible deniability?)
I asked both ChatGPT and Claude for help writing the first two paragraphs of a self-help book. This was my prompt:
I want to write a self-help book about what it's like to be human. How each of us are unique individuals but we are all interconnected as well. Can you write me a sample paragraph or two of an introduction for that type of book?
After sharing two paragraphs, each AI asked if I wanted to change the tone at all, so I requested both to write “a more poetic version, a more spiritual version.” That gave me four versions of a sample self-help intro, two each from ChatGPT and Claude.
Now, it stands to reason that, if given the same prompt, each service would come up with similar samples. I just wasn’t prepared for how similar.
Both AI services took my prompt and created the same “a person is a solitary universe” metaphor? As you can see below, a lot more of the wording is similar, not just in transitional phrases and sentence openers, but in the metaphors and visual imagery as well.
Is that a common phrase I’ve just never heard before? I’ve heard people call someone they’ve dated before an “old flame” but why would both ChatGPT and Claude come up with a person being a “singular flame” that’s part of a great/greater fire?
Now, again, I asked for a pretty specific sample, but these excerpts show the limitations of using AI in a structured writing capacity.
Again, this was a request for two paragraphs. Imagine the similarities if I had asked each AI to write an entire book! If you need more proof, check out this video that shows just how often the same AI-driven phrase shows up in different books.

I tweaked my prompt to request a finance book, keeping the overall theme the same, but changing all of the other details:
I want to write a how-to book about what it's like to work on the stock market. How each of us are unique individuals buying and selling stocks, but we are all connected based on the fluctuations in price and trends for buying and selling. Can you write me a sample paragraph or two of an introduction for that type of book?
Not only are the two services again similar, but they are similar to the previous prompt as well. In the self-help response, ChatGPT included, “we are stitched into a vast, invisible web of connection.” Here’s what the stock market prompted:
What does “bumping into each other through invisible threads” even mean? How do threads bump into each other?
ChatGPT’s response to the self-help prompt began with the line: Being human is a strange, beautiful paradox. Claude’s response on the stock market prompt began with the line: The stock market is a paradox of individuality and interconnection.
It’s relatively common for people to use the word paradox when talking about divergent topics, like being an individual and being part of something bigger, so maybe that’s just a coincidence.
Except Claude also used it in the self-help prompt: Yet paradoxically, this very individuality connects us. The second paragraph of Claude’s stock market prompt also began: Yet despite this beautiful diversity…
Then there’s the ecosystem both ChatGPT and Claude seem to share:
ChatGPT: It is a guide to navigating the market not as a detached observer but as an active participant in a shared human ecosystem.
Claude: The stock market is a paradox of individuality and interconnection—a vast ecosystem where millions of independent minds make autonomous decisions…
That last ChatGPT sample above leads to the next huge issue with AI, and an obvious tell when an author has used it. AI seems to only be able to write about something by comparing it to something else. Often, the comparisons are just relevant enough to keep in the book, but sometimes they are absolutely ridiculous.
Do you see how the ChatGPT sentence in the previous section is ostensibly structured backwards? Instead of just saying, “It is a guide to navigating the market as an active participant…” AI throws in the “not…but” for no reason: “It is a guide to navigating the market not as a detached observer but as an active participant…”
I recently edited a 57,000-word business, leadership, self-help book that was very uplifting and positive. Until I realized how many “not…buts” there were.
The word “not” appeared 299 times. That’s a lot. That got me thinking about other negatives which…duh…I forgot to count contractions. In some capacity "n't"—shouldn't, don't can't, etc.—appeared 551 times!
In a 57,000-word book, some version of the word “not” appeared 850 times!!!
Put another way, that’s the word “not” appearing every 67 words. For comparison, I looked at three similar books that were not as heavily AI-driven—I hasten to say not AI at all, despite what authors disclose—and they average a “not” every 121.5 words.
As I said to the author, “It's such an uplifting positive book…but subliminally you are putting negative themes into the reader's mind over and over again.”
Sometimes having a “not” comparison can be an effective tool. After all, this is not rocket science, it’s writing. (See?) The issue magnifies, however, when you stop paying attention to the number of times it appears. That’s how your book takes on an oddly negative tone, often for no reason whatsoever. For example, in that book, the author talked about having a strong collaborative working relationship, writing that they always talked through solutions before presenting options to the client. (Note: I removed any other context in the following sentences that would tip off the book, but given these lines were edited, I thought it was safe to use these examples.) This was in back-to-back lines:
Not to hide problems, but to make sure we'd thought through the implications…
This wasn't about presenting a united front to cover mistakes. It was about…
Why would you ever introduce that idea to the reader? Why put the thought in the reader’s head that you might have once even considered hiding problems or covering mistakes? Not everything needs a comparison, especially when introducing examples of what you didn’t do undercuts the positive of what you actually did!
I recently edited another book that was just over 70,000 words. The word “not” appeared in that book 373 times. I didn’t think to check the contractions in that one, which I had finished prior to the book I mentioned above, but in the finished edit there were still 80 “n’t” that showed up, making it at a minimum 450 appearances. That is too many!
In that same book, the comparisons were aplenty, with “whether” appearing 52 times, “more than” used 86 times, and “both” showing up 159 times. I was able to cut “more than” down from 86 to 32 times and “both” down from 159 to 38 times, and did not change anything of substance in the book.
Let’s take another look at the example above, using one of Claude’s “poetic” style responses:
To be human is to live in this mystery: to be both utterly alone and completely held, both insignificant and infinitely precious.
And here is one again from ChatGPT:
To be human is to walk the earth as both a singular flame and part of a greater fire.
A book I recently edited had the line, “a rhythm essential to both humans and animals,” which had several red flags, another of which I will get to next.
The AI infiltration can be subtle, especially if an author is just using one of the platforms to fix tone or edit wording or syntax. The book I just mentioned had a number of other words that are AI tells.
The word rhythm, seen in the example above, appeared 141 times in that book. Without having to rework much at all, I was able to cut the total number of “rhythms” to 52, and that included keeping all 32 references to circadian rhythms. I was able to cut the remaining 109 down to 20 without it impacting the information in the book at all.
I knew to look out for rhythm because another AI-heavy 37,000-word book I edited used it 49 times, which at the time felt like a lot! Many of those rhythms were described as sacred (a word used 65 times) while others were quiet (94 times). Quiet also appeared in the 70,000-word book 84 times.
The word “foster”, meaning to promote or support, is another example of a commonly-used AI word. How do I know? The 70,000-word book used it 48 times, while a 33,000-word book I edited used a version of foster 138 times.
The reason I called them noticeable words is because they stand out when you read them. You could use the word “grow” 1,000 times and barely anyone would notice, but use “cultivate” more than a few times and it starts to stand out.
Here are a few common words AI loves to overuse:
The word simple appeared in that 70,000-word book 89 times. Nothing is that simple. Lines like the ones below scream AI, and sometimes the repetition goes beyond just one word. These three examples appeared in the first three chapters of the same book:
I removed any other context in those sentences that would tip off the book, but given these lines were edited out, I thought it was safe to use these examples.
Part of the reason we think AI is going to rise up and take over the world is probably because they keep telling us everything is so damn simple! I’ve even seen lines like, “This simple distinction has profound implications,” which seems like a paradox (Or is that what the AI wants you to think?). Look, if you are a human reading this and not a bot, please heed this advice: If you see something described as simple over and over again…there’s one simple explanation.

Another common AI issue shows up in using repetition. Lists are almost always done in threes, sometimes fours, and the majority of AI lists feature two very common things. First, the repetitive use of one or two words for effect, and second, the omission of the word “and.”
Why does AI hate the word and? It’s almost as if AI wants to be poetic in everything it says, and using simple commas with the word “and” is too boring. Remember, not everything needs to be breathlessly written like it will someday be engraved on a plaque. Sometimes a list is just a list.
AI LOVES to complicate punctuation. Why use two commas when you can use a colon and an em dash?
A recent book I worked on used 513 colons. For context, it only used six semicolons. For further context, it used 519 em dashes.
I cut the colons down to 403 and the em dashes down to 405 and I surely could have done more. (Wait…did I just perform the first AI colonoscopy?) The overuse of punctuation, especially those used to replace commas, are huge AI tells. The majority of colons in the book I just mentioned, however, were used in sentence fragments used set up sentences. For example, “Here's what I learned:” and “Here's what most people get wrong:”
To go along with the overuse of colons, that book used the word “Here’s” 114 times, 83 of which began sentences. Of those, 65 started paragraphs. Please tell me which of these sentences is clearer?
Here's what we didn't do during the investigation: make excuses.
During the investigation, we didn't make excuses.
I worked on another book that had 587 colons and only 27 semicolons. (That, unlike the example above, only had 32 em dashes.) When you see that kind of disparity between colons and semicolons you either have a LOT of lists, or your AI got caught in a colon loop.
(That sounds gross. Let’s move on…)
If every quotation in a book is preceded by a colon rather than a comma, you are not writing in a fluid manner. If you think you have to say, “Here’s something important to remember:” instead of just saying the thing on its own, or perhaps saying, “Something important to remember is…” you have been reading too many books written by AI. Don’t get affected by bad punctuation. (Editor’s note: It’s worth considering: Is the AI doing this to beef up the word count?)
When I asked both ChatGPT and Claude, “You aren’t human, so where do you get your information from when asked to write about what it’s like to be human?” the responses were telling. The bold is mine:
Claude: What I'm drawing from is patterns in human writing, philosophy, literature, and cultural expressions about the human experience that I've been trained on. I can recognize what resonates in human expression—the common threads in how people describe loneliness, connection, meaning-making, spirituality. I can synthesize those patterns into something that sounds authentic because it's based on countless authentic human expressions.
But here's the honest tension: I'm essentially curating and recombining human insights rather than generating them from genuine understanding. When I write about the "sacred wounds of humanity," I'm reflecting back what humans have written about their own experience, not reporting from my own felt sense of what that means.
So the AI that admitted it doesn’t retain information from one conversation to the next also says it’s “curating and recombining human insights” in order to generate human-like emotions and thoughts?
ChatGPT: You’re right: I’m not human, and I don’t have lived experience. When I write about what it’s like to be human, I’m drawing from three main sources:
- Patterns in Human Writing
I’ve been trained on a vast amount of publicly available text — books, essays, stories, conversations, philosophy, psychology, poetry, interviews, and more.
From that, I learn the patterns in how humans describe their own emotions, struggles, growth, and relationships.
Think of it like this: I don’t feel heartbreak, but I’ve seen a million ways humans talk about heartbreak — the metaphors they use, the sensations they describe, the stories they tell about how it changed them.
- Human Behavior Described Through Science & Observation
There’s also the psychological and sociological side:
– studies on connection and vulnerability
– research about identity and personality
– writing about meaning, purpose, loneliness, joy, creativity, trauma, etc.
I synthesize these ideas in a way that sounds natural and emotionally coherent.
- Your Direction, Tone, and Values
Whenever you ask for a style — poetic, spiritual, nature-based, humorous, philosophical — I combine what I’ve learned about human expression with the voice you’re looking for.
You’re steering the ship; I’m shaping the language.
You’re steering the ship, but AI is shaping the language. Clearly, if the end result is what we’ve been getting, we need AI to quickly learn how to synthesize something better.
I’ll admit this post is a little bit selfish. If you are an author and you’ve gone through this entire post, you now have the skills to dissect your AI-assisted projects and get rid of all of these common tropes before sending it to your editor. That way, we editors can focus on more important issues, like whether it’s actually a self-help book or a memoir, or whether or not you really need a Preface, a Prologue, an Introduction, and three appendices for a 30,000-word book.
The best advice I can give you is to go through each number on this list and do a Ctrl-F search for anything you think doesn’t sound like you. Are there too many negatives used to set up a positive? Are there way too many colons? Here’s what you can do about that: reword the sentence.
If you wouldn’t use em dashes, tell the AI before asking for a rough draft. If you’re uploading your manuscript to check for tone, be specific of the tone you are trying to utilize and track the changes it makes. You will see a pattern develop, which will allow you to pivot and make corrections before submitting your final draft. Prompt the AI to only use the same descriptive word three times before offering synonyms. Have the AI bold instances where changes are made—much like how we use Suggested Edits on our projects—so you can easily spot the changes being made.
You also need to read other books. See what patterns you catch as you go through books in the same genre. If you see your draft has a lot of the same sentence structure or wacky punctuation, stop and rewrite things.
If you do write everything yourself, try not to get in your own head that your work “reads like AI wrote it.” I’ll be completely honest, I use an insane amount of em dashes in my writing, but seeing how often AI uses them, I’ve definitely cut back. Oh, and I will never use the word rhythm in anything I write ever again. If you’re concerned your work reads too much like AI, remember that AI learned how to write by reading millions of books, blog posts, articles, and social media posts written by humans. The AI’s goal is to sound indistinguishable from you. If your writing sounds like AI, all it means is that your writing style is too common; try to find subtle ways to inject some personality into your work. Make everything as “you” as you can.
More than anything, take pride in your work. Your name is on the front cover of the book and, last I checked, I don’t see too many books out there that say, “Written by Some Author and Claude the AI.”
The best books are the ones that are authentic. Be you, even if it’s not great. Then, hire a human editor who knows how to make the book sound like you without having to rely on the same obvious tropes! If you want your book to sell, don’t settle for the cookie-cutter model of writing to then rely on marketing and SEO to grow your audience. Focus on writing something that is new, thoughtful, and uniquely yours. Then get AI to promote the hell out of it.
(Note: This post was written in late November, 2025 so things may change as AI becomes more advanced and more users continue to seek writing help from AI.)
We're always looking for thoughtful voices on writing, editing, and the world of books.

