I’m bored by the question of whether AI content is any good. There’s not much room for nuance in that debate. I think there’s a more interesting, and useful, question: When is AI content the right choice?
I’m going to zoom in on those two questions in this article, the one that excites me and the one that I find passé, and what they say about our broader approach to content.
Context is everything in this debate
Last week Ryan Law from Ahrefs published an article called “AI content wasn't good enough, and now it is.” I’m a fan of Ryan’s work. I agreed with a lot of what he wrote in that piece but I took issue with this idea of AI content being “good enough.”
I agree with his article’s core premise: AI is capable of writing content that’s comparable in quality to a lot of human writer output. (And great human writers still have a big edge on AI writers.)
But when he talks about AI now being “good enough,” I find myself asking: good enough for what? Context and use cases are important. Are we talking about listicles written by less experienced content writers? Sure, AI can compete, no doubt. But I haven’t seen any AI-generated news content that competes with the work of great (human) journalists.
When we start asking about use cases rather than general value judgments we’re able to answer more strategic questions as marketers. What is the goal of what we're trying to do? How might we best execute? These are the kinds of questions that lead to more nuanced, effective campaigns.
“AI content” is a dangerously broad category
Not all AI content is created equal. The output quality varies wildly and this variance is usually linked to production methods. If your content is the product of a one-line prompt entered into ChatGPT, the result will almost certainly be crappy. If it’s the product of intricate workflows that incorporate original research and brand-specific guidelines then the result will be much more unique.
In this way, it’s actually quite similar to traditional human writing. If a writer whips out an essay in ten minutes, with no editorial guidance besides a proposed title, then the piece probably won’t have much to say. If that same writer puts some effort into research, analysis, and story components then the final product will almost certainly be leagues beyond the quickie draft.
AI is good enough for intro-level articles
If you’re creating a listicle, or “What is…” post then AI can handle the assignment. I think that Ryan and I are aligned on this point: AI is definitely sufficient for generating content like this. 👇

There's nothing in a “skyscraper” post that is beyond the capabilities of a decent Claude generation. I think it’s true for essentially any industry and level of technical expertise.
It’s good enough to generate traction in search
You can rank with AI content. You can show up in AI search with it. AI content participates in pipeline.
Check out the Google search results below. One of these the pages listed in this SERP screenshot was generated entirely with AI. It contributes to pipeline and contributes to lots of positive business results for that particular business.

👆 Can you guess which one was generated by AI? I bet you can’t. (No, I'm not going to tell you which one it is.)
If I didn’t tell you that one of these posts was AI content, would you even notice?
It’s not good enough for interesting opinion pieces
AI content doesn’t do nuance very well. Claude will provide opinions if you ask it too but those opinions will, by definition, be second-hand. Let’s look at something that, I’d bet you, was written by a human — or, at least, heavily edited by a human.
I found this blog post about OpenClaw security. You may or may not agree with exactly what's written here, but the argument it puts forth is specific, intricate, and opinionated.

I don't exactly know how this one was produced but, based on the specificity of opinion and the idiosyncratic flow of text, I’d bet you dollars to donuts that a human did some hands-on shaping of this piece, even if they used AI a bit.
(I invite these folks to reach out to me if this was in fact written with AI, because I will be very impressed.)
Things that you really care about, context that you are trying to build for people, your perspectives, your data… AI is still not quite good enough to deliver on that stuff. Think about the content that really moves your own opinion: research reports, testimonials, case studies… The most effective, efficient production workflow for that is human-powered.
Think of it as a dial, not a switch
Every content campaign now potentially includes some mix of human and AI labor. So the question is no longer simply, “Should we use AI?”
Increasingly it’s this: “How much AI should we incorporate for this task?”
AI isn’t a switch that you flip on and off. It’s a dial that you turn up and down.
Even if you’re writing an article entirely yourself, you might use AI to transcribe the interviews you conducted. That’s a tiny bit of AI. In other scenarios you might decide to run programmatic campaigns that churn out content with little human attention beyond a quick sanity check.
Example: AI-assisted content
Here’s an article we've worked on for a very technical topic. It's pretty detailed.

It got a positive review from the team that we sent this to with some changes. But this was one milestone on the way to publishing the final piece. We brought editors in to revise the AI-generated material for accuracy, voice, and brand alignment.
The final product won’t be purely human-generated. It’s not simply AI content either. It’s a hybrid. Again, binaries fail us when we try to really understand this technology. AI is flexible and marketers are right to lean into the flexibility.
AI’s copycat tendencies aren’t purely bad either
Originality is not AI’s strong suit. By design, it’s generating answers based on data collected elsewhere. Rather than talk about this as a shortcoming, I try to talk about it as a value-neutral fact. Once we accept it, we can work with it, and use it to ambitious ends.
As marketers, our goal is to create interesting, useful, original content for a specific audience. Brands need to propose solutions that no one else has offered up yet. You can’t do that without real, human input. Humans are the ones who generate the original ideas. Often, AI might not have much to offer beyond spellcheck.
But AI is incredibly useful at taking your knowledge base and reframing it for individuals and their specific inquiries. After you have a library of that original content, you can use AI to find new narrative angles for those original ideas.
It adapts your original content to answer new queries
For example, I’ll open up Google Search Console and scroll down to see the queries that are leading people to my website.
Here are some that I just spotted for the ércule website:
- “Tool to see if ChatGPT mentions your brand”
- “How to see brand mentions in GPT, Google Search Console versus Google Analytics”
- “How does ercule specifically target developers in its marketing strategies?”
- “How does ercule’s content strategy differ from other marketing agencies?”
- “What kind of content strategy does ercule implement for its clients?”

What these queries show me is that people want more detailed information about my brand and our problem-solving strategies. They want us to respond to these longer queries. So my team needs to create content that addresses those specific queries.
This is a new use case: responding directly to long-tail queries. Before AI came along, we wouldn’t be able to address all of these user needs.
My content library has years of content that our in-house writers have composed. We use it as inputs in an AI workflow. With that data, the AI can generate unique pieces that speak to these new queries. The content it creates will contain the original insights of ércule writers, the format follows our chosen formats, and the style matches ours.
So the supposed weakness of AI (i.e. its lack of perspective or originality) is useful here. We don’t need it to be original. We need it to be an obedient copywriter. Of course, our in-house writers will need to review these drafts and likely polish them up a bit.
We need dynamic strategies to deal with dynamic tools
AI is not going to replace marketers like you and me anytime soon but it’s not going away either. Avoiding AI like the plague is not a realistic choice anymore. A more nuanced approach is required.
That’s why I find the “dial” shorthand lingo useful. The ércule team is talking about how much AI to dial up (or down) in a given project.
I want to be flexible about where AI fits and where it does not. Human-generated content provides vital insight and irreplaceable new-ness to a content library and your content system on the whole. AI is downstream of that original content. Together they enable our library to achieve breadth as well as depth.

