The study that's making content teams nervous
Semrush published new research at the start of April 2026 that every marketing team running an AI content operation needs to read. They analysed 42,000 blog pages across 20,000 keywords and used an AI detector to classify each page as human-written, AI-assisted, or fully AI-generated.
The headline number: human-written content appeared in the number-one position 80% of the time. Purely AI-generated content appeared there 9% of the time.
That's an 8x gap at position one. And the gap widens the closer you get to the top of the page.
The part most people aren't talking about
The same study surveyed 224 SEO professionals. 72% of them believe AI content ranks as well as or better than human-written content. The data says the opposite at the top of the page.
That's a striking disconnect. The people running AI content programmes are significantly more optimistic about performance than the ranking data justifies. There are two explanations. Either AI content is genuinely improving and the survey respondents are seeing it, or teams aren't rigorously comparing AI and human content performance in the same way Semrush did.
A second study from DigitalApplied tracked 4,200 articles over 16 months and found something more specific. Purely AI-generated content ranked an average of 23% lower than human-written content targeting the same keywords. But AI-drafted content with substantive human editing came within 4% of fully human writing.
The variable is not whether AI wrote the first draft. It's whether a human improved it before publishing.
🔑 Key finding
AI-assisted content (AI draft + human edit) performs within 4% of fully human-written content. Unedited AI content ranks 23% lower on average and acquires 61% fewer editorial backlinks over time.
Why the backlink gap matters more than the ranking gap
The ranking gap between AI and human content is real but manageable with good editorial process. The backlink gap is harder to fix. The DigitalApplied research found that AI-only content acquired 61% fewer editorial backlinks than human-written articles on comparable topics.
Backlinks remain one of the top ranking signals. And they compound — pages that earn links in months one through three continue to benefit from those links for years. Unedited AI content that fails to attract links doesn't just underperform now. It falls further behind over time.
The same study tracked a secondary finding: the ranking gap between AI and human content widened over the 16-month observation period. At month three, the average gap was 14%. By month 16, it had grown to 31%. Human content accumulates authority. Pure AI content tends to plateau and then decline after algorithm updates.
What this means for your AI content workflow
If you're using AI to write first drafts and then publishing without meaningful human review, this data is a direct warning. Not about AI content being penalised — Google has consistently said it evaluates quality, not production method. The warning is about what "quality" actually means in practice.
Quality signals that AI drafts consistently lack:
- First-person experience or specific examples from actual work
- Original data, proprietary research, or unique case studies
- Editorial perspective — a genuine point of view, not a balanced overview
- Specificity that only comes from doing the thing being written about
These are also exactly the signals Google's E-E-A-T framework (Experience, Expertise, Authoritativeness, Trustworthiness) is designed to reward. The reason human-written content outperforms AI at position one is not that it's human. It's that it tends to have these qualities, and unedited AI content tends not to.
The workflow that actually holds up
The Semrush and DigitalApplied studies point to the same conclusion: the human review step is not optional. But it also doesn't need to be a full rewrite. Here's what substantive human editing looks like in practice:
- Add one specific example from your own experience. Even a sentence. It signals E-E-A-T and makes the piece distinct from every other AI article on the same topic.
- Sharpen the intro. AI intros are structurally correct but predictable. A human-written opening is the most distinctive part of any article. Spend five minutes here.
- Remove anything generic. If a sentence could appear in any article about this topic without changing anything, cut it or replace it with something specific.
- Add a direct opinion. AI generates balanced overviews by default. Pick a position and state it clearly. This is the hardest thing to fake and the most likely to earn links.
None of this takes an hour. Done well, it takes 15–20 minutes per article. That's the investment that bridges the 4% gap between AI-assisted and fully human content — and keeps you out of the 23% underperformance zone.
One more data point worth knowing
While Google still rewards human writing at the top of organic search, the SEO statistics tell a different story for AI-powered search. Research from Ahrefs found that Google's AI Overviews are more likely to cite AI-generated content than human-written content. Two different systems, two different incentive structures. What gets you to position one in organic search is not identical to what gets you cited in an AI Overview.
The practical implication: your content needs to serve both. Human-edited depth and expertise for traditional rankings. Structured, direct, clearly attributed answers for AI citation. The good news is these aren't in conflict — clear structure and authoritative sourcing improve both.