The Internet Is No Longer Just for Humans—AI Bots Love Markdown

|Benni Mack
A human hand shakes a robotic hand against a purple background with heart patterns, symbolizing collaboration between humans and technology.

For years, we optimized websites for two roles: editors who create content and human readers who consume it. That model no longer reflects reality. 

Amid discussions about declining traffic, SEO saturation, or the rise of “GEO,” we often overlook the most obvious change: the majority of visitors are no longer people.

Multiple reports estimate that automated traffic will account for around 80% of all web traffic in 2025, and real-world log file analyses back this up. AI crawlers, autonomous agents, and LLM-powered systems have become a first-class audience of the web.

Not “bad bots”. Not scrapers.

But systems explicitly designed to understand content. 

This is what Dries Buytaert calls the third audience: humans, editors, and machines. Once you accept that this audience exists, the real question becomes uncomfortable: Are we still publishing content in a way that makes sense for them? It’s not that your users aren’t interested in your products or your services anymore, they just consume it differently.

Why HTML Is a Terrible Knowledge Format for AI

HTML was never meant to be a knowledge interchange format. It is a presentation format.

Modern websites ship deeply nested DOM trees, megabytes of CSS, JavaScript-driven layouts, navigation layers, tracking scripts, cookie banners, and UI noise. Humans filter this visually. AI has to parse it. And the internet has to transfer it from one system to another.

That’s why more and more people are rediscovering an old idea: Separate content from presentation.

In his post, Dries explains how he added Markdown content negotiation to his own website instead of forcing AI systems to reverse-engineer meaning from HTML . Similar approaches are discussed across the ecosystem. Developer-focused articles describe how serving Markdown drastically improves AI consumption of documentation. 

Even large documentation platforms like Read the Docs are actively evaluating Markdown-first or alternate Markdown representations and Initiatives like llms.txt explicitly mention content negotiation as a foundational building block for AI-friendly websites. The pattern is always the same: Give machines the content—not the website.

TYPO3 Has Been Ready for This All Along

TYPO3 was never about HTML pages. It was always about structured content, semantic fields, clear ownership, and controlled output. The frontend was just one possible channel. Whether content is consumed on a website, in an app, via an API, or by an AI assistant was never supposed to matter.

At b13, we strongly believe that content is the real asset. If your content is well structured, cleanly authored, and stored in a system that respects separation of concerns, distribution becomes a technical detail.

AI didn’t break that model.

It validates it.

Content Negotiation, Used Properly

Content negotiation is not new. It is part of HTTP.

A client can request a resource and express a preference for a specific representation. Browsers usually ask for text/html. AI agents increasingly prefer simpler, semantically clean formats.

With proper content negotiation, you don’t create a second website. You expose the same resource in multiple representations:

  • HTML for humans
  • Markdown for LLMs

Search engines, crawlers, and agents can discover these alternatives through standard mechanisms like <link rel="alternate">. No AI-specific endpoints. No custom sitemaps. Just the web working as intended.

This is not a SEO trick. It is protocol-level correctness.

“AI Bots Love Markdown”—A TYPO3 Experiment

Inspired by these ideas, we built a small TYPO3 extension: b13/ai-bots-love-markdown

Not as a gimmick, but as an experiment.

The constraints were strict:

  • no duplicate content
  • no new editorial workflows
  • no additional maintenance
  • no backend coupling

The extension exposes your existing TYPO3 content as Markdown. Each page can be requested as a Markdown representation, enriched with YAML front matter for metadata. TYPO3 advertises this representation via content negotiation and alternate links.

Editors don’t notice it. Integrators don’t maintain it. AI systems actively prefer it.

What We Observed on b13.com

We enabled the extension on b13.com for testing. It’s been on our production servers for more than a week.  Within 24 hours, bot traffic shifted noticeably from HTML to Markdown. No announcement. No promotion. Just discovery.

That shift confirms what many suspected:

AI systems are already optimized to detect and prefer cleaner content representations, which is exactly what we want. 

Making b13 knowledge easier to read, understand, and reuse aligns perfectly with our mission of sharing expertise.

This Is Not About Chasing Trends

This is not about “optimizing for AI”.

It’s about going back to fundamentals:

  • structured content
  • clean separation of concerns
  • multiple output channels
  • correct use of web standards

Markdown just happens to be the format AI understands best right now. Tomorrow it might be something else. TYPO3 doesn’t care—as long as the content model is sound.

And that’s the real point.

Ready to Try It?

The extension works with TYPO3 v13 and v14 and is available in Composer mode.

Install it, enable the Site Set, and observe what happens in your server logs.

If you’re still running TYPO3 without Composer, or want to think more strategically about multi-channel content, AI-readability, or long-term content architecture, this is exactly the kind of conversation we have at b13. 

Sometimes the future doesn’t require new content—just better ways to share the content you already have.