Last issue I asked you to check your /robots.txt. A few of you wrote back — you found it, you read it, and then you asked the obvious next question: what's /llms.txt?
That's what this issue is about.
WHY THIS WEEK IS EXACTLY THE RIGHT TIME
One week ago, Cloudflare launched "Markdown for Agents" — a feature that converts HTML to Markdown in real time at the CDN edge when an AI agent requests it. Their benchmarks: a page that costs 16,000 tokens to process as HTML costs 3,000 tokens as Markdown. An 80% reduction.
This isn't incremental. It means AI agents processing thousands of pages per query just became economically viable. The infrastructure for an agentic web is being built right now, at the infrastructure layer.
The businesses that have already introduced themselves to that infrastructure are building a lead that compounds quietly.
THE PROBLEM
When a human visits your website, they navigate menus, skip the footer, find what they need. Messy, but it works.
When an AI agent visits your website — because someone asked it "find me a good HVAC company" or "what's the best project management tool for a small team?" — it hits your HTML: navigation, cookie banners, JavaScript, sidebars, marketing copy. All noise. The actual useful content is buried.
Google handles this by crawling your site over days, indexing everything, serving cached results. AI agents don't work like that. They need to understand your business right now, while answering a user's question. They have a limited context window and seconds to work with.
llms.txt is a file that lives at /llms.txt on your domain. It's a clean, structured summary of who you are and what you do — written for the machine doing the routing, not the human browsing the result. Think of it as introducing yourself to AI agents before they have to guess.
844,000+ websites have already done it — including Stripe, Cloudflare, and Anthropic's own Claude documentation.
THE GAP IS BIGGER THAN YOU THINK
We spent the last several weeks auditing businesses across 21 industries — HVAC companies, law firms, dentists, restaurants, marketing agencies, auto repair shops, electricians. 497 businesses total.
The average AI readiness score: 49 out of 100.
But the gap between industries tells the real story.
Marketing agencies average 72/100. They know about llms.txt. They know about structured data. They've implemented it because their clients ask about it and their employees follow this space. They're not better businesses — they're just closer to the information.
Electricians average 31/100. HVAC companies average 39/100. Veterinary clinics average 38/100.
Same economy. Same AI agents doing the routing. Very different visibility.
The electrician who scores 31/100 isn't doing anything wrong. They're running a legitimate business, doing good work, and their website looks fine to a human visitor. But to an AI agent trying to answer "who's the best electrician near me?" — they're nearly invisible. The structured signals aren't there. The introduction hasn't been made.
The marketing agency that scores 72/100 didn't get lucky. They systematically made themselves visible to the systems doing the routing.
That's the gap. And it's not closing on its own.
WHAT LLMS.TXT ACTUALLY IS
The format is simple by design. It lives at /llms.txt on your domain. It's a plain text file written in Markdown that tells AI agents:
What your business does
Who you serve
Where to find your key pages — services, pricing, contact, about
What makes you different
Here's how Stripe does it:
Stripe
Stripe is a technology company that provides financial infrastructure
for businesses.
Payments
Stripe Payments (stripe.com/payments): Accept payments online and in person globally.
Payment methods: Offer popular local payment methods around the world.
Billing
Stripe Billing (stripe.com/billing): Manage subscriptions and recurring revenue.
That's it. Your name. One sentence about what you do. Your main services with links. It takes 20 minutes for a professional to write a good one — less than that to write a basic one.
844,000 sites have it. Most of your competitors probably don't.
THE HONEST CAVEAT
Here's what nobody tells you: no major AI platform has officially confirmed they read llms.txt files.
So why bother?
Three reasons:
The trajectory is clear. AI coding tools like Cursor and GitHub Copilot actively use llms.txt to understand APIs. As agentic browsing matures, the standard will be read.
Cloudflare is building the infrastructure. "Markdown for Agents" — launched February 2026 — is CDN-level support for the agentic web. The big players are treating this as real infrastructure, not a thought experiment.
The cost is 20 minutes. If it helps — and the evidence strongly suggests it will — you'll be glad you did it. If it doesn't, you lost 20 minutes.
But — and this matters — llms.txt is one signal. There are eight.
The businesses scoring 72/100 in our dataset didn't get there from an llms.txt alone. They have structured data. They have clean meta descriptions. They have AI crawlers allowed in their robots.txt. They have contact and about pages that are easy to find. The llms.txt is the introduction — but the rest of the signals are the handshake.
WHERE DO YOU STAND?
You can find out in about 30 seconds.
We built a free Agent Readiness Auditor at aaoweekly.com/audit. Enter your domain. Get your score across all eight signals, a grade, and specific recommendations for what's missing.
The average across 497 businesses is 49/100. Most people who run it are surprised by what they find — both what's working and what isn't.
Run it. See where you land.
Next issue: structured data (JSON-LD) — the signal that 62% of businesses are missing, why it matters more than llms.txt right now, and what it actually looks like when you have it.
Subscribe free at aaoweekly.com. Forward this to a business owner who should know their score.
