What is llms.txt and Why Your Site Needs One
llms.txt helps AI agents understand your site. Learn how to create one and what to include.
What is llms.txt?
llms.txt is a plain-text file you place at the root of your domain that helps AI agents understand your site — what it does, how to use it, and where to find its most important content.
Think of it as the AI-native equivalent of robots.txt. While robots.txt tells crawlers which paths they can visit, llms.txt tells AI systems what your site means and how it should be used.
The format was proposed by Jeremy Howard (fast.ai) and is gaining adoption among API services, developer tools, and content sites that want to be reliably surfaced in AI-powered search and agent workflows.
Why AI agents need it
When a user asks an AI assistant a question, the assistant may search the web and synthesize answers from multiple pages. Without guidance, it has to guess which page to look at, what your API does, and whether your content is trustworthy.
llms.txt answers these questions upfront:
- What does this site do? A 1–2 sentence description.
- Who should use it? The intended audience.
- What are the key pages? Direct links to documentation, API specs, pricing.
- How can it be used programmatically? Links to OpenAPI specs or machine-readable endpoints.
An agent that can read your llms.txt in one request has enough context to either answer the user directly or know exactly where to navigate next — without crawling every page.
What to include
A minimal llms.txt has three parts:
- H1 — site name
- Blockquote — one-paragraph description
- Sections with markdown links — key resources grouped by topic
Optional but useful:
- Pricing summary
- Auth/payment requirements
- API endpoint list with one-line descriptions
Example: paylog.dev's llms.txt
Here's the actual llms.txt from paylog.dev:
# paylog
> paylog.dev tracks your MPP (Machine Payments Protocol) and x402 spending across AI services.
> Call /api/v1/report with your Tempo wallet address to see a breakdown of what you've spent and where.
> No signup required. Pay-per-use via MPP ($0.001 USDC) or x402 ($0.001 USDC on Base).
## API
- [Spending report (Tempo/MPP)](https://paylog.dev/api/v1/report): GET /api/v1/report?wallet=0x...&from=YYYY-MM-DD — returns spending breakdown by service
- [Spending report (Base/x402)](https://paylog.dev/api/v1/x402/report): GET /api/v1/x402/report?wallet=0x... — returns x402 spend on Base
- [Insights](https://paylog.dev/api/v1/insights): GET /api/v1/insights?wallet=0x...&from=YYYY-MM-DD — cost optimization tips
- [OpenAPI spec](https://paylog.dev/openapi.json): Machine-readable spec with x-payment-info (AgentCash Discovery format)
## Pricing
All API endpoints are pay-per-use. No API key required.
- /api/v1/report — $0.001 USDC via MPP (Tempo) or x402 (Base)
- /api/v1/insights — $0.001 USDC via MPP (Tempo)
## Optional tools
- [AI Bot Score](https://paylog.dev/score): Free tool to audit your site's robots.txt and AI crawler configuration
- [Blog](https://paylog.dev/blog): Guides on robots.txt, llms.txt, x402, and AI-era site configuration
This file is publicly accessible at https://paylog.dev/llms.txt and is fetched by agents as part of the AgentCash Discovery flow.
Where to place it
Put llms.txt at your domain root — the same directory as robots.txt:
https://yourdomain.com/llms.txt
Some frameworks (like Next.js App Router) use route handlers for dynamic generation. For most sites, a static file is fine.
How to write it
Keep it short
AI agents retrieve and process llms.txt as part of a larger request chain. A file under 2KB is ideal. A 50KB marketing page converted to markdown is not useful.
Use plain markdown
The format is intentionally simple — headers, blockquotes, and links. Avoid tables, embedded HTML, or complex nesting.
Link to machine-readable resources first
If you have an OpenAPI spec or a structured API, link to it near the top. Agents can use structured specs more reliably than prose descriptions.
Update it when your API changes
llms.txt is not automatically generated from your code. Add it to your release checklist when you add or remove endpoints.
Check your configuration
paylog.dev/score checks whether llms.txt exists on your domain and includes it in your AI-readiness score. It's a quick way to confirm the file is accessible and served correctly.