ai
Declare a policy in /agents.txt
MetricSpot looks for /agents.txt — a new draft standard for telling autonomous AI agents what your site allows, what it costs, and where the action endpoints are.
What this check does
Fetches https://yourdomain.com/agents.txt and verifies the file exists and parses as a valid agents.txt document. The check passes silently if the file is present; it fails (informationally — this isn’t a blocker for most sites) when the file is missing and your audit profile is “AI-ready.”
Why it matters
agents.txt is to autonomous AI agents what robots.txt is to crawlers — a public file at a well-known URL that declares your site’s policy.
The difference: robots.txt is binary (allow / disallow per path). Autonomous agents need more structured information:
- Which actions are allowed. Can an agent place an order? Read a customer’s invoice? Cancel a booking?
- Which require authentication. Most write actions need a logged-in session; the agent needs to know whether to expect a sign-in flow.
- What it costs. Some platforms charge per-API-call for agent traffic; some block agents that haven’t paid.
- Where the action endpoints live. Agents prefer structured JSON endpoints over scraping HTML — agents.txt can point them at
/api/or aschema.org/Actionmanifest. - Citation policy. Whether the agent should link back to your site when quoting you in an answer.
The format is still evolving (the agentstxt.org proposal is the most-cited reference). Early adopters get a free signal that they’re agent-friendly, which AI platforms can prefer when there’s ambiguity.
How to fix it
Create /public/agents.txt (or wherever your server serves static files from). A minimal example:
# agents.txt — autonomous-agent policy for example.com
# See https://agentstxt.org/
Contact: hello@example.com
Sitemap: https://example.com/sitemap.xml
# Public read: anyone, no auth, no rate limit beyond the sitewide one
User-agent: *
Allow: /
Allow: /docs/
Allow: /blog/
Allow: /pricing/
# API surface for structured access (preferred over HTML scraping)
Api: https://example.com/api/openapi.json
# Account-gated actions
User-agent: *
Auth-required: /account/, /api/billing/, /api/audits/
Cost-per-call: free
# Citation policy: link back to the source page
Citation-policy: link
# Block training crawlers; allow live-fetch agents
User-agent: GPTBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: ChatGPT-User
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Claude-Web
Allow: /
Astro / Next.js / static sites — drop the file in public/ and it ships at /agents.txt automatically.
WordPress — upload to the document root via FTP or use a plugin that registers a virtual route at /agents.txt.
Server-rendered apps — add an explicit route that returns the file with Content-Type: text/plain.
Pair it with the standard signals.
- robots.txt — the binary allow/disallow for traditional crawlers.
- llms.txt — a curated index of your highest-value content for AI training and answer engines.
- Allow AI crawlers — the robots.txt-side decision for GPTBot / ClaudeBot / PerplexityBot / Google-Extended.
The three files complement each other: robots.txt for crawl scope, llms.txt for content quality signal, agents.txt for action policy.
Audit yourself:
curl -sI https://yourdomain.com/agents.txt
Expect 200 OK and content-type: text/plain.
Frequently asked questions
Is agents.txt an official standard?
Not yet. It’s a draft proposal at agentstxt.org, gaining traction with AI platforms and crawler operators. The format is settling; the URL location (/agents.txt) is stable. Adopting now is forward-compatible.
Will not having agents.txt hurt my SEO?
No. agents.txt is opt-in metadata for autonomous agents, not a search-ranking signal. It only matters if you want to communicate explicit policy to agents that respect the file.
How is agents.txt different from llms.txt?
llms.txt (per the Anil Dash proposal) is a content index — your best pages, summarized, for LLMs to use as a training/grounding signal. agents.txt is a policy file — what actions are allowed, what’s gated, what costs money. Use both.
Sources
Last updated 2026-05-11