AI SEO Foundations

Module 2 — Technical Foundations

Lesson 6: Crawlability, robots.txt, llms.txt, and AGENTS.md

Last updated: 2026-04-27

Before an agent can read a page, it has to be allowed to fetch it, find it, and parse it efficiently. Three tiny files at the root of a site — robots.txt, sitemap.xml, and the newer llms.txt — control most of that. A fourth, AGENTS.md, has emerged as a way to give coding agents a fast on-ramp.

Lesson 6: Crawlability, robots.txt, llms.txt, and AGENTS.md

Blackboard

Four small files do most of the routing

13 min
Pause and Think

Practice Question

Question

Does not count for certificate
Choose one answer, use the hint if needed, then check your answer.
Use the hint before checking if you need a nudge.