llms.txt
TL;DR
llms.txt is a text file on your website that tells AI systems which content they may use and how they should understand your brand.
llms.txt is an emerging standard (introduced in 2024) that works as a counterpart to robots.txt but specifically for large language models. The file lives in your domain root and contains a structured description of your site, key pages, AI usage rules and instructions for how AI systems should cite your content correctly.
Why it matters
Why it matters
Without llms.txt, AI models have to reverse-engineer your site to understand who you are and what your content is about. With a good llms.txt, you give them the right context directly, increasing correct citations and preventing misattribution.
How to use it
How to use it
- 01 Place the file at /llms.txt in your domain root.
- 02 Start with a short description of your organization and products.
- 03 List your most important pages with absolute URLs and context.
- 04 Add usage permissions (citation allowed, training not, etc.).
- 05 Update on major site changes or new core pages.
Example
Scriberank itself has an llms.txt at scriberank.com/llms.txt declaring we are an AI content platform, our pricing, and which pages are relevant for AI to cite.
Related terms
AI Search
GEO (Generative Engine Optimization)
GEO is optimizing content so AI systems like ChatGPT, Gemini and Perplexity cite your brand in their answers.
Technical SEO
Schema markup
Schema markup is structured data (JSON-LD) that tells search engines and AI models what a page actually is — an article, product, FAQ or recipe.