Been experimenting with how LLMs crawl and interpret content. Realized most sites aren’t optimized for them at all. BUT! the popular ones have already added llms txt file
So I built a free tool that converts a site’s sitemap.xml into a markdown-style llms.txt file — kind of like robots.txt but for LLMs. It lists useful pages for AI agents to learn from or link to, in plain readable format.
Try it out: https://keploy.io/llmstxt-generator
Inspired by the idea that the next SEO battle isn’t just human-readable content, but AI-readable structure.
Would love feedback or ideas on what to add next.