Control how search engine bots crawl your website. Hide sensitive folders, declare your Sitemap location, and protect your SEO ranking.
Robots.txt is a simple text file located in your website's root directory. It tells search engine crawlers (spiders) which parts of your site they should or should not scan. It is a cornerstone of technical SEO.
This command targets "All Bots". Any rule written below it, such as `Disallow: /admin/`, applies to every visitor bot from Google to Yandex.
Adding `Sitemap: https://...` at the bottom of your robots.txt allows Google to discover new content on your site much faster. Even if you don't use Search Console, bots can find your map here.