Visually configure crawl rules and generate a valid robots.txt file
1. Add user-agent groups: Each group targets specific crawlers (e.g., Googlebot, Bingbot, or * for all). Set crawl-delay and add allow/disallow rules.
2. Configure rules: For each group, add paths to allow or disallow. Use * as a wildcard (e.g., /admin/* blocks all admin pages).
3. Add sitemaps: Enter your sitemap URLs to help crawlers discover your content faster.
4. Generate & download: Preview the robots.txt content, then copy or download the file. Upload it to your website root directory.