Cluster C3: Network & Security
Robots.txt testing checklist
One incorrect robots directive can suppress discovery of critical pages across locales. A pre-release checklist for robots rules and crawler simulation prevents accidental deindexing during migrations and platform changes.
Pre-release robots checks
- Validate locale path access patterns and disallow rules for private paths only.
- Verify sitemap declaration and canonical links remain crawlable.
- Simulate crawler access for top landing pages and key tool routes.
- Review environment-specific robots behavior before production deploy.
Practical input/output example
Input
User-agent: * Disallow: /
Output
crawler access: blocked indexing risk: critical