Skip to content

Commit 9aad1d3

Browse files
ostermanclaude
andcommitted
feat: Make robots.txt environment-aware using DEPLOYMENT_HOST
- Add generate-robots-txt.js script that uses DEPLOYMENT_HOST env var - Fallback to atmos.tools (production) when DEPLOYMENT_HOST is not set - Add robots.txt to static/.gitignore since it's now generated at build time - Run generation in prebuild step alongside themes.json copy This ensures non-production deployments (staging, preview) have the correct sitemap URL in robots.txt, matching the same pattern used in algolia/reindex.sh. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
1 parent f6d182b commit 9aad1d3

File tree

3 files changed

+28
-1
lines changed

3 files changed

+28
-1
lines changed

website/package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"license": "Apache-2.0",
55
"scripts": {
66
"docusaurus": "docusaurus",
7-
"prebuild": "cp ../pkg/ui/theme/themes.json static/themes.json",
7+
"prebuild": "cp ../pkg/ui/theme/themes.json static/themes.json && node scripts/generate-robots-txt.js",
88
"start": "BROWSER=open docusaurus start",
99
"build": "docusaurus build",
1010
"postbuild": "cp build/llms.txt static/llms.txt && cp build/llms-full.txt static/llms-full.txt",
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
#!/usr/bin/env node
2+
3+
/**
4+
* Generates robots.txt with environment-aware sitemap URL.
5+
* Uses DEPLOYMENT_HOST env var with fallback to atmos.tools (production).
6+
*/
7+
8+
const fs = require('fs');
9+
const path = require('path');
10+
11+
const DEPLOYMENT_HOST = process.env.DEPLOYMENT_HOST || 'atmos.tools';
12+
13+
const robotsTxt = `# Algolia-Crawler-Verif: 10F61B92D9EB1214
14+
15+
# Allow all crawlers to index all content
16+
User-agent: *
17+
Allow: /
18+
19+
# Sitemap location
20+
Sitemap: https://${DEPLOYMENT_HOST}/sitemap.xml
21+
`;
22+
23+
const outputPath = path.join(__dirname, '..', 'static', 'robots.txt');
24+
fs.writeFileSync(outputPath, robotsTxt);
25+
26+
console.log(`Generated robots.txt with sitemap URL: https://${DEPLOYMENT_HOST}/sitemap.xml`);

website/static/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,3 @@
11
glossary.json
2+
robots.txt
23
themes.json

0 commit comments

Comments
 (0)