-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Title: Set Up robots.txt for Production and Staging Environments
Overview
We need to set up robots.txt files for both our production and staging websites. This will help search engines index the production site correctly and prevent them from indexing the staging site.
Tasks
-
Create
robots.txtfor Production- File Location: Inside the
publicfolder of the project. - Content:
User-agent: * Disallow: /admin/ Disallow: /api/ Allow: / Sitemap: https://www.devops-dynamics.com/sitemap.xml - Purpose: This file will guide search engines on what to index on the production site.
- File Location: Inside the
-
Create
robots.txtfor Staging- File Location: Create an API route at
src/pages/api/robots.ts. - Content:
import type { NextApiRequest, NextApiResponse } from 'next'; export default function handler(req: NextApiRequest, res: NextApiResponse) { const isStaging = process.env.NEXT_PUBLIC_ENV === 'staging'; const robotsTxt = isStaging ? `User-agent: * Disallow: /` : `User-agent: * Disallow: /admin/ Disallow: /api/ Allow: / Sitemap: https://www.devops-dynamics.com/sitemap.xml`; res.setHeader('Content-Type', 'text/plain'); res.send(robotsTxt); }
- Purpose: This dynamic file will block search engines from indexing the staging site.
- File Location: Create an API route at
-
Add Custom Route Handling
- File Location: Update
next.config.mjs. - Content:
module.exports = { async rewrites() { return [ { source: '/robots.txt', destination: process.env.NEXT_PUBLIC_ENV === 'staging' ? '/api/robots' : '/robots.txt', }, ]; }, };
- Purpose: This ensures that the correct
robots.txtfile is served based on the environment (production or staging).
- File Location: Update
-
Set Environment Variable
- In Staging: Add
NEXT_PUBLIC_ENV=stagingin the.envfile. - Purpose: This variable will determine if the environment is staging, so the correct
robots.txtis used.
- In Staging: Add
-
Deploy and Test
- Production: Ensure that
https://www.devops-dynamics.com/robots.txtandhttps://devops-dynamics.com/robots.txtwork correctly. - Staging: Ensure that
https://staging.devops-dynamics.com/robots.txtblocks all crawlers.
- Production: Ensure that
Acceptance Criteria
- Production
robots.txtallows search engines to index the site, except for specific directories. - Staging
robots.txtblocks all search engines from indexing the site. - The setup is verified after deployment.
References
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request