Skip to content

Implement robots.txt Configuration for Production and Staging Environments #8

@suyashbhawsar

Description

@suyashbhawsar

Title: Set Up robots.txt for Production and Staging Environments


Overview

We need to set up robots.txt files for both our production and staging websites. This will help search engines index the production site correctly and prevent them from indexing the staging site.

Tasks

  1. Create robots.txt for Production

    • File Location: Inside the public folder of the project.
    • Content:
      User-agent: *
      Disallow: /admin/
      Disallow: /api/
      Allow: /
      
      Sitemap: https://www.devops-dynamics.com/sitemap.xml
      
    • Purpose: This file will guide search engines on what to index on the production site.
  2. Create robots.txt for Staging

    • File Location: Create an API route at src/pages/api/robots.ts.
    • Content:
      import type { NextApiRequest, NextApiResponse } from 'next';
      
      export default function handler(req: NextApiRequest, res: NextApiResponse) {
        const isStaging = process.env.NEXT_PUBLIC_ENV === 'staging';
        
        const robotsTxt = isStaging
          ? `User-agent: *
             Disallow: /`
          : `User-agent: *
             Disallow: /admin/
             Disallow: /api/
             Allow: /
             
             Sitemap: https://www.devops-dynamics.com/sitemap.xml`;
      
        res.setHeader('Content-Type', 'text/plain');
        res.send(robotsTxt);
      }
    • Purpose: This dynamic file will block search engines from indexing the staging site.
  3. Add Custom Route Handling

    • File Location: Update next.config.mjs.
    • Content:
      module.exports = {
        async rewrites() {
          return [
            {
              source: '/robots.txt',
              destination: process.env.NEXT_PUBLIC_ENV === 'staging' ? '/api/robots' : '/robots.txt',
            },
          ];
        },
      };
    • Purpose: This ensures that the correct robots.txt file is served based on the environment (production or staging).
  4. Set Environment Variable

    • In Staging: Add NEXT_PUBLIC_ENV=staging in the .env file.
    • Purpose: This variable will determine if the environment is staging, so the correct robots.txt is used.
  5. Deploy and Test

    • Production: Ensure that https://www.devops-dynamics.com/robots.txt and https://devops-dynamics.com/robots.txt work correctly.
    • Staging: Ensure that https://staging.devops-dynamics.com/robots.txt blocks all crawlers.

Acceptance Criteria

  • Production robots.txt allows search engines to index the site, except for specific directories.
  • Staging robots.txt blocks all search engines from indexing the site.
  • The setup is verified after deployment.

References

Metadata

Metadata

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions