A robust solution for migrating large Contentful spaces without hitting rate limits.
Migrate thousands of assets and entries between Contentful spaces by intelligently splitting them into manageable batches. Perfect for moving content between regions, environments, or organizations.
New to this tool? Start with the Proven Migration Workflow - our battle-tested, step-by-step guide that has successfully migrated 10,000+ assets (12.6GB) and 25,000+ entries with 4,000+ circular dependencies.
What's inside:
- β Complete workflow from export to validation restore
- β Two proven import methods (Official CLI vs. Custom Script)
- β Draft cleanup (2,000+ drafts) and validation stripping (50+ content types)
- β Assets-first import strategy
- β Brute force publishing for massive circular dependencies
- β Real production examples with timelines (12-14 hours total)
π Read the Proven Workflow Guide β
- Batch Processing: Automatically split large exports into configurable batch sizes
- Client-Side Rate Limiting: Token bucket algorithm enforces API rate limits (10 req/sec, 36K req/hour)
- Draft Cleanup: Identify and remove invalid/orphan draft entries before migration
- Smart Publishing: Multiple publishing strategies for handling circular dependencies
- Tag Management: Tag and filter draft entries for selective publishing
- Brute Force Publishing: Repeatedly publish drafts until circular dependencies resolve
- Smart Relationships: Maintains asset-entry relationships across batches
- Resume Support: Automatically resume failed or interrupted migrations
- Progress Tracking: Detailed logs and state management
- Validation: Post-migration validation to ensure data integrity
- Retry Logic: Configurable retry attempts with exponential backoff
β Migrating 1,000+ assets and entries β Moving content between Contentful regions (US β EU) β Copying content between organizations β Environment cloning with large datasets β Avoiding "Too Many Requests" (429) errors
Importing large Contentful exports (4,000+ assets, 10,000+ entries) directly causes:
- Rate limiting errors (429 Too Many Requests)
- Failed imports
- Lost time and frustration
This tool:
- Splits your export into batches (500-700 assets each)
- Maintains relationships between assets and entries
- Imports batches sequentially with delays
- Retries failed batches automatically
- Validates migration success
- Node.js >= 18.0.0 (LTS recommended)
- npm or yarn
- Contentful Management Token (CMA)
# 1. Clone the repository
git clone https://github.com/faisalbasra/contentful-batch-migrator.git
cd contentful-batch-migrator
# 2. Install dependencies
npm install
# 3. Set up configuration
npm run setup # Interactive setup wizard (or manual setup below)
# Manual setup (alternative to setup wizard):
cp config/batch-config.example.json config/batch-config.json
cp config/cascade-config.example.json config/cascade-config.json
# Edit config files with your Contentful credentials
# 4. Run the migration
npm run split # Step 1: Split export into batches
npm run import # Step 2: Import batches sequentially
npm run validate # Step 3: Validate migration success
# 5. Publish content (if needed)
npm run cascade-publish # Smart dependency-aware publishing
# OR
npm run publish-all # Brute force publishing (for circular dependencies)First time user? Check out the Getting Started Guide for a detailed walkthrough.
See all available commands:
npm run helpnpm run help # Show all available commands
# Migration Commands
npm run cleanup-drafts # Analyze and remove invalid drafts
npm run split # Split export into batches
npm run import # Import all batches
npm run import:cli # Import using CLI (recommended for assets)
npm run validate # Validate migration
npm run resume # Resume failed import
npm run resume:cli # Resume failed CLI import
# Publishing Commands
npm run publish-assets # Publish all draft assets
npm run publish-assets:dry-run # Preview asset publishing
npm run cascade-publish # Smart publish with dependency resolution
npm run cascade-publish:dry-run # Preview cascade publish
npm run publish-all # Brute force publish all drafts
npm run publish-all:dry-run # Preview brute force publish
npm run tag-drafts <tag> [opts] # Tag/untag draft entries
# Cleanup Commands
npm run clean # Remove batches directory
npm run clean:all # Remove batches and export
npm run clean-space [opts] # Delete entries/content types/assets from space
npm run clean-space:dry-run # Preview space cleanupEdit batch-config.json:
{
"batchSize": 400,
"sourceFile": "./contentful-export/exported-space.json",
"sourceAssetsDir": "./contentful-export",
"outputDir": "./batches",
"targetSpace": {
"spaceId": "YOUR_TARGET_SPACE_ID",
"environmentId": "master",
"managementToken": "YOUR_CMA_TOKEN",
"host": "api.contentful.com"
},
"importOptions": {
"uploadAssets": true,
"skipContentPublishing": false,
"delayBetweenBatches": 180000,
"maxRetries": 3,
"retryDelay": 5000
},
"rateLimits": {
"enabled": true,
"requestsPerSecond": 10,
"requestsPerHour": 36000,
"verbose": true
}
}| Option | Description | Default | Recommended |
|---|---|---|---|
batchSize |
Assets per batch | 400 | 400-700 |
delayBetweenBatches |
Wait time between batches (ms) | 180000 | 180000-300000 |
maxRetries |
Retry attempts per batch | 3 | 3-5 |
retryDelay |
Initial retry delay (ms) | 5000 | 5000-10000 |
rateLimits.enabled |
Enable client-side rate limiting | true | true |
rateLimits.requestsPerSecond |
Max requests per second | 10 | 10 |
rateLimits.requestsPerHour |
Max requests per hour | 36000 | 36000 |
π Rate limiting details: docs/RATE-LIMITING.md
First, export your content from the source Contentful space:
npx contentful-export \
--space-id SOURCE_SPACE_ID \
--management-token SOURCE_TOKEN \
--export-dir ./contentful-export \
--download-assetsπ Detailed guide: docs/EXPORT-GUIDE.md
If your export contains draft entries with missing required fields or orphan drafts, clean them before importing:
npm run cleanup-draftsOutput:
π Analysis Summary:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Total Entries: 11985
ββ Valid Published Entries: 11850
ββ Valid Draft Entries: 100
ββ Invalid Drafts: 25 β οΈ
ββ Orphan Drafts: 10 β οΈ
Total Assets: 4126
ββ Valid Published Assets: 4100
ββ Valid Draft Assets: 20
ββ Invalid Asset Drafts: 6 β οΈ
Total Items to Remove: 41 ποΈ
What it does:
- Identifies draft entries with missing required fields
- Finds orphan drafts (content type doesn't exist)
- Detects invalid asset drafts (missing files)
- Creates
draft-cleanup-report.jsonwith detailed analysis - Generates cleaned export:
contentful-export/exported-space-cleaned.json
Next: Update your batch-config.json to use the cleaned file:
{
"sourceFile": "./contentful-export/exported-space-cleaned.json"
}Split your large export into batches:
npm run splitOutput:
π Starting Contentful Export Splitter...
π Source data summary:
- Assets: 4126
- Entries: 11985
π¦ Created 7 batches
β
Splitting completed successfully!
Creates batches/ directory with subdirectories for each batch.
Import all batches sequentially:
npm run importFeatures:
- Automatically imports content model in first batch
- Waits between batches (prevents rate limiting)
- Retries failed batches
- Saves progress state
Expected time: 3-5 hours for ~4,000 assets (with rate limiting enabled)
Verify the migration was successful:
npm run validateOutput:
β
Content Types Source: 60 | Target: 60 | Diff: 0
β
Entries Source: 11985 | Target: 11985 | Diff: 0
β
Assets Source: 4126 | Target: 4126 | Diff: 0
π Validation passed! All data migrated successfully.
π Detailed guide: docs/IMPORT-GUIDE.md
After importing with skipContentPublishing: true, you need to publish draft entries. Choose the appropriate publishing strategy based on your needs:
Publishes entries in dependency order (entries with no dependencies first, then their dependents):
# Preview first
npm run cascade-publish:dry-run
# Publish
npm run cascade-publish
# Skip tagged entries
npm run cascade-publish -- --skip-tag skip-publishFeatures:
- β Analyzes entry dependencies
- β Publishes in waves (depth-first)
- β Handles circular references gracefully
- β Can skip tagged entries with --skip-tag
β οΈ May skip entries with circular dependencies
Output:
Analyzing dependencies...
Found 11985 draft entries
Publishing wave 1 (depth 0): 5000 entries
Publishing wave 2 (depth 1): 4000 entries
Publishing wave 3 (depth 2): 2500 entries
β οΈ Skipped 485 entries with circular dependencies
Attempts to publish all drafts without analyzing dependencies. Run repeatedly until all are published:
# Preview first
npm run publish-all:dry-run
# Run repeatedly until complete
npm run publish-all
npm run publish-all # Run again
npm run publish-all # Keep running until failures reach 0Features:
- β No dependency analysis required
- β Handles circular dependencies through repetition
- β Each run publishes what it can
- β
Failed entries are saved to
failed-entries.json
Output:
Total draft entries: 4000
β
Successfully published: 1200
β Failed to publish: 2800
βοΈ Skipped (already published): 0
π‘ TIP: Run this script again to retry failed entries.
How it works:
- First run: Publishes entries with no unpublished dependencies (~30%)
- Second run: Publishes entries whose dependencies were published in run 1 (~40%)
- Third run: More entries get published (~20%)
- Continue: Until all entries are published or failures stop decreasing
Tag problematic entries and skip them during publishing. Both cascade-publish and publish-all support --skip-tag:
# 1. Tag all current drafts with 'skip-publish'
npm run tag-drafts skip-publish -- --dry-run # Preview
npm run tag-drafts skip-publish # Tag them
# 2a. Use cascade publish (skipping tagged)
npm run cascade-publish -- --skip-tag skip-publish
# OR
# 2b. Use brute force publish (skipping tagged)
npm run publish-all -- --skip-tag skip-publish
# 3. Run repeatedly if using brute force
npm run publish-all -- --skip-tag skip-publish
# 4. When ready, untag and publish the remaining ones
npm run tag-drafts skip-publish -- --remove # Remove tag
npm run publish-all # Publish remainingTag Management Commands:
# Tag draft entries
npm run tag-drafts <tag-name> # Tag all drafts
npm run tag-drafts <tag-name> -- --dry-run # Preview tagging
npm run tag-drafts <tag-name> -- --remove # Remove tag from drafts
# Publishing with tag filtering
npm run cascade-publish -- --skip-tag <tag-name> # Smart publish, skip tagged
npm run publish-all -- --skip-tag <tag-name> # Brute force, skip tagged
# Example: Tag with 'circular-dep'
npm run tag-drafts circular-dep
npm run cascade-publish -- --skip-tag circular-dep
# Or use: npm run publish-all -- --skip-tag circular-depUse cases:
- Mark entries known to have circular dependencies
- Skip problematic entries temporarily
- Publish clean entries first, handle complex ones later
- Test publishing on a subset of entries
- Combine with cascade for efficient dependency-aware publishing
Publish all assets before entries (assets have no dependencies):
# Preview
npm run publish-assets:dry-run
# Publish all draft assets
npm run publish-assetsThen proceed with entry publishing using one of the strategies above.
| Scenario | Recommended Strategy |
|---|---|
| Clean migration, no circular deps | Cascade Publish |
| Known circular dependencies (100-1000 entries) | Brute Force Publish |
| Many circular dependencies (1000+ entries) | Selective with Tags (Cascade or Brute Force with --skip-tag) |
| Want to skip problematic entries | Cascade or Brute Force with --skip-tag |
| Assets only | Asset Publishing |
| Mixed approach | Assets β Cascade (skip tagged) β Brute Force for remaining |
Create cascade-config.json for publishing scripts:
{
"spaceId": "your-space-id",
"environmentId": "master",
"managementToken": "CFPAT-your-management-token",
"host": "api.contentful.com"
}Note: Use api.eu.contentful.com for EU spaces.
Sometimes you need to clean up a space before re-importing or to start fresh. Use the built-in clean-space script for selective cleanup.
Remove all entries but keep content types and assets:
# Preview first
npm run clean-space:dry-run
# Execute cleanup
npm run clean-spaceWhat gets deleted:
- β All entries (published + draft)
What stays:
- β Content types (kept)
- β Assets (kept)
Use case: Remove all content but keep the content model and assets for fresh import.
Remove entries and content types, but keep assets:
# Preview first
npm run clean-space -- --dry-run --content-types
# Execute cleanup
npm run clean-space -- --content-typesWhat gets deleted:
- β All entries
- β All content types
What stays:
- β Assets (kept)
Use case: When you want to re-import content model and entries but keep existing assets. Perfect for your EU space cleanup!
Remove everything - entries, content types, AND assets:
# Preview first
npm run clean-space -- --dry-run --content-types --assets
# Execute cleanup
npm run clean-space -- --content-types --assetsWhat gets deleted:
- β All entries
- β All content types
- β All assets
Use case: Complete fresh start.
npm run clean-space [options]
Options:
--dry-run Preview what will be deleted (no actual deletion)
--content-types Delete content types after entries
--assets Delete assets as well
--batch-size <n> Number of concurrent operations (default: 10)- β
Supports EU & US endpoints - Reads from
cascade-config.json - β Confirmation prompt - Shows space details and asks for Y/N confirmation
- β Dry-run mode - Preview before deleting
- β Rate limiting - 10 req/sec to respect API limits
- β Auto-unpublish - Unpublishes before deletion
- β Progress tracking - Real-time progress updates
- β Safe by default - Only deletes entries unless flags are provided
# 1. Clean entries + content types (keep assets)
npm run clean-space -- --content-types
# 2. Re-import
npm run import
# 3. Publish
npm run publish-assets
npm run cascade-publish# 1. Clean only entries (keep model + assets)
npm run clean-space
# 2. Import new content
npm run import# 1. Backup first (optional but recommended)
npx contentful-export \
--space-id YOUR_SPACE_ID \
--management-token YOUR_TOKEN \
--export-dir ./backup
# 2. Clean everything
npm run clean-space -- --content-types --assets
# 3. Import from scratch
npm run import- DESTRUCTIVE OPERATION - Cannot be undone
- Test on staging first - Always test cleanup on a non-production environment
- Backup before cleanup - Create an export before running cleanup
- Check cascade-config.json - Ensure it points to the correct space (EU or US)
- Use dry-run first - Always preview with
--dry-runbefore actual deletion - Deletes published content - This removes both draft AND published content
The script uses cascade-config.json for space credentials:
{
"spaceId": "69zmfo9ko3qk",
"environmentId": "master",
"managementToken": "CFPAT-your-management-token",
"host": "api.eu.contentful.com"
}Make sure this file points to the correct space before running cleanup!
When running cleanup (not in dry-run mode), you'll see space details and be asked to confirm:
π Connecting to Contentful...
β
Connected
π TARGET SPACE DETAILS:
================================================================================
Organization: Your Organization Name
Space Name: Your Space Name
Space ID: 69zmfo9ko3qk
Environment: master
API Host: api.eu.contentful.com
================================================================================
β οΈ WHAT WILL BE DELETED:
================================================================================
Mode: LIVE CLEANUP
β
All entries (published + draft)
β
Content types (will be deleted)
β Assets (will be kept)
================================================================================
β οΈ WARNING: This operation is DESTRUCTIVE and cannot be undone!
β οΈ Make sure you have a backup before proceeding.
Are you sure you want to proceed? (Y/N):
Type Y or yes to proceed, or N to cancel.
To clean your EU space (entries + content types, keep assets):
# 1. Verify cascade-config.json points to EU space
cat cascade-config.json
# 2. Preview what will be deleted (no confirmation needed)
npm run clean-space -- --dry-run --content-types
# 3. Execute cleanup (will ask for confirmation)
npm run clean-space -- --content-types
# You'll see space details and must type Y to proceed
# 4. Re-import
npm run importIf import fails or is interrupted:
npm run resumeAutomatically detects where to resume and continues.
contentful-batch-migrator/
βββ bin/ # Executable scripts
β βββ rateLimiter.js # Token bucket rate limiter
β βββ cleanup-drafts.js # Remove invalid/orphan drafts
β βββ clean-space.js # Delete entries/content types/assets from space
β βββ split.js # Split large exports into batches
β βββ import.js # Import batches with rate limiting
β βββ import-cli.js # Import batches using Contentful CLI
β βββ import-direct.js # Direct import without batching
β βββ validate.js # Validate migration success
β βββ resume.js # Resume interrupted migrations
β βββ resume-cli.js # Resume interrupted CLI migrations
β βββ cascade-publish.js # Smart publish with dependency resolution
β βββ publish-all-drafts.js # Brute force publish all drafts
β βββ tag-drafts.js # Tag/untag draft entries
β βββ publish-assets.js # Publish all draft assets
β βββ strip-validations.js # Remove content type validations
β βββ restore-validations.js # Restore content type validations
βββ docs/ # Documentation
β βββ EXPORT-GUIDE.md # Detailed export instructions
β βββ IMPORT-GUIDE.md # Detailed import instructions
β βββ RATE-LIMITING.md # Rate limiting details
β βββ TROUBLESHOOTING.md # Common issues and solutions
βββ batch-config.json # Batch import configuration
βββ batch-config.example.json # Batch import config template
βββ cascade-config.json # Publishing configuration
βββ cascade-config.example.json # Publishing config template
βββ package.json # Dependencies and scripts
βββ README.md # This file
βββ CONTRIBUTING.md # Contribution guidelines
βββ LICENSE # MIT License
βββ contentful-export/ # Your exported data (not in repo)
βββ exported-space.json
βββ [asset directories]
Scenario: Migrate 10,000+ assets (12.6GB) and 25,000+ entries with 4,000+ circular dependencies
# 1. Export from US space
npx contentful-export \
--space-id us-space-123 \
--management-token US_TOKEN \
--export-dir ./contentful-export \
--download-assets
# 2. Clean invalid drafts (recommended)
npm run cleanup-drafts
# Cleaned 2,000+ draft entries
# 3. Strip validations
npm run strip-validations
# Stripped 187 validations from 52 content types
# 4. Configure target (EU space)
cp batch-config.example.json batch-config.json
# Edit batch-config.json with EU space credentials
# Set "skipContentPublishing": true
# 5. Import using CLI (assets first)
npm run import:cli
# Takes ~4-5 hours for 10,000+ assets
# 6. Validate
npm run validate
# All checks pass β
# 7. Publish assets
npm run publish-assets
# ~17 minutes for 10,000+ assets
# 8. Publish entries with cascade
npm run cascade-publish
# ~42 minutes for 21,000 entries (4,000 skipped due to circular deps)
# 9. Brute force publish circular dependencies
npm run publish-all # Run 8-12 times until complete
# ~80 minutes total for 4,000 entries
# 10. Restore validations
npm run restore-validations
# Restored 187 validations to 52 content typesResult: Successfully migrated and published 35,000+ items in ~12 hours!
Scenario: Same migration but using custom script with advanced rate limiting
# 1-3. Same as Example 1 (export, clean drafts, strip validations)
# 4. Configure for custom script
cp batch-config.example.json batch-config.json
# Enable rate limiting:
# "rateLimits": { "enabled": true, "requestsPerSecond": 10 }
# 5. Split into batches
npm run split
# Output: Created 26 batches (400 assets each)
# 6. Import with custom script (all-in-one: assets + entries)
npm run import
# Takes ~8-10 hours with built-in rate limiting and state tracking
# 7. Validate
npm run validate
# All checks pass β
# 8-10. Same publishing steps as Example 1
npm run publish-assets
npm run cascade-publish
npm run publish-all # Repeat 8-12 times
npm run restore-validationsResult: Same successful migration with more granular control and better resume capability!
Scenario: Mark problematic entries and publish clean ones first
# 1-6. Same as Example 1 (export, clean, import, validate)
# 7. Identify and tag problematic entries
# After investigating failed-entries.json from a test run
# Manually tag ~500 problematic entries in Contentful UI with 'skip-publish'
# 8. Publish assets
npm run publish-assets
# 9. Cascade publish (skipping tagged - respects dependencies)
npm run cascade-publish -- --skip-tag skip-publish
# Published 20,500 entries in dependency order
# Skipped 500 tagged + 4,000 circular ones
# 10. Brute force remaining circular deps (except tagged)
npm run publish-all -- --skip-tag skip-publish
# Run 1: Published 50, Failed 35
npm run publish-all -- --skip-tag skip-publish
# Run 2: Published 30, Failed 5
npm run publish-all -- --skip-tag skip-publish
# Run 3: Published 5, Failed 0
# 11. Eventually handle the 500 tagged ones
npm run tag-drafts skip-publish -- --remove
npm run publish-all
# Run several times until completeResult: Clean entries published efficiently with cascade, circular deps resolved with brute force, problematic ones handled separately!
Solution: Increase delay between batches
{
"importOptions": {
"delayBetweenBatches": 300000 // 5 minutes instead of 3
}
}Solution: Clean invalid drafts before importing
npm run cleanup-drafts
# Review draft-cleanup-report.json
# Update batch-config.json to use cleaned file- Check logs:
batches/logs/batch-XX-errors.log - Resume import:
npm run resume - If persists, reduce batch size
- Check failed batches:
batches/import-state.json - Review error logs
- Retry failed batches:
npm run resume
Problem: Migration is too broken to fix, or you want to start over
Solution: Use the built-in space cleanup script
# Clean everything except assets (fastest way to retry)
npm run clean-space -- --content-types
# Then re-import
npm run importWhen to use:
- Import created corrupted data
- Want to test different import strategies
- Content model changes require fresh import
- Migration failed multiple times and recovery is too complex
Features:
- β Supports EU & US API endpoints (reads from cascade-config.json)
- β Shows space details before deletion (org, space name, space ID, environment)
- β Requires Y/N confirmation prompt
- β Dry-run mode for safety
- β Selective cleanup (entries, content types, assets)
- β Auto-unpublish before deletion
See: Step 7 in Usage section for detailed cleanup options
Problem: publish-all keeps failing with same entries after many runs
Solutions:
# 1. Check failed-entries.json for patterns
cat failed-entries.json | grep "error" | sort | uniq -c
# 2. Tag problematic entries and skip them temporarily
npm run tag-drafts circular-dep
npm run publish-all -- --skip-tag circular-dep
# 3. Manually investigate and fix in Contentful UI
# - Break circular references
# - Publish dependencies manually
# - Then retry: npm run publish-allProblem: 100% failure rate in publish-all
Possible causes:
- Assets not published yet β Run
npm run publish-assetsfirst - Wrong configuration β Verify
cascade-config.jsoncredentials - Permissions issue β Check management token has publish permissions
- Network/API issues β Wait and retry
Problem: tag-drafts fails or tags not appearing
Solution:
# 1. Verify the tag was created
# Check in Contentful UI β Settings β Tags
# 2. Ensure tag ID doesn't have special characters
# Use simple names: skip-publish, circular-dep, problematic
# 3. Check metadata in Contentful API
# Tags should appear in entry.metadata.tagsProblem: Publishing takes hours
Solutions:
- Skip analysis: Use
publish-allinstead ofcascade-publish(no dependency analysis overhead) - Filter by tag: Tag and skip entries you know will fail
- Parallel runs: If you have multiple environments, publish them in parallel
- Rate limiting: The 100ms delay (10 req/sec) is safe but you can reduce it to 50ms (20 req/sec) in the code if needed
π Full guide: docs/TROUBLESHOOTING.md
Test with a small batch first:
{
"batchSize": 100 // Small batch for testing
}Then monitor the first batch import closely before proceeding with full migration.
Contributions are welcome! Please see CONTRIBUTING.md for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Built for handling large-scale Contentful migrations
- Uses contentful-import and contentful-management
- Inspired by real-world migration challenges
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Contentful Support: Contentful Help Center
- contentful-import - Official Contentful import tool
- contentful-export - Official Contentful export tool
- contentful-cli - Contentful command line tools
Tested with:
- β 10,000+ assets (12.6GB)
- β 25,000+ content entries
- β 2,000+ draft entries (cleaned before migration)
- β 50+ content types with 187 validations
- β 700+ tags
- β 4,000+ circular dependency entries
Import Performance:
- Average batch import: 20-30 minutes per batch (with rate limiting)
- Full migration (26 batches): 8-10 hours (with custom script + rate limiting)
- Assets-only import (CLI): 4-5 hours for 10,000+ assets (12.6GB)
- Content-only import: 2-3 hours for 25,000+ entries
- Success rate: 100% (with retries)
Publishing Performance:
- Asset publishing: ~17 minutes for 10,000+ assets (10 req/sec)
- Cascade publish: ~42 minutes for 21,000 entries (10 req/sec)
- Brute force publish: 8-12 iterations for 4,000+ circular dependencies (~10 min per iteration, ~80 min total)
- Tagging: ~7 minutes per 1,000 entries (10 req/sec)
Overall Migration Time:
- Method A (CLI Import): 11-12 hours total
- Method B (Custom Script): 13-14 hours total
- Client-side rate limiting - Token bucket algorithm to respect API limits
- Draft cleanup utility - Identify and remove invalid/orphan drafts before migration
- Cascade publish - Smart publishing with dependency resolution
- Brute force publish - Handle circular dependencies through repetition
- Tag management - Tag and filter draft entries for selective publishing
- Webhook integration - Trigger notifications on migration completion
- Parallel batch imports - Import multiple batches simultaneously
- Incremental migrations - Sync only changed content
- Management Token: Keep your CMA token secure, never commit it
- Test First: Always test on a staging environment
- Backup: Create a space snapshot before importing
- Rate Limits: Respect Contentful's API rate limits (10 req/sec, 36K req/hour)
- Asset Files: Ensure all asset files are downloaded locally
- Publishing Configuration: Create
cascade-config.jsonfor publishing scripts (separate frombatch-config.json) - Circular Dependencies: Use brute force publish for entries with circular references
- Draft Publishing: Always set
skipContentPublishing: trueduring import, then publish separately - Space Cleanup: Use
npm run clean-spacefor selective cleanup (supports EU & US endpoints) - Cleanup is Destructive: Space cleanup operations cannot be undone - always use dry-run first
# Clean entries only (keep content types & assets)
npm run clean-space:dry-run
npm run clean-space
# Clean entries + content types (keep assets)
npm run clean-space -- --dry-run --content-types
npm run clean-space -- --content-types
# Clean everything (entries + content types + assets)
npm run clean-space -- --content-types --assets# Publish assets (always do this first)
npm run publish-assets
# Publish entries with smart dependency resolution
npm run cascade-publish
# Cascade publish with tag filtering
npm run cascade-publish -- --skip-tag skip-publish
# Preview before publishing (dry run)
npm run cascade-publish:dry-run# Brute force - run repeatedly until all published
npm run publish-all
npm run publish-all # Run multiple times
# Check progress
cat failed-entries.json | wc -l # Count remaining failures# Tag all drafts
npm run tag-drafts skip-publish
# Publish everything except tagged
npm run publish-all -- --skip-tag skip-publish
# Remove tag when ready
npm run tag-drafts skip-publish -- --remove
npm run publish-all # Publish the previously tagged ones# Workflow 1: Standard (no circular deps)
npm run publish-assets && npm run cascade-publish
# Workflow 2: With circular deps
npm run publish-assets
npm run cascade-publish
# Some will be skipped, use brute force for remaining
npm run publish-all # Repeat until failures = 0
# Workflow 3: Selective with cascade (tag problematic ones first)
npm run tag-drafts problematic
npm run publish-assets
npm run cascade-publish -- --skip-tag problematic
# Clean entries published, now handle problematic ones
npm run tag-drafts problematic -- --remove
npm run publish-all # Brute force for remaining
# Workflow 4: Selective with brute force only
npm run tag-drafts problematic
npm run publish-assets
npm run publish-all -- --skip-tag problematic
# Handle problematic ones separately later# Workflow 5: Clean and re-import (keep assets)
npm run clean-space -- --content-types
npm run import
npm run publish-assets
npm run cascade-publish
# Workflow 6: Complete fresh start
npm run clean-space -- --content-types --assets
npm run import
npm run publish-all
# Workflow 7: Clean content only, keep model
npm run clean-space
npm run import:direct # Import without splittingMade with β€οΈ for the Contentful community
If this tool helped you, please β star the repo!