The application is designed as a command-line first tool with all key parameters provided via arguments for security and flexibility.
# Analyze a specific database
dotnet run -- --endpoint "https://your-cosmos-account.documents.azure.com:443/" --database "YourDatabase" --output "C:\Reports"
# Analyze all databases in the account
dotnet run -- --endpoint "https://your-cosmos-account.documents.azure.com:443/" --all-databases --output "C:\Reports"dotnet run -- --helpAvailable Options:
--endpoint, -e: Cosmos DB endpoint URL (required)--database, -d: Specific database name to analyze--all-databases: Analyze all databases in the account--output, -o: Output directory for reports--auto-discover: Auto-discover Azure Monitor settings--help: Show help information
The application creates a timestamped folder for each analysis run:
C:\Reports\
βββ CosmosDB-Analysis_2025-08-21__14-30-45\
βββ DatabaseA-Analysis.xlsx
βββ DatabaseB-Analysis.xlsx
βββ DatabaseC-Analysis.xlsx
βββ Migration-Assessment.docx
File Structure:
- Timestamped Folder:
CosmosDB-Analysis_YYYY-MM-dd__HH-mm-ss - Excel Files:
{DatabaseName}-Analysis.xlsx(one per database) - Word Document:
Migration-Assessment.docx(combined assessment)
dotnet run -- --endpoint "https://your-cosmos-account.documents.azure.com:443/" --all-databases --output "C:\MultiDBAssessment"This will:
- Connect to your Cosmos DB account
- Discover all databases automatically
- Generate separate Excel reports for each database
- Create one combined Word document with proper heading structure
- Use proper Word styles for navigation and accessibility
- Discover all databases automatically
- Analyze each database independently
- Generate combined assessment results
- Create consolidated reports covering all databases
# Application will prompt for output directory
dotnet runIf no output directory is specified via command line or configuration, the application will:
- Prompt you to enter a directory path
- Suggest a default timestamped directory
- Create the directory if it doesn't exist
- Validate write permissions
# Full enterprise assessment across all databases
dotnet run -- \
--all-databases \
--output "\\shared\reports\cosmos-migration-$(Get-Date -Format 'yyyyMMdd')" \
--auto-discoverThis command will:
- Analyze all databases in the Cosmos DB account
- Auto-discover Azure Monitor settings
- Save reports to a shared network location with date stamp
- Generate comprehensive cross-database analysis
# Detailed analysis of specific database
dotnet run -- \
--database "production-ecommerce" \
--output "C:\Reports\EcommerceAnalysis"# Fast assessment for development environments
dotnet run -- \
--database "dev-testing" \
--output ".\dev-reports"# For CI/CD or scheduled assessments
dotnet run -- \
--all-databases \
--output "%ASSESSMENT_OUTPUT_DIR%" \
--auto-discoverπ Connecting to Azure services...
β
Cosmos DB account: myapp-cosmos
β
Database: ecommerce
β
Azure Monitor workspace: connected
β
Found 5 containers for analysis
What happens: The application authenticates with Azure and discovers your Cosmos DB structure.
π Analyzing Cosmos DB containers...
π¦ users (15,234 documents, 2.1 GB)
π¦ orders (89,567 documents, 5.8 GB)
π¦ products (2,456 documents, 0.3 GB)
π¦ inventory (45,789 documents, 1.2 GB)
π¦ sessions (234,567 documents, 8.9 GB)
What happens: Each container is analyzed for:
- Document count and storage size
- Schema detection and field analysis
- Partition key effectiveness
- Indexing policy review
β‘ Collecting performance metrics (6 months)...
π Request unit consumption patterns
π Latency percentiles (P95, P99)
π Throttling events and error rates
π Regional distribution analysis
What happens: Historical performance data is collected from Azure Monitor to understand usage patterns.
π― Generating SQL migration assessment...
ποΈ Platform recommendation: Azure SQL Database
ποΈ Service tier: Business Critical Gen5 8 vCore
ποΈ Estimated monthly cost: $2,847.60
ποΈ Migration complexity: Medium (6.2/10)
What happens: AI-driven analysis recommends optimal Azure SQL configuration based on your workload.
π Calculating Data Factory migration estimates...
β±οΈ Estimated duration: 14.2 hours
π° Estimated cost: $47.83
π§ Recommended DIUs: 8
π Parallel degree: 6
What happens: Migration timeline and costs are calculated based on data volume and complexity.
π Generating assessment reports...
β
Excel report: Reports/CosmosDB_Assessment_20250820_143022.xlsx
β
Word summary: Reports/CosmosDB_Assessment_20250820_143022.docx
What happens: Professional reports are generated for technical and executive audiences.
The Excel report contains multiple worksheets:
- High-level metrics: Container count, data volume, cost estimates
- Complexity assessment: Overall migration difficulty score
- Key recommendations: Top 3-5 actionable items
- Per-container details: Document counts, storage, performance
- Schema complexity: Field types, nesting levels, arrays
- Partition key effectiveness: Hot partitions, distribution
- Historical trends: 6-month RU consumption patterns
- Latency analysis: P95/P99 response times
- Throttling events: When and why throttling occurred
- Platform recommendation: SQL Database vs Managed Instance vs VM
- Sizing details: vCores, storage, service tier
- Cost breakdown: Compute, storage, backup costs
- Migration timeline: Detailed time estimates per container
- Resource requirements: DIU recommendations
- Cost projections: Detailed cost breakdown
The Word document provides:
- Executive summary for stakeholders
- Key findings and recommendations
- Next steps for migration planning
- Risk assessment and mitigation strategies
| Score | Level | Description |
|---|---|---|
| 1-3 | Low | Simple schema, minimal transformations needed |
| 4-6 | Medium | Moderate complexity, some data transformation required |
| 7-8 | High | Complex schema, significant transformation work |
| 9-10 | Very High | Extensive restructuring and custom migration logic needed |
Context: Online store with user profiles, product catalog, and order history
Assessment Command:
dotnet run -- --database "ecommerce" --analysis-months 12Expected Results:
- Users container: Low complexity (simple profile data)
- Products container: Medium complexity (nested categories, arrays)
- Orders container: High complexity (embedded line items, complex relationships)
Recommendations:
- Normalize nested order data into separate tables
- Consider Azure SQL Database Business Critical tier
- Implement staged migration approach
Context: Time-series data from IoT sensors
Assessment Command:
dotnet run -- --containers "sensor-data,device-metadata" --analysis-months 3Expected Results:
- Very high data volume with simple schema
- High RU consumption for writes
- Time-based partitioning requirements
Recommendations:
- Consider Azure SQL Hyperscale for large datasets
- Implement time-based table partitioning
- Use columnstore indexes for analytics
Context: SaaS application with tenant-isolated data
Assessment Command:
dotnet run -- --enable-deep-analysis --analysis-months 6Expected Results:
- Complex tenant isolation patterns
- Varied data volumes per tenant
- Different access patterns by tenant size
Recommendations:
- Design tenant isolation strategy for SQL
- Consider elastic pools for smaller tenants
- Implement row-level security
Create a PowerShell script for regular assessments:
# scheduled-assessment.ps1
param(
[string]$Database,
[string]$OutputPath,
[int]$AnalysisMonths = 1
)
$timestamp = Get-Date -Format "yyyyMMdd"
$reportDir = "$OutputPath\$Database-$timestamp"
dotnet run -- \
--database $Database \
--analysis-months $AnalysisMonths \
--output-dir $reportDir \
--format json
# Upload reports to Azure Storage
az storage blob upload-batch \
--source $reportDir \
--destination "assessments" \
--account-name "myassessments"# azure-pipelines.yml
trigger:
schedules:
- cron: "0 2 * * 0" # Weekly on Sunday at 2 AM
branches:
include:
- main
jobs:
- job: CosmosAssessment
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DotNetCoreCLI@2
displayName: 'Run Cosmos Assessment'
inputs:
command: 'run'
projects: 'CosmosToSqlAssessment.csproj'
arguments: '--database production --output-dir $(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts@1
displayName: 'Publish Assessment Reports'
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'cosmos-assessment'-
Limit document sampling:
dotnet run -- --max-samples 1000
-
Reduce analysis period:
dotnet run -- --analysis-months 1
-
Analyze specific containers:
dotnet run -- --containers "critical-container" -
Disable deep analysis:
dotnet run -- --no-deep-analysis
For containers with millions of documents:
# Use statistical sampling
dotnet run -- \
--max-samples 10000 \
--enable-statistical-sampling \
--confidence-level 95# For memory-constrained environments
dotnet run -- \
--batch-size 100 \
--memory-limit 2GB \
--enable-streaming- Timeout errors: Increase timeout settings in configuration
- Memory issues: Reduce sample sizes and enable streaming
- Permission errors: Verify Azure RBAC assignments
- Network issues: Check firewall and VPN configurations
# Test individual components
dotnet run -- --test-cosmos-connection
dotnet run -- --test-monitor-connection
dotnet run -- --test-report-generation
# Enable detailed logging
dotnet run -- --log-level Debug --log-to-file- Schedule regular assessments to track changes over time
- Use version control for configuration files
- Archive assessment reports for historical comparison
- Review recommendations with your SQL DBA team
- Test migration approaches in development environments
- Monitor performance after migration completion