Enterprise-grade data engineering, MLOps, and governance plugins for Claude Code
Build production-grade data platforms on Databricks with AI-powered automation. This marketplace provides comprehensive plugins for data engineering, MLOps, and governance workflows.
- 15 Commands: Complete pipeline lifecycle from planning to deployment
- 18 Specialized Agents: Expert code review and optimization
- 8 Skills: Reusable architecture patterns and templates
- 3 MCP Servers: Deep Databricks integration
- Model training and deployment automation
- Feature store management
- MLflow experiment tracking
- Model monitoring and drift detection
- Unity Catalog access control
- Compliance checking and reporting
- Data lineage tracking
- Audit log analysis
# Recommended: Install via npx
npx claude-plugins install @vivekgana/databricks-platform-marketplace/databricks-engineering
# Or add marketplace in Claude
/plugin marketplace add https://github.com/yourcompany/databricks-platform-marketplace
/plugin install databricks-engineering# Set up Databricks credentials
export DATABRICKS_HOST="https://your-workspace.cloud.databricks.com"
export DATABRICKS_TOKEN="your-token-here"
# Optional: Configure specific resources
export DATABRICKS_WAREHOUSE_ID="your-warehouse-id"
export DATABRICKS_CLUSTER_ID="your-cluster-id"# 1. Plan a new data pipeline
claude /databricks:plan-pipeline "Build customer 360 with real-time updates"
# 2. Implement the pipeline
claude /databricks:work-pipeline plans/customer-360.md
# 3. Review before merging
claude /databricks:review-pipeline https://github.com/your-org/repo/pull/42
# 4. Deploy to production
claude /databricks:deploy-bundle --environment prod| Command | Description | Category |
|---|---|---|
plan-pipeline |
Plan data pipeline with architecture and costs | Planning |
work-pipeline |
Execute implementation systematically | Development |
review-pipeline |
Multi-agent code review | Quality |
create-data-product |
Design data products with SLAs | Data Products |
configure-delta-share |
Set up external data sharing | Sharing |
deploy-bundle |
Deploy with Asset Bundles | Deployment |
optimize-costs |
Analyze and reduce costs | Optimization |
test-data-quality |
Generate quality tests | Testing |
monitor-data-product |
Set up monitoring | Observability |
- PySpark Optimizer: Performance tuning and best practices
- Delta Lake Expert: Storage optimization and time travel
- Data Quality Sentinel: Validation and monitoring
- Unity Catalog Expert: Governance and permissions
- Cost Analyzer: Compute and storage optimization
- Delta Sharing Expert: External data distribution
- Data Product Architect: Product design and SLAs
- Pipeline Architect: Medallion architecture patterns
- Medallion Architecture: Bronze/Silver/Gold patterns
- Delta Live Tables: Streaming pipeline templates
- Data Products: Contract and SLA templates
- Databricks Asset Bundles: Multi-environment deployment
- Testing Patterns: pytest fixtures for Spark
- Delta Sharing: External data distribution setup
- Data Quality: Great Expectations integration
- CI/CD Workflows: GitHub Actions templates
# Build complete data platform
claude /databricks:scaffold-project customer-data-platform \
--architecture medallion \
--include-governance \
--enable-delta-sharing# Create streaming pipeline
claude /databricks:generate-dlt-pipeline \
--source kafka \
--sink delta \
--with-quality-checks# Set up feature engineering
claude /databricks:create-data-product feature-store \
--type feature-platform \
--with-monitoring- Getting Started Guide
- Configuration Reference
- Commands Reference
- Agents Reference
- Skills & Templates
- Examples & Tutorials
- API Documentation
# Run all tests
npm test
# Run unit tests only
npm run test:unit
# Run integration tests
npm run test:integration
# Run with coverage
pytest tests/ --cov=plugins --cov-report=html# Clone the repository
git clone https://github.com/yourcompany/databricks-platform-marketplace.git
cd databricks-platform-marketplace
# Install dependencies
npm install
pip install -r requirements-dev.txt
# Validate plugin configurations
npm run validate
# Format code
npm run format
# Lint code
npm run lint
# Build documentation
npm run docs- π Documentation
- π¬ Slack Community
- π Issue Tracker
- π§ Email Support
# Check for updates
claude /plugin update databricks-engineering
# View changelog
claude /plugin changelog databricks-engineering- β 2.5k+ stars on GitHub
- π¦ 10k+ installations
- π’ Used by 500+ enterprises
- β‘ 95% user satisfaction
- Auto Loader advanced patterns
- Lakehouse Federation support
- Scala and R language support
- Advanced cost optimization algorithms
- AI-powered query optimization
- Data mesh governance patterns
MIT License - see LICENSE for details
We welcome contributions! See CONTRIBUTING.md for guidelines.
Built with β€οΈ by Ganapathi Ekambaram