diff --git a/CLAUDE.md b/CLAUDE.md index 6026ad5..9a84e00 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -6,7 +6,7 @@ Skills for AI coding assistants (Claude Code, etc.) that provide Databricks-spec ``` skills/ -├── databricks/ # core skill: CLI, auth, data exploration +├── databricks-core/ # core skill: CLI, auth, data exploration │ ├── SKILL.md │ └── *.md (references) └── databricks-apps/ # product skill: app development @@ -14,7 +14,7 @@ skills/ └── references/ ``` -Hierarchy: `databricks` (core) → `databricks-apps` (product) → `databricks-apps-*` (niche) +Hierarchy: `databricks-core` (core) → `databricks-apps` (product) → `databricks-apps-*` (niche) ## Development diff --git a/manifest.json b/manifest.json index 1526908..74657f0 100644 --- a/manifest.json +++ b/manifest.json @@ -1,25 +1,12 @@ { "version": "2", - "updated_at": "2026-03-25T23:11:20Z", + "updated_at": "2026-03-31T18:36:08Z", "skills": { - "databricks": { - "version": "0.1.0", - "description": "Core Databricks skill for CLI, auth, and data exploration", - "experimental": false, - "updated_at": "2026-03-22T09:19:20Z", - "files": [ - "SKILL.md", - "data-exploration.md", - "databricks-cli-auth.md", - "databricks-cli-install.md", - "declarative-automation-bundles.md" - ] - }, "databricks-apps": { "version": "0.1.1", "description": "Databricks Apps development and deployment", "experimental": false, - "updated_at": "2026-03-22T09:19:20Z", + "updated_at": "2026-03-31T18:35:18Z", "files": [ "SKILL.md", "references/appkit/appkit-sdk.md", @@ -33,11 +20,24 @@ "references/testing.md" ] }, + "databricks-core": { + "version": "0.1.0", + "description": "Core Databricks skill for CLI, auth, and data exploration", + "experimental": false, + "updated_at": "2026-03-31T18:34:21Z", + "files": [ + "SKILL.md", + "data-exploration.md", + "databricks-cli-auth.md", + "databricks-cli-install.md", + "declarative-automation-bundles.md" + ] + }, "databricks-jobs": { "version": "0.1.0", "description": "Databricks Jobs orchestration and scheduling", "experimental": false, - "updated_at": "2026-03-22T09:19:20Z", + "updated_at": "2026-03-31T18:35:12Z", "files": [ "SKILL.md" ] @@ -46,7 +46,7 @@ "version": "0.1.0", "description": "Databricks Lakebase database development", "experimental": false, - "updated_at": "2026-03-22T09:19:20Z", + "updated_at": "2026-03-31T18:34:30Z", "files": [ "SKILL.md" ] @@ -55,7 +55,7 @@ "version": "0.1.0", "description": "Databricks Pipelines (DLT) for ETL and streaming", "experimental": false, - "updated_at": "2026-03-22T09:19:20Z", + "updated_at": "2026-03-31T18:35:15Z", "files": [ "SKILL.md", "references/auto-cdc-python.md", diff --git a/scripts/generate_manifest.py b/scripts/generate_manifest.py index 86fdb25..8fafdd4 100644 --- a/scripts/generate_manifest.py +++ b/scripts/generate_manifest.py @@ -53,7 +53,7 @@ def get_skill_updated_at(skill_path: Path) -> str: SKILL_METADATA = { - "databricks": { + "databricks-core": { "description": "Core Databricks skill for CLI, auth, and data exploration", "experimental": False, }, diff --git a/skills/databricks-apps/SKILL.md b/skills/databricks-apps/SKILL.md index 79267d1..b56a9c6 100644 --- a/skills/databricks-apps/SKILL.md +++ b/skills/databricks-apps/SKILL.md @@ -4,12 +4,12 @@ description: Build apps on Databricks Apps platform. Use when asked to create da compatibility: Requires databricks CLI (>= v0.294.0) metadata: version: "0.1.1" -parent: databricks +parent: databricks-core --- # Databricks Apps Development -**FIRST**: Use the parent `databricks` skill for CLI basics, authentication, and profile selection. +**FIRST**: Use the parent `databricks-core` skill for CLI basics, authentication, and profile selection. Build apps that deploy to Databricks Apps platform. @@ -17,7 +17,7 @@ Build apps that deploy to Databricks Apps platform. | Phase | READ BEFORE proceeding | |-------|------------------------| -| Scaffolding | Parent `databricks` skill (auth, warehouse discovery); run `databricks apps manifest` and use its plugins/resources to build `databricks apps init` with `--features` and `--set` (see AppKit section below) | +| Scaffolding | Parent `databricks-core` skill (auth, warehouse discovery); run `databricks apps manifest` and use its plugins/resources to build `databricks apps init` with `--features` and `--set` (see AppKit section below) | | Writing SQL queries | [SQL Queries Guide](references/appkit/sql-queries.md) | | Writing UI components | [Frontend Guide](references/appkit/frontend.md) | | Using `useAnalyticsQuery` | [AppKit SDK](references/appkit/appkit-sdk.md) | @@ -31,7 +31,7 @@ Build apps that deploy to Databricks Apps platform. - **App name**: ≤26 characters, lowercase letters/numbers/hyphens only (no underscores). dev- prefix adds 4 chars, max 30 total. - **Validation**: `databricks apps validate --profile ` before deploying. - **Smoke tests** (AppKit only): ALWAYS update `tests/smoke.spec.ts` selectors BEFORE running validation. Default template checks for "Minimal Databricks App" heading and "hello world" text — these WILL fail in your custom app. See [testing guide](references/testing.md). -- **Authentication**: covered by parent `databricks` skill. +- **Authentication**: covered by parent `databricks-core` skill. ## Project Structure (after `databricks apps init --features analytics`) - `client/src/App.tsx` — main React component (start here) @@ -49,7 +49,7 @@ Build apps that deploy to Databricks Apps platform. ## Data Discovery -Before writing any SQL, use the parent `databricks` skill for data exploration — search `information_schema` by keyword, then batch `discover-schema` for the tables you need. Do NOT skip this step. +Before writing any SQL, use the parent `databricks-core` skill for data exploration — search `information_schema` by keyword, then batch `discover-schema` for the tables you need. Do NOT skip this step. ## Development Workflow (FOLLOW THIS ORDER) @@ -123,7 +123,7 @@ npx @databricks/appkit docs ./docs/plugins/analytics.md # example: specific doc Optionally use `--version ` to target a specific AppKit version. - **Required**: `--name`, `--profile`. Name: ≤26 chars, lowercase letters/numbers/hyphens only. Use `--features` only for **optional** plugins the user wants (plugins with `requiredByTemplate: false` or absent); mandatory plugins must not be listed in `--features`. - **Resources**: Pass `--set` for every required resource (each field in `resources.required`) for (1) all plugins with `requiredByTemplate: true`, and (2) any optional plugins you added to `--features`. Add `--set` for `resources.optional` only when the user requests them. - - **Discovery**: Use the parent `databricks` skill to resolve IDs (e.g. warehouse: `databricks warehouses list --profile ` or `databricks experimental aitools tools get-default-warehouse --profile `). + - **Discovery**: Use the parent `databricks-core` skill to resolve IDs (e.g. warehouse: `databricks warehouses list --profile ` or `databricks experimental aitools tools get-default-warehouse --profile `). **DO NOT guess** plugin names, resource keys, or property names — always derive them from `databricks apps manifest` output. Example: if the manifest shows plugin `analytics` with a required resource `resourceKey: "sql-warehouse"` and `fields: { "id": ... }`, include `--set analytics.sql-warehouse.id=`. diff --git a/skills/databricks-apps/references/appkit/overview.md b/skills/databricks-apps/references/appkit/overview.md index 5f96cae..df785b4 100644 --- a/skills/databricks-apps/references/appkit/overview.md +++ b/skills/databricks-apps/references/appkit/overview.md @@ -23,7 +23,7 @@ See [Lakebase Guide](lakebase.md) for full Lakebase scaffolding and app-code pat ## Data Discovery (Before Writing SQL) -**Use the parent `databricks` skill for data discovery** (table search, schema exploration, query execution). +**Use the parent `databricks-core` skill for data discovery** (table search, schema exploration, query execution). ## Pre-Implementation Checklist diff --git a/skills/databricks/SKILL.md b/skills/databricks-core/SKILL.md similarity index 99% rename from skills/databricks/SKILL.md rename to skills/databricks-core/SKILL.md index de4bef9..87b1e2a 100644 --- a/skills/databricks/SKILL.md +++ b/skills/databricks-core/SKILL.md @@ -1,5 +1,5 @@ --- -name: "databricks" +name: "databricks-core" description: "Databricks CLI operations: auth, profiles, data exploration, and bundles. Contains up-to-date guidelines for Databricks-related CLI tasks." compatibility: Requires databricks CLI (>= v0.292.0) metadata: diff --git a/skills/databricks/data-exploration.md b/skills/databricks-core/data-exploration.md similarity index 100% rename from skills/databricks/data-exploration.md rename to skills/databricks-core/data-exploration.md diff --git a/skills/databricks/databricks-cli-auth.md b/skills/databricks-core/databricks-cli-auth.md similarity index 100% rename from skills/databricks/databricks-cli-auth.md rename to skills/databricks-core/databricks-cli-auth.md diff --git a/skills/databricks/databricks-cli-install.md b/skills/databricks-core/databricks-cli-install.md similarity index 100% rename from skills/databricks/databricks-cli-install.md rename to skills/databricks-core/databricks-cli-install.md diff --git a/skills/databricks/declarative-automation-bundles.md b/skills/databricks-core/declarative-automation-bundles.md similarity index 100% rename from skills/databricks/declarative-automation-bundles.md rename to skills/databricks-core/declarative-automation-bundles.md diff --git a/skills/databricks-jobs/SKILL.md b/skills/databricks-jobs/SKILL.md index fd4077f..58aec3c 100644 --- a/skills/databricks-jobs/SKILL.md +++ b/skills/databricks-jobs/SKILL.md @@ -4,12 +4,12 @@ description: Develop and deploy Lakeflow Jobs on Databricks. Use when creating d compatibility: Requires databricks CLI (>= v0.292.0) metadata: version: "0.1.0" -parent: databricks +parent: databricks-core --- # Lakeflow Jobs Development -**FIRST**: Use the parent `databricks` skill for CLI basics, authentication, profile selection, and data exploration commands. +**FIRST**: Use the parent `databricks-core` skill for CLI basics, authentication, profile selection, and data exploration commands. Lakeflow Jobs are scheduled workflows that run notebooks, Python scripts, SQL queries, and other tasks on Databricks. @@ -41,7 +41,7 @@ Verify: `databricks -v` ## For AI Agents -Read the `databricks` skill for CLI basics, authentication, and deployment workflow. +Read the `databricks-core` skill for CLI basics, authentication, and deployment workflow. Read the `databricks-jobs` skill for job-specific guidance. If skills are not available, install them: `databricks experimental aitools skills install` diff --git a/skills/databricks-lakebase/SKILL.md b/skills/databricks-lakebase/SKILL.md index 41bbadc..c00f4d3 100644 --- a/skills/databricks-lakebase/SKILL.md +++ b/skills/databricks-lakebase/SKILL.md @@ -4,12 +4,12 @@ description: "Manage Lakebase Postgres Autoscaling projects, branches, and endpo compatibility: Requires databricks CLI (>= v0.294.0) metadata: version: "0.1.0" -parent: databricks +parent: databricks-core --- # Lakebase Postgres Autoscaling -**FIRST**: Use the parent `databricks` skill for CLI basics, authentication, and profile selection. +**FIRST**: Use the parent `databricks-core` skill for CLI basics, authentication, and profile selection. Lakebase is Databricks' serverless Postgres-compatible database (similar to Neon). It provides fully managed OLTP storage with autoscaling, branching, and scale-to-zero. diff --git a/skills/databricks-pipelines/SKILL.md b/skills/databricks-pipelines/SKILL.md index d4454a3..d2b4a00 100644 --- a/skills/databricks-pipelines/SKILL.md +++ b/skills/databricks-pipelines/SKILL.md @@ -4,12 +4,12 @@ description: Develop Lakeflow Spark Declarative Pipelines (formerly Delta Live T compatibility: Requires databricks CLI (>= v0.292.0) metadata: version: "0.1.0" -parent: databricks +parent: databricks-core --- # Lakeflow Spark Declarative Pipelines Development -**FIRST**: Use the parent `databricks` skill for CLI basics, authentication, profile selection, and data discovery commands. +**FIRST**: Use the parent `databricks-core` skill for CLI basics, authentication, profile selection, and data discovery commands. ## Decision Tree @@ -199,7 +199,7 @@ Verify: `databricks -v` ## For AI Agents -Read the `databricks` skill for CLI basics, authentication, and deployment workflow. +Read the `databricks-core` skill for CLI basics, authentication, and deployment workflow. Read the `databricks-pipelines` skill for pipeline-specific guidance. If skills are not available, install them: `databricks experimental aitools skills install`