AppKit is the recommended way to build Databricks Apps - provides type-safe SQL queries, React components, and seamless deployment.
Before scaffolding, decide which data pattern the app needs:
| Pattern | When to use | Init command |
|---|---|---|
| Analytics (read-only) | Dashboards, charts, KPIs from warehouse | --features analytics --set analytics.sql-warehouse.id=<ID> |
| Lakebase (OLTP) (read/write) | CRUD forms, persistent state, user data | --features lakebase --set lakebase.postgres.branch=<BRANCH> --set lakebase.postgres.database=<DB> |
| Both | Dashboard + user data or preferences | --features analytics,lakebase with all required --set flags |
See Lakebase Guide for full Lakebase scaffolding and app-code patterns.
- Scaffold: Run
databricks apps manifest, thendatabricks apps initwith--featuresand--setas in parent SKILL.md (App Manifest and Scaffolding) - Develop:
cd <NAME> && npm install && npm run dev - Validate:
databricks apps validate - Deploy:
databricks apps deploy --profile <PROFILE>(⚠️ USER CONSENT REQUIRED)
Use the parent databricks-core skill for data discovery (table search, schema exploration, query execution).
Before writing App.tsx, complete these steps:
- ✅ Create SQL files in
config/queries/ - ✅ Run
npm run typegento generate query types - ✅ Read
client/src/appKitTypes.d.tsto see available query result types - ✅ Verify component props via
npx @databricks/appkit docs(check the relevant component page) - ✅ Plan smoke test updates (default expects "Minimal Databricks App")
DO NOT write UI code until types are generated and verified.
Before running databricks apps validate:
- ✅ Update
tests/smoke.spec.tsheading selector to match your app title - ✅ Update or remove the 'hello world' text assertion
- ✅ Verify
npm run typegenhas been run after all SQL files are finalized - ✅ Ensure all numeric SQL values use
Number()conversion in display code
my-app/
├── server/
│ ├── server.ts # Backend entry point (AppKit)
│ └── .env # Optional local dev env vars (do not commit)
├── client/
│ ├── index.html
│ ├── vite.config.ts
│ └── src/
│ ├── main.tsx
│ └── App.tsx # <- Main app component (start here)
├── config/
│ └── queries/
│ └── my_query.sql # -> queryKey: "my_query"
├── app.yaml # Deployment config
├── package.json
└── tsconfig.json
Key files to modify:
| Task | File |
|---|---|
| Build UI | client/src/App.tsx |
| Add SQL query | config/queries/<NAME>.sql |
| Add API endpoint | server/server.ts (tRPC) |
| Add shared helpers (optional) | create shared/types.ts or client/src/lib/formatters.ts |
| Fix smoke test | tests/smoke.spec.ts |
For type generation details, see: npx @databricks/appkit docs ./docs/development/type-generation.md
Quick workflow:
- Add/modify SQL in
config/queries/ - Types auto-generate during dev via the Vite plugin (or run
npm run typegenmanually) - Types appear in
client/src/appKitTypes.d.ts
Step 1: Create SQL file config/queries/my_data.sql
SELECT category, COUNT(*) as count FROM my_table GROUP BY categoryStep 2: Use component (types auto-generated!)
import { BarChart } from '@databricks/appkit-ui/react';
// Query mode: fetches data automatically
<BarChart queryKey="my_data" parameters={{}} />
// Data mode: pass static data directly (no queryKey/parameters needed)
<BarChart data={myData} xKey="category" yKey="count" />Always use AppKit docs as the source of truth for API details.
npx @databricks/appkit docs # show the docs index (start here)
npx @databricks/appkit docs <query> # look up a section by name or doc pathDo not guess paths — run without args first, then pick from the index.
| When you're about to... | Read |
|---|---|
| Write SQL files | SQL Queries — parameterization, dialect, sql.* helpers |
Use useAnalyticsQuery |
AppKit SDK — memoization, conditional queries |
| Add chart/table components | Frontend — component quick reference, anti-patterns |
| Add API mutation endpoints | tRPC — only if you need server-side logic |
| Use Lakebase for CRUD / persistent state | Lakebase — createLakebasePool, tRPC patterns, schema init |
- SQL for data retrieval: Use
config/queries/+ visualization components. Never tRPC for SELECT. - Numeric types: SQL numbers may return as strings. Always convert:
Number(row.amount) - Type imports: Use
import type { ... }(verbatimModuleSyntax enabled). - Charts are ECharts: No Recharts children — use props (
xKey,yKey,colors).xKey/yKeyauto-detect from schema if omitted. - Two data modes: Charts/tables support query mode (
queryKey+parameters) and data mode (staticdataprop). - Conditional queries: Use
autoStart: falseoption or conditional rendering to control query execution.
- Display data from SQL?
- Chart/Table →
BarChart,LineChart,DataTablecomponents - Custom layout (KPIs, cards) →
useAnalyticsQueryhook
- Chart/Table →
- Call Databricks API? → tRPC (serving endpoints, MLflow, Jobs)
- Modify data? → tRPC mutations