SQL-like query builder for
IndexedDBwithDrizzle-style API
API Reference • Examples • Contributing
IndexedDB is a powerful browser-native database, but its low-level API can be cumbersome and complex to work with. Locality IDB simplifies IndexedDB interactions by providing a modern, type-safe, and SQL-like query builder inspired by Drizzle ORM.
If you're weighing using Locality IDB vs. the raw IndexedDB API:
- Type safety: schema-driven types reduce runtime errors.
- Query ergonomics: SQL-like query builder replaces verbose cursor boilerplate.
- Validation built-in: column type checks and custom validators run automatically.
- Consistency: reusable schema + shared helpers keep data access uniform.
- Features
- Installation
- Quick Start
- Core Concepts
- Usage
- API Reference
- Type System
- FAQ / Common Pitfalls
- License
- 🎯 Type-Safe: Full TypeScript support with automatic type inference
- 🔍 SQL-like Queries: Familiar query syntax inspired by Drizzle ORM
- 🚀 Modern API: Clean and intuitive interface for
IndexedDBoperations - 📦 Zero Dependencies: Lightweight with only development dependencies
- 🔄 Auto-Generation: Automatic UUID and timestamp generation
- 🎨 Schema-First: Define your database schema with a simple, declarative API
- 🛠️ Rich Column Types: Support for various data types including custom types
- ✅ Built-in Validation: Validation for built-in column types during insert and update operations
- 🔧 Custom Validators: Define custom validation logic for columns to enforce complex rules
- 🔒 Transactions: Execute multiple operations across tables with automatic rollback on failure
- 📤 Database Export: Export database data as JSON for backup, migration, or debugging
- 📥 Database Import: Import exported data with
'merge','replace', or'upsert'modes
# npm
npm install locality-idb
# pnpm
pnpm add locality-idb
# yarn
yarn add locality-idbimport { Locality, defineSchema, column } from 'locality-idb';
// Define your schema
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
email: column.text().unique(),
createdAt: column.timestamp(),
},
posts: {
id: column.int().pk().auto(),
userId: column.int().index(),
title: column.varchar(255),
content: column.text(),
createdAt: column.timestamp(),
},
});
// Initialize database
const db = new Locality({
dbName: 'my-app-db',
version: 1,
schema,
});
// Wait for database to be ready (optional)
await db.ready();
// Insert data
const user = await db
.insert('users')
.values({
name: 'Alice',
email: 'alice@example.com',
})
.run();
// Query data
const users = await db.from('users').findAll();
const alice = await db
.from('users')
.where((user) => user.email === 'alice@example.com')
.findFirst();
// Update data
await db
.update('users')
.set({ name: 'Alice Wonderland' })
.where((user) => user.id === 1)
.run();
// Delete data
await db
.delete('users')
.where((user) => user.id === 1)
.run();Check out the demo in the demo directory for selective examples with basic CRUD, transactions and database export. You can also try the live demo here: locality-idb-demo.vercel.app
Define your database schema using the defineSchema function:
import { defineSchema, column } from 'locality-idb';
const schema = defineSchema({
tableName: {
columnName: column.type().modifier(),
// ... more columns
},
// ... more tables
});Important: Each table must have exactly one primary key defined using
.pk(). Having zero or multiple primary keys will result in a runtime error.
// ✅ Valid - single primary key
const validSchema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
},
});
// ❌ Invalid - no primary key (runtime error)
const noPkSchema = defineSchema({
users: {
name: column.text(),
},
});
// ❌ Invalid - multiple primary keys (runtime error)
const multiPkSchema = defineSchema({
users: {
id: column.int().pk(),
uuid: column.uuid().pk(), // Error!
name: column.text(),
},
});Locality IDB supports a wide range of column types:
| Type | Description | Example |
|---|---|---|
number() / float() |
Numeric values (integer or float) | column.number() |
int() |
Numeric values (only integer is allowed) | column.int() |
numeric() |
Number or numeric string | column.numeric() |
bigint() |
Large integers | column.bigint() |
text() / string() |
Text strings | column.text() |
char(length?) |
Fixed-length string | column.char(10) |
varchar(length?) |
Variable-length string | column.varchar(255) |
email() |
Email strings | column.email() |
url() |
URL strings | column.url() |
bool() / boolean() |
Boolean values | column.bool() |
date() |
Date objects | column.date() |
timestamp() |
ISO 8601 timestamps (auto-generated) | column.timestamp() |
uuid() |
UUID strings (auto-generated v4) | column.uuid() |
object<T>() |
Generic objects | column.object<UserData>() |
array<T>() |
Arrays | column.array<number>() |
list<T>() |
Read-only arrays | column.list<string>() |
tuple<T>() |
Fixed-size tuples | column.tuple<string, number>() |
set<T>() |
Sets | column.set<string>() |
map<K,V>() |
Maps | column.map<string, number>() |
custom<T>() |
Custom types | column.custom<MyType>() |
Most column types support generic type parameters for creating branded types, literal unions, or domain-specific types:
// Basic usage
const age = column.int();
const price = column.float();
const score = column.number();
// Branded types for type safety
type UserId = Branded<number, 'UserId'>;
type ProductId = Branded<number, 'ProductId'>;
const schema = defineSchema({
users: {
id: column.int<UserId>().pk().auto(),
age: column.int(),
},
products: {
id: column.int<ProductId>().pk().auto(),
userId: column.int<UserId>(), // Type-safe foreign key
price: column.float(),
},
});
// ✅ Type safety prevents mixing IDs
const userId: UserId = 1 as UserId;
const productId: ProductId = 2 as ProductId;
// userId = productId; // ❌ Type error!// Literal unions for enum-like behavior
type Role = 'admin' | 'user' | 'guest';
type Status = 'draft' | 'published' | 'archived';
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
role: column.text<Role>().default('user'),
status: column.string<Status>().default('draft'),
},
});
// Branded types for domain-specific strings
type Email = Branded<string, 'Email'>;
type URL = Branded<string, 'URL'>;
const profileSchema = defineSchema({
profiles: {
id: column.int().pk().auto(),
email: column.varchar<Email>(255).unique(),
website: column.varchar<URL>(500).optional(),
},
});
// or use specialized `email` & `url` methods + types with built-in validation
const advancedProfileSchema = defineSchema({
profiles: {
id: column.int().pk().auto(),
email: column.email().unique(), // Validates emails
website: column.url().optional(), // Validates URLs (internally uses URL constructor)
},
});const schema = defineSchema({
sessions: {
id: column.uuid().pk(), // Auto-generated UUID v4
idWithDefault: column.uuid().pk().default(uuid({ version: 'v6' })), // Replace auto-generated UUID v4
createdAt: column.timestamp(), // Auto-generated timestamp
defaultTs: column.timestamp().default(getTimestamp()), // Auto-generated timestamp with default using utility built-in function
customTs: column.timestamp().default(new Chronos().toLocalISOString() as Timestamp), // Default timestamp with custom format
},
});Note:
- Auto-generated values can be overridden by providing explicit values during insert.
- Use the
default()modifier to set custom default values instead of auto-generated ones.- Auto-generated values are generated at runtime during insert operations.
onUpdate()modifier can be used to auto-update values on update operations (e.g.updatedAttimestamp).- Type extensions for
uuidandtimestampare not applicable since they are already typed.- For custom UUID versions, use
uuidutility fromnhb-toolbox.- For custom timestamp formats, use date libraries like
ChronosorgetTimestamp(fromnhb-toolbox); ordate-fnsto generate ISO 8601 strings.
// Branded booleans for clarity
type EmailVerified = Branded<boolean, 'EmailVerified'>;
type TwoFactorEnabled = Branded<boolean, 'TwoFactorEnabled'>;
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
emailVerified: column.bool<EmailVerified>().default(false as EmailVerified),
twoFactorEnabled: column.boolean<TwoFactorEnabled>().default(false as TwoFactorEnabled),
},
});// Object with typed structure
interface UserProfile {
avatar: string;
bio: string;
socials: {
twitter?: string;
github?: string;
};
}
// Array of typed elements
interface Comment {
author: string;
text: string;
date: string;
}
// Map with typed keys and values
interface CacheEntry {
value: any;
expires: number;
}
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
profile: column.object<UserProfile>(),
tags: column.array<string>(),
comments: column.array<Comment>(),
permissions: column.set<'read' | 'write' | 'delete'>(),
cache: column.map<string, CacheEntry>(),
},
// Tuples for fixed structures
locations: {
id: column.int().pk().auto(),
coordinates: column.tuple<number, number>(), // [latitude, longitude]
rgbColor: column.tuple<number, number, number>(), // [r, g, b]
},
// List (readonly array)
config: {
id: column.int().pk().auto(),
allowedOrigins: column.list<string>(), // Immutable at type level
},
});// Numeric accepts both number and numeric strings
const schema = defineSchema({
products: {
id: column.int().pk().auto(),
serialNumber: column.numeric(), // Can be 123 or "123"
largeId: column.bigint(), // For very large integers
},
});
// Branded Numeric types
type SerialNumber = Branded<Numeric, 'SerialNumber'>;
type SnowflakeId = Branded<bigint, 'SnowflakeId'>;
const advancedSchema = defineSchema({
items: {
id: column.int().pk().auto(),
serial: column.numeric<SerialNumber>(),
snowflake: column.bigint<SnowflakeId>(),
},
});Note: Type extensions are compile-time only and do not affect runtime validation. Use custom validators for runtime type enforcement.
Locality IDB provides powerful type inference utilities:
import type { InferSelectType, InferInsertType, InferUpdateType } from 'locality-idb';
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
email: column.text().unique(),
age: column.int().optional(),
createdAt: column.timestamp(),
},
});
// Infer types from schema
type User = InferSelectType<typeof schema.users>;
// { id: number; name: string; email: string; age?: number; createdAt: string }
type InsertUser = InferInsertType<typeof schema.users>;
// { name: string; email: string; age?: number; id?: number; createdAt?: string }
type UpdateUser = InferUpdateType<typeof schema.users>;
// { name?: string; email?: string; age?: number; createdAt?: string }import { Locality, defineSchema, column } from 'locality-idb';
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
email: column.text().unique(),
},
});
const db = new Locality({
dbName: 'my-database',
version: 1,
schema,
});
// Optional: Wait for database initialization
await db.ready();const user = await db
.insert('users')
.values({
name: 'John Doe',
email: 'john@example.com',
})
.run();
console.log(user); // { id: 1, name: 'John Doe', email: 'john@example.com' }const users = await db
.insert('users')
.values([
{ name: 'Alice', email: 'alice@example.com' },
{ name: 'Bob', email: 'bob@example.com' },
])
.run();
console.log(users); // Array of inserted usersconst schema = defineSchema({
posts: {
id: column.uuid().pk(),
title: column.text(),
createdAt: column.timestamp(),
isPublished: column.bool().default(false),
},
});
const post = await db
.insert('posts')
.values({ title: 'My First Post' })
.run();
// id and createdAt are auto-generated, isPublished defaults to false
console.log(post);
// {
// id: "550e8400-e29b-41d4-a716-446655440000",
// title: "My First Post",
// createdAt: "2026-01-29T12:34:56.789Z",
// isPublished: false
// }const allUsers = await db.from('users').findAll();// Predicate-based filtering (in-memory)
const admins = await db
.from('users')
.where((user) => user.role === 'admin')
.findAll();
const activeUsers = await db
.from('users')
.where((user) => user.isActive && user.age >= 18)
.findAll();
// Index-based filtering (optimized) - requires index or primary key
const usersByEmail = await db
.from('users')
.where('email', 'alice@example.com')
.findAll();
// Range queries with IDBKeyRange
const adults = await db
.from('users')
.where('age', IDBKeyRange.bound(18, 65))
.findAll();// Include only specified columns
const userNames = await db
.from('users')
.select({ name: true, email: true })
.findAll();
// Returns: Array<{ name: string; email: string }>
// Exclude specified columns
const usersWithoutPassword = await db
.from('users')
.select({ password: false })
.findAll();
// Returns: Array<Omit<User, 'password'>>const sortedUsers = await db
.from('users')
.orderBy('name', 'asc') // or 'desc'
.findAll();
// Supports nested keys
const sorted = await db
.from('users')
.orderBy('profile.age', 'desc')
.findAll();const topTenUsers = await db
.from('users')
.orderBy('createdAt', 'desc')
.limit(10)
.findAll();const user = await db
.from('users')
.where((user) => user.email === 'john@example.com')
.findFirst();
// Returns: User | null// Optimized O(1) lookup using IndexedDB's get()
const user = await db.from('users').findByPk(1);
// Returns: User | null
// Works with select projection
const userName = await db
.from('users')
.select({ name: true, email: true })
.findByPk(1);
// Returns: { name: string; email: string } | null// Find records using an indexed field (optimized index query)
const usersByEmail = await db
.from('users')
.findByIndex('email', 'alice@example.com');
// Returns: User[]
// Find by numeric index
const youngUsers = await db
.from('users')
.findByIndex('age', 25);
// Returns: User[]
// Works with all query modifiers
const result = await db
.from('users')
.select({ name: true, age: true })
.findByIndex('age', 30)
.where((user) => user.isActive)
.limit(5);// Optimized cursor-based sorting (no in-memory sort needed!)
const sortedUsers = await db
.from('users')
.sortByIndex('age', 'desc')
.findAll();
// Combine with limit for efficient pagination
const topTenOldest = await db
.from('users')
.sortByIndex('age', 'desc')
.limit(10)
.findAll();
// Works with select projection
const names = await db
.from('users')
.select({ name: true, age: true })
.sortByIndex('age', 'asc')
.findAll();Note:
sortByIndex()uses IndexedDB cursor iteration for optimal performance whenwhere()filter is applied without index.
const result = await db
.from('users')
.select({ id: true, name: true, email: true })
.where((user) => user.age >= 18)
.orderBy('name', 'asc')
.limit(5)
.findAll();// Update with condition
const updatedCount = await db
.update('users')
.set({ name: 'Jane Doe', age: 30 })
.where((user) => user.id === 1)
.run();
console.log(`Updated ${updatedCount} records`);
// Update all matching records
await db
.update('users')
.set({ isActive: false })
.where((user) => user.lastLogin < '2025-01-01')
.run();// Delete with condition
const deletedCount = await db
.delete('users')
.where((user) => user.id === 1)
.run();
console.log(`Deleted ${deletedCount} records`);
// Delete multiple records
await db
.delete('users')
.where((user) => user.isDeleted === true)
.run();Transactions enable you to perform multiple operations across multiple tables atomically. All operations in a transaction either succeed together or fail together, ensuring data consistency.
// Create a user and their first post atomically
await db.transaction(['users', 'posts'], async (ctx) => {
const newUser = await ctx
.insert('users')
.values({ name: 'John Doe', email: 'john@example.com' })
.run();
await ctx
.insert('posts')
.values({
userId: newUser.id,
title: 'My First Post',
content: 'Hello World!',
})
.run();
});// Transfer data between tables atomically
await db.transaction(['users', 'posts', 'comments'], async (ctx) => {
// Update user
await ctx
.update('users')
.set({ isActive: true })
.where((user) => user.id === 1)
.run();
// Create post
const post = await ctx
.insert('posts')
.values({ userId: 1, title: 'New Post', content: 'Content' })
.run();
// Add comment
await ctx
.insert('comments')
.values({ postId: post.id, userId: 1, text: 'First comment!' })
.run();
// Query within transaction
const userPosts = await ctx
.from('posts')
.where((p) => p.userId === 1)
.findAll();
console.log(`User now has ${userPosts.length} posts`);
});try {
await db.transaction(['users', 'posts'], async (ctx) => {
const user = await ctx
.insert('users')
.values({ name: 'Alice', email: 'alice@example.com' })
.run();
// This will cause the entire transaction to rollback
throw new Error('Something went wrong!');
// This never executes
await ctx
.insert('posts')
.values({ userId: user.id, title: 'Post' })
.run();
});
} catch (error) {
console.error('Transaction failed:', error);
// No data was inserted - transaction was rolled back
}Note:
- Transactions guarantee atomicity: all operations succeed or all fail.
- If any operation fails or an error is thrown, the entire transaction is automatically rolled back.
- Transaction context (
ctx) providesinsert(),update(),delete(), andfrom()methods.- All operations must be performed on tables specified in the transaction.
Export your database data as JSON for backup, migration, or debugging purposes. The export includes metadata and table data, and automatically triggers a browser download.
// Export entire database with pretty-printed JSON
await db.export();
// Downloads: my-database-2026-02-04T10-30-45-123Z.json// Export only users and posts tables
await db.export({
tables: ['users', 'posts'],
filename: 'users-posts-backup.json',
});// Export with custom configuration
await db.export({
tables: ['users'], // Optional: specific tables
filename: 'users-export.json', // Optional: custom filename
pretty: false, // Optional: compact JSON (default: true)
includeMetadata: true, // Optional: include metadata (default: true)
});The exported JSON file contains:
{
"metadata": {
"dbName": "my-database",
"version": 1,
"exportedAt": "2026-02-04T10:30:45.123Z",
"tables": ["users", "posts"]
},
"data": {
"users": [
{ "id": 1, "name": "Alice", "email": "alice@example.com" },
{ "id": 2, "name": "Bob", "email": "bob@example.com" }
],
"posts": [
{ "id": 1, "userId": 1, "title": "First Post", "content": "..." }
]
}
}Note:
- Exported files are automatically downloaded in the browser.
- Default filename format:
{dbName}-{timestamp}.json- Metadata is included by default but can be disabled.
- Use
pretty: true(default) for human-readable JSON.- Use
pretty: falsefor compact JSON (smaller file size).- Export uses a single readonly transaction for a consistent snapshot.
Import previously exported data or raw table data. Supports merge, replace, and upsert modes.
const exported = await db.exportToObject();
// Merge into existing tables (default)
await db.import(exported);await db.import(exported, { mode: 'replace' });await db.import(exported, { mode: 'upsert' });Use cursor-based pagination for efficient scrolling through large tables.
const page1 = await db.from('users').sortByIndex('id').page({ limit: 20 });
const page2 = await db.from('users').sortByIndex('id').page({
limit: 20,
cursor: page1.nextCursor,
});Note:
page()requiressortByIndex()when ordering is needed.page()does not support combiningcursorwithwhere(indexName, query)at the moment.
Stream rows with a cursor to avoid loading everything into memory.
await db.from('users').sortByIndex('id').stream(async (row) => {
console.log(row.email);
});Note:
stream()does not support in-memoryorderBy(). UsesortByIndex()instead.
await db.clearAll();await db.dropTable('users');
// You may need to recreate a Locality instance with an updated schema after dropping.new Locality<DBName, Version, Schema>(config: LocalityConfig)Parameters:
config.dbName: Database name (string)config.version: Database version (optional, default: 1)config.schema: Schema definition object
Example:
const db = new Locality({
dbName: 'my-database',
version: 1,
schema: mySchema,
});Gets the current database name.
Returns: The database name
Example:
const db = new Locality({
dbName: 'my-database',
version: 1,
schema: mySchema,
});
await db.ready(); // (optional) for extra safety
console.log(db.dbName); // 'my-database'Gets the current database version.
Returns: The database version number
Example:
const db = new Locality({
dbName: 'my-database',
version: 2,
schema: mySchema,
});
await db.ready(); // (optional) for extra safety
console.log(db.version); // 2Gets all table (store) names in the current database.
Returns: Array of table names
Example:
const tables = db.tableList;
console.log(tables); // ['users', 'posts', 'comments']Gets the list of all existing IndexedDB databases in the current origin.
Returns: Array of database information objects containing name and version
Example:
const databases = await db.dbList;
console.log(databases);
// [{ name: 'my-database', version: 1 }, { name: 'other-db', version: 2 }]This is an instance method that calls the static
Locality.getDatabaseList()internally.
Waits for database initialization to complete.
await db.ready();Creates a SELECT query for the specified table.
const query = db.from('users');Creates an INSERT query for the specified table.
const query = db.insert('users');Creates an UPDATE query for the specified table.
const query = db.update('users');Creates a DELETE query for the specified table.
const query = db.delete('users');Clears all records from the specified table.
await db.clearTable('users');Warning: This will remove all data and cannot be undone.
Deletes the entire database (current database).
Note: This method uses the
deleteDButility function internally and closes the database connection before deletion.
await db.deleteDB();Warning: This will remove all data and cannot be undone.
Closes the database connection.
db.close();Gets the underlying IDBDatabase instance.
const idb = await db.getDBInstance();Inserts seed data into the specified table.
Note:
- This is a convenience method for inserting initial data.
- It uses the
insertmethod internally.- It does not clear existing data before inserting.
- Accepts only an array of records (for single record insertion, use
insert().values().run())
Parameters:
table: Table namedata: Array of records to insert
Returns: Array of inserted record(s)
Example:
await db.seed('users', [
{ name: 'Alice', email: 'alice@wonderland.mad', },
{ name: 'Bob', email: 'bob@top.com', },
]);
const allUsers = await db.from('users').findAll();
console.log(allUsers);Executes multiple database operations across multiple tables in a single atomic transaction.
Parameters:
tables: Array of table names to include in the transactioncallback: Async function that receives a transaction context and performs operations
Returns: Promise that resolves when the transaction completes successfully
Transaction Context Methods:
ctx.insert(table): Insert records within the transactionctx.update(table): Update records within the transactionctx.delete(table): Delete records within the transactionctx.from(table): Query records within the transaction
Example:
// Create user and post atomically
await db.transaction(['users', 'posts'], async (ctx) => {
const user = await ctx
.insert('users')
.values({ name: 'Alice', email: 'alice@example.com' })
.run();
await ctx
.insert('posts')
.values({ userId: user.id, title: 'First Post', content: 'Hello!' })
.run();
});Important:
- All operations succeed or all fail (atomicity).
- If any operation fails or an error is thrown, the entire transaction is rolled back.
- Only tables specified in the
tablesarray can be accessed within the transaction.- Transactions use IndexedDB's native transaction mechanism.
Exports database data as a JSON file and triggers a browser download.
Parameters:
options: Optional export configurationoptions.tables: Array of table names to export (default: all tables)options.filename: Custom filename (default:{dbName}-{timestamp}.json)options.pretty: Enable pretty-printed JSON (default:true)options.includeMetadata: Include export metadata (default:true)
Returns: Promise that resolves when the export completes
Example:
// Export all tables with default settings
await db.export();
// Export specific tables with custom filename
await db.export({
tables: ['users', 'posts'],
filename: 'backup-2026-02-04.json',
pretty: true,
});
// Export without metadata in compact format
await db.export({
pretty: false,
includeMetadata: false,
});Exported JSON Structure:
/** Exported table data structure */
type ExportedTableData<T extends string, S extends SchemaDefinition> = {
[K in T]: InferSelectType<S[K]>[];
};
/** Exported database data structure */
type ExportData<T extends string, S extends SchemaDefinition> = {
/** Optional metadata about the export */
metadata?: {
/** Database name */
dbName: string;
/** Database version */
version: number;
/** Export creation time */
exportedAt: Timestamp;
/** List of exported table names */
tables: T[];
};
/** Actual exported data, mapping table names to arrays of records */
data: ExportedTableData<T, S>;
};Exports database data as an object without triggering a download.
Parameters: Same as export(), except filename and pretty are omitted.
Returns: Promise that resolves to an ExportData object.
Example:
const exported = await db.exportToObject();Imports database data using merge, replace, or upsert modes.
Parameters:
data: Exported data or raw table data objectoptions.mode:'merge' | 'replace' | 'upsert'(default:merge)options.tables: Optional list of table names to import
Example:
await db.import(exported, { mode: 'replace' });Note:
mergeusesadd()under the hood, so primary key or unique conflicts will abort the transaction.
Clears all records from all tables in a single transaction.
await db.clearAll();Drops an object store by name and bumps the database version internally.
You should re-instantiate
Localitywith an updated schema after dropping.
await db.dropTable('users');Note:
- Automatically triggers a file download in the browser.
- Exported data includes all records from specified tables.
- Use for backup, debugging, or data migration.
- File download works in browser environments only.
Gets the list of all existing IndexedDB databases in the current origin (static method).
Returns: Array of database information objects containing name and version
Example:
import { Locality } from 'locality-idb';
const databases = await Locality.getDatabaseList();
console.log(databases);
// [{ name: 'app-db', version: 1 }, { name: 'cache-db', version: 2 }]Note:
- This method requires IndexedDB support in the browser.
- Returns an empty array if the browser doesn't support
indexedDB.databases().- Can be called without instantiating the Locality class.
Deletes an IndexedDB database by name (static method).
Parameters:
name: The name of the database to delete
Returns: Promise that resolves when the database is deleted
Example:
import { Locality } from 'locality-idb';
// Delete a database without creating an instance
await Locality.deleteDatabase('old-database');
// Alternative: Get list of databases first
const databases = await Locality.getDatabaseList();
for (const db of databases) {
if (db.name.startsWith('temp-')) {
await Locality.deleteDatabase(db.name);
}
}Warning: This will permanently remove all data from the specified database and cannot be undone.
Note: This is a static method that can be called without creating a Locality instance. For deleting the current database instance, use the instance method
db.deleteDB()instead.
defineSchema<Schema extends ColumnRecord, Keys extends keyof Schema>(schema: Schema): SchemaRecord<Schema, Keys>
Defines a database schema from an object mapping table names to column definitions.
Parameters:
schema: Object with table names as keys and column definitions as values
Returns: Schema object with typed tables
Example:
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
},
posts: {
id: column.int().pk().auto(),
title: column.text(),
},
});Creates a single table definition (alternative to defineSchema).
Parameters:
name: Table namecolumns: Column definitions object
Returns: Table instance
Example:
const userTable = table('users', {
id: column.int().pk().auto(),
name: column.text(),
});All column types support the following modifiers:
Marks the column as the primary key.
column.int().pk()Enables auto-increment (only for numeric columns: int, float, number).
column.int().pk().auto()Marks the column as unique and creates an index.
column.text().unique()Creates an index on the column.
column.int().index()Makes the column optional (nullable).
column.text().optional()Sets a default value for the column.
column.bool().default(true)
column.text().default('N/A')Adds custom validation logic to the column. The validation function receives the column value and should return:
nullorundefinedif the value is valid- An error message
stringif the value is invalid
When it runs: During insert and update operations, before data is saved to IndexedDB.
Error Handling: If validation fails, a
TypeErroris thrown with details about the invalid field.
Precedence: Custom validators override built-in type validation. If you provide a custom validator, the built-in type check for that column will be skipped.
Note:
- Custom validation is not applied to auto-generated values (e.g. auto-increment, UUID, timestamp). But default values are validated if
.default(value)is used.- If multiple validators are chained, only the last one is used.
- Built-in type validation still applies to all other columns without custom validators.
- If the column is optional, the validator is only called when a value is provided (not
undefined).
// Email validation
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
email: column.text().validate((val) => {
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
return emailRegex.test(val) ? null : 'Invalid email format';
}),
age: column.int().validate((val) => {
if (val < 0) return 'Age cannot be negative';
if (val > 120) return 'Age must be 120 or less';
return null; // Valid
}),
username: column.text().validate((val) => {
if (val.length < 3) return 'Username must be at least 3 characters';
if (!/^[a-zA-Z0-9_]+$/.test(val)) return 'Username can only contain letters, numbers, and underscores';
return null;
}),
},
});
// ✅ Valid insert
await db.insert('users').values({
email: 'user@example.com',
age: 25,
username: 'john_doe'
}).run();
// ❌ Throws TypeError: Invalid value for field 'email' in table 'users': Invalid email format
await db.insert('users').values({
email: 'invalid-email',
age: 25,
username: 'john_doe'
}).run();
// ❌ Throws TypeError: Invalid value for field 'age' in table 'users': Age cannot be negative
await db.insert('users').values({
email: 'user@example.com',
age: -5,
username: 'john_doe'
}).run();Combining with .optional():
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
// Custom validation only runs when value is provided
bio: column.text().optional().validate((val) => {
return val.length <= 500 ? null : 'Bio must be 500 characters or less';
}),
},
});
// ✅ Valid - bio is optional and omitted
await db.insert('users').values({}).run();
// ✅ Valid - bio is provided and valid
await db.insert('users').values({ bio: 'Short bio' }).run();
// ❌ Throws TypeError - bio provided but exceeds 500 chars
await db.insert('users').values({ bio: 'x'.repeat(501) }).run();Access the ValidateFn symbol (advanced):
import { ValidateFn } from 'locality-idb';
// Access validator function programmatically
const emailColumn = column.text().validate((val) => { /* ... */ });
const validatorFn = emailColumn[ValidateFn]; // Function referenceSets a function to auto-update the column value during update operations.
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
updatedAt: column.timestamp().onUpdate(() => getTimestamp()),
},
});Note:
- The updater function is called automatically during update operations.
- Important: It overrides any value provided during updates.
- It receives the current value of the column and should return the updated value.
- This is useful for fields like
"updatedAt"timestamps that need to be refreshed on each update.- If multiple updaters are chained, only the last one is used.
- Should not be used with auto-generated indexed columns like primary keys.
- The updated value is validated according to the column's type and custom validators (if any).
Access the OnUpdate symbol (advanced):
import { OnUpdate } from 'locality-idb';
// Access updater function programmatically
const updatedAtColumn = column.timestamp().onUpdate(() => getTimestamp());
const updaterFn = updatedAtColumn[OnUpdate]; // Function referenceSelects or excludes specific columns.
// Include specific columns
db.from('users').select({ name: true, email: true })
// Exclude specific columns
db.from('users').select({ password: false })Filters rows based on a predicate function.
db.from('users').where((user) => user.age >= 18)Filters rows using an indexed field or primary key.
Type Safety: indexName must be either an indexed field or the primary key.
Performance: Uses IndexedDB's optimized index/key query for efficient lookups.
// Using an indexed field
db.from('users').where('age', IDBKeyRange.bound(18, 30))
// Using primary key
db.from('users').where('id', IDBKeyRange.bound(1, 100))Sorts results by an indexed field using IndexedDB cursor iteration (avoiding in-memory sorting).
Type Safety: indexName must be a field with an index or the primary key.
Performance: Uses IndexedDB's cursor for optimized sorting. For large datasets, this is significantly more efficient than in-memory sorting.
For sorting on non-indexed fields, use
orderBy()which performs in-memory sorting.
// Optimized cursor-based sort
const sorted = await db.from('users').sortByIndex('age', 'desc').findAll();
// Efficient pagination
const page = await db.from('users').sortByIndex('createdAt', 'desc').limit(20).findAll();Orders results by a specified key. Supports nested keys using dot notation.
db.from('users').orderBy('name', 'asc')
db.from('users').orderBy('profile.age', 'desc')Note: This method performs in-memory sorting. For large datasets, consider using
sortByIndex()with an indexed field for better performance.
Limits the number of results.
db.from('users').limit(10)Fetches records using cursor-based pagination.
const page1 = await db.from('users').sortByIndex('id').page({ limit: 20 });
const page2 = await db.from('users').sortByIndex('id').page({
limit: 20,
cursor: page1.nextCursor,
});Notes:
page()does not support in-memoryorderBy(). UsesortByIndex()instead.page()does not support combiningcursorwithwhere(indexName, query).nextCursormay beundefinedwhen there are no more results.
Streams records using a cursor for low-memory iteration.
await db.from('users').sortByIndex('id').stream((row) => {
console.log(row.email);
});Fetches all matching records.
const users = await db.from('users').findAll()Fetches the first matching record.
const user = await db.from('users').findFirst()Finds a single record by its primary key value using IndexedDB's optimized get() method.
Performance: O(1) lookup
const user = await db.from('users').findByPk(1);
const post = await db.from('posts').findByPk('some-uuid-string');Finds records using an indexed field. Only accepts field names that are marked with .index() or .unique().
Type Safety: indexName must be a field with an index.
Performance: Uses IndexedDB's index query for optimized lookups.
// Type-safe: 'email' must be indexed
const users = await db.from('users').findByIndex('email', 'alice@example.com');
// Works with IDBKeyRange for range queries
const adults = await db.from('users').findByIndex('age', IDBKeyRange.bound(18, 65));Note:
- Unique columns are automatically indexed.
- Unique indexes are recommended for this method to ensure a single result.
Counts the number of matching records.
const userCount = await db.from('users').where((user) => user.isActive).count()Note:
- Uses IndexedDB's optimized
count()when:
- No
where()clause is applied, ORwhere()uses an index or primary key- Falls back to in-memory counting when
where()uses a predicate function
Checks if any matching records exist.
const hasAdmins = await db.from('users').where((user) => user.role === 'admin').exists()Note: This method internally uses
count()for checking existence.
Sets the data to insert (single object or array).
db.insert('users').values({ name: 'John' })
db.insert('users').values([{ name: 'John' }, { name: 'Jane' }])Executes the insert query and returns the inserted record(s).
const user = await db.insert('users').values({ name: 'John' }).run()Sets the values to update.
db.update('users').set({ name: 'Jane', age: 30 })Filters rows to update.
db.update('users').set({ isActive: false }).where((user) => user.id === 1)Filters rows to update using an indexed field.
db.update('users')
.set({ isActive: true })
.where('email', 'alice@example.com')Executes the update query and returns the number of updated records.
const count = await db.update('users').set({ name: 'Jane' }).run()Filters rows to delete.
db.delete('users').where((user) => user.id === 1)Filters rows to delete using an indexed field.
db.delete('users').where('email', 'alice@example.com')Executes the delete query and returns the number of deleted records.
const count = await db.delete('users').where((user) => user.id === 1).run()Generates a random UUID v4 string.
Uses Web Crypto (
crypto.randomUUIDorcrypto.getRandomValues) when available, and falls back toMath.random()when not.
Parameters:
uppercase: Whether to return uppercase format (optional, default:false)
Returns: UUID v4 string
Example:
import { uuidV4 } from 'locality-idb';
const id = uuidV4(); // "550e8400-e29b-41d4-a716-446655440000"
const upperId = uuidV4(true); // "550E8400-E29B-41D4-A716-446655440000"Gets a timestamp in ISO 8601 format from various input types.
Can be used to generate current timestamp or convert existing date inputs to use as timestamp.
Parameters:
value: Optional date input:string: ISO date string or any valid date stringnumber: Unix timestamp (milliseconds)Date: Date object
Returns: ISO 8601 timestamp string
Remarks: If no value is provided or the provided value is invalid, the current date and time will be used.
Example:
import { getTimestamp } from 'locality-idb';
// Current timestamp
const now = getTimestamp(); // "2026-01-29T12:34:56.789Z"
// From Date object
const fromDate = getTimestamp(new Date('2025-01-01')); // "2025-01-01T00:00:00.000Z"
// From ISO string
const fromString = getTimestamp('2025-06-15T10:30:00.000Z'); // "2025-06-15T10:30:00.000Z"
// From Unix timestamp
const fromUnix = getTimestamp(1704067200000); // "2024-01-01T00:00:00.000Z"
// Invalid input falls back to current time
const fallback = getTimestamp('invalid'); // Current timestampChecks if a value is a valid Timestamp string in ISO 8601 format.
Parameters:
value: The value to check
Returns: true if the value is a valid Timestamp, otherwise false
Example:
import { isTimestamp } from 'locality-idb';
isTimestamp('2026-01-29T12:34:56.789Z'); // true
isTimestamp('2026-01-29'); // false (not full ISO 8601)
isTimestamp('invalid'); // false
isTimestamp(123); // falseChecks if a value is a valid UUID string (v1, v4, or v5).
Parameters:
value: The value to check
Returns: true if the value is a valid UUID, otherwise false
Example:
import { isUUID } from 'locality-idb';
// Valid UUIDs
isUUID('d9428888-122b-11e8-b642-0ed5f89f718b'); // true (v1)
isUUID('6ba7b810-9dad-11d1-80b4-00c04fd430c8'); // true (v1)
isUUID('550e8400-e29b-41d4-a716-446655440000'); // true (v4)
// Invalid formats
isUUID('not-a-uuid'); // false
isUUID('12345678-1234-1234-1234-123456789abc'); // false (invalid version)
isUUID(123456789); // falseChecks if a value is a valid email string.
Parameters:
value: The value to check
Returns: true if the value is a valid email, otherwise false
Example:
import { isEmail } from 'locality-idb';
// Valid emails
isEmail('user@example.com'); // true
isEmail('first.last@sub.domain.co.uk'); // true
isEmail('user+filter@example.org'); // true
// Invalid emails
isEmail('plain-string'); // false
isEmail('user@.com'); // false
isEmail('@example.com'); // false
isEmail('user@domain'); // false
isEmail(12345); // falseChecks if a value is a valid URL string.
Parameters:
value: The value to check
Returns: true if the value is a valid URL, otherwise false
Example:
import { isURL } from 'locality-idb';
// Valid URLs
isURL('https://example.com'); // true
isURL('ftp://files.test/path?q=1'); // true
// Invalid URLs
isURL('example.com'); // false (missing protocol)
isURL('http://'); // false (empty domain)
isURL('//cdn.domain/image.png'); // false (`URL` constructor cannot parse it)
isURL(123456); // falseOpens an IndexedDB database with specified stores (low-level API).
Internally used by
Localityclass. Can be used for custom setups.
Parameters:
name: Database namestores: Array of store configurationsversion: Database version (optional, default:undefined)
Returns: Promise resolving to IDBDatabase instance
Example:
import { openDBWithStores } from 'locality-idb';
const db = await openDBWithStores(
'my-db',
[
{
name: 'users',
keyPath: 'id',
autoIncrement: true,
},
],
1
);Deletes an IndexedDB database by name.
Parameters:
name: The name of the database to delete
Returns: A promise that resolves when the database is deleted
Example:
import { deleteDB } from 'locality-idb';
await deleteDB('my-database');Warning: This will remove all data and cannot be undone.
Locality IDB includes built-in validation that automatically validates data types for built-in column types during insert and update operations based on your schema definitions.
When you insert or update records, Locality IDB automatically validates that the values match their expected column types:
const schema = defineSchema({
users: {
id: column.int().pk().auto(),
name: column.text(),
age: column.int(),
email: column.varchar(255),
},
});
const db = new Locality({ dbName: 'app', schema });
// ✅ Valid - all types match
await db.insert('users').values({ name: 'Alice', age: 25, email: 'alice@example.com' }).run();
// ❌ Throws TypeError - age must be an integer
await db.insert('users').values({ name: 'Bob', age: 'twenty', email: 'bob@example.com' }).run();
// ❌ Throws TypeError - email exceeds varchar(255) length
await db.insert('users').values({ name: 'Charlie', age: 30, email: 'a'.repeat(300) }).run();Manually validate if a value matches the specified column data type.
Parameters:
type: The column data type (e.g.,'int','text','uuid','varchar(255)')value: The value to validate
Returns: null if valid, otherwise an error message string
Example:
import { validateColumnType } from 'locality-idb';
validateColumnType('int', 42); // null (valid)
validateColumnType('int', 'hello'); // "'\"hello\"' is not an integer"
validateColumnType('text', 'hello'); // null (valid)
validateColumnType('uuid', '550e8400-e29b-41d4-a716-446655440000'); // null (valid)
validateColumnType('varchar(5)', 'hi'); // null (valid)
validateColumnType('varchar(5)', 'hello world'); // error message
validateColumnType('numeric', 42); // null (valid)
validateColumnType('numeric', '3.14'); // null (valid)
validateColumnType('numeric', 'abc'); // error messageThe following column types are validated:
| Type | Validation Rule |
|---|---|
int |
Must be an integer |
float / number |
Must be a number |
numeric |
Must be a number or numeric string |
bigint |
Must be a BigInt |
text / string |
Must be a string |
char(n) |
Must be a string with exactly n characters |
varchar(n) |
Must be a string with at most n characters |
bool / boolean |
Must be a boolean |
uuid |
Must be a valid UUID string |
timestamp |
Must be a valid ISO 8601 timestamp string |
date |
Must be a Date object |
array / list / tuple |
Must be an array |
set |
Must be a Set |
map |
Must be a Map |
object |
Must be an object |
custom |
No validation (always passes, custom validation integration coming soon...) |
Locality IDB provides comprehensive TypeScript support:
import type {
InferSelectType,
InferInsertType,
InferUpdateType,
$InferRow,
} from 'locality-idb';
// InferSelectType: Full row type with all fields
type User = InferSelectType<typeof schema.users>;
// InferInsertType: Insert type with auto-generated fields optional
type InsertUser = InferInsertType<typeof schema.users>;
// InferUpdateType: Partial type for updates (excluding primary key)
type UpdateUser = InferUpdateType<typeof schema.users>;
// $InferRow: Direct inference from column definitions
type UserRow = $InferRow<typeof schema.users.columns>;Locality IDB uses branded type for UUIDv4 for better type safety:
import type { UUID } from 'locality-idb';
// UUID types are branded with their version
type UserId = UUID<'v4'>; // Branded UUID v4import type {
GenericObject,
Prettify,
NestedPrimitiveKey,
SelectFields,
} from 'locality-idb';
// GenericObject: Record<string, any>
type MyObj = GenericObject;
// Prettify: Flattens complex types for better readability
type Pretty = Prettify<ComplexType>;
// NestedPrimitiveKey: Extracts nested primitive keys with dot notation
type Keys = NestedPrimitiveKey<{ user: { profile: { age: number } } }>;
// "user.profile.age"
// SelectFields: Projects selected fields from a type
type Selected = SelectFields<User, { name: true; email: true }>;
// { name: string; email: string }import type {
TransactionContext,
TransactionCallback,
ExportOptions,
ExportData,
ImportOptions,
PageOptions,
PageResult,
} from 'locality-idb';
// TransactionContext: Context object provided to transaction callback
type TxContext = TransactionContext<Schema, TableName, ['users', 'posts']>;
// TransactionCallback: Function signature for transaction operations
type TxCallback = TransactionCallback<Schema, TableName, ['users']>;
// ExportOptions: Configuration options for database export
type ExportOpts = ExportOptions<'users' | 'posts'>;
/*
{
tables?: ('users' | 'posts')[];
filename?: string;
pretty?: boolean;
includeMetadata?: boolean;
}
*/
// ExportData: Structure of exported database data
type Exported = ExportData;
/*
{
metadata?: {
dbName: string;
version: number;
exportedAt: Timestamp;
tables: string[];
};
data: Record<string, GenericObject[]>;
}
*/
// ImportOptions: Configuration options for database import
type ImportOpts = ImportOptions<'users' | 'posts'>;
/*
{
tables?: ('users' | 'posts')[];
mode?: 'replace' | 'merge' | 'upsert';
}
*/
// PageOptions: Cursor pagination options
type PagedOpts = PageOptions;
/*
{
cursor?: IDBValidKey;
limit?: number;
}
*/
// PageResult: Cursor pagination result
type PagedResult = PageResult<User, null>;
/*
{
items: User[];
nextCursor: IDBValidKey | undefined;
}
*/- Schema changes aren’t automatic: any schema change should bump the database
versionso the upgrade path runs. - Index queries require indexes:
where('field', value)only works for primary keys or fields defined with.index()or.unique(). - Predicate filters are in-memory:
where((row) => ...)filters client-side after fetching rows, so prefer indexes for large datasets. - IndexedDB is browser-only: calls will fail in SSR/Node environments without a shim.
- GitHub: nazmul-nhb/locality-idb
- npm: locality-idb
- Author: Nazmul Hassan
Made with ❤️ by Nazmul Hassan
If you find this package useful, please consider giving it a ⭐ on GitHub!