Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 34 additions & 0 deletions .github/workflows/typescript.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
name: typescript

on:
push:
pull_request:

jobs:
gcache-ts:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- uses: pnpm/action-setup@v4
with:
version: 10.33.0
- uses: actions/setup-node@v6
with:
node-version: "24"
cache: "pnpm"
- run: pnpm install --frozen-lockfile
- run: pnpm ts:gcache:typecheck
- run: pnpm ts:gcache:test
# Dependabot-triggered runs cannot access repository secrets, so this
# upload would fail with an empty Codecov token.
- name: Upload TypeScript Coverage Reports
if: ${{ github.actor != 'dependabot[bot]' }}
uses: codecov/codecov-action@v5.5.2
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: packages/gcache-ts/coverage/lcov.info
flags: gcache-ts
name: gcache-ts
fail_ci_if_error: true
verbose: true
- run: pnpm ts:gcache:build
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -158,3 +158,8 @@ cython_debug/
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

# Node / TypeScript
node_modules/
packages/*/dist/
packages/*/coverage/
13 changes: 13 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"name": "gcache-monorepo",
"private": true,
"packageManager": "pnpm@10.33.0",
"workspaces": [
"packages/*"
],
"scripts": {
"ts:gcache:build": "pnpm --filter @rungalileo/gcache build",
"ts:gcache:test": "pnpm --filter @rungalileo/gcache test",
"ts:gcache:typecheck": "pnpm --filter @rungalileo/gcache typecheck"
}
}
247 changes: 247 additions & 0 deletions packages/gcache-ts/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,247 @@
# @rungalileo/gcache

TypeScript port of GCache. Milestone 5 ships explicit enabled contexts, stable key construction, local/Redis TTL caching, runtime config providers, gradual rollout ramp controls, Prometheus-ready observability, and Redis watermark-based targeted invalidation with fail-open behavior.

## Install

```bash
pnpm add @rungalileo/gcache
```

## Quick start

```ts
import { GCache, GCacheKeyConfig } from "@rungalileo/gcache";

const gcache = new GCache();

const getUser = gcache.cached({
keyType: "user_id",
useCase: "GetUser",
id: ([userId]: [string]) => userId,
defaultConfig: GCacheKeyConfig.enabled(60),
})(async (userId: string) => {
return db.fetchUser(userId);
});

// Caching is disabled by default.
await getUser("123");

// Enable caching for one async scope.
const user = await gcache.enable(async () => {
return await getUser("123");
});
```

## Redis-backed TTL cache

Pass a small Redis command-surface client, or a lazy factory, to enable the read-through chain:

```ts
import { GCache, GCacheKeyConfig } from "@rungalileo/gcache";

const gcache = new GCache({
redis: {
client: redisClient, // implements get, del, flushAll/flushall, and setEx/setex/set({ EX })
keyPrefix: "gcache:",
},
});
```

When caching is enabled, reads flow through:

```text
local cache -> Redis cache -> fallback function
```

- Local hits return immediately.
- Local misses try Redis and populate local on a Redis hit.
- Redis misses call the fallback and write both Redis and local.
- Redis read/write/delete/flush failures are logged, counted in metrics, and fail open; fallback results still return when fallback succeeds.
- Missing per-layer config disables that layer, records a disabled reason, and falls through to the next layer/fallback.

You can also provide `createClient` for lazy client construction:

```ts
const gcache = new GCache({
redis: {
createClient: async () => createRedisClient({ url: process.env.REDIS_URL }),
},
});
```

Redis payloads use a TypeScript-specific JSON envelope, not the Python pickle format:

```ts
type RedisValueEnvelope = {
version: 1;
createdAtMs: number;
expiresAtMs: number;
encoding: "utf8" | "base64";
payload: string;
};
```

`payload` is produced by the cached function's serializer, or by `JsonSerializer` by default. Custom serializers can return either `string` or `Buffer`; Buffer payloads are base64 encoded in the envelope.

## Targeted invalidation and watermarks

Mutable Redis-backed use cases can opt into targeted invalidation by setting `trackForInvalidation: true` on the cached function and calling `invalidate(keyType, id)` after writes:

```ts
import { CacheLayer, GCache, GCacheKeyConfig } from "@rungalileo/gcache";

const gcache = new GCache({ redis: { client: redisClient } });

const getUser = gcache.cached({
keyType: "user_id",
useCase: "GetMutableUser",
id: ([userId]: [string]) => userId,
trackForInvalidation: true,
// Strongly invalidated mutable data should usually disable local cache.
defaultConfig: new GCacheKeyConfig({
ttlSec: { [CacheLayer.REMOTE]: 300 },
ramp: { [CacheLayer.REMOTE]: 100 },
}),
})(async (userId: string) => db.fetchUser(userId));

await updateUser("123", patch);
await gcache.invalidate("user_id", "123");
```

Invalidation writes a Redis watermark at `{encodedUrnPrefix:encodedKeyType:encodedId}#watermark`. Tracked Redis cache entries use the same Redis Cluster hash tag, for example `{urn:user_id:123}?locale=en#GetMutableUser`, so the value key and watermark key live in the same slot. Key components are percent-encoded before joining so delimiters inside IDs or args cannot collide with delimiters in the key format. Components may not contain `{` or `}` because those characters would corrupt the hash tag.

A cached Redis value whose `createdAtMs` is older than or equal to the watermark is treated as stale and refreshed through fallback. `invalidate(keyType, id, { futureBufferMs })` can extend the watermark into the future during write races; while the watermark is still in the future, fallback results are returned but not written to Redis or local cache.

Watermarks use `DEFAULT_WATERMARK_TTL_SEC` (4 hours) by default. You can override it with `redis.watermarkTtlSec`, but it must exceed the maximum Redis cache TTL for invalidation-tracked data; otherwise a watermark can expire before old cached values do.

Local cache limitation: targeted invalidation is enforced by Redis watermarks. Existing local cache hits are not synchronously invalidated across processes, so strongly invalidated mutable data should disable the local layer (or use very short local TTLs only when stale reads are acceptable).

## Runtime config and ramp controls

Every cached function can provide a decorator-local `defaultConfig`; a `cacheConfigProvider` can override it at runtime. If the provider returns `null`, GCache falls back to the cached function's `defaultConfig`. If neither exists, or a layer's TTL/ramp is missing or disabled, only that layer is skipped.

```ts
import { CacheLayer, GCache, GCacheKeyConfig } from "@rungalileo/gcache";

const gcache = new GCache({
cacheConfigProvider: async (key) => {
if (key.useCase === "GetUser") {
return new GCacheKeyConfig({
ttlSec: { [CacheLayer.LOCAL]: 30, [CacheLayer.REMOTE]: 300 },
ramp: { [CacheLayer.LOCAL]: 100, [CacheLayer.REMOTE]: 25 },
});
}
return null; // use the cached function's defaultConfig
},
rampSampler: ({ key, layer }) => deterministicPercentFor(`${key.urn}:${layer}`),
});
```

`ramp` values are percentages from 0 to 100. `0` disables the layer, `100` enables it, and intermediate values use `rampSampler`; the default sampler is random. Provider errors fail open and execute the fallback function.

## Enabled context

The TypeScript port uses Node `AsyncLocalStorage` to mirror Python's `with gcache.enable():` safety model.

```ts
await gcache.enable(async () => {
await getUser("123"); // cached

await gcache.disable(async () => {
await updateUser("123", patch); // uncached reads here
});

await getUser("123"); // cached again
});
```

- Default is disabled.
- Enabled state is async-scope-local, not process-global.
- Nested `enable` / `disable` scopes restore the previous behavior when the callback completes.

## Explicit key builders

TypeScript does not have safe Python-style function argument introspection after transpilation/bundling. Use explicit key builders instead:

```ts
const searchPosts = gcache.cached({
keyType: "user_id",
useCase: "SearchPosts",
id: ([userId]: [string, number, string]) => userId,
args: ([, page, filter]) => ({ page, filter }),
defaultConfig: GCacheKeyConfig.enabled(60),
})(async (userId: string, page: number, filter: string) => {
return db.searchPosts(userId, page, filter);
});
```

## Metrics

GCache registers Prometheus metrics by default via `prom-client`. Metric names intentionally follow the Python package where practical:

| Metric | Type | Labels | Description |
| --- | --- | --- | --- |
| `gcache_request_counter` | Counter | `use_case`, `key_type`, `layer` | Cache-layer requests that reached an enabled layer |
| `gcache_miss_counter` | Counter | `use_case`, `key_type`, `layer` | Cache misses |
| `gcache_disabled_counter` | Counter | `use_case`, `key_type`, `layer`, `reason` | Cache skips (`context`, `missing_config`, `invalid_ttl`, `ramped_down`, `config_error`) |
| `gcache_error_counter` | Counter | `use_case`, `key_type`, `layer`, `error`, `in_fallback` | Cache/fallback errors, with `in_fallback` separating cache plumbing failures from application fallback failures |
| `gcache_invalidation_counter` | Counter | `key_type`, `layer` | Delete/invalidation calls for the layers touched today |
| `gcache_get_timer` | Histogram | `use_case`, `key_type`, `layer` | Cache get latency in seconds |
| `gcache_fallback_timer` | Histogram | `use_case`, `key_type`, `layer` | Time spent in the underlying function |
| `gcache_serialization_timer` | Histogram | `use_case`, `key_type`, `layer`, `operation` | Redis serializer dump/load latency |
| `gcache_size_histogram` | Histogram | `use_case`, `key_type`, `layer` | Serialized Redis payload size in bytes |

Use a custom registry or prefix when embedding GCache in an app with its own metrics endpoint:

```ts
import { Registry } from "prom-client";
import { GCache } from "@rungalileo/gcache";

const registry = new Registry();
const gcache = new GCache({
metricsRegistry: registry,
metricsPrefix: "myapp_", // myapp_gcache_request_counter, etc.
});

app.get("/metrics", async (_req, res) => {
res.type(registry.contentType).send(await registry.metrics());
});
```

For non-Prometheus telemetry, inject a `GCacheMetricsAdapter` through `new GCache({ metrics })`. Pass `metrics: false` to disable metrics entirely. GCache reuses existing collectors in a registry so repeated instances with the same prefix do not throw duplicate-registration errors.

## Milestone 5 scope

Included:

- Local TTL cache
- Redis TTL cache
- Local → Redis → fallback read-through chain
- Lazy Redis client factory support
- Timestamped, versioned Redis envelope
- JSON and custom serializer support for Redis values
- Duplicate and reserved use-case validation
- `delete` and `flushAll` across configured layers
- Fail-open behavior for key/config/cache errors
- Runtime config provider with fallback to cached-function `defaultConfig`
- Per-layer TTL and ramp controls
- Injectable ramp sampler for deterministic rollout tests
- Missing config disables only the relevant layer and falls through
- Prometheus metrics with duplicate-registration safety
- Custom metrics adapter/registry/prefix hooks
- Cache-vs-fallback error classification through the `in_fallback` label
- Serialization latency and cached payload size metrics for Redis values
- Logger injection for cache operational failures
- `trackForInvalidation` on cached functions
- `invalidate(keyType, id, { futureBufferMs })` Redis watermark API
- Redis Cluster hash-tagged value/watermark keys for invalidation-tracked entries
- Configurable Redis watermark TTL via `redis.watermarkTtlSec` with `DEFAULT_WATERMARK_TTL_SEC`
- Future-buffer behavior that avoids cache writes during active invalidation windows

Not included yet:

- Framework middleware helpers/integrations
- `cachedObject`
- Expanded examples
- Release hardening
40 changes: 40 additions & 0 deletions packages/gcache-ts/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
{
"name": "@rungalileo/gcache",
"version": "0.1.0",
"description": "TypeScript port of GCache with explicit-context local and Redis TTL caching.",
"license": "MIT",
"type": "module",
"main": "./dist/index.cjs",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"types": "./dist/index.d.ts",
"import": "./dist/index.js",
"require": "./dist/index.cjs"
}
},
"files": [
"dist",
"README.md"
],
"scripts": {
"build": "tsup src/index.ts --format esm,cjs --dts --clean",
"typecheck": "tsc --noEmit",
"test": "vitest run --coverage",
"test:watch": "vitest"
},
"devDependencies": {
"@types/node": "^24.10.1",
"@vitest/coverage-v8": "^4.0.14",
"tsup": "^8.5.1",
"typescript": "^5.9.3",
"vitest": "^4.0.14"
},
"engines": {
"node": ">=18.17"
},
"dependencies": {
"prom-client": "^15.1.3"
}
}
Loading
Loading