Conversation
📝 WalkthroughWalkthroughAdds a migration subsystem: CI workflow to build/push a migration Docker image, a new relayer API endpoint for atomic account creation, and a migration package with providers, DB adapters, Bull queues, CLI commands, Docker Compose, and related utilities. Changes
Sequence Diagram(s)sequenceDiagram
participant Migrator as Migrator CLI
participant Firebase as Firebase (export/local)
participant Mpc as MPC Provider
participant FastAuth as FastAuth Contract
participant MQ as MpcUserQueue
participant AKQ as AddAccessKeysQueue
participant SQ as SignQueue
participant NEAR as NearProvider
participant RQ as RelayQueue
Migrator->>Firebase: iterateAll users
Firebase->>MQ: user object
MQ->>Mpc: claimOidcToken() / userCredentials()
Mpc-->>MQ: oidc token, oldPublicKey
MQ->>FastAuth: getPublicKey(provider, providerId)
FastAuth-->>MQ: newPublicKey(s)
MQ->>AKQ: enqueue(oldPublicKey, newPublicKeys, jwt)
AKQ->>NEAR: getAccountIdsByPublicKey()
NEAR-->>AKQ: accountIds
AKQ->>NEAR: getAccountNonce() / hasFullAccessKey()
NEAR-->>AKQ: nonce / boolean
AKQ->>SQ: enqueue(serializedDelegateAction, jwt)
SQ->>Mpc: sign(delegateAction, jwt)
Mpc-->>SQ: signature
SQ->>RQ: enqueue(serializedDelegateAction, signature)
RQ->>NEAR: relayMetaTransaction(receiverId, delegateAction, signature)
NEAR-->>RQ: tx hash / outcome
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes
Suggested labels
Suggested Reviewers
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 22
🧹 Nitpick comments (27)
scripts/migration/src/cmd/debug/ExportDatastore.ts (2)
1-6: Avoid hard-coding Datastore projectId to make the script reusableRight now the projectId is fixed to
fast-auth-000, which makes this script easy to mis‑use against the wrong project and harder to run in different environments (dev/stage/prod).Consider reading the projectId from an environment variable or CLI flag (e.g.
GOOGLE_CLOUD_PROJECTorDATASTORE_PROJECT_ID), with a sane default for local debugging.
31-37: Make the debug entrypoint safer and more flexibleThe top‑level IIFE always runs and exports a hard‑coded set of kinds in JSON. This works for ad‑hoc use, but:
- If this module is ever imported elsewhere, it will immediately start exporting data.
- Kinds and format have to be edited in code instead of passed via CLI/environment.
Consider:
- Wrapping the IIFE in a
if (require.main === module)guard (or ESM equivalent) so it only runs when executed directly.- Reading
kindsandformatfrom process args or env vars, keeping"EncryptedUserCredentials-dev"as a default example in comments.scripts/migration/src/utils/timeout-promise.ts (1)
13-24: Consider explicit typing for the timeout variable.The
timeoutvariable is implicitly typed and may cause TypeScript to infernumber | undefined. While the code works correctly, explicit typing improves clarity.Apply this diff for explicit typing:
export async function timeoutPromise<T>(promise: Promise<T>, ms: number): Promise<T> { - let timeout; + let timeout: NodeJS.Timeout; const rejectedPromise = new Promise<T>((_, reject) => { timeout = setTimeout(() => { reject(new TimeoutPromiseError()); }, ms); });scripts/migration/src/utils/bytesJsonStringify.ts (1)
6-6: Replaceanytype withunknownfor better type safety.Using
anybypasses TypeScript's type checking. Consider usingunknownto maintain type safety while accepting any input.Apply this diff:
-export function bytesJsonStringify(input: any): Buffer { +export function bytesJsonStringify(input: unknown): Buffer { return Buffer.from(JSON.stringify(input)); }scripts/migration/src/cmd/Migrate.ts (1)
6-8: Consider batch enqueueing for better performance.Sequential awaiting of each
add()operation may be slow for large user counts. Consider usingPromise.all()orPromise.allSettled()to enqueue users in batches.Example batched approach:
const users = mpcDatabase.findAll(); const BATCH_SIZE = 100; for (let i = 0; i < users.length; i += BATCH_SIZE) { const batch = users.slice(i, i + BATCH_SIZE); await Promise.allSettled( batch.map(user => mpcUserQueue.add({ user })) ); console.log(`Enqueued batch ${Math.floor(i / BATCH_SIZE) + 1}`); }docker/migration.Dockerfile (1)
13-17: Clarify the build stage purpose.This Dockerfile only runs linting and sets a WORKDIR but doesn't build or compile the migration scripts. If the migration scripts need to be built (e.g., TypeScript compilation), add a build step. If they run directly via ts-node or similar, document this in a comment.
# Lint migration script RUN --mount=type=secret,id=turbo_token,env=TURBO_TOKEN \ npx turbo run lint --filter=@fast-auth/migration + +# Build migration script (if needed) +# RUN npx turbo run build --filter=@fast-auth/migration WORKDIR /project/scripts/migrationscripts/migration/config-example.json (1)
1-19: Document that example keys must be replaced.The example configuration contains placeholder private keys. Consider adding a comment at the top of the file or in a README to clarify that these values are examples and must be replaced with actual credentials before use.
{ + "_comment": "Example configuration - replace all keys and URLs with actual values before use", "oldMpc": { "url": "http://localhost:3000",Alternatively, create a separate
scripts/migration/README.mdwith setup instructions.scripts/migration/src/cmd/Dashboard.ts (1)
12-36: Add error handling and make port configurable.The dashboard lacks error handling for queue initialization and server startup failures. Additionally, the hardcoded port (3009) should be configurable via environment variables.
(async () => { - const migrateConfig = initMigrationConfig(); - - const mpcQueue = new Queue(MPC_USER_QUEUE_NAME, migrateConfig.queue); - const addAccessKeyQueue = new Queue(ADD_ACCESS_KEY_QUEUE_NAME, migrateConfig.queue); - const signQueue = new Queue(SIGN_QUEUE_NAME, migrateConfig.queue); - const relayQueue = new Queue(RELAY_QUEUE_NAME, migrateConfig.queue); + try { + const migrateConfig = initMigrationConfig(); + const port = parseInt(process.env.DASHBOARD_PORT || '3009', 10); + + const mpcQueue = new Queue(MPC_USER_QUEUE_NAME, migrateConfig.queue); + const addAccessKeyQueue = new Queue(ADD_ACCESS_KEY_QUEUE_NAME, migrateConfig.queue); + const signQueue = new Queue(SIGN_QUEUE_NAME, migrateConfig.queue); + const relayQueue = new Queue(RELAY_QUEUE_NAME, migrateConfig.queue); - const serverAdapter = new ExpressAdapter(); - serverAdapter.setBasePath("/admin"); + const serverAdapter = new ExpressAdapter(); + serverAdapter.setBasePath("/admin"); - createBullBoard({ - queues: [new BullAdapter(mpcQueue), new BullAdapter(addAccessKeyQueue), new BullAdapter(signQueue), new BullAdapter(relayQueue)], - serverAdapter: serverAdapter, - }); + createBullBoard({ + queues: [new BullAdapter(mpcQueue), new BullAdapter(addAccessKeyQueue), new BullAdapter(signQueue), new BullAdapter(relayQueue)], + serverAdapter: serverAdapter, + }); - const app = express(); + const app = express(); - app.use("/admin", serverAdapter.getRouter()); + app.use("/admin", serverAdapter.getRouter()); - app.listen(3009, () => { - console.log("Running on 3009..."); - console.log("For the UI, open http://localhost:3000/admin"); - console.log("Make sure Redis is running on port 6379 by default"); - }); + app.listen(port, () => { + console.log(`Running on ${port}...`); + console.log(`For the UI, open http://localhost:${port}/admin`); + console.log("Make sure Redis is running on port 6379 by default"); + }); + } catch (error) { + console.error('Failed to start dashboard:', error); + process.exit(1); + } })();apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts (1)
2-5: Consider using camelCase for consistency.The properties use snake_case naming (
account_id,oauth_token,signed_delegate_action), which is inconsistent with TypeScript/JavaScript conventions. If this matches the external API contract, consider using@Expose()decorators to map between camelCase properties and snake_case JSON keys.import { Expose } from 'class-transformer'; export class CreateAccountAtomicRequest { @Expose({ name: 'account_id' }) accountId: string; allowance: number; @Expose({ name: 'oauth_token' }) oauthToken: string; @Expose({ name: 'signed_delegate_action' }) signedDelegateAction: SignedDelegateAction; }scripts/migration/docker-compose.yml (2)
3-58: Consider adding restart policies and health checks.The services lack restart policies and health checks, which can impact reliability. Consider adding
restart: unless-stoppedand appropriate health checks for each service to ensure they recover from failures and are monitored properly.Example for the migrate service:
migrate: image: migration container_name: migration command: "pnpm run migrate" volumes: - ./config.json:/project/scripts/migration/config.json depends_on: - redis restart: unless-stopped healthcheck: test: ["CMD", "pnpm", "run", "health"] interval: 30s timeout: 10s retries: 3
4-8: Consider adding Redis persistence configuration.Redis is configured without persistence settings. If the container restarts, all queue data will be lost. Consider adding a volume mount for Redis data persistence.
redis: image: redis:7.2 container_name: redis ports: - "6379:6379" + volumes: + - redis-data:/data + command: redis-server --appendonly yes + +volumes: + redis-data:scripts/migration/src/utils/firebase.ts (1)
1-11: Complete the JSDoc comments.The JSDoc comments lack parameter and return value descriptions, reducing code maintainability.
Apply this diff:
/** - * + * Extracts the Firebase ID from an internal account ID URL. * @param internalAccountId + * @param internalAccountId The internal account ID in URL format (e.g., "https://....:firebaseId") + * @returns The extracted Firebase ID + * @throws Error if the Firebase ID cannot be parsed from the input */scripts/migration/src/database/MpcDatabase.ts (2)
11-11: Consider making the database property private or readonly.The
databaseproperty is public and mutable, allowing external code to modify the data structure directly, which could lead to unintended side effects.- database: MpcUser[]; + private database: MpcUser[];
21-23: Return a defensive copy to prevent external mutations.The
findAll()method returns a direct reference to the internal array, allowing callers to mutate the database. Consider returning a copy or marking the return type as readonly.findAll(): MpcUser[] { - return this.database; + return [...this.database]; }Or update the return type to indicate immutability:
-findAll(): MpcUser[] { +findAll(): readonly MpcUser[] { return this.database; }.github/workflows/build-migration.yaml (1)
40-43: Consider using full commit SHA for image tags.The image tag uses only the first 8 characters of the commit SHA, which could theoretically lead to collisions in very large repositories or long-running projects.
Consider using the full SHA or combining with other identifiers:
- name: Sets env vars run: | - echo "DOCKER_IMAGE_NAME=${{ secrets.DOCKER_USERNAME }}/${{ github.event.repository.name }}-migration:${GITHUB_SHA:0:8}" >> $GITHUB_ENV + echo "DOCKER_IMAGE_NAME=${{ secrets.DOCKER_USERNAME }}/${{ github.event.repository.name }}-migration:${{ github.sha }}" >> $GITHUB_ENVscripts/migration/src/cmd/bootstrapMigration.ts (1)
44-50: Consider returning firebaseDatabase for consistency.The function creates
firebaseDatabasebut doesn't return it, which may be needed by other components for debugging or monitoring purposes. Consider adding it to the return object.return { mpcDatabase, + firebaseDatabase, relayQueue, signQueue, addAccessKeysQueue, mpcUserQueue, };apps/relayer/src/modules/relayer/relayer.controller.ts (1)
39-39: Inconsistent endpoint naming convention.The new endpoint uses underscore notation (
/create_account_atomic) while other endpoints in the codebase and REST conventions typically use kebab-case or camelCase. Consider using/create-account-atomicfor consistency.-@Post("/create_account_atomic") +@Post("/create-account-atomic")apps/relayer/openapi-spec.json (1)
1-1: Consider adding security definitions to the OpenAPI spec.The OpenAPI spec lacks global security field definitions and per-operation security requirements, as indicated by static analysis. This makes it unclear which endpoints require authentication and what authentication mechanisms are supported.
While the spec includes
securitySchemes(bearer and X-API-KEY), consider adding security requirements to operations that need authentication:{ "paths": { "/api/relayer/fast-auth/create_account_atomic": { "post": { "operationId": "createAccountAtomic", "security": [{"bearer": []}, {"X-API-KEY": []}], ... } } } }Based on learnings
scripts/migration/src/config/migrationConfig.ts (1)
34-39: Improve error handling for config loading.The generic catch block silently swallows all errors, which could hide issues like incorrect file paths, permission problems, or invalid JSON syntax.
Consider logging more details:
} catch (e) { - console.warn("Could not read config, using default"); + console.warn(`Could not read config.json, using defaults. Error: ${e instanceof Error ? e.message : String(e)}`); }scripts/migration/src/queue/AddAccessKeysQueue.ts (1)
40-44: Improve error detection for missing access keys.Using string matching on error messages is fragile and may break if the RPC provider changes error messages.
Consider checking for a more specific error type or code if available:
} catch (e) { - if (e?.toString().includes("Can't complete the action because access key")) { + // Check for specific NEAR error type if available, e.g., e.type === 'AccessKeyDoesNotExist' + if (e?.type === 'AccessKeyDoesNotExist' || e?.toString().includes("Can't complete the action because access key")) { this.log(`skipping account ${account}: access key ${job.data.oldPublicKey} not found`, "warn"); continue; } else throw e;scripts/migration/src/provider/FastAuthProvider.ts (1)
18-27: Improve argument validation logic.The Uint8Array detection at line 19 is not robust and could produce false positives. The check
args.byteLength !== undefined && args.byteLength === args.lengthdoesn't reliably distinguish Uint8Arrays from other objects.Use a more reliable check:
private validateArgs(args: any) { - const isUint8Array = args.byteLength !== undefined && args.byteLength === args.length; - if (isUint8Array) { + if (args instanceof Uint8Array || ArrayBuffer.isView(args)) { return; } if (Array.isArray(args) || typeof args !== "object") { throw new Error("invalid arguments"); } }scripts/migration/src/database/FirebaseDatabase.ts (2)
24-30: Constructor does synchronous I/O and trusts unvalidated JSONFor a CLI migration script this is probably fine, but
fs.readFileSync+ bareJSON.parsein the constructor means instantiation can throw low‑levelENOENT/syntax errors without context, andthis.exportmay be structurally wrong without any check.Consider:
- Wrapping the read/parse in a
try/catchthat includesexportFilein the error message.- Optionally validating the parsed value against
FirebaseExport(e.g. with a small runtime schema) before assigning, so bad exports fail fast with a clear message.
36-40: Include the requested id in the not‑found error
findByIdthrowing"firebase user not found"is fine, but including theidmakes large migrations much easier to debug:- if (!user) throw new Error("firebase user not found"); + if (!user) throw new Error(`firebase user not found: ${id}`);scripts/migration/src/provider/MPCProvider.ts (2)
7-52: Tighten request/response types instead of usingunknownMost of these fields are always strings in your code (
oidc_token,frp_signature,frp_public_key,user_credentials_frp_signature,oidc_token_hash, etc.), but the types areunknown. That loses type safety with no real benefit.Consider:
- Using
stringfor all JWTs, FRP signatures, public keys and hashes.- Keeping
create_account_optionsasunknownor a more specific shape if you know it.- Aligning
CreateNewAccountRequest/CreateNewAccountResponsewith what the MPC service actually returns instead of genericunknown.This will give earlier feedback if something is wired incorrectly and improves IDE help without changing runtime behavior.
96-105: Makerequestgeneric to preserve response typing
userCredentials,sign,createNewAccount, andclaimOidcTokenall declare precise response types, butrequestitself returnsPromise<any>, so there’s no compile‑time check that the JSON matches those shapes.You could keep the same runtime behavior and tighten types by making
requestgeneric:- private async request(path: string, request: object): Promise<any> { + private async request<TResponse>(path: string, request: object): Promise<TResponse> { const res = await fetch(`${this.baseUrl}${path}`, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify(request), }); if (res.status >= 300) { throw new Error(await res.text()); } - return res.json(); + return res.json() as Promise<TResponse>; }and then:
return this.request<UserCredentialResponse>("/user_credentials", { ... });This keeps everything as‑is at runtime but gives you much better static guarantees.
Also applies to: 112-120, 126-132, 204-218
scripts/migration/src/provider/NearProvider.ts (2)
78-92: Two public‑key→account lookup endpoints; consider consolidatingYou expose both:
getAccountIdsByPublicKey→/v1/kitwallet/publicKey/${publicKey}/accounts_getAccountIdsByPublicKey→/v1/public-key/${publicKey}/accountsIf both endpoints are meant to be equivalent, consider:
- Settling on one of them and dropping the other, or
- Making one a clear fallback with a comment (e.g. “legacy kitwallet endpoint”).
Right now it’s not obvious which one callers should use, and the underscore prefix suggests “private” even though it’s
asyncand exported as part of the class API.
59-72: Timeout wrapper is good; consider preserving underlying error detailThe
fetchJsonwrapper withtimeoutPromiseis a solid pattern, but the catch block replaces any underlying error (network failure, 5xx with message, etc.) with a generic string.If you ever need to debug indexer issues, it may help to preserve or at least log the original error message alongside the path:
} catch (e: unknown) { let errorMsg = e instanceof TimeoutPromiseError ? `could not fetch json due to timeout ${path}` : `could not fetch json ${path}: ${e instanceof Error ? e.message : String(e)}`; throw new Error(errorMsg); }Not required, but this tends to pay off quickly when something goes wrong in production.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (36)
.github/workflows/build-migration.yaml(1 hunks).github/workflows/pull-request.yml(1 hunks)apps/relayer/openapi-spec.json(1 hunks)apps/relayer/src/modules/relayer/relayer.controller.ts(2 hunks)apps/relayer/src/modules/relayer/relayer.service.ts(2 hunks)apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts(1 hunks)docker/migration.Dockerfile(1 hunks)pnpm-workspace.yaml(1 hunks)scripts/migration/.eslintrc.js(1 hunks)scripts/migration/.gitignore(1 hunks)scripts/migration/README.md(1 hunks)scripts/migration/config-example.json(1 hunks)scripts/migration/docker-compose.yml(1 hunks)scripts/migration/package.json(1 hunks)scripts/migration/src/cmd/Dashboard.ts(1 hunks)scripts/migration/src/cmd/Migrate.ts(1 hunks)scripts/migration/src/cmd/Work.ts(1 hunks)scripts/migration/src/cmd/bootstrapMigration.ts(1 hunks)scripts/migration/src/cmd/debug/CreateAccount.ts(1 hunks)scripts/migration/src/cmd/debug/ExportDatastore.ts(1 hunks)scripts/migration/src/config/migrationConfig.ts(1 hunks)scripts/migration/src/database/FirebaseDatabase.ts(1 hunks)scripts/migration/src/database/MpcDatabase.ts(1 hunks)scripts/migration/src/provider/FastAuthProvider.ts(1 hunks)scripts/migration/src/provider/MPCProvider.ts(1 hunks)scripts/migration/src/provider/NearProvider.ts(1 hunks)scripts/migration/src/provider/NearSigner.ts(1 hunks)scripts/migration/src/queue/AddAccessKeysQueue.ts(1 hunks)scripts/migration/src/queue/MpcUserQueue.ts(1 hunks)scripts/migration/src/queue/Queue.ts(1 hunks)scripts/migration/src/queue/RelayQueue.ts(1 hunks)scripts/migration/src/queue/SignQueue.ts(1 hunks)scripts/migration/src/utils/bytesJsonStringify.ts(1 hunks)scripts/migration/src/utils/firebase.ts(1 hunks)scripts/migration/src/utils/timeout-promise.ts(1 hunks)scripts/migration/tsconfig.json(1 hunks)
🧰 Additional context used
🧠 Learnings (8)
📚 Learning: 2025-09-10T13:23:58.048Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 571
File: packages/shared/api/src/index.ts:33-33
Timestamp: 2025-09-10T13:23:58.048Z
Learning: The packages/shared/api package exports source files directly (./src/index.ts) rather than compiled output because it's auto-generated by openapi-typescript-codegen and this export strategy is intentional for this shared package.
Applied to files:
scripts/migration/tsconfig.jsonscripts/migration/package.json
📚 Learning: 2025-08-14T14:53:11.507Z
Learnt from: AdriaCarrera
Repo: Peersyst/near-mobile PR: 520
File: turbo.json:0-0
Timestamp: 2025-08-14T14:53:11.507Z
Learning: In turbo.json, package.json files can be legitimate outputs when build tasks modify them during execution (e.g., version updates, metadata injection). In such cases, including package.json in the outputs array ensures proper cache restoration of the modified files.
Applied to files:
scripts/migration/tsconfig.json
📚 Learning: 2025-02-27T10:24:14.276Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 58
File: packages/shared/blockchain/core/src/signers/signer.ts:26-27
Timestamp: 2025-02-27T10:24:14.276Z
Learning: The Signer interface uses `Record<string, any>` for the transaction parameter in `signTransaction` method to allow implementing classes to add their own specific types extending this base type, providing flexibility for different blockchain implementations.
Applied to files:
scripts/migration/src/provider/NearSigner.ts
📚 Learning: 2025-04-07T12:08:24.795Z
Learnt from: obnol
Repo: Peersyst/near-mobile PR: 154
File: packages/frontend/auth/package.json:0-0
Timestamp: 2025-04-07T12:08:24.795Z
Learning: The BiometricsService is correctly re-exported in /packages/frontend/auth/src/data-access/services/index.ts with the line `export * from "./biometrics.service";`, validating the export path in package.json.
Applied to files:
scripts/migration/package.json
📚 Learning: 2025-09-01T14:25:59.510Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/test/mocks/wci/chains/nervos/providers/nervos-account-scanner.provider.mock.ts:1-6
Timestamp: 2025-09-01T14:25:59.510Z
Learning: In the Nervos blockchain implementation, scanning functionality has been separated from the account provider into a dedicated account scanner provider. The NervosScannedAccountAddressMock and NervosScannedAccountMock now belong in the nervos-account-scanner.provider.mock.ts file, not the account provider mock file.
Applied to files:
scripts/migration/src/cmd/debug/CreateAccount.tsscripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented in NervosAccountProvider by calling getAddressAndScriptFromPublicKey and returning the first element of the tuple. Calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented, so calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface has both getAddressFromPublicKey(publicKey: string): string and getAddressAndScriptFromPublicKey(publicKey: string): [string, Script] methods available. The getAddressFromPublicKey method exists and is correctly implemented.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
🧬 Code graph analysis (16)
scripts/migration/src/cmd/Dashboard.ts (1)
scripts/migration/src/config/migrationConfig.ts (1)
initMigrationConfig(32-66)
apps/relayer/src/modules/relayer/relayer.controller.ts (2)
apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts (1)
CreateAccountAtomicRequest(1-6)apps/relayer/src/modules/relayer/response/sign.response.ts (1)
SignResponse(3-6)
scripts/migration/src/config/migrationConfig.ts (1)
packages/shared/cli/src/cli.mjs (1)
__dirname(12-12)
scripts/migration/src/queue/RelayQueue.ts (2)
scripts/migration/src/provider/NearProvider.ts (1)
NearProvider(23-208)packages/auth0/src/actions/authorize-app.action.js (1)
SCHEMA(12-182)
scripts/migration/src/cmd/bootstrapMigration.ts (10)
scripts/migration/src/config/migrationConfig.ts (1)
initMigrationConfig(32-66)scripts/migration/src/database/MpcDatabase.ts (1)
MpcDatabase(10-24)scripts/migration/src/database/FirebaseDatabase.ts (1)
FirebaseDatabase(24-41)scripts/migration/src/provider/MPCProvider.ts (1)
MPCProvider(69-219)scripts/migration/src/provider/NearProvider.ts (1)
NearProvider(23-208)scripts/migration/src/provider/FastAuthProvider.ts (1)
FastAuthProvider(6-69)scripts/migration/src/queue/RelayQueue.ts (1)
RelayQueue(14-41)scripts/migration/src/queue/SignQueue.ts (1)
SignQueue(15-40)scripts/migration/src/queue/AddAccessKeysQueue.ts (1)
AddAccessKeysQueue(18-69)scripts/migration/src/queue/MpcUserQueue.ts (1)
MpcUserQueue(19-72)
scripts/migration/src/cmd/Work.ts (2)
scripts/migration/src/cmd/bootstrapMigration.ts (1)
bootstrapMigration(16-51)scripts/migration/src/queue/Queue.ts (1)
process(30-41)
scripts/migration/src/queue/AddAccessKeysQueue.ts (3)
scripts/migration/src/queue/RelayQueue.ts (2)
QUEUE_NAME(7-7)JobParams(9-12)scripts/migration/src/queue/SignQueue.ts (3)
QUEUE_NAME(8-8)JobParams(10-13)SignQueue(15-40)scripts/migration/src/provider/NearProvider.ts (1)
NearProvider(23-208)
apps/relayer/src/modules/relayer/relayer.service.ts (2)
apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts (1)
CreateAccountAtomicRequest(1-6)apps/relayer/src/modules/relayer/response/sign.response.ts (1)
SignResponse(3-6)
scripts/migration/src/provider/NearSigner.ts (2)
packages/auth0/src/actions/authorize-app.action.js (1)
transaction(237-237)packages/sdks/browser/src/providers/auth0/utils/index.ts (1)
encodeTransaction(13-18)
scripts/migration/src/queue/MpcUserQueue.ts (5)
scripts/migration/src/queue/AddAccessKeysQueue.ts (3)
QUEUE_NAME(10-10)JobParams(12-16)AddAccessKeysQueue(18-69)scripts/migration/src/database/MpcDatabase.ts (1)
MpcUser(3-8)scripts/migration/src/database/FirebaseDatabase.ts (1)
FirebaseDatabase(24-41)scripts/migration/src/provider/FastAuthProvider.ts (1)
FastAuthProvider(6-69)scripts/migration/src/utils/firebase.ts (2)
firebaseIdFromInternalAccountId(5-11)firebaseProviderToAuth0Provider(17-24)
scripts/migration/src/provider/FastAuthProvider.ts (1)
scripts/migration/src/utils/bytesJsonStringify.ts (1)
bytesJsonStringify(6-8)
scripts/migration/src/cmd/Migrate.ts (2)
scripts/migration/src/cmd/bootstrapMigration.ts (1)
bootstrapMigration(16-51)scripts/migration/src/queue/Queue.ts (1)
process(30-41)
scripts/migration/src/provider/MPCProvider.ts (1)
packages/auth0/src/actions/authorize-app.action.js (1)
SCHEMA(12-182)
scripts/migration/src/queue/SignQueue.ts (3)
scripts/migration/src/queue/RelayQueue.ts (3)
QUEUE_NAME(7-7)JobParams(9-12)RelayQueue(14-41)scripts/migration/src/provider/MPCProvider.ts (1)
MPCProvider(69-219)packages/auth0/src/actions/authorize-app.action.js (1)
SCHEMA(12-182)
scripts/migration/src/cmd/debug/CreateAccount.ts (1)
scripts/migration/src/provider/MPCProvider.ts (2)
MPCProvider(69-219)userCredentials(83-89)
scripts/migration/src/provider/NearProvider.ts (2)
scripts/migration/src/provider/NearSigner.ts (1)
NearSigner(8-52)scripts/migration/src/utils/timeout-promise.ts (2)
timeoutPromise(13-24)TimeoutPromiseError(1-5)
🪛 actionlint (1.7.8)
.github/workflows/build-migration.yaml
47-47: the runner of "docker/build-push-action@v4" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
🪛 Checkov (3.2.334)
apps/relayer/openapi-spec.json
[high] 1: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🪛 Gitleaks (8.29.0)
scripts/migration/src/config/migrationConfig.ts
[high] 48-48: Identified a Private Key, which may compromise cryptographic security and sensitive data encryption.
(private-key)
🔇 Additional comments (14)
pnpm-workspace.yaml (1)
11-11: LGTM!The migration package is correctly added to the workspace configuration following the established pattern.
scripts/migration/.gitignore (1)
1-2: LGTM!Ignoring local data and configuration files is appropriate for a migration tool that likely handles environment-specific settings and temporary state.
scripts/migration/.eslintrc.js (1)
1-10: LGTM!The ESLint configuration appropriately disables
no-consolefor CLI tooling and relaxes JSDoc requirements for internal migration scripts.scripts/migration/package.json (2)
5-5: Fix module export configuration mismatch.There's an inconsistency between
"main"and"exports":
"main"points to compiled output:"./dist/index.js""exports"points to source:"./src/index.ts"Modern Node.js module resolution prioritizes the
"exports"field, which means consumers would attempt to import TypeScript source files directly, likely causing runtime errors.Apply this diff to align the exports configuration:
"exports": { - ".": "./src/index.ts" + ".": "./dist/index.js" }Also applies to: 47-49
⛔ Skipped due to learnings
Learnt from: AgustinMJ Repo: Peersyst/near-mobile PR: 571 File: packages/shared/api/src/index.ts:33-33 Timestamp: 2025-09-10T13:23:58.048Z Learning: The packages/shared/api package exports source files directly (./src/index.ts) rather than compiled output because it's auto-generated by openapi-typescript-codegen and this export strategy is intentional for this shared package.Learnt from: obnol Repo: Peersyst/near-mobile PR: 154 File: packages/frontend/auth/package.json:0-0 Timestamp: 2025-04-07T12:08:24.795Z Learning: The BiometricsService is correctly re-exported in /packages/frontend/auth/src/data-access/services/index.ts with the line `export * from "./biometrics.service";`, validating the export path in package.json.
38-38: Now I need to verify the actual versions in the package.json file to see what's currently specified:Based on my verification, I have completed the security and version checks:
Summary of Verification Results:
- @near-js/transactions latest version is 2.3.1, compatible with the specified
^2.0.2- near-api-js latest version is 6.3.0
- bull latest version is 4.16.5, and Bull is currently in maintenance mode with only bug fixes, while BullMQ is the modern rewritten implementation in TypeScript
- No security vulnerabilities were found in the NEAR repository or for these packages
No issues were found. All specified package versions are current and free from known vulnerabilities.
Versions verified as current and free from known vulnerabilities.
All three packages have recent versions available on npm with no known security advisories. Bull is in maintenance mode but remains safe to use. The specified version ranges are compatible with latest releases.
.github/workflows/pull-request.yml (1)
29-36: Verify the push: true configuration is intentional.The
build-migration-scriptjob haspush: truewhilebuild-relayerhaspush: false. This means migration images will be pushed to the registry on every PR, while relayer images won't.Verify this is intentional and aligns with your CI/CD strategy, as pushing on every PR can consume significant registry storage.
If this was unintentional, consider aligning with the relayer job:
build-migration-script: needs: [build-base] uses: ./.github/workflows/build-migration.yaml secrets: inherit with: target: integration - push: true + push: false base_image: ${{ github.sha }}docker/migration.Dockerfile (1)
8-11: No issues found. The base image properly includes workspace metadata.The base.Dockerfile executes
COPY . /project, which copies the entire monorepo includingpackage.json,pnpm-workspace.yaml, andpnpm-lock.yamlinto the base image. When migration.Dockerfile inherits from the base image and runspnpm install, all necessary workspace files are already present in/project, allowing pnpm to correctly resolve dependencies for thescripts/migrationworkspace package.scripts/migration/src/provider/NearSigner.ts (2)
20-35: LGTM! Transaction signing implementation is correct.The
signTransactionmethod correctly encodes the transaction, computes a SHA-256 hash, signs it with the ED25519 key, and constructs a properly formattedSignedTransaction. The use ofbase_encodefor the hash follows NEAR conventions.
11-13: Add error handling for invalid private key format.The constructor performs a type assertion without validation. If the private key string is not in the expected
ed25519:...format,KeyPair.fromString()will throw an error, making it difficult to diagnose the issue.Apply this diff to add validation:
constructor(privateKey: string) { + if (!privateKey.startsWith('ed25519:')) { + throw new Error('Private key must be in ed25519:... format'); + } this.keyPair = KeyPair.fromString(privateKey as `ed25519:${string}`); }⛔ Skipped due to learnings
Learnt from: AgustinMJ Repo: Peersyst/near-mobile PR: 58 File: packages/shared/blockchain/near/src/signers/near-private-key.signer.ts:7-10 Timestamp: 2025-02-27T10:23:16.083Z Learning: KeyPair.fromString() from near-api-js already performs internal validation and throws an error if an invalid key is provided, making additional validation checks redundant. The type definition in the library is incorrect - it should accept string rather than KeyPairString.scripts/migration/src/queue/RelayQueue.ts (1)
26-40: LGTM!The deserialization and relay logic is correct. Error handling properly checks for transaction failures and throws with context.
scripts/migration/src/queue/Queue.ts (1)
1-44: LGTM!The abstract Queue class provides a clean abstraction over Bull with proper error handling and logging. The pattern of delegating to
_processin subclasses is well-designed.scripts/migration/src/queue/SignQueue.ts (1)
28-39: LGTM!The signing flow correctly deserializes the delegate action, signs it via the MPC provider, and forwards the result to the relay queue.
scripts/migration/src/provider/MPCProvider.ts (1)
53-68: Borsh serialization patterns are correct; focus on backend interoperability testingThe serialization approach here is sound:
near-api-js/lib/utilsre-exportsborsh's functions, so importing from either source is equivalent. Borsh 2.0'sserialize()API officially supports both scalar type strings (e.g.,"u32") andSchemaobjects, making your mixed usage pattern (serialize("u32", this.salt)andserialize(U8_32, new BytesBox(...))) correct per spec.The code structure is valid, but the security concern remains: FRP signature helpers encode
[salt, oidc/delegate bytes, 0x00, frp_public_key]for signing, and any deviation will break backend interoperability. Add unit tests asserting thatfrpSignature,frpSignatureSign, andfrpSignatureCredentialsproduce known-good hex outputs for fixed test vectors (salt, keyPair, jwt, delegateAction) derived from backend test cases to catch serialization mismatches early.Also applies to: 138–159, 165–197
scripts/migration/src/provider/NearProvider.ts (1)
160-185: Confirm relayer account configuration is correctIn
signAndSendTransaction, verify thatsender(derived fromNearSigner.getAddress(), the hex encoding of the relayer's public key) matches your deployed relayer account on chain:
- If using an implicit account (account id = public key hash), ensure that account is created and funded.
- If using a named account (e.g.
relayer.near), pass that account id instead of the hex encoding.The
sendTransactionUntil(signedTransaction, "FINAL")method is valid and officially supported innear-api-jsv5.
|
|
||
| # Build docker image | ||
| - name: Build docker image | ||
| uses: docker/build-push-action@v4 |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Update docker/build-push-action to a newer version.
The workflow uses docker/build-push-action@v4, which is outdated and may not work on current GitHub Actions runners. Update to v6 for compatibility and latest features.
Apply this diff:
- name: Build docker image
- uses: docker/build-push-action@v4
+ uses: docker/build-push-action@v6
with:Based on learnings
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| uses: docker/build-push-action@v4 | |
| - name: Build docker image | |
| uses: docker/build-push-action@v6 | |
| with: |
🧰 Tools
🪛 actionlint (1.7.8)
47-47: the runner of "docker/build-push-action@v4" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
🤖 Prompt for AI Agents
.github/workflows/build-migration.yaml around line 47: the workflow pins
docker/build-push-action@v4 which is outdated; update the action reference to
docker/build-push-action@v6 (replace the v4 tag with v6) to use the newer
release and ensure compatibility with current GitHub Actions runners, then
commit and run the workflow to verify the build/push step succeeds.
| @@ -1 +1 @@ | |||
| {"openapi":"3.0.0","paths":{"/api/metrics":{"get":{"operationId":"index","parameters":[],"responses":{"200":{"description":""}}}},"/api/relayer/fast-auth":{"post":{"operationId":"createAccount","summary":"Create account","parameters":[],"requestBody":{"required":true,"content":{"application/json":{"schema":{"$ref":"#/components/schemas/CreateAccountRequest"}}}},"responses":{"201":{"description":"","content":{"application/json":{"schema":{"$ref":"#/components/schemas/SignResponse"}}}}},"tags":["fast-auth-relayer"]}}},"info":{"title":"relayer","description":"Relayer","version":"0.0.0","contact":{}},"tags":[],"servers":[],"components":{"securitySchemes":{"bearer":{"scheme":"bearer","bearerFormat":"JWT","type":"http"},"X-API-KEY":{"type":"apiKey","in":"header","name":"X-API-KEY"}},"schemas":{"SignRequest":{"type":"object","properties":{}},"SignResponse":{"type":"object","properties":{}},"SignAndSendDelegateActionRequest":{"type":"object","properties":{}},"CreateAccountRequest":{"type":"object","properties":{}}}}} No newline at end of file | |||
| {"openapi":"3.0.0","paths":{"/api/metrics":{"get":{"operationId":"index","parameters":[],"responses":{"200":{"description":""}}}},"/api/relayer/fast-auth":{"post":{"operationId":"createAccount","summary":"Create account","parameters":[],"requestBody":{"required":true,"content":{"application/json":{"schema":{"$ref":"#/components/schemas/CreateAccountRequest"}}}},"responses":{"201":{"description":"","content":{"application/json":{"schema":{"$ref":"#/components/schemas/SignResponse"}}}}},"tags":["fast-auth-relayer"]}},"/api/relayer/fast-auth/create_account_atomic":{"post":{"operationId":"createAccountAtomic","summary":"Create account","parameters":[],"requestBody":{"required":true,"content":{"application/json":{"schema":{"$ref":"#/components/schemas/CreateAccountAtomicRequest"}}}},"responses":{"201":{"description":"","content":{"application/json":{"schema":{"$ref":"#/components/schemas/SignResponse"}}}}},"tags":["fast-auth-relayer"]}}},"info":{"title":"relayer","description":"Relayer","version":"0.0.0","contact":{}},"tags":[],"servers":[],"components":{"securitySchemes":{"bearer":{"scheme":"bearer","bearerFormat":"JWT","type":"http"},"X-API-KEY":{"type":"apiKey","in":"header","name":"X-API-KEY"}},"schemas":{"SignRequest":{"type":"object","properties":{}},"SignResponse":{"type":"object","properties":{}},"SignAndSendDelegateActionRequest":{"type":"object","properties":{}},"CreateAccountRequest":{"type":"object","properties":{}},"CreateAccountAtomicRequest":{"type":"object","properties":{}}}}} No newline at end of file | |||
There was a problem hiding this comment.
Complete the CreateAccountAtomicRequest schema definition.
The schema for CreateAccountAtomicRequest has empty properties, which means the OpenAPI spec doesn't document the expected request structure. Based on the implementation in apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts, the schema should include: account_id, allowance, oauth_token, and signed_delegate_action.
This prevents API consumers from understanding the required fields and their types. Please regenerate the OpenAPI spec to include the complete schema definition.
🧰 Tools
🪛 Checkov (3.2.334)
[high] 1: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🤖 Prompt for AI Agents
In apps/relayer/openapi-spec.json around lines 1 to 1, the
CreateAccountAtomicRequest schema is empty; update the
components.schemas.CreateAccountAtomicRequest entry to include the expected
fields from
apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts
(account_id, allowance, oauth_token, signed_delegate_action) and regenerate the
OpenAPI spec — for example define account_id as string, allowance as string (or
numeric if implementation expects number), oauth_token as string, and
signed_delegate_action as an object (or reference the existing delegate action
schema) so consumers can see required fields and their types.
| @Post("/create_account_atomic") | ||
| @ApiOperation({ summary: "Create account" }) | ||
| @ApiBody({ type: CreateAccountAtomicRequest }) | ||
| async createAccountAtomic(@Body() body: CreateAccountAtomicRequest): Promise<SignResponse> { | ||
| return this.fastAuthRelayerService.createAccountAtomic(body); | ||
| } |
There was a problem hiding this comment.
Add input validation for the new endpoint.
The createAccountAtomic method doesn't validate the input body, unlike other endpoints in this controller (e.g., line 20 calls validate(body)). This could allow invalid or malicious data to reach the service layer.
Apply this diff to add validation:
@Post("/create_account_atomic")
@ApiOperation({ summary: "Create account" })
@ApiBody({ type: CreateAccountAtomicRequest })
async createAccountAtomic(@Body() body: CreateAccountAtomicRequest): Promise<SignResponse> {
+ validate(body);
return this.fastAuthRelayerService.createAccountAtomic(body);
}Also ensure that the CreateAccountAtomicRequest class includes appropriate validation decorators (e.g., from class-validator).
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| @Post("/create_account_atomic") | |
| @ApiOperation({ summary: "Create account" }) | |
| @ApiBody({ type: CreateAccountAtomicRequest }) | |
| async createAccountAtomic(@Body() body: CreateAccountAtomicRequest): Promise<SignResponse> { | |
| return this.fastAuthRelayerService.createAccountAtomic(body); | |
| } | |
| @Post("/create_account_atomic") | |
| @ApiOperation({ summary: "Create account" }) | |
| @ApiBody({ type: CreateAccountAtomicRequest }) | |
| async createAccountAtomic(@Body() body: CreateAccountAtomicRequest): Promise<SignResponse> { | |
| validate(body); | |
| return this.fastAuthRelayerService.createAccountAtomic(body); | |
| } |
🤖 Prompt for AI Agents
In apps/relayer/src/modules/relayer/relayer.controller.ts around lines 39 to 44,
the new createAccountAtomic endpoint is missing input validation; update the
controller method to call validate(body) (or await validate(body)) before
passing to fastAuthRelayerService.createAccountAtomic, and return/throw the
resulting validation errors consistently with the other endpoints; additionally
ensure the CreateAccountAtomicRequest DTO is annotated with appropriate
class-validator decorators (e.g., IsString/IsEmail/IsOptional/ValidateNested
etc.) and any nested types use Type decorators so validation runs correctly.
| async createAccountAtomic(body: CreateAccountAtomicRequest): Promise<SignResponse> { | ||
| const functionCallAction = body.signed_delegate_action.delegate_action.actions[0].FunctionCall; | ||
| const [sigAlgo, sigData] = body.signed_delegate_action.signature.split(":") | ||
| const signedDelegate = actionCreators.signedDelegate({ | ||
| delegateAction: buildDelegateAction({ | ||
| senderId: body.signed_delegate_action.delegate_action.sender_id, | ||
| receiverId: body.signed_delegate_action.delegate_action.receiver_id, | ||
| actions: [ | ||
| functionCall( | ||
| functionCallAction.method_name, | ||
| JSON.parse(Buffer.from(functionCallAction.args, "base64").toString()), | ||
| functionCallAction.gas, | ||
| functionCallAction.deposit, | ||
| ) | ||
| ], | ||
| nonce: body.signed_delegate_action.delegate_action.nonce, | ||
| maxBlockHeight: body.signed_delegate_action.delegate_action.max_block_height, | ||
| publicKey: PublicKey.from(body.signed_delegate_action.delegate_action.public_key), | ||
| }), | ||
| signature: new Signature({ | ||
| keyType: sigAlgo === "ed25519" ? 0 : 1, | ||
| data: base_decode(sigData), | ||
| // 245, 37, 132, 230, ... 63, 170, 6 | ||
| }), | ||
| }); | ||
| return await this.signAndSendTransaction(body.signed_delegate_action.delegate_action.sender_id, [signedDelegate]); | ||
| } |
There was a problem hiding this comment.
Add input validation for createAccountAtomic.
The method accesses nested properties and performs operations without validating the input structure. Malformed requests will cause cryptic runtime errors or security issues.
Add validation before processing:
async createAccountAtomic(body: CreateAccountAtomicRequest): Promise<SignResponse> {
+ // Validate input structure
+ if (!body.signed_delegate_action?.delegate_action?.actions?.[0]?.FunctionCall) {
+ throw new BadRequestException('Missing or invalid delegate_action.actions[0].FunctionCall');
+ }
+ if (!body.signed_delegate_action.signature || typeof body.signed_delegate_action.signature !== 'string') {
+ throw new BadRequestException('Missing or invalid signature');
+ }
+ if (!body.signed_delegate_action.delegate_action.sender_id || !body.signed_delegate_action.delegate_action.receiver_id) {
+ throw new BadRequestException('Missing sender_id or receiver_id');
+ }
+
const functionCallAction = body.signed_delegate_action.delegate_action.actions[0].FunctionCall;
const [sigAlgo, sigData] = body.signed_delegate_action.signature.split(":")
+ if (!sigAlgo || !sigData) {
+ throw new BadRequestException('Invalid signature format, expected "algorithm:data"');
+ }
+ if (sigAlgo !== "ed25519" && sigAlgo !== "secp256k1") {
+ throw new BadRequestException(`Unsupported signature algorithm: ${sigAlgo}`);
+ }
+
+ let parsedArgs;
+ try {
+ parsedArgs = JSON.parse(Buffer.from(functionCallAction.args, "base64").toString());
+ } catch (e) {
+ throw new BadRequestException('Invalid base64 or JSON in function call args');
+ }
+
const signedDelegate = actionCreators.signedDelegate({
delegateAction: buildDelegateAction({
senderId: body.signed_delegate_action.delegate_action.sender_id,
receiverId: body.signed_delegate_action.delegate_action.receiver_id,
actions: [
functionCall(
functionCallAction.method_name,
- JSON.parse(Buffer.from(functionCallAction.args, "base64").toString()),
+ parsedArgs,
functionCallAction.gas,
functionCallAction.deposit,
)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async createAccountAtomic(body: CreateAccountAtomicRequest): Promise<SignResponse> { | |
| const functionCallAction = body.signed_delegate_action.delegate_action.actions[0].FunctionCall; | |
| const [sigAlgo, sigData] = body.signed_delegate_action.signature.split(":") | |
| const signedDelegate = actionCreators.signedDelegate({ | |
| delegateAction: buildDelegateAction({ | |
| senderId: body.signed_delegate_action.delegate_action.sender_id, | |
| receiverId: body.signed_delegate_action.delegate_action.receiver_id, | |
| actions: [ | |
| functionCall( | |
| functionCallAction.method_name, | |
| JSON.parse(Buffer.from(functionCallAction.args, "base64").toString()), | |
| functionCallAction.gas, | |
| functionCallAction.deposit, | |
| ) | |
| ], | |
| nonce: body.signed_delegate_action.delegate_action.nonce, | |
| maxBlockHeight: body.signed_delegate_action.delegate_action.max_block_height, | |
| publicKey: PublicKey.from(body.signed_delegate_action.delegate_action.public_key), | |
| }), | |
| signature: new Signature({ | |
| keyType: sigAlgo === "ed25519" ? 0 : 1, | |
| data: base_decode(sigData), | |
| // 245, 37, 132, 230, ... 63, 170, 6 | |
| }), | |
| }); | |
| return await this.signAndSendTransaction(body.signed_delegate_action.delegate_action.sender_id, [signedDelegate]); | |
| } | |
| async createAccountAtomic(body: CreateAccountAtomicRequest): Promise<SignResponse> { | |
| // Validate input structure | |
| if (!body.signed_delegate_action?.delegate_action?.actions?.[0]?.FunctionCall) { | |
| throw new BadRequestException('Missing or invalid delegate_action.actions[0].FunctionCall'); | |
| } | |
| if (!body.signed_delegate_action.signature || typeof body.signed_delegate_action.signature !== 'string') { | |
| throw new BadRequestException('Missing or invalid signature'); | |
| } | |
| if (!body.signed_delegate_action.delegate_action.sender_id || !body.signed_delegate_action.delegate_action.receiver_id) { | |
| throw new BadRequestException('Missing sender_id or receiver_id'); | |
| } | |
| const functionCallAction = body.signed_delegate_action.delegate_action.actions[0].FunctionCall; | |
| const [sigAlgo, sigData] = body.signed_delegate_action.signature.split(":") | |
| if (!sigAlgo || !sigData) { | |
| throw new BadRequestException('Invalid signature format, expected "algorithm:data"'); | |
| } | |
| if (sigAlgo !== "ed25519" && sigAlgo !== "secp256k1") { | |
| throw new BadRequestException(`Unsupported signature algorithm: ${sigAlgo}`); | |
| } | |
| let parsedArgs; | |
| try { | |
| parsedArgs = JSON.parse(Buffer.from(functionCallAction.args, "base64").toString()); | |
| } catch (e) { | |
| throw new BadRequestException('Invalid base64 or JSON in function call args'); | |
| } | |
| const signedDelegate = actionCreators.signedDelegate({ | |
| delegateAction: buildDelegateAction({ | |
| senderId: body.signed_delegate_action.delegate_action.sender_id, | |
| receiverId: body.signed_delegate_action.delegate_action.receiver_id, | |
| actions: [ | |
| functionCall( | |
| functionCallAction.method_name, | |
| parsedArgs, | |
| functionCallAction.gas, | |
| functionCallAction.deposit, | |
| ) | |
| ], | |
| nonce: body.signed_delegate_action.delegate_action.nonce, | |
| maxBlockHeight: body.signed_delegate_action.delegate_action.max_block_height, | |
| publicKey: PublicKey.from(body.signed_delegate_action.delegate_action.public_key), | |
| }), | |
| signature: new Signature({ | |
| keyType: sigAlgo === "ed25519" ? 0 : 1, | |
| data: base_decode(sigData), | |
| // 245, 37, 132, 230, ... 63, 170, 6 | |
| }), | |
| }); | |
| return await this.signAndSendTransaction(body.signed_delegate_action.delegate_action.sender_id, [signedDelegate]); | |
| } |
| export class CreateAccountAtomicRequest { | ||
| account_id: string; | ||
| allowance: number; | ||
| oauth_token: string; | ||
| signed_delegate_action: any; | ||
| } No newline at end of file |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Add validation decorators and improve type safety.
The request class lacks validation decorators, which are essential for request DTOs in NestJS applications. Additionally, signed_delegate_action is typed as any, reducing type safety.
+import { IsString, IsNumber, IsNotEmpty, ValidateNested } from 'class-validator';
+import { Type } from 'class-transformer';
+
+// Define a proper type for signed delegate action
+interface SignedDelegateAction {
+ delegateAction: any; // or more specific type
+ signature: string;
+}
+
export class CreateAccountAtomicRequest {
+ @IsString()
+ @IsNotEmpty()
account_id: string;
+
+ @IsNumber()
allowance: number;
+
+ @IsString()
+ @IsNotEmpty()
oauth_token: string;
- signed_delegate_action: any;
+
+ @ValidateNested()
+ @Type(() => Object)
+ signed_delegate_action: SignedDelegateAction;
}Committable suggestion skipped: line range outside the PR's diff.
| const [issP1, issP2, sub] = job.data.user.internal_account_id.split(":"); | ||
| const issP2Parts = issP2.split("/"); | ||
| const aud = issP2Parts[issP2Parts.length - 1]; |
There was a problem hiding this comment.
Validate internal_account_id format before parsing.
The code splits internal_account_id by : and / without validation. If the format is malformed, accessing array elements will produce undefined, leading to an invalid JWT.
Consider adding validation:
+ if (!job.data.user.internal_account_id || typeof job.data.user.internal_account_id !== 'string') {
+ throw new Error('Invalid internal_account_id');
+ }
const [issP1, issP2, sub] = job.data.user.internal_account_id.split(":");
+ if (!issP1 || !issP2 || !sub) {
+ throw new Error('internal_account_id must have format issP1:issP2:sub');
+ }
const issP2Parts = issP2.split("/");
+ if (issP2Parts.length === 0) {
+ throw new Error('issP2 must contain at least one segment');
+ }
const aud = issP2Parts[issP2Parts.length - 1];📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const [issP1, issP2, sub] = job.data.user.internal_account_id.split(":"); | |
| const issP2Parts = issP2.split("/"); | |
| const aud = issP2Parts[issP2Parts.length - 1]; | |
| if (!job.data.user.internal_account_id || typeof job.data.user.internal_account_id !== 'string') { | |
| throw new Error('Invalid internal_account_id'); | |
| } | |
| const [issP1, issP2, sub] = job.data.user.internal_account_id.split(":"); | |
| if (!issP1 || !issP2 || !sub) { | |
| throw new Error('internal_account_id must have format issP1:issP2:sub'); | |
| } | |
| const issP2Parts = issP2.split("/"); | |
| if (issP2Parts.length === 0) { | |
| throw new Error('issP2 must contain at least one segment'); | |
| } | |
| const aud = issP2Parts[issP2Parts.length - 1]; |
| newPublicKeys.push(providerPublicKey); | ||
| } | ||
|
|
||
| await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys, jwt }); |
There was a problem hiding this comment.
Fix type mismatch in newPublicKeys.
The newPublicKeys array contains PublicKey objects, but AddAccessKeysQueue.JobParams expects newPublicKeys: string[]. This will cause a type error.
Apply this diff:
- await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys, jwt });
+ await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys: newPublicKeys.map(pk => pk.toString()), jwt });📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys, jwt }); | |
| await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys: newPublicKeys.map(pk => pk.toString()), jwt }); |
🤖 Prompt for AI Agents
In scripts/migration/src/queue/MpcUserQueue.ts around line 70, newPublicKeys is
an array of PublicKey objects but the AddAccessKeysQueue.JobParams expects
string[]; map newPublicKeys to strings (e.g., newPublicKeys.map(pk =>
pk.toString()) or appropriate serialization) before passing into
addAccessKeysQueue.add so the payload matches the expected string[] type and
avoid the type error.
| * @returns The stringified input as a buffer. | ||
| */ | ||
| export function bytesJsonStringify(input: any): Buffer { | ||
| return Buffer.from(JSON.stringify(input)); |
There was a problem hiding this comment.
Add error handling for circular references in JSON.stringify.
JSON.stringify throws a TypeError when encountering circular references. Consider adding error handling or documenting this limitation, especially since this utility is used in migration workflows where data integrity is critical.
Apply this diff to add error handling:
export function bytesJsonStringify(input: unknown): Buffer {
- return Buffer.from(JSON.stringify(input));
+ try {
+ return Buffer.from(JSON.stringify(input));
+ } catch (error) {
+ throw new Error(`Failed to stringify input: ${error instanceof Error ? error.message : String(error)}`);
+ }
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| return Buffer.from(JSON.stringify(input)); | |
| export function bytesJsonStringify(input: unknown): Buffer { | |
| try { | |
| return Buffer.from(JSON.stringify(input)); | |
| } catch (error) { | |
| throw new Error(`Failed to stringify input: ${error instanceof Error ? error.message : String(error)}`); | |
| } | |
| } |
🤖 Prompt for AI Agents
In scripts/migration/src/utils/bytesJsonStringify.ts around line 7,
JSON.stringify can throw a TypeError on circular references; wrap the stringify
call in a try/catch, detect a circular-reference TypeError (or any error) and
rethrow a clearer, migration-contextual error (including the original error
message) so callers get actionable info, then continue to return
Buffer.from(theString) when successful; alternatively, if desired, use a safe
stringify replacer (cycle handler) and document the behavior in the function
comment.
| export const firebaseProviderToAuth0Provider = (provider: string): string => { | ||
| switch (provider) { | ||
| case "google.com": | ||
| return "goauth-2"; | ||
| default: | ||
| throw new Error("firebaseProvider unknown " + provider); | ||
| } | ||
| }; |
There was a problem hiding this comment.
🧩 Analysis chain
Limited provider support may cause migration failures.
The function only supports google.com provider and will throw for any other provider (e.g., facebook.com, github.com, apple.com). This could cause migration failures for users with non-Google authentication.
Verify whether the migration data includes users with non-Google providers:
If other providers exist, consider extending the mapping or documenting the limitation.
🏁 Script executed:
#!/bin/bash
# Description: Check for non-Google providers in Firebase export data
# Search for provider IDs in Firebase export
if [ -f "scripts/migration/data/firebase-export.json" ]; then
echo "Checking for provider IDs in firebase-export.json..."
jq -r '.users[].providerUserInfo[].providerId' scripts/migration/data/firebase-export.json 2>/dev/null | sort -u
else
echo "Firebase export file not found at expected location"
fiLength of output: 51
I'll investigate further to understand the actual scope and impact of this limitation.
Now let me search the codebase to understand how this function is used and whether there's any context about what providers should be supported.
Let me now search the codebase directly to understand where this function is used and whether there's context about which providers should be supported.
[/run_scripts]
🏁 Script executed:
#!/bin/bash
# Search for usages of firebaseProviderToAuth0Provider function
rg -n "firebaseProviderToAuth0Provider" --type ts --type tsx --type js --type jsx -C 3Length of output: 27
---
🏁 Script executed:
```shell
#!/bin/bash
# Search for usages of firebaseProviderToAuth0Provider function
rg -n "firebaseProviderToAuth0Provider" -C 3
# Also search for any provider mapping or configuration
echo "---"
echo "Searching for provider-related files/constants:"
fd -e ts -e js | xargs rg -l "provider" | head -20
Length of output: 3062
Now let me examine the exact context around the usage to understand the full impact:
---
🏁 Script executed:
```shell
#!/bin/bash
# Get more context around the usage in MpcUserQueue.ts
rg -n "for (const provider of firebaseUser.providerUserInfo)" -A 10 -B 5 scripts/migration/src/queue/MpcUserQueue.ts
Length of output: 0
---
🏁 Script executed:
```shell
#!/bin/bash
# View the MpcUserQueue.ts file directly
cat scripts/migration/src/queue/MpcUserQueue.ts | sed -n '55,75p'
Length of output: 806
Function will crash for users with non-Google authentication providers.
Verification confirms the issue: at scripts/migration/src/queue/MpcUserQueue.ts (lines 62–67), the code loops through all providers in firebaseUser.providerUserInfo and calls firebaseProviderToAuth0Provider() for each without error handling. Since Firebase supports multiple providers (Google, Facebook, GitHub, Apple, etc.) and the function only handles "google.com", this will throw an error and crash the migration for any user with non-Google authentication.
The function needs to either support additional provider mappings or filter/skip unsupported providers before processing.
🤖 Prompt for AI Agents
In scripts/migration/src/utils/firebase.ts around lines 17–24, the
firebaseProviderToAuth0Provider function currently only maps "google.com" and
throws for any other provider which crashes the migration; change the function
to return a nullable/undefined value (or an explicit "unsupported") instead of
throwing for unknown providers and add mappings for common Firebase providers
(e.g., facebook.com, github.com, apple.com) if Auth0 equivalents are known; then
update callers (like scripts/migration/src/queue/MpcUserQueue.ts) to skip
providers when the function returns undefined/unsupported so non-Google users
are ignored rather than causing an exception.
| "extends": "@shared/tsconfig/package", | ||
| "compilerOptions": { | ||
| "outDir": "./dist", | ||
| "lib": ["ES2022", "DOM"] |
There was a problem hiding this comment.
🧩 Analysis chain
Verify the need for DOM lib in a Node.js migration script.
The "DOM" library is included but migration scripts typically run in Node.js environments and don't require browser APIs. Unless specific DOM types are needed, consider removing it to avoid accidentally using browser-only APIs.
🏁 Script executed:
#!/bin/bash
# Check if DOM-specific types are used in the migration codebase
rg -n --type ts -g '!node_modules/**' '\b(HTMLElement|Document|Window|Element|addEventListener|querySelector|fetch|XMLHttpRequest|localStorage|sessionStorage|navigator)\b' scripts/migration/src/Length of output: 599
I need to verify the Node.js version target for this project to complete the analysis.
---
🏁 Script executed:
```shell
#!/bin/bash
# Check Node.js version requirements in package.json
cat package.json | grep -A 2 "engines"
# Also check tsconfig.json to see if there are any other lib settings or comments
echo "---"
cat scripts/migration/tsconfig.json
Length of output: 185
Let me check the shared tsconfig to understand the inheritance chain:
---
🏁 Script executed:
```shell
#!/bin/bash
# Find the shared tsconfig that migration config extends
fd -n tsconfig.json | xargs grep -l "@shared/tsconfig/package" 2>/dev/null
# Also look for the actual shared config location
fd -n "tsconfig" -path "*/shared/*" 2>/dev/null | head -20
Length of output: 230
Let me correct that command:
---
🏁 Script executed:
```shell
#!/bin/bash
# Find tsconfig files in shared/packages
find . -path "*/shared/*" -name "tsconfig*.json" 2>/dev/null | head -20
# Also search more broadly for the package config
find . -name "tsconfig.json" | xargs grep -l "compilerOptions" 2>/dev/null | grep -i shared
Length of output: 987
---
🏁 Script executed:
```shell
#!/bin/bash
# Check the parent tsconfig that migration extends from
cat packages/shared/tsconfig/tsconfig.package.json
# Also check if DOM appears in base tsconfig
echo "---"
cat packages/shared/tsconfig/tsconfig.base.json | head -30
Length of output: 820
Remove the unnecessary "DOM" lib from the migration script's tsconfig.
Verification confirms that "DOM" is explicitly added only in scripts/migration/tsconfig.json:5 and is not inherited from parent configs. Code analysis found zero DOM API usage—only fetch(), which is available in Node.js 18+ and doesn't require the DOM lib. Removing it prevents accidental use of browser-only types and maintains clarity for Node.js-specific scripts.
🤖 Prompt for AI Agents
In scripts/migration/tsconfig.json at line 5, the "DOM" entry in the "lib" array
is unnecessary for Node.js-only migration scripts and should be removed to avoid
pulling in browser-only types; edit the file to remove "DOM" from the array so
it becomes e.g. ["ES2022"] (or just drop the DOM element if other libs remain),
save the file, and run TypeScript build/check to confirm no missing types.
There was a problem hiding this comment.
Actionable comments posted: 6
♻️ Duplicate comments (19)
scripts/migration/README.md (1)
1-1: Add comprehensive documentation for the migration system.This README contains only a title. The migration system includes complex components (Bull queues, NEAR integration, MPC providers) that require comprehensive documentation including purpose, setup instructions, usage examples, architecture overview, configuration requirements, and safety considerations.
scripts/migration/src/cmd/debug/CreateAccount.ts (2)
4-8: Remove hardcoded credentials from source code.The script contains hardcoded JWT tokens and private keys, which poses a security risk. Move these to environment variables or a local config file.
9-16: Fix logging logic and add error handling.The
claimOidcTokenis called once but its result is logged inside the loop repeatedly. Additionally, the entire sequence lacks error handling for potential failures fromuserCredentialsandcreateNewAccount.scripts/migration/src/cmd/debug/ExportDatastore.ts (1)
13-28: Implement pagination to export all datastore entities.The current implementation only fetches the first page of results from
datastore.runQuery(). For large kinds, this will silently export incomplete data. Loop usinginfo.endCursorand checkinfo.moreResults !== Datastore.NO_MORE_RESULTSuntil all pages are fetched.scripts/migration/src/utils/firebase.ts (1)
17-24: Limited provider support will crash migration for non-Google users.The function only maps
"google.com"and throws for any other provider. This will crash the migration when processing users with Facebook, GitHub, Apple, or other authentication providers, as verified inMpcUserQueue.tswhere all providers are processed without error handling.scripts/migration/src/database/MpcDatabase.ts (1)
13-22: Add error handling for credentials parsing.The constructor parses
GOOGLE_CLOUD_CREDENTIALS_JSONwithout error handling. If the environment variable contains invalid JSON or is missing, this will either throw an unclear error or silently use an empty object, leading to authentication failures later.Apply this diff to add proper validation:
constructor( projectId: string, private readonly kind: string, private readonly pageSize = 1_000, ) { + const credentialsJson = process.env.GOOGLE_CLOUD_CREDENTIALS_JSON; + if (!credentialsJson) { + throw new Error("GOOGLE_CLOUD_CREDENTIALS_JSON environment variable is required"); + } + + let credentials; + try { + credentials = JSON.parse(credentialsJson); + } catch (error) { + throw new Error(`Failed to parse GOOGLE_CLOUD_CREDENTIALS_JSON: ${error.message}`); + } + this.datastore = new Datastore({ projectId, - credentials: JSON.parse(process.env.GOOGLE_CLOUD_CREDENTIALS_JSON || "{}"), + credentials, }); }scripts/migration/src/utils/bytesJsonStringify.ts (1)
1-8: Consider handling stringify failures (e.g., circular references).
JSON.stringify(input)will throw on circular structures; in a migration context that can be hard to diagnose. You might want to wrap it in a try/catch and rethrow with a clearer message (e.g., including the key path or mentioning circular refs), or at least document that the input must be JSON-serializable.scripts/migration/src/cmd/Migrate.ts (1)
3-11: Add basic error handling and logging around the migration run.Right now any failure in
bootstrapMigration,iterateAll, ormpcUserQueue.addwill surface as an unhandled rejection, and there’s no visibility into progress (user count, enqueue status).Wrapping the body in a try/catch with minimal logging and exiting non‑zero on error would make this script much safer to run in CI/ops contexts, e.g. log how many users were enqueued and print the error stack before
process.exit(1).scripts/migration/src/cmd/Dashboard.ts (1)
32-35: Fix the admin URL in the log message.The server listens on port
3009, but the log instructs users to openhttp://localhost:3000/admin. Update the URL to match the actual port so operators don’t chase the wrong endpoint.scripts/migration/src/cmd/Work.ts (1)
3-20: Validate CLI args and harden error handling in the worker launcher.
const workersRaw = process.argv[2];is used without checking; if omitted,workersRaw.split(",")will throw.- Failures in
bootstrapMigration()or any*.process()are not caught or logged.- The “Unknown worker” error could list valid options to aid operators.
Consider:
- Verifying
process.argv[2]is present and non-empty; otherwise print a short usage message and exit non‑zero.- Wrapping bootstrap +
Promise.all(processors)in a try/catch that logs the error (including stack) and exits with code 1.- Expanding the unknown-worker error to enumerate allowed values:
all, mpcUser, addAccessKeys, sign, relay.scripts/migration/docker-compose.yml (1)
10-10: Pin the migration image version for all services.All services using the
migrationimage lack a version tag, which can lead to unpredictable behavior if the image is updated. This affects the migrate, dashboard, and all worker services.Apply consistent versioning across all services:
- image: migration + image: migration:latestOr better yet, use a specific version tag like
migration:1.0.0or an immutable digest.Also applies to: 20-20, 30-30, 38-38, 46-46, 54-54
.github/workflows/build-migration.yaml (1)
47-47: Update docker/build-push-action to v6.The workflow uses an outdated version of the action that may not work on current GitHub Actions runners.
Apply this diff:
- uses: docker/build-push-action@v4 + uses: docker/build-push-action@v6apps/relayer/src/modules/relayer/relayer.controller.ts (1)
39-44: Add input validation for consistency and security.The
createAccountAtomicendpoint doesn't validate the input body, unlike thesignandsignAndSendDelegateActionendpoints (lines 20, 28). This inconsistency creates security risks and makes the API behavior unpredictable.Apply this diff to add validation:
@Post("/create_account_atomic") @ApiOperation({ summary: "Create account" }) @ApiBody({ type: CreateAccountAtomicRequest }) async createAccountAtomic(@Body() body: CreateAccountAtomicRequest): Promise<SignResponse> { + validate(body); return this.fastAuthRelayerService.createAccountAtomic(body); }Also ensure that the
CreateAccountAtomicRequestclass includes appropriateclass-validatordecorators.apps/relayer/src/modules/relayer/relayer.service.ts (1)
109-135: Add comprehensive input validation and error handling.The method accesses deeply nested properties and performs operations without validation, which will cause runtime exceptions with malformed requests.
Key issues to address:
- Validate nested property access (lines 110, 114, 115)
- Validate signature format and algorithm (line 111, 129)
- Add try-catch for base64/JSON parsing (line 119)
- Validate actions array length (line 110)
See the past review comment for detailed validation code.
apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts (1)
1-6: Add validation decorators and improve type safety.The request class lacks validation decorators, which are essential for NestJS DTOs. Additionally,
signed_delegate_actionis typed asany, reducing type safety and making the code prone to runtime errors.Apply validation decorators and define proper types. See the past review comment for detailed implementation.
scripts/migration/src/cmd/bootstrapMigration.ts (1)
15-51: Add error handling for initialization failures.The bootstrap function performs multiple operations that can fail (config loading, file I/O, network connections) without error handling. This can lead to cryptic error messages and inconsistent state during startup.
Wrap the initialization sequence in a try-catch block with clear error messages:
export const bootstrapMigration = () => { try { const config = initMigrationConfig(); // ... rest of initialization return { relayQueue, signQueue, addAccessKeysQueue, mpcUserQueue, firebaseDatabase, }; } catch (error) { console.error('Failed to bootstrap migration:', error); throw new Error(`Migration bootstrap failed: ${error instanceof Error ? error.message : String(error)}`); } };apps/relayer/openapi-spec.json (1)
1-1: Complete the CreateAccountAtomicRequest schema definition.The
CreateAccountAtomicRequestschema has empty properties, preventing API consumers from understanding the required request structure. Based on the implementation, it should include:account_id,allowance,oauth_token, andsigned_delegate_action.scripts/migration/src/queue/MpcUserQueue.ts (1)
67-67: Fix type mismatch in newPublicKeys.The
newPublicKeysarray containsPublicKeyobjects, butAddAccessKeysQueue.JobParamsexpectsstring[]. This will cause a type error.Apply this diff:
- await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys, jwt }); + await this.addAccessKeysQueue.add({ oldPublicKey: oldPublicKey.toString(), newPublicKeys: newPublicKeys.map(pk => pk.toString()), jwt });scripts/migration/src/config/migrationConfig.ts (1)
45-75: Remove hardcoded private keys and load them from config/env with a hard fail if missing
bypassJwt.privateKeyPem,oldMpc.tmpPrivateKey, andnear.relayerPrivateKeyare currently hardcoded in source. Even for tooling/migrations, committing private keys is a critical security risk (leakage via VCS, logs, backups) and was already flagged in a previous review.Instead, read these from
config.jsonand/or environment variables, and throw a clear error if any are missing. For example:-export const initMigrationConfig = (): MigrationConfig => { - let config = {}; +export const initMigrationConfig = (): MigrationConfig => { + let config: Partial<MigrationConfig> = {}; @@ - return { - firebase: { - file: path.join(__dirname, "../../data/firebase-export.json"), - }, - bypassJwt: { - privateKeyPem: - "-----BEGIN PRIVATE KEY-----\nMIIEvAI...IbiFCFzqBw0UQ==\n-----END PRIVATE KEY-----\n", - jwtIssuer: "https://securetoken.google.com/near-mobile-99e38", - jwtAudience: "near-mobile-99e38", - }, - oldMpc: { - url: "https://mpc.mainnet.sig.network", - tmpPrivateKey: "ed25519:MHz6wrZAegmeCXT6frJCJLYtGy39pz48JFeu2R1zKpeS74PVbaCXnZ1h8cCNzt9yyr2rq9DFswJWx7gWD5gRgFd", - salt: 3177899144, - }, - near: { - relayerPrivateKey: "ed25519:MHz6wrZAegmeCXT6frJCJLYtGy39pz48JFeu2R1zKpeS74PVbaCXnZ1h8cCNzt9yyr2rq9DFswJWx7gWD5gRgFd", - networkId: "mainnet", - nodeUrl: "https://rpc.near.org", - indexerUrl: "https://api.nearblocks.io", - }, + const cfg = config as Partial<MigrationConfig>; + + const bypassJwtPrivateKeyPem = + cfg.bypassJwt?.privateKeyPem ?? process.env.BYPASS_JWT_PRIVATE_KEY_PEM; + const oldMpcTmpPrivateKey = + cfg.oldMpc?.tmpPrivateKey ?? process.env.MPC_TMP_PRIVATE_KEY; + const nearRelayerPrivateKey = + cfg.near?.relayerPrivateKey ?? process.env.NEAR_RELAYER_PRIVATE_KEY; + + if (!bypassJwtPrivateKeyPem) { + throw new Error("bypassJwt.privateKeyPem must be provided via config.json or BYPASS_JWT_PRIVATE_KEY_PEM"); + } + if (!oldMpcTmpPrivateKey) { + throw new Error("oldMpc.tmpPrivateKey must be provided via config.json or MPC_TMP_PRIVATE_KEY"); + } + if (!nearRelayerPrivateKey) { + throw new Error("near.relayerPrivateKey must be provided via config.json or NEAR_RELAYER_PRIVATE_KEY"); + } + + return { + firebase: { + file: cfg.firebase?.file ?? path.join(__dirname, "../../data/firebase-export.json"), + }, + bypassJwt: { + privateKeyPem: bypassJwtPrivateKeyPem, + jwtIssuer: cfg.bypassJwt?.jwtIssuer ?? "https://securetoken.google.com/near-mobile-99e38", + jwtAudience: cfg.bypassJwt?.jwtAudience ?? "near-mobile-99e38", + }, + oldMpc: { + url: cfg.oldMpc?.url ?? "https://mpc.mainnet.sig.network", + tmpPrivateKey: oldMpcTmpPrivateKey, + salt: cfg.oldMpc?.salt ?? 3177899144, + }, + near: { + relayerPrivateKey: nearRelayerPrivateKey, + networkId: cfg.near?.networkId ?? "mainnet", + nodeUrl: cfg.near?.nodeUrl ?? "https://rpc.near.org", + indexerUrl: cfg.near?.indexerUrl ?? "https://api.nearblocks.io", + }, @@ - queue: { - redis: "redis://localhost", - }, - ...config, + queue: { + redis: cfg.queue?.redis ?? "redis://localhost", + }, };Also consider documenting recommended secret management (Vault/Secrets Manager, etc.) so these values are injected at runtime rather than committed.
🧹 Nitpick comments (12)
scripts/migration/src/database/MpcDatabase.ts (1)
28-49: Pagination implementation is correct; consider adding error handling.The pagination logic correctly uses
Datastore.NO_MORE_RESULTSto iterate through all pages, addressing the pagination concerns found in other files.Consider adding error handling for Datastore operations to provide better diagnostics if query execution fails:
async iterateAll(callback: (user: MpcUser) => Promise<void>): Promise<void> { let more = true; let nextPageCursor = null; while (more) { - const query = this.datastore.createQuery(this.kind).limit(this.pageSize); - - // Add cursor if continuing pagination - if (nextPageCursor) { - query.start(nextPageCursor); - } - - const [entities, metadata] = await this.datastore.runQuery(query); - for (const entity of entities) { - await callback(entity); - } - - // Pagination info - nextPageCursor = metadata.endCursor; - more = metadata.moreResults !== Datastore.NO_MORE_RESULTS; + try { + const query = this.datastore.createQuery(this.kind).limit(this.pageSize); + + // Add cursor if continuing pagination + if (nextPageCursor) { + query.start(nextPageCursor); + } + + const [entities, metadata] = await this.datastore.runQuery(query); + for (const entity of entities) { + await callback(entity); + } + + // Pagination info + nextPageCursor = metadata.endCursor; + more = metadata.moreResults !== Datastore.NO_MORE_RESULTS; + } catch (error) { + throw new Error(`Failed to iterate ${this.kind} at cursor ${nextPageCursor}: ${error.message}`); + } } }scripts/migration/src/utils/timeout-promise.ts (1)
13-23: Type the timeout handle explicitly for better TS safety.
let timeout;is implicitlyany. To keep types robust across Node/browser targets, consider:let timeout: ReturnType<typeof setTimeout>;This keeps
clearTimeout(timeout)type-safe without changing behavior.docker/migration.Dockerfile (1)
2-17: Make working directory explicit before install/lint steps.This Dockerfile likely relies on the base image’s default
WORKDIRforpnpm installandnpx turbo. To avoid surprises if the base changes, consider moving theWORKDIRdeclaration up (e.g., right afterFROM) so all subsequent RUN steps execute in the intended project root.scripts/migration/docker-compose.yml (1)
3-8: Add health check for Redis dependency.Services depend on Redis but lack health checks, which may cause workers to start before Redis is ready to accept connections.
Apply this diff to add a health check:
redis: image: redis:7.2 container_name: redis + healthcheck: + test: ["CMD", "redis-cli", "ping"] + interval: 5s + timeout: 3s + retries: 5 ports: - "6379:6379"Then update service dependencies:
depends_on: - - redis + redis: + condition: service_healthyscripts/migration/src/database/IFirebaseDatabase.ts (2)
1-8: Consider extracting ProviderUserInfo type for reusability.The inline type for
providerUserInfocould be extracted as a separate type to improve reusability and type safety across the codebase.Apply this diff:
+export type ProviderUserInfo = { + providerId: string; + rawId: string; +}; + export type FirebaseUser = { localId: string; email?: string; - providerUserInfo: { - providerId: string; - rawId: string; - }[]; + providerUserInfo: ProviderUserInfo[]; };
10-13: Document error handling behavior for interface methods.The interface doesn't specify how implementations should handle errors (e.g., user not found in
findById, callback failures initerateAll). Adding JSDoc would clarify expectations.Add documentation:
+/** + * Interface for Firebase user database operations. + */ export interface IFirebaseDatabase { + /** + * Find a user by their ID. + * @param id - The user's localId + * @returns Promise resolving to the FirebaseUser + * @throws Error if user not found or database error occurs + */ findById(id: string): Promise<FirebaseUser>; + + /** + * Iterate over all users in the database. + * @param callback - Async function called for each user + * @returns Promise that resolves when iteration completes + * @throws Error if iteration fails; callback errors should be handled by caller + */ iterateAll(callback: (user: FirebaseUser) => Promise<void>): Promise<void>; }scripts/migration/package.json (1)
32-32: Pin TypeScript to a specific version.Using
"latest"for TypeScript can cause build inconsistencies and unexpected breaking changes across environments.Apply this diff to pin TypeScript:
- "typescript": "latest", + "typescript": "^5.7.0",scripts/migration/src/queue/MpcUserQueue.ts (1)
43-49: Consider validating user.localId before JWT creation.The code uses
job.data.user.localIdas the JWT subject without validation. IflocalIdis empty or undefined, it could produce an invalid JWT.Consider adding validation:
async _process(job: Bull.Job<JobParams>): Promise<void> { + if (!job.data.user.localId) { + throw new Error('user.localId is required'); + } const pk = createPrivateKey({ key: this.normalizePem(this.privateKeyPem), format: "pem", type: "pkcs1" });scripts/migration/src/queue/Queue.ts (1)
3-41: Base queue abstraction looks solid; consider making logging/concurrency configurableThe wrapper around
Bull.Queueis clean and error handling is appropriate. To keep this generic layer reusable, you might want to (optionally) inject a logger instead of usingconsoledirectly and expose concurrency or processor options viaprocess()/constructor so different queues can tune throughput without modifying the base class.scripts/migration/src/queue/SignQueue.ts (1)
28-39: Signing flow and handoff toRelayQueuelook consistentThe deserialize → sign → enqueue pipeline matches the serialization used in
MPCProvider.sign, so the base64 payloads should round‑trip correctly. If you ever need easier debugging, you might consider adding some lightweight logging (e.g., job id and maybe a shortened delegate action hash) before pushing torelayQueue, but functionally this looks good.scripts/migration/src/provider/MPCProvider.ts (1)
204-218: Consider adding a timeout and richer error context to MPC HTTP requests
request()currently does a barefetchand only checksres.status >= 300. For long‑running or stuck MPC endpoints, this risks hanging indefinitely and producing opaque failures.You might want to:
- wrap
fetchin the sametimeoutPromisepattern used inNearProvider, and- include status code and possibly response body snippet in the thrown error for easier diagnosis.
For example:
// pseudo-code const res = await timeoutPromise(fetch(...), 20_000); if (!res.ok) { const body = await res.text(); throw new Error(`MPC ${path} failed (${res.status}): ${body}`); }This would make migration failures much easier to understand and recover from.
scripts/migration/src/provider/NearProvider.ts (1)
78-92:_getAccountIdsByPublicKeyappears unused; consider removing or documenting fallback usageThere are two lookup methods:
_getAccountIdsByPublicKeyhitting/v1/kitwallet/publicKey/...(private, unused here).getAccountIdsByPublicKeyhitting/v0/public_key/...(public, used by callers).If
/v1/kitwalletis a legacy/fallback endpoint you plan to keep, it’d be good to document when to use it; otherwise, consider removing_getAccountIdsByPublicKeyto avoid dead code.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (38)
.github/workflows/build-migration.yaml(1 hunks).github/workflows/pull-request.yml(1 hunks)apps/relayer/openapi-spec.json(1 hunks)apps/relayer/src/modules/relayer/relayer.controller.ts(2 hunks)apps/relayer/src/modules/relayer/relayer.service.ts(2 hunks)apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts(1 hunks)docker/migration.Dockerfile(1 hunks)pnpm-workspace.yaml(1 hunks)scripts/migration/.eslintrc.js(1 hunks)scripts/migration/.gitignore(1 hunks)scripts/migration/README.md(1 hunks)scripts/migration/config-example.json(1 hunks)scripts/migration/docker-compose.yml(1 hunks)scripts/migration/package.json(1 hunks)scripts/migration/src/cmd/Dashboard.ts(1 hunks)scripts/migration/src/cmd/Migrate.ts(1 hunks)scripts/migration/src/cmd/Work.ts(1 hunks)scripts/migration/src/cmd/bootstrapMigration.ts(1 hunks)scripts/migration/src/cmd/debug/CreateAccount.ts(1 hunks)scripts/migration/src/cmd/debug/ExportDatastore.ts(1 hunks)scripts/migration/src/config/migrationConfig.ts(1 hunks)scripts/migration/src/database/IFirebaseDatabase.ts(1 hunks)scripts/migration/src/database/LocalFirebaseDatabase.ts(1 hunks)scripts/migration/src/database/MpcDatabase.ts(1 hunks)scripts/migration/src/database/RemoteFirebaseDatabase.ts(1 hunks)scripts/migration/src/provider/FastAuthProvider.ts(1 hunks)scripts/migration/src/provider/MPCProvider.ts(1 hunks)scripts/migration/src/provider/NearProvider.ts(1 hunks)scripts/migration/src/provider/NearSigner.ts(1 hunks)scripts/migration/src/queue/AddAccessKeysQueue.ts(1 hunks)scripts/migration/src/queue/MpcUserQueue.ts(1 hunks)scripts/migration/src/queue/Queue.ts(1 hunks)scripts/migration/src/queue/RelayQueue.ts(1 hunks)scripts/migration/src/queue/SignQueue.ts(1 hunks)scripts/migration/src/utils/bytesJsonStringify.ts(1 hunks)scripts/migration/src/utils/firebase.ts(1 hunks)scripts/migration/src/utils/timeout-promise.ts(1 hunks)scripts/migration/tsconfig.json(1 hunks)
🧰 Additional context used
🧠 Learnings (13)
📚 Learning: 2025-09-01T14:25:59.510Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/test/mocks/wci/chains/nervos/providers/nervos-account-scanner.provider.mock.ts:1-6
Timestamp: 2025-09-01T14:25:59.510Z
Learning: In the Nervos blockchain implementation, scanning functionality has been separated from the account provider into a dedicated account scanner provider. The NervosScannedAccountAddressMock and NervosScannedAccountMock now belong in the nervos-account-scanner.provider.mock.ts file, not the account provider mock file.
Applied to files:
scripts/migration/src/cmd/debug/CreateAccount.tsscripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-19T14:39:32.188Z
Learnt from: ShinningPikachu
Repo: Peersyst/near-mobile PR: 582
File: apps/api/src/config/push-notification.config.ts:18-23
Timestamp: 2025-09-19T14:39:32.188Z
Learning: When users indicate they're keeping hard-coded credentials temporarily for testing purposes and plan to remove them later, acknowledge their decision and provide guidance on secure alternatives rather than being insistent about immediate removal.
Applied to files:
scripts/migration/src/cmd/debug/CreateAccount.ts
📚 Learning: 2025-04-25T08:26:35.531Z
Learnt from: JordiParraCrespo
Repo: Peersyst/near-mobile PR: 214
File: packages/frontend/account/src/domain/controllers/account.controller.ts:37-40
Timestamp: 2025-04-25T08:26:35.531Z
Learning: Error handling with try-catch blocks is not needed in the `signAndBroadcastTransaction` method of the `AccountController` class in the NEAR mobile codebase, as errors are likely handled at a higher level or allowed to propagate naturally through the promise chain.
Applied to files:
scripts/migration/src/cmd/debug/CreateAccount.tsapps/relayer/src/modules/relayer/relayer.service.ts
📚 Learning: 2025-09-12T07:12:41.650Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 77
File: packages/frontend/token/src/ui/queries/token.queries.ts:11-14
Timestamp: 2025-09-12T07:12:41.650Z
Learning: In packages/frontend/token/src/ui/queries/token.queries.ts, the detail function accepts a chain parameter but does not include it in the query key, which could lead to cache key collisions if currencyId is not globally unique across chains.
Applied to files:
scripts/migration/src/cmd/debug/CreateAccount.ts
📚 Learning: 2025-03-14T12:05:34.665Z
Learnt from: JordiParraCrespo
Repo: Peersyst/near-mobile PR: 93
File: packages/shared/blockchain/src/wci/chains/near/clients/staking/near-staking-client.ts:30-45
Timestamp: 2025-03-14T12:05:34.665Z
Learning: The error handling and BigInt validation approach suggested for NearStakingClient.getValidatorStakingBalance method was declined as it does not apply to this codebase's requirements.
Applied to files:
apps/relayer/src/modules/relayer/relayer.service.ts
📚 Learning: 2025-02-27T10:24:14.276Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 58
File: packages/shared/blockchain/core/src/signers/signer.ts:26-27
Timestamp: 2025-02-27T10:24:14.276Z
Learning: The Signer interface uses `Record<string, any>` for the transaction parameter in `signTransaction` method to allow implementing classes to add their own specific types extending this base type, providing flexibility for different blockchain implementations.
Applied to files:
scripts/migration/src/provider/NearSigner.ts
📚 Learning: 2025-02-27T10:23:16.083Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 58
File: packages/shared/blockchain/near/src/signers/near-private-key.signer.ts:7-10
Timestamp: 2025-02-27T10:23:16.083Z
Learning: KeyPair.fromString() from near-api-js already performs internal validation and throws an error if an invalid key is provided, making additional validation checks redundant. The type definition in the library is incorrect - it should accept string rather than KeyPairString.
Applied to files:
scripts/migration/src/queue/MpcUserQueue.ts
📚 Learning: 2025-08-27T15:08:02.153Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 544
File: apps/mobile/src/modules/npro/containers/npro-pre-staking-token/npro-pre-staking-token.tsx:19-22
Timestamp: 2025-08-27T15:08:02.153Z
Learning: The config system in this codebase uses strong TypeScript typing where useConfig("npro.stakingStartsAt") returns string (not string | undefined), guaranteeing the value will be present at runtime and eliminating the need for defensive guard clauses.
Applied to files:
scripts/migration/src/config/migrationConfig.ts
📚 Learning: 2025-09-10T13:23:58.048Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 571
File: packages/shared/api/src/index.ts:33-33
Timestamp: 2025-09-10T13:23:58.048Z
Learning: The packages/shared/api package exports source files directly (./src/index.ts) rather than compiled output because it's auto-generated by openapi-typescript-codegen and this export strategy is intentional for this shared package.
Applied to files:
scripts/migration/tsconfig.jsonscripts/migration/package.json
📚 Learning: 2025-04-07T12:08:24.795Z
Learnt from: obnol
Repo: Peersyst/near-mobile PR: 154
File: packages/frontend/auth/package.json:0-0
Timestamp: 2025-04-07T12:08:24.795Z
Learning: The BiometricsService is correctly re-exported in /packages/frontend/auth/src/data-access/services/index.ts with the line `export * from "./biometrics.service";`, validating the export path in package.json.
Applied to files:
scripts/migration/package.json
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented in NervosAccountProvider by calling getAddressAndScriptFromPublicKey and returning the first element of the tuple. Calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented, so calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface has both getAddressFromPublicKey(publicKey: string): string and getAddressAndScriptFromPublicKey(publicKey: string): [string, Script] methods available. The getAddressFromPublicKey method exists and is correctly implemented.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
🧬 Code graph analysis (10)
scripts/migration/src/cmd/Dashboard.ts (1)
scripts/migration/src/config/migrationConfig.ts (1)
initMigrationConfig(37-76)
scripts/migration/src/cmd/Migrate.ts (1)
scripts/migration/src/cmd/bootstrapMigration.ts (1)
bootstrapMigration(15-51)
scripts/migration/src/cmd/debug/CreateAccount.ts (1)
scripts/migration/src/provider/MPCProvider.ts (2)
MPCProvider(69-219)userCredentials(83-89)
scripts/migration/src/database/RemoteFirebaseDatabase.ts (3)
scripts/migration/src/database/IFirebaseDatabase.ts (2)
IFirebaseDatabase(10-13)FirebaseUser(1-8)scripts/migration/src/queue/Queue.ts (1)
process(30-41)apps/relayer/src/modules/near/near-client.service.ts (1)
provider(27-29)
apps/relayer/src/modules/relayer/relayer.controller.ts (3)
apps/relayer/src/modules/relayer/requests/create-account-atomic.request.ts (1)
CreateAccountAtomicRequest(1-6)scripts/migration/src/provider/MPCProvider.ts (1)
SignResponse(25-27)apps/relayer/src/modules/relayer/response/sign.response.ts (1)
SignResponse(3-6)
scripts/migration/src/cmd/Work.ts (2)
scripts/migration/src/cmd/bootstrapMigration.ts (1)
bootstrapMigration(15-51)scripts/migration/src/queue/Queue.ts (1)
process(30-41)
scripts/migration/src/queue/MpcUserQueue.ts (5)
scripts/migration/src/queue/AddAccessKeysQueue.ts (3)
QUEUE_NAME(10-10)JobParams(12-16)AddAccessKeysQueue(18-69)scripts/migration/src/database/IFirebaseDatabase.ts (1)
FirebaseUser(1-8)scripts/migration/src/provider/MPCProvider.ts (2)
MPCProvider(69-219)userCredentials(83-89)scripts/migration/src/provider/FastAuthProvider.ts (1)
FastAuthProvider(6-69)scripts/migration/src/utils/firebase.ts (1)
firebaseProviderToAuth0Provider(17-24)
scripts/migration/src/config/migrationConfig.ts (1)
packages/shared/cli/src/cli.mjs (1)
__dirname(12-12)
scripts/migration/src/queue/RelayQueue.ts (1)
scripts/migration/src/provider/NearProvider.ts (1)
NearProvider(23-208)
scripts/migration/src/provider/NearProvider.ts (2)
scripts/migration/src/provider/NearSigner.ts (1)
NearSigner(8-52)scripts/migration/src/utils/timeout-promise.ts (2)
timeoutPromise(13-24)TimeoutPromiseError(1-5)
🪛 actionlint (1.7.9)
.github/workflows/build-migration.yaml
47-47: the runner of "docker/build-push-action@v4" action is too old to run on GitHub Actions. update the action's version to fix this issue
(action)
🪛 Checkov (3.2.334)
apps/relayer/openapi-spec.json
[high] 1: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🔇 Additional comments (15)
pnpm-workspace.yaml (1)
11-11: LGTM!The addition of the migration package to the workspace is correctly configured.
scripts/migration/.gitignore (1)
1-2: LGTM!The gitignore patterns appropriately exclude generated data and configuration files that may contain sensitive information.
scripts/migration/.eslintrc.js (1)
1-10: LGTM!The ESLint configuration is appropriate for migration tooling, allowing console usage for CLI scripts and relaxing JSDoc requirements for internal utilities.
scripts/migration/src/utils/firebase.ts (1)
5-11: LGTM!The regex-based extraction logic correctly parses the Firebase ID from the internal account ID format, with appropriate error handling for invalid inputs.
scripts/migration/tsconfig.json (1)
1-8: Migration tsconfig wiring looks consistent with shared setup.Extends the shared package config, sets a local
outDir, and excludes build artifacts and Jest config — all reasonable for this package..github/workflows/pull-request.yml (1)
29-36: LGTM! Migration script build job configured correctly.The new
build-migration-scriptjob follows the same pattern asbuild-relayerand correctly depends onbuild-base. Thepush: truesetting enables the image to be pushed for integration testing.apps/relayer/src/modules/relayer/relayer.controller.ts (1)
32-37: Note: Existing createAccount endpoint also lacks validation.For consistency, the existing
createAccountendpoint (line 35) should also callvalidate(body)like the other endpoints. This appears to be a pre-existing issue but should be addressed to maintain a uniform validation strategy across all endpoints.scripts/migration/docker-compose.yml (1)
29-60: Workers do not require Google Cloud credentials.The worker services (mpc-user-worker, add-access-keys-worker, sign-worker, relay-worker) only invoke NEAR and MPC providers. They do not access any Google Cloud resources. The
MpcDatabaseclass, which is the only code that usesGOOGLE_CLOUD_CREDENTIALS_JSON, is commented out inbootstrapMigration.tsand therefore never instantiated by any service—migrate, workers, or otherwise. Workers will function correctly without this environment variable.Likely an incorrect or invalid review comment.
scripts/migration/src/provider/NearSigner.ts (1)
8-52: LGTM! Clean signing implementation.The NearSigner class correctly implements NEAR transaction signing:
- Uses KeyPair.fromString which validates internally (per learnings)
- Follows the proper signing flow: encode → SHA-256 → sign with ed25519
- Returns hex-encoded public key as address (correct for implicit accounts)
scripts/migration/src/database/LocalFirebaseDatabase.ts (1)
23-43: LGTM! Efficient streaming implementation.The
iterateAllmethod correctly uses streaming to handle large JSON exports without loading everything into memory. The finally block ensures proper cleanup even if the callback throws.scripts/migration/src/queue/RelayQueue.ts (1)
26-40: LGTM! Correct deserialization and error handling.The
_processmethod correctly:
- Deserializes the base64-encoded DelegateAction
- Relays the meta-transaction through NearProvider
- Throws with helpful context (hash + status) on failure
scripts/migration/src/database/RemoteFirebaseDatabase.ts (1)
15-25: LGTM! Correct mapping from Firebase UserRecord.The
findByIdmethod correctly maps Firebase Admin's UserRecord to the FirebaseUser type, properly handling optional email and provider data.scripts/migration/src/queue/AddAccessKeysQueue.ts (1)
31-68: LGTM! Correct nonce management and delegate action flow.The processing logic correctly:
- Fetches accounts associated with the old public key (including implicit account)
- Manages nonces incrementally for multiple key additions per account
- Skips accounts where keys don't exist or already have full access
- Serializes delegate actions properly for the signing queue
scripts/migration/src/provider/FastAuthProvider.ts (1)
33-48: AligngetPublicKey’s return type with the actual contract result
viewFunctionparses the raw result and returnsany, andgetPublicKeyjust forwards that value while declaringPromise<PublicKey>. In practice, the FastAuth/MPC contract will most likely return a string or JSON object rather than anear-api-jsPublicKeyinstance, so callers may get a mismatched type at runtime.Consider decoding explicitly and constructing a
PublicKey(or adjusting the return type if you really want a string):private async viewFunction(/* ... */): Promise<unknown> { this.validateArgs(args); const encodedArgs = bytesJsonStringify(args); const result = await this.connection.provider.query<CodeResult>({ /* ... */ }); return JSON.parse(Buffer.from(result.result).toString("utf8")); } async getPublicKey(provider: string, providerId: string): Promise<PublicKey> { const value = await this.viewFunction({ contractId: this.mpcContractId, methodName: "derived_public_key", args: { path: `jwt#https://${this.fastAuthDomain}/#${provider}|${providerId}`, predecessor: this.fastAuthContractId, domain_id: 1, }, }); const keyStr = typeof value === "string" ? value : (value as any).public_key; return PublicKey.from(keyStr as string); }This makes the contract→SDK boundary explicit and avoids surprises for downstream code.
Also applies to: 55-68
scripts/migration/src/provider/MPCProvider.ts (1)
57-67: Verify custom schemas andserialize.serializeusage against MPC/NEAR specsThe custom schemas
U8_32/U8_ARRAYand the calls like:const saltSerialized = serialize.serialize("u32", this.salt); const oidcSerialized = serialize.serialize(U8_32, new BytesBox(await this.hashToken(jwt)));are a bit unconventional for
near-api-js’sserializehelpers andborsh.Schema. It would be easy to accidentally drift from the format the MPC backend expects (e.g., if it expects a specific Borsh layout or a Map‑based schema).Please double‑check these schemas and
serialize.serializecall patterns against:
- the MPC service spec for
frpSignature*payloads, and- the exact version of
near-api-js/borshused in this repo.If the library provides more explicit helpers for fixed‑size arrays/u32s, using those directly would reduce ambiguity.
Also applies to: 165-197
| { | ||
| "oldMpc": { | ||
| "url": "http://localhost:3000", | ||
| "tmpPrivateKey": "ed25519:MHz6wr...gRgFd", | ||
| "salt": 3177899144, | ||
| "bypassPrivateKeyPem": "-----BEGIN PRIVATE KEY-----\nMIIEvAI...IbiFCFzqBw0UQ==\n-----END PRIVATE KEY-----\n" | ||
| }, | ||
| "near": { | ||
| "relayerPrivateKey": "ed25519:28u4nNG...MoK", | ||
| "networkId": "testnet", | ||
| "nodeUrl": "https://rpc.testnet.near.org", | ||
| "indexerUrl": "https://api-testnet.nearblocks.io" | ||
| }, | ||
| "fastAuth": { | ||
| "mpcContractId": "v1.signer-prod.testnet", | ||
| "contractId": "fast-auth-beta-001.testnet", | ||
| "domainUrl": "https://domain.com" | ||
| } | ||
| } No newline at end of file |
There was a problem hiding this comment.
Avoid committing real-looking private keys in example config.
This example includes what look like actual private keys / PEM material. Even if these are dummy or already-rotated, it’s safer to replace them with obvious placeholders (e.g. "ed25519:YOUR_TMP_PRIVATE_KEY_HERE", "-----BEGIN PRIVATE KEY-----\n...REDACTED...\n-----END PRIVATE KEY-----\n") and document that real values must come from secure config (env, secrets manager, non-committed file).
Keeping realistic secrets in-tree increases the risk of accidental reuse and complicates secret-scanning/compliance.
🤖 Prompt for AI Agents
In scripts/migration/config-example.json around lines 1 to 19, the example file
contains realistic-looking private keys and PEM material; replace those values
with clear placeholders (e.g. "ed25519:YOUR_TMP_PRIVATE_KEY_HERE", "-----BEGIN
PRIVATE KEY-----\n...REDACTED...\n-----END PRIVATE KEY-----\n") and add a short
comment or note in the file header indicating that real secrets must not be
committed and should be supplied via environment variables or a secrets
manager/non-committed config file; ensure no real key fragments remain and run
secret-scan to verify before committing.
| async findById(_: string): Promise<FirebaseUser> { | ||
| throw new Error("Method not implemented."); | ||
| } |
There was a problem hiding this comment.
Document or implement findById for interface completeness.
The findById method throws "Method not implemented" but is required by the IFirebaseDatabase interface. If local file lookups aren't supported, consider documenting this limitation in a comment. Otherwise, implement the method (possibly by caching during iteration or scanning the file).
Do you want me to generate an implementation that scans the export file to find a user by ID?
| constructor() { | ||
| admin.initializeApp({ | ||
| credential: admin.credential.cert(JSON.parse(process.env.FIREBASE_SERVICE_ACCOUNT_JSON || "{}")), | ||
| }); | ||
| } |
There was a problem hiding this comment.
Validate FIREBASE_SERVICE_ACCOUNT_JSON before initialization.
The code falls back to "{}" if the environment variable is missing, which will cause Firebase initialization to fail silently or with a cryptic error later.
Apply this diff to fail early with a clear message:
constructor() {
+ const serviceAccountJson = process.env.FIREBASE_SERVICE_ACCOUNT_JSON;
+ if (!serviceAccountJson) {
+ throw new Error('FIREBASE_SERVICE_ACCOUNT_JSON environment variable is required');
+ }
admin.initializeApp({
- credential: admin.credential.cert(JSON.parse(process.env.FIREBASE_SERVICE_ACCOUNT_JSON || "{}")),
+ credential: admin.credential.cert(JSON.parse(serviceAccountJson)),
});
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| constructor() { | |
| admin.initializeApp({ | |
| credential: admin.credential.cert(JSON.parse(process.env.FIREBASE_SERVICE_ACCOUNT_JSON || "{}")), | |
| }); | |
| } | |
| constructor() { | |
| const serviceAccountJson = process.env.FIREBASE_SERVICE_ACCOUNT_JSON; | |
| if (!serviceAccountJson) { | |
| throw new Error('FIREBASE_SERVICE_ACCOUNT_JSON environment variable is required'); | |
| } | |
| admin.initializeApp({ | |
| credential: admin.credential.cert(JSON.parse(serviceAccountJson)), | |
| }); | |
| } |
🤖 Prompt for AI Agents
In scripts/migration/src/database/RemoteFirebaseDatabase.ts around lines 5-9,
don't fall back to "{}" when FIREBASE_SERVICE_ACCOUNT_JSON is missing or
invalid; instead check process.env.FIREBASE_SERVICE_ACCOUNT_JSON exists and is
non-empty, attempt JSON.parse inside a try/catch, and if missing or parsing
fails throw a clear, early Error indicating the env var is required and whether
parsing failed (include the parse error message); only call admin.initializeApp
with the parsed credential object after these validations succeed.
| async iterateAll(_: (user: FirebaseUser) => Promise<void>): Promise<void> { | ||
| throw new Error("Method not implemented"); | ||
| } |
There was a problem hiding this comment.
Document or implement iterateAll for interface completeness.
The iterateAll method is not implemented but required by the IFirebaseDatabase interface. If remote iteration isn't supported, document this limitation. Otherwise, implement it using Firebase Admin's listUsers() API.
Do you want me to generate an implementation using admin.auth().listUsers() to iterate all remote users?
🤖 Prompt for AI Agents
In scripts/migration/src/database/RemoteFirebaseDatabase.ts around lines 31-33,
the iterateAll method is currently unimplemented; replace the throw with an
implementation that uses Firebase Admin's admin.auth().listUsers() API to page
through all users (using nextPageToken), invoking the provided async callback
for each FirebaseUser, await each callback (or batch/limit concurrency if
desired), handle and surface errors (reject the Promise on fatal errors) and
return when no more pages remain; ensure you import/admin client is available
and add brief JSDoc stating it paginates via listUsers().
| private async signAndSendTransaction( | ||
| receiverId: string, | ||
| actions: Action[], | ||
| ): Promise<{ | ||
| hash: string; | ||
| result: FinalExecutionOutcome; | ||
| }> { | ||
| const sender = this.relayerSigner.getAddress(); | ||
| const senderPublicKey = this.relayerSigner.getPublicKey(); | ||
| const senderPublicKeyString = senderPublicKey.toString(); | ||
|
|
||
| // Do not use Promise.all to ensure client service updates recent hash only once | ||
| const nonce = (await this.getAccountNonce(sender, senderPublicKeyString)) + 1n; | ||
| const recentBlockHash = await this.getRecentBlockHash(); | ||
|
|
||
| const tx = createTransaction(sender, senderPublicKey, receiverId, nonce, actions, base_decode(recentBlockHash)); | ||
|
|
||
| const { signedTransaction, hash } = this.relayerSigner.signTransaction(tx); | ||
|
|
||
| console.log(`Sending transaction ${hash}`); | ||
| const result = await this.near.connection.provider.sendTransactionUntil(signedTransaction, "FINAL"); | ||
|
|
||
| console.log(`Sign to ${receiverId} done with hash ${hash}`); | ||
|
|
||
| return { hash, result }; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Find the NearProvider file and understand its usage
find . -name "NearProvider.ts" -type fRepository: Peersyst/fast-auth
Length of output: 110
🏁 Script executed:
# Search for imports and usage of NearProvider
rg "NearProvider" --type ts --type tsx -B 2 -A 2Repository: Peersyst/fast-auth
Length of output: 89
🏁 Script executed:
# Check if signAndSendTransaction is called in concurrent scenarios
rg "signAndSendTransaction" --type ts --type tsx -B 3 -A 3Repository: Peersyst/fast-auth
Length of output: 89
🏁 Script executed:
# Look at the broader context of the migration script
git ls-files "scripts/migration/src" | head -20Repository: Peersyst/fast-auth
Length of output: 981
🏁 Script executed:
# Check if there's any error handling or retry logic in the file
sed -n '1,300p' scripts/migration/src/provider/NearProvider.ts | tail -100Repository: Peersyst/fast-auth
Length of output: 3299
🏁 Script executed:
# Check how signAndSendTransaction is used via relayMetaTransaction
rg "relayMetaTransaction" --type ts -B 2 -A 2Repository: Peersyst/fast-auth
Length of output: 837
🏁 Script executed:
# Examine the RelayQueue to see if it uses NearProvider
cat -n scripts/migration/src/queue/RelayQueue.tsRepository: Peersyst/fast-auth
Length of output: 1567
🏁 Script executed:
# Check Work.ts to understand the execution model
cat -n scripts/migration/src/cmd/Work.tsRepository: Peersyst/fast-auth
Length of output: 1126
🏁 Script executed:
# Check if there are multiple process/worker instances
rg "fork\|spawn\|worker\|cluster" scripts/migration/src --type tsRepository: Peersyst/fast-auth
Length of output: 44
Be aware of potential nonce races if multiple relayer workers process jobs concurrently
signAndSendTransaction derives the nonce as:
const nonce = (await this.getAccountNonce(sender, senderPublicKeyString)) + 1n;This is fine as long as there's only a single writer for the relayer key. However, Work.ts supports multiple concurrent relay queue workers, and each job calling relayMetaTransaction → signAndSendTransaction will derive nonces independently. If multiple workers process jobs simultaneously, they can compute the same nonce and cause "invalid nonce" or Transaction <...> already processed errors.
Consider:
- centralizing nonce management (e.g., via a single relayer service or distributed lock), or
- adding retry logic around
sendTransactionUntilfor nonce‑related failures.
🤖 Prompt for AI Agents
In scripts/migration/src/provider/NearProvider.ts around lines 160 to 185,
signAndSendTransaction computes nonce once and can cause nonce races when
multiple relayer workers run concurrently; fix by adding nonce-management or
retry logic: implement a short retry loop around signing+send that catches
nonce-related failures (invalid nonce, transaction already processed) and on
failure re-fetches the current account nonce (getAccountNonce), increments it,
re-creates/signs the transaction and retries with exponential backoff (limit
attempts), or alternatively replace per-call nonce derivation with a centralized
nonce allocator (single process or a simple atomic counter persisted/shared) so
only one writer increments nonces; ensure retries only trigger for nonce-related
errors and other errors are propagated.
| } catch (e) { | ||
| if (e?.toString().includes("Can't complete the action because access key")) { | ||
| this.log(`skipping account ${account}: access key ${job.data.oldPublicKey} not found`, "warn"); | ||
| continue; | ||
| } else throw e; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Check for near-api-js imports and error handling patterns
rg -t ts "near-api-js|error|Exception" scripts/migration/src/queue/AddAccessKeysQueue.ts -B 2 -A 2Repository: Peersyst/fast-auth
Length of output: 493
🏁 Script executed:
# Look for error handling patterns in the entire codebase
rg -t ts "catch.*e\?" scripts/migration/src/ -A 3 | head -50Repository: Peersyst/fast-auth
Length of output: 44
🏁 Script executed:
# Check if there are custom error types defined
fd -t ts "error|exception" scripts/migration/src/ --exec grep -l "class.*Error\|interface.*Error" {} \;Repository: Peersyst/fast-auth
Length of output: 231
🏁 Script executed:
# Look for near-api-js imports and how they're used
rg -t ts "from.*near-api-js|import.*near" scripts/migration/src/ -B 1 -A 1Repository: Peersyst/fast-auth
Length of output: 5135
🏁 Script executed:
# Get more context around the error handling in AddAccessKeysQueue.ts
sed -n '30,55p' scripts/migration/src/queue/AddAccessKeysQueue.tsRepository: Peersyst/fast-auth
Length of output: 1546
🏁 Script executed:
# Check NearProvider for error handling patterns
grep -n "catch\|throw\|Error" scripts/migration/src/provider/NearProvider.ts -B 2 -A 3 | head -60Repository: Peersyst/fast-auth
Length of output: 1024
🏁 Script executed:
# Search for the exact error string to see if it's documented or common
rg "Can't complete the action" scripts/migration/Repository: Peersyst/fast-auth
Length of output: 205
🏁 Script executed:
# Check near-api-js package.json to understand versions and error handling
find . -name "package*.json" -exec grep -l "near-api-js" {} \;Repository: Peersyst/fast-auth
Length of output: 253
🏁 Script executed:
# Look for error handling in getAccountNonce or related methods
grep -n "getAccountNonce\|getNonce\|getAccessKey" scripts/migration/src/provider/NearProvider.ts -B 2 -A 10Repository: Peersyst/fast-auth
Length of output: 1871
🏁 Script executed:
# Check the package.json for near-api-js version
cat scripts/migration/package.json | grep -A 5 -B 5 "near-api-js"Repository: Peersyst/fast-auth
Length of output: 364
🏁 Script executed:
# Look for any error type definitions or status codes
rg "status|code|type" scripts/migration/src/provider/NearProvider.ts -B 2 -A 2 | head -40Repository: Peersyst/fast-auth
Length of output: 1073
🏁 Script executed:
# One more check: look for any custom error handling utilities in the migration scripts
find scripts/migration/src -type f -name "*.ts" -exec grep -l "Error\|Exception" {} \;Repository: Peersyst/fast-auth
Length of output: 481
Error handling relies on fragile string matching that may break with NEAR RPC updates.
The code checks for missing access keys by matching the error message string "Can't complete the action because access key" from the RPC provider. The near-api-js library does not expose structured error types or codes for such failures, making this string-based approach the current workaround. However, this pattern is fragile—if NEAR changes its error message format in a future RPC update, this check will silently fail and the error will propagate instead of being gracefully skipped.
Consider implementing a wrapper around the RPC call that could:
- Log and inspect the full error object to identify stable error attributes
- Monitor NEAR release notes for message format changes
- Switch to structured error handling if
near-api-jsor@near-jspackages introduce error type definitions
🤖 Prompt for AI Agents
In scripts/migration/src/queue/AddAccessKeysQueue.ts around lines 40 to 45, the
current catch block relies on fragile string matching of e.toString() to detect
a missing access key; replace this with a small wrapper/helper that normalizes
RPC errors and detects "missing access key" using stable checks (inspect e.type,
e.name, e.kind, e.code and nested properties) and only fall back to a cautious
regex match when none exist; update the catch to call that helper, log the
entire error object (stack/serialized form) at debug/error level for monitoring,
and return/skips the account when the helper reports a missing-access-key
condition, otherwise rethrow the original error.
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (2)
scripts/migration/src/provider/NearProvider.ts (1)
158-183: Be aware of potential nonce races if multiple relayer workers process jobs concurrently.The nonce derivation logic at line 170:
const nonce = (await this.getAccountNonce(sender, senderPublicKeyString)) + 1n;This is safe only with a single writer. However, if multiple relay queue workers process jobs concurrently (as supported by
Work.ts), they can independently compute the same nonce, leading to "invalid nonce" or "transaction already processed" errors.Consider centralizing nonce management (e.g., via a single relayer service or distributed lock), or adding retry logic around
sendTransactionUntilfor nonce-related failures.scripts/migration/src/config/migrationConfig.ts (1)
37-74: Remove hardcoded private keys and make config merging saferThis function currently embeds three private keys directly in source (
bypassJwt.privateKeyPem,oldMpc.tmpPrivateKey,near.relayerPrivateKey) and then shallow‑mergesconfigon top. That has two problems:
- Security (critical): Committing real or placeholder private keys is a hard “no” — they will be scanned by tools and risk leaking in logs, PRs, etc. Secrets should come from env/config at runtime, not from the repo. This was already flagged in a previous review and still needs to be addressed here.
- Brittle merge:
...configat the top level means a partialconfig.oldMpc/config.near/ etc. can overwrite the whole default object, leaving required propertiesundefinedat runtime even though the type saysstring/number.I’d strongly recommend:
- Parse
config.jsonintoPartial<MigrationConfig>.- Prefer environment variables over config.json for secrets.
- Require all three secrets at startup and throw a clear error if missing.
- Build the final
MigrationConfigby field, using??to fall back to defaults so partial config sections cannot drop required values.- Keep
config.jsonout of version control and rely on a secret manager to inject env vars in prod.Concrete suggestion:
-export const initMigrationConfig = (): MigrationConfig => { - let config = {}; - try { - const configFile = fs.readFileSync("config.json", "utf8"); - config = JSON.parse(configFile.toString()); - } catch (_) { - console.warn("Could not read config, using default"); - } - return { - firebase: { - file: path.join(__dirname, "../../data/firebase-export.json"), - }, - bypassJwt: { - privateKeyPem: "-----BEGIN PRIVATE KEY-----\nMIIEvAI...IbiFCFzqBw0UQ==\n-----END PRIVATE KEY-----\n", - jwtIssuer: "https://securetoken.google.com/near-mobile-99e38", - jwtAudience: "near-mobile-99e38", - }, - oldMpc: { - url: "https://mpc.mainnet.sig.network", - tmpPrivateKey: "ed25519:MHz6wrZAegmeCXT6frJCJLYtGy39pz48JFeu2R1zKpeS74PVbaCXnZ1h8cCNzt9yyr2rq9DFswJWx7gWD5gRgFd", - salt: 3177899144, - }, - near: { - relayerPrivateKey: "ed25519:MHz6wrZAegmeCXT6frJCJLYtGy39pz48JFeu2R1zKpeS74PVbaCXnZ1h8cCNzt9yyr2rq9DFswJWx7gWD5gRgFd", - networkId: "mainnet", - nodeUrl: "https://rpc.near.org", - indexerUrl: "https://api.nearblocks.io", - }, - fastAuth: { - mpcContractId: "v1.signer", - contractId: "fast-auth.near", - domainUrl: "https://domain.com", - }, - queue: { - redis: "redis://localhost", - }, - ...config, - }; -}; +export const initMigrationConfig = (): MigrationConfig => { + let config: Partial<MigrationConfig> = {}; + try { + const configFile = fs.readFileSync("config.json", "utf8"); + config = JSON.parse(configFile.toString()); + } catch (error) { + console.warn( + "Could not read migration config.json, falling back to environment/defaults", + error, + ); + } + + // Prefer environment variables over checked-in config for secrets + const bypassJwtPrivateKeyPem = + process.env.BYPASS_JWT_PRIVATE_KEY_PEM ?? config.bypassJwt?.privateKeyPem; + const oldMpcTmpPrivateKey = + process.env.MPC_TMP_PRIVATE_KEY ?? config.oldMpc?.tmpPrivateKey; + const nearRelayerPrivateKey = + process.env.NEAR_RELAYER_PRIVATE_KEY ?? config.near?.relayerPrivateKey; + + if (!bypassJwtPrivateKeyPem) { + throw new Error( + "bypassJwt.privateKeyPem must be provided via BYPASS_JWT_PRIVATE_KEY_PEM or config.json", + ); + } + if (!oldMpcTmpPrivateKey) { + throw new Error( + "oldMpc.tmpPrivateKey must be provided via MPC_TMP_PRIVATE_KEY or config.json", + ); + } + if (!nearRelayerPrivateKey) { + throw new Error( + "near.relayerPrivateKey must be provided via NEAR_RELAYER_PRIVATE_KEY or config.json", + ); + } + + return { + firebase: { + file: + config.firebase?.file ?? + path.join(__dirname, "../../data/firebase-export.json"), + }, + bypassJwt: { + privateKeyPem: bypassJwtPrivateKeyPem, + jwtIssuer: + config.bypassJwt?.jwtIssuer ?? + "https://securetoken.google.com/near-mobile-99e38", + jwtAudience: + config.bypassJwt?.jwtAudience ?? "near-mobile-99e38", + }, + oldMpc: { + url: config.oldMpc?.url ?? "https://mpc.mainnet.sig.network", + tmpPrivateKey: oldMpcTmpPrivateKey, + salt: config.oldMpc?.salt ?? 3177899144, + }, + near: { + relayerPrivateKey: nearRelayerPrivateKey, + networkId: config.near?.networkId ?? "mainnet", + nodeUrl: config.near?.nodeUrl ?? "https://rpc.near.org", + indexerUrl: + config.near?.indexerUrl ?? "https://api.nearblocks.io", + }, + fastAuth: { + mpcContractId: config.fastAuth?.mpcContractId ?? "v1.signer", + contractId: config.fastAuth?.contractId ?? "fast-auth.near", + domainUrl: config.fastAuth?.domainUrl ?? "https://domain.com", + }, + queue: { + redis: config.queue?.redis ?? "redis://localhost", + }, + }; +};This keeps the nice, strongly typed
MigrationConfigwhile avoiding secret leakage and making the config behavior predictable even with partial overrides. Based on learnings, this also lines up better with the strongly-typed config approach used elsewhere in your stack.
🧹 Nitpick comments (2)
scripts/migration/src/provider/NearProvider.ts (2)
24-24: Consider making the request timeout configurable.The
maxRequestTimeoutis hardcoded to 20 seconds. For production flexibility, consider accepting it as an optional constructor parameter with a default value.export class NearProvider { - private readonly maxRequestTimeout: number = 20_000; + private readonly maxRequestTimeout: number; private readonly relayerSigner: NearSigner; constructor( private readonly indexerUrl: string, private readonly near: Near, relayerPrivateKey: string, + maxRequestTimeout: number = 20_000, ) { + this.maxRequestTimeout = maxRequestTimeout; this.relayerSigner = new NearSigner(relayerPrivateKey); }
177-180: Replace console.log with a proper logger.Production code should use a structured logging library instead of
console.logfor better observability, log levels, and consistency.- console.log(`Sending transaction ${hash}`); + // TODO: Replace with proper logger (e.g., winston, pino) + // logger.info(`Sending transaction ${hash}`); const result = await this.near.connection.provider.sendTransactionUntil(signedTransaction, "FINAL"); - console.log(`Sign to ${receiverId} done with hash ${hash}`); + // logger.info(`Sign to ${receiverId} done with hash ${hash}`);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
scripts/migration/.dockerignore(1 hunks)scripts/migration/src/config/migrationConfig.ts(1 hunks)scripts/migration/src/provider/NearProvider.ts(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- scripts/migration/.dockerignore
🧰 Additional context used
🧠 Learnings (11)
📚 Learning: 2025-08-27T15:08:02.153Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 544
File: apps/mobile/src/modules/npro/containers/npro-pre-staking-token/npro-pre-staking-token.tsx:19-22
Timestamp: 2025-08-27T15:08:02.153Z
Learning: The config system in this codebase uses strong TypeScript typing where useConfig("npro.stakingStartsAt") returns string (not string | undefined), guaranteeing the value will be present at runtime and eliminating the need for defensive guard clauses.
Applied to files:
scripts/migration/src/config/migrationConfig.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented in NervosAccountProvider by calling getAddressAndScriptFromPublicKey and returning the first element of the tuple. Calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface extends IAccountProvider, which defines getAddressFromPublicKey(publicKey: string): string. This method is properly inherited and implemented, so calls to nervosAccountProvider.getAddressFromPublicKey() are valid and correct.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:59.510Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/test/mocks/wci/chains/nervos/providers/nervos-account-scanner.provider.mock.ts:1-6
Timestamp: 2025-09-01T14:25:59.510Z
Learning: In the Nervos blockchain implementation, scanning functionality has been separated from the account provider into a dedicated account scanner provider. The NervosScannedAccountAddressMock and NervosScannedAccountMock now belong in the nervos-account-scanner.provider.mock.ts file, not the account provider mock file.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-01T14:25:11.087Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 63
File: packages/shared/blockchain/src/wci/chains/nervos/providers/account-scanner/nervos-account-scanner.provider.ts:75-77
Timestamp: 2025-09-01T14:25:11.087Z
Learning: The INervosAccountProvider interface has both getAddressFromPublicKey(publicKey: string): string and getAddressAndScriptFromPublicKey(publicKey: string): [string, Script] methods available. The getAddressFromPublicKey method exists and is correctly implemented.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-04-25T08:26:35.531Z
Learnt from: JordiParraCrespo
Repo: Peersyst/near-mobile PR: 214
File: packages/frontend/account/src/domain/controllers/account.controller.ts:37-40
Timestamp: 2025-04-25T08:26:35.531Z
Learning: Error handling with try-catch blocks is not needed in the `signAndBroadcastTransaction` method of the `AccountController` class in the NEAR mobile codebase, as errors are likely handled at a higher level or allowed to propagate naturally through the promise chain.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-05T08:27:36.172Z
Learnt from: ShinningPikachu
Repo: Peersyst/near-mobile PR: 558
File: apps/npro-distributor/src/modules/near/near.service.ts:261-314
Timestamp: 2025-09-05T08:27:36.172Z
Learning: The distributeNproToDelegators method in apps/npro-distributor/src/modules/near/near.service.ts processes epochs sequentially within a single execution, but race conditions can occur between concurrent method executions triggered by multiple HTTP requests to the POST /near/npro/distribute endpoint.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-12T07:12:41.650Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 77
File: packages/frontend/token/src/ui/queries/token.queries.ts:11-14
Timestamp: 2025-09-12T07:12:41.650Z
Learning: In packages/frontend/token/src/ui/queries/token.queries.ts, the detail function accepts a chain parameter but does not include it in the query key, which could lead to cache key collisions if currencyId is not globally unique across chains.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-10T13:20:00.983Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 571
File: apps/npro-distributor/src/modules/near/near.service.ts:59-89
Timestamp: 2025-09-10T13:20:00.983Z
Learning: The findNewEpochsToProcess method in apps/npro-distributor/src/modules/near/near.service.ts is designed with a precondition (pre) that the processed_epoch table should not be modified while the method is running, indicating it should only be called in single-threaded environments where concurrency control is handled at the caller level.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-08-29T14:31:52.266Z
Learnt from: AgustinMJ
Repo: Peersyst/near-mobile PR: 552
File: apps/mobile/src/modules/token/screens/npro-pre-staking-token-details/screens/npro-pre-staking-token-details-general/containers/npro-pre-staking-token-details-actions/npro-pre-staking-token-details-acitons.tsx:26-31
Timestamp: 2025-08-29T14:31:52.266Z
Learning: In the NEAR mobile app, setTimeout(250) in navigation functions like handleEarnNproPress (apps/mobile/src/modules/token/screens/npro-pre-staking-token-details/screens/npro-pre-staking-token-details-general/containers/npro-pre-staking-token-details-actions/npro-pre-staking-token-details-acitons.tsx) is intentionally used to make navigation transitions smoother, not a code issue that needs fixing.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
📚 Learning: 2025-09-18T13:10:13.778Z
Learnt from: JordiParraCrespo
Repo: Peersyst/bitcoin-light PR: 85
File: packages/shared/blockchain/src/wci/chains/nervos/providers/dao/nervos-dao.provider.ts:1-1
Timestamp: 2025-09-18T13:10:13.778Z
Learning: In the nervos-dao.provider.ts file in the Bitcoin Light codebase, BigNumber is used on line 117 but needs to be explicitly imported, even though it may be available globally elsewhere in the codebase.
Applied to files:
scripts/migration/src/provider/NearProvider.ts
🧬 Code graph analysis (2)
scripts/migration/src/config/migrationConfig.ts (2)
apps/relayer/webpack.config.js (1)
path(1-1)packages/shared/cli/src/cli.mjs (1)
__dirname(12-12)
scripts/migration/src/provider/NearProvider.ts (2)
scripts/migration/src/provider/NearSigner.ts (1)
NearSigner(8-52)scripts/migration/src/utils/timeout-promise.ts (2)
timeoutPromise(13-24)TimeoutPromiseError(1-5)
| async _getAccountIdsByPublicKey(publicKey: string): Promise<string[]> { | ||
| const accountIds = await this.fetchJson<string[]>(`/v1/kitwallet/publicKey/${publicKey}/accounts`); | ||
| return accountIds.map(this.parseAccountId); | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
rg "_getAccountIdsByPublicKey" -t ts -C 3Repository: Peersyst/fast-auth
Length of output: 671
Remove the _getAccountIdsByPublicKey method—it is unused dead code.
This private method uses a deprecated endpoint (/v1/kitwallet) and has no references anywhere in the codebase. The public getAccountIdsByPublicKey method (lines 87-90) provides the current implementation with the updated endpoint (/v0/public_key).
🤖 Prompt for AI Agents
In scripts/migration/src/provider/NearProvider.ts around lines 78 to 81, remove
the unused private method _getAccountIdsByPublicKey which calls the deprecated
/v1/kitwallet endpoint; delete the entire method declaration and any associated
JSDoc/comments, then run lint/tests to ensure no remaining references and update
exports or imports if they become unused.
| return BigInt(status.sync_info.latest_block_height + 1_000); | ||
| } | ||
|
|
||
| // @ts-ignore |
There was a problem hiding this comment.
🛠️ Refactor suggestion | 🟠 Major
Add explanation for type suppression or fix the underlying type issue.
The @ts-ignore directive suppresses TypeScript errors without explanation. This can hide legitimate type issues and makes maintenance harder.
Please either:
- Add a comment explaining why the type suppression is necessary, or
- Fix the underlying type issue that requires the suppression
🤖 Prompt for AI Agents
In scripts/migration/src/provider/NearProvider.ts around line 115, there is a
naked "// @ts-ignore" that suppresses TypeScript errors without explanation;
either remove it by fixing the underlying type mismatch (adjust
types/interfaces, add proper type assertions, or refine function signatures so
the expression is correctly typed) or keep it but replace it with a short
explanatory comment above the directive describing exactly which TypeScript
error is being suppressed and why (include the tsc/error message ID or text and
why a safe ignore is necessary) so future maintainers understand the rationale.
[TA-XXXX]: Title
Changes 🛠️
app/package
Summary by CodeRabbit
New Features
Chores
✏️ Tip: You can customize this high-level summary in your review settings.