Skip to content

Commit 9428137

Browse files
committed
refactor: remove unnecessary transactionAsync calls from NoteStore and PrivateEventStore
First of a series of PRs that will aim at reducing the usage of `transactionAsync` in PXE to the minimum, if not completely. This one removes a `transactionAsync` call from `NoteStore#applyNullifiers` and `PrivateEventStore#getPrivateEvents`. It is safe to do so since `applyNullifiers` doesn't write to the DB, it just writes to the in-memory staged slice that the store keeps for the duration of job. It is also safe to do so in `getPrivateEvents` because it also doesn't write to the DB, it just reads committed events. Writes from both stores are issued on `commit`, which is still done in a `transactionAsync` wrapper (note that particular case will probably remain as one of the few legitimate usages of `transactionAsync`). # Why we need to hunt down `transactionAsync` The issue with `transactionAsync` is that it works as intended with LMDB stores, but it breaks assumptions in IndexedDB. Browsers are free to auto-commit IndexedDB transactions whenever a micro-task resolves without adding a read or write request to it. That browser behavior in practice makes it extremely hard to guarantee that what's run inside a `transactionAsync` wrapper will not be auto-committed. `transactionAsync` is designed to make all operations across all store collections be routed through the same transactional context. We have to assume that whenever any code inside a `transactionAsync` resolves a promise without generating more work to the DB, the browser can and will commit the transaction, resulting in a `TransactionInactiveError` like the one in the screenshot. This is especially tricky to deal with because while aztec-packages has a very thorough CI test harness, 99% of it runs on LMDB on Node. The issues discussed in this description affect only browser environments. I took some time to evaluate libs that simulate IndexedDB, but none seem to be reliable enough. That means we can only enhance coverage and regression testing of this by adding playwright headless browser tests, which are slow and heavy. <img width="1172" height="258" alt="Screenshot 2026-01-27 at 13 15 21 (1)" src="https://github.com/user-attachments/assets/af093110-9a09-4f79-bbde-5c255d09d7d8" />
1 parent 97643a6 commit 9428137

File tree

2 files changed

+85
-92
lines changed

2 files changed

+85
-92
lines changed

yarn-project/pxe/src/storage/note_store/note_store.ts

Lines changed: 28 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -18,8 +18,6 @@ import { StoredNote } from './stored_note.js';
1818
export class NoteStore implements StagedStore {
1919
readonly storeName: string = 'note';
2020

21-
#store: AztecAsyncKVStore;
22-
2321
// Note that we use the siloedNullifier as the note id in the store as it's guaranteed to be unique.
2422

2523
// Main storage for notes. Avoid performing full scans on it as it contains all notes PXE knows, use
@@ -48,7 +46,6 @@ export class NoteStore implements StagedStore {
4846
#jobLocks: Map<string, Semaphore>;
4947

5048
constructor(store: AztecAsyncKVStore) {
51-
this.#store = store;
5249
this.#notes = store.openMap('notes');
5350
this.#nullifiersByContractAddress = store.openMultiMap('note_nullifiers_by_contract');
5451
this.#nullifiersByNullificationBlockNumber = store.openMultiMap('note_block_number_to_nullifier');
@@ -182,43 +179,41 @@ export class NoteStore implements StagedStore {
182179
* @throws Error if any nullifier is not found in this notes store
183180
*/
184181
applyNullifiers(nullifiers: DataInBlock<Fr>[], jobId: string): Promise<NoteDao[]> {
185-
return this.#withJobLock(jobId, () =>
186-
this.#store.transactionAsync(async () => {
187-
if (nullifiers.length === 0) {
188-
return [];
189-
}
190-
191-
const notesToNullify = await Promise.all(
192-
nullifiers.map(async nullifierInBlock => {
193-
const nullifier = nullifierInBlock.data.toString();
182+
return this.#withJobLock(jobId, async () => {
183+
if (nullifiers.length === 0) {
184+
return [];
185+
}
194186

195-
const storedNote = await this.#readNote(nullifier, jobId);
196-
if (!storedNote) {
197-
throw new Error(`Attempted to mark a note as nullified which does not exist in PXE DB`);
198-
}
187+
const notesToNullify = await Promise.all(
188+
nullifiers.map(async nullifierInBlock => {
189+
const nullifier = nullifierInBlock.data.toString();
199190

200-
return { storedNote: await this.#readNote(nullifier, jobId), blockNumber: nullifierInBlock.l2BlockNumber };
201-
}),
202-
);
191+
const storedNote = await this.#readNote(nullifier, jobId);
192+
if (!storedNote) {
193+
throw new Error(`Attempted to mark a note as nullified which does not exist in PXE DB`);
194+
}
203195

204-
const notesNullifiedInThisCall: Map<string, NoteDao> = new Map();
205-
for (const noteToNullify of notesToNullify) {
206-
// Safe to coerce (!) because we throw if we find any undefined above
207-
const note = noteToNullify.storedNote!;
196+
return { storedNote: await this.#readNote(nullifier, jobId), blockNumber: nullifierInBlock.l2BlockNumber };
197+
}),
198+
);
208199

209-
// Skip already nullified notes
210-
if (note.isNullified()) {
211-
continue;
212-
}
200+
const notesNullifiedInThisCall: Map<string, NoteDao> = new Map();
201+
for (const noteToNullify of notesToNullify) {
202+
// Safe to coerce (!) because we throw if we find any undefined above
203+
const note = noteToNullify.storedNote!;
213204

214-
note.markAsNullified(noteToNullify.blockNumber);
215-
this.#writeNote(note, jobId);
216-
notesNullifiedInThisCall.set(note.noteDao.siloedNullifier.toString(), note.noteDao);
205+
// Skip already nullified notes
206+
if (note.isNullified()) {
207+
continue;
217208
}
218209

219-
return [...notesNullifiedInThisCall.values()];
220-
}),
221-
);
210+
note.markAsNullified(noteToNullify.blockNumber);
211+
this.#writeNote(note, jobId);
212+
notesNullifiedInThisCall.set(note.noteDao.siloedNullifier.toString(), note.noteDao);
213+
}
214+
215+
return [...notesNullifiedInThisCall.values()];
216+
});
222217
}
223218

224219
/**

yarn-project/pxe/src/storage/private_event_store/private_event_store.ts

Lines changed: 57 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -131,78 +131,76 @@ export class PrivateEventStore implements StagedStore {
131131
* @returns - The event log contents, augmented with metadata about the transaction and block in which the event was
132132
* included.
133133
*/
134-
public getPrivateEvents(
134+
public async getPrivateEvents(
135135
eventSelector: EventSelector,
136136
filter: PrivateEventStoreFilter,
137137
): Promise<PackedPrivateEvent[]> {
138-
return this.#store.transactionAsync(async () => {
139-
const events: Array<{
140-
l2BlockNumber: number;
141-
txIndexInBlock: number;
142-
eventIndexInTx: number;
143-
event: PackedPrivateEvent;
144-
}> = [];
145-
146-
const key = this.#keyFor(filter.contractAddress, eventSelector);
147-
const targetScopes = new Set(filter.scopes.map(s => s.toString()));
148-
149-
const eventIds: string[] = await toArray(this.#eventsByContractAndEventSelector.getValuesAsync(key));
150-
151-
for (const eventId of eventIds) {
152-
const eventBuffer = await this.#events.getAsync(eventId);
153-
154-
// Defensive, if it happens, there's a problem with how we're handling #eventsByContractAndEventSelector
155-
if (!eventBuffer) {
156-
this.logger.verbose(
157-
`EventId ${eventId} does not exist in main index but it is referenced from contract event selector index`,
158-
);
159-
continue;
160-
}
138+
const events: Array<{
139+
l2BlockNumber: number;
140+
txIndexInBlock: number;
141+
eventIndexInTx: number;
142+
event: PackedPrivateEvent;
143+
}> = [];
161144

162-
const storedPrivateEvent = StoredPrivateEvent.fromBuffer(eventBuffer);
145+
const key = this.#keyFor(filter.contractAddress, eventSelector);
146+
const targetScopes = new Set(filter.scopes.map(s => s.toString()));
163147

164-
// Filter by block range
165-
if (storedPrivateEvent.l2BlockNumber < filter.fromBlock || storedPrivateEvent.l2BlockNumber >= filter.toBlock) {
166-
continue;
167-
}
148+
const eventIds: string[] = await toArray(this.#eventsByContractAndEventSelector.getValuesAsync(key));
168149

169-
// Filter by scopes
170-
if (storedPrivateEvent.scopes.intersection(targetScopes).size === 0) {
171-
continue;
172-
}
150+
for (const eventId of eventIds) {
151+
const eventBuffer = await this.#events.getAsync(eventId);
173152

174-
// Filter by txHash
175-
if (filter.txHash && !storedPrivateEvent.txHash.equals(filter.txHash)) {
176-
continue;
177-
}
153+
// Defensive, if it happens, there's a problem with how we're handling #eventsByContractAndEventSelector
154+
if (!eventBuffer) {
155+
this.logger.verbose(
156+
`EventId ${eventId} does not exist in main index but it is referenced from contract event selector index`,
157+
);
158+
continue;
159+
}
178160

179-
events.push({
180-
l2BlockNumber: storedPrivateEvent.l2BlockNumber,
181-
txIndexInBlock: storedPrivateEvent.txIndexInBlock,
182-
eventIndexInTx: storedPrivateEvent.eventIndexInTx,
183-
event: {
184-
packedEvent: storedPrivateEvent.msgContent,
185-
l2BlockNumber: BlockNumber(storedPrivateEvent.l2BlockNumber),
186-
txHash: storedPrivateEvent.txHash,
187-
l2BlockHash: storedPrivateEvent.l2BlockHash,
188-
eventSelector,
189-
},
190-
});
161+
const storedPrivateEvent = StoredPrivateEvent.fromBuffer(eventBuffer);
162+
163+
// Filter by block range
164+
if (storedPrivateEvent.l2BlockNumber < filter.fromBlock || storedPrivateEvent.l2BlockNumber >= filter.toBlock) {
165+
continue;
191166
}
192167

193-
// Sort by block number, then by tx index within block, then by event index within tx
194-
events.sort((a, b) => {
195-
if (a.l2BlockNumber !== b.l2BlockNumber) {
196-
return a.l2BlockNumber - b.l2BlockNumber;
197-
}
198-
if (a.txIndexInBlock !== b.txIndexInBlock) {
199-
return a.txIndexInBlock - b.txIndexInBlock;
200-
}
201-
return a.eventIndexInTx - b.eventIndexInTx;
168+
// Filter by scopes
169+
if (storedPrivateEvent.scopes.intersection(targetScopes).size === 0) {
170+
continue;
171+
}
172+
173+
// Filter by txHash
174+
if (filter.txHash && !storedPrivateEvent.txHash.equals(filter.txHash)) {
175+
continue;
176+
}
177+
178+
events.push({
179+
l2BlockNumber: storedPrivateEvent.l2BlockNumber,
180+
txIndexInBlock: storedPrivateEvent.txIndexInBlock,
181+
eventIndexInTx: storedPrivateEvent.eventIndexInTx,
182+
event: {
183+
packedEvent: storedPrivateEvent.msgContent,
184+
l2BlockNumber: BlockNumber(storedPrivateEvent.l2BlockNumber),
185+
txHash: storedPrivateEvent.txHash,
186+
l2BlockHash: storedPrivateEvent.l2BlockHash,
187+
eventSelector,
188+
},
202189
});
190+
}
203191

204-
return events.map(ev => ev.event);
192+
// Sort by block number, then by tx index within block, then by event index within tx
193+
events.sort((a, b) => {
194+
if (a.l2BlockNumber !== b.l2BlockNumber) {
195+
return a.l2BlockNumber - b.l2BlockNumber;
196+
}
197+
if (a.txIndexInBlock !== b.txIndexInBlock) {
198+
return a.txIndexInBlock - b.txIndexInBlock;
199+
}
200+
return a.eventIndexInTx - b.eventIndexInTx;
205201
});
202+
203+
return events.map(ev => ev.event);
206204
}
207205

208206
/**

0 commit comments

Comments
 (0)