I am Sarang, and I am a contributor from GSoC.
Uploads can include duplicate transactions (especially across retries), and the current flow may fail or partially commit depending on DB constraints.
I want to add duplicate handling logic to make ingestion idempotent and provide clear processing stats.
Acceptance Criteria:
-Duplicate records are detected by [transaction_id] before insert.
-Duplicate rows are skipped safely without breaking the full batch.
-API response includes processed/skipped/duplicate counts.
-Logging includes duplicate summary for observability.
I will start working on this and open a PR soon.
I am Sarang, and I am a contributor from GSoC.
Uploads can include duplicate transactions (especially across retries), and the current flow may fail or partially commit depending on DB constraints.
I want to add duplicate handling logic to make ingestion idempotent and provide clear processing stats.
Acceptance Criteria:
-Duplicate records are detected by [transaction_id] before insert.
-Duplicate rows are skipped safely without breaking the full batch.
-API response includes processed/skipped/duplicate counts.
-Logging includes duplicate summary for observability.
I will start working on this and open a PR soon.