Skip to content

Implement Duplicate Transaction Detection During Batch Processing #9

@007-SARANG

Description

@007-SARANG

I am Sarang, and I am a contributor from GSoC.
Uploads can include duplicate transactions (especially across retries), and the current flow may fail or partially commit depending on DB constraints.
I want to add duplicate handling logic to make ingestion idempotent and provide clear processing stats.

Acceptance Criteria:

-Duplicate records are detected by [transaction_id] before insert.
-Duplicate rows are skipped safely without breaking the full batch.
-API response includes processed/skipped/duplicate counts.
-Logging includes duplicate summary for observability.

I will start working on this and open a PR soon.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions