Skip to content

Avoid generating JSON with duplicate keys #721

Open
@nhmarujo

Description

@nhmarujo

Describe the bug
Right now if someone adds a logstash marker with the same name as one of the default fields, it will generate a JSON with the two, which doesn't comply with JSON rules (no duplicate fields are allowed).

To Reproduce
Create a log line with a logstash marker with a name you know will collide with a default one and then observe the generated JSON. Example:

log.info(append("message", "This will generate another key 'message" on the JSON"), "This will be added on 'message' by default");

Expected behavior
Having an invalid JSON (with duplicate keys) will result in a log ingestion issue with many system (had it with fluentbit). I would expect that, when constructing the JSON, any fields that would result in duplicate keys to be dropped (only the first added would be included). Additionally I would suggest some sort of warning to be printed, to raise people's attention to the mapping mistake they introduced, so they would fix it ASAP.

  • logstash-logback-encoder 7.0.1
  • logback 1.2.7
  • jackson 2.13.0
  • java 17

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions