Skip to content

Optimize Record Hashes #220

@darnjo

Description

@darnjo

There are a couple of things that should be done with hashing when tracking stats:

  • In Node/JS when hashes get large, insertions slow down after 2^23 entries. This makes the testing process go slowly. Refactor the state service so that it uses several maps in round-robin or random order.
  • The hashes are currently 256 bit, or 32 bytes. Sampling is only on the order of millions, so we don't need the full 256 bits to avoid collisions. 128 bits are sufficient, and would use half the memory.

Metadata

Metadata

Assignees

Labels

Type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions