Skip to content

Conversation

@gomoripeti
Copy link
Contributor

Prevents osiris_retention gen_server from building up a long message queue when stream writes outperform processing retention evaluations.

Related to #204

@kjnilsson
Copy link
Contributor

I am wondering if just makng this process a gen_batch_server and de-duplicate within each batch would be a simpler approach.

@gomoripeti
Copy link
Contributor Author

I didn't know about gen_batch_server. Will soon submit that version. A subtle difference is that the current PR deduplicates after each single eval processing, while the batch server approach deduplicates after each batch. So if there is an eval request from stream1 then from stream2 and then while processing stream1 another request arrives for stream2, current PR will process stream2 only once while gen_batch_server will have two batches and process stream2 two times. Probably doesn't matter that much

@gomoripeti gomoripeti force-pushed the retention_backlog branch 2 times, most recently from 0c29efe to 9dc60c0 Compare January 22, 2026 17:23
Prevents osiris_retention gen_server from building up a long message
queue when stream writes outperform processing retention evaluations.

Related to rabbitmq#204
@kjnilsson
Copy link
Contributor

kjnilsson commented Jan 23, 2026

This looks good, thank you.

There is a minor optimisation available in that you can ask the gen_batch_server to pass the batch list in reverse order (i.e. the order it was built up - see osiris_writer). This avoids a lists:reverse/1 and you can process the batch as you iterate it (as the newest item will be first), only keeping a deduplication set for each processed eval to drop any evals already processed

Probably not worth the effort but just wanted to mention it is available.

@gomoripeti
Copy link
Contributor Author

thanks for the tip. Not for performance reasons but I think the code became a bit nicer with reversed_batch, true.

@kjnilsson kjnilsson self-requested a review January 23, 2026 13:41
@kjnilsson kjnilsson merged commit 6f54928 into rabbitmq:main Jan 23, 2026
4 checks passed
@acogoluegnes acogoluegnes added this to the 1.11.0 milestone Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants