-
-
Notifications
You must be signed in to change notification settings - Fork 877
fix: deduplicate doc counts in term aggregation for multi-valued fields #2854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -58,6 +58,49 @@ impl<T: PartialOrd + Copy + std::fmt::Debug + Send + Sync + 'static + Default> | |
| } | ||
| } | ||
|
|
||
| /// Like `fetch_block_with_missing`, but deduplicates (doc_id, value) pairs | ||
| /// so that each unique value per document is returned only once. | ||
| /// | ||
| /// This is necessary for correct document counting in aggregations, | ||
| /// where multi-valued fields can produce duplicate entries that inflate counts. | ||
| #[inline] | ||
| pub fn fetch_block_with_missing_unique_per_doc( | ||
| &mut self, | ||
| docs: &[u32], | ||
| accessor: &Column<T>, | ||
| missing: Option<T>, | ||
| ) { | ||
| self.fetch_block_with_missing(docs, accessor, missing); | ||
| if !accessor.index.get_cardinality().is_full() { | ||
| self.dedup_docid_val_pairs(); | ||
| } | ||
| } | ||
|
|
||
| /// Removes consecutive duplicate (doc_id, value) pairs from the caches. | ||
| /// | ||
| /// After `fetch_block`, entries for the same doc are adjacent, so duplicates | ||
| /// (same doc, same value) are consecutive and can be removed in O(n). | ||
| fn dedup_docid_val_pairs(&mut self) { | ||
| if self.docid_cache.len() <= 1 { | ||
| return; | ||
| } | ||
| let mut write = 0; | ||
| for read in 1..self.docid_cache.len() { | ||
| if self.docid_cache[read] != self.docid_cache[write] | ||
| || self.val_cache[read] != self.val_cache[write] | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we should check only for duplicate docids, not for duplicate values? Can you extend the tests to capture this?
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah nevermind, we need to check both, so that termid only filters duplicate values on the same docid, but still handles multi-values
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It does not cover pairs if the values are not consecutive, e.g. : (0, 1), (0, 2), (0, 1) It's a bit more expensive. I think we could
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fixed. Values within each doc_id group are now sorted before deduplicating, and added unit tests. |
||
| { | ||
| write += 1; | ||
| if write != read { | ||
| self.docid_cache[write] = self.docid_cache[read]; | ||
| self.val_cache[write] = self.val_cache[read]; | ||
| } | ||
| } | ||
| } | ||
| let new_len = write + 1; | ||
| self.docid_cache.truncate(new_len); | ||
| self.val_cache.truncate(new_len); | ||
| } | ||
|
|
||
| #[inline] | ||
| pub fn iter_vals(&self) -> impl Iterator<Item = T> + '_ { | ||
| self.val_cache.iter().cloned() | ||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's only necessary to deduplicate for multivalue cardinality
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed, Thanks for the review