You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Don't interpret decayed data as we've failed to send tiny values
When we're calculating the success probability for min-/max-bucket
pairs and are looking at the 0th' min-bucket, we only look at the
highest max-bucket to decide the success probability. We ignore
max-buckets which have a value below `BUCKET_FIXED_POINT_ONE` to
only consider values which aren't substantially decayed.
However, if all of our data is substantially decayed, this filter
causes us to conclude that the highest max-bucket is bucket zero
even though we really should then be looking at any bucket.
We make this change here, looking at the highest non-zero
max-bucket if no max-buckets have a value above
`BUCKET_FIXED_POINT_ONE`.
0 commit comments