Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update serialization extension to support custom cache-size metric #52

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

justinmimbs
Copy link

The current Serialization extension assumes the cache size is always the number of items in the cache. This PR allows serializing LRUs with custom by functions, where lru.currentsize != length(lru).

@codecov-commenter
Copy link

codecov-commenter commented Feb 19, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 81.97%. Comparing base (1cfbef2) to head (2185bb5).

Additional details and impacted files
@@            Coverage Diff             @@
##           master      #52      +/-   ##
==========================================
+ Coverage   81.92%   81.97%   +0.05%     
==========================================
  Files           3        3              
  Lines         354      355       +1     
==========================================
+ Hits          290      291       +1     
  Misses         64       64              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -29,20 +29,21 @@ function Serialization.deserialize(s::AbstractSerializer, ::Type{LRU{K, V}}) whe
lock = Serialization.deserialize(s)
by = Serialization.deserialize(s)
finalizer = Serialization.deserialize(s)
n_items = Serialization.deserialize(s)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a way to detect there are no bytes left to deserialize and fallback to n_items = currentsize here? to continue supporting de-serializing old serialized files

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(if not, I'm not sure this should be a patch release, it seems like it could unexpectedly break stuff. Not quite sure what folks use LRU serialization for though, is it for IPC only or for persistence?)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point; this will break deserializing old serialized data. I think we can make this backward compatible, and that would be nicer in case there are some persisted LRUs out there.

Any existing file would have n_items in the currentsize position, so we can continue storing n_items there, and always recompute currentsize.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the PR. Thanks for pointing this out!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good solution!

@Jutho
Copy link
Collaborator

Jutho commented Feb 19, 2025

Thanks; this looks like a great PR. I have no additional comments. I suggest to leave this open one more day for maybe @jarbus or @lkdvos to take a look, and then merge and tag tomorrow.

@jarbus
Copy link
Contributor

jarbus commented Feb 19, 2025

This seems good to me as well. Thanks for the great changes!

Copy link
Contributor

@lkdvos lkdvos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also looks like a great improvement to me, I don't think I can spot any issues here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Serialization does not support non-default by functions
6 participants