Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improved output message when using inference cache #1686

Merged
merged 6 commits into from
Mar 18, 2025

Conversation

yoavkatz
Copy link
Member

Also fixed issue when all data was in the cache and an empty list was passed to _infer.

The problem is that some inference models (like WML) assume at least one element is sent to _infer.

Also fixed issue when all data was in the cache and an empty list was passed to _infer.

Signed-off-by: Yoav Katz <[email protected]>
@yoavkatz yoavkatz requested a review from eladven March 17, 2025 13:17
@elronbandel elronbandel merged commit f131b94 into main Mar 18, 2025
18 of 19 checks passed
@elronbandel elronbandel deleted the improve_inference_log branch March 18, 2025 20:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants