Skip to content

Conversation

infinity0
Copy link

@infinity0 infinity0 commented Jul 10, 2024

This is a more conservative alternative to #3262.

See #3257 (comment) / #3258 for details.

Copy link
Collaborator

@mashb1t mashb1t left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is in contrast with the comment in https://github.com/infinity0/Fooocus/blob/a3880034e476a4fb5c1c3ba17635b08e5548fe88/ldm_patched/modules/model_management.py#L769, please confirm this makes the handling better instead of worse.

@mashb1t mashb1t linked an issue Jul 10, 2024 that may be closed by this pull request
5 tasks
@mashb1t mashb1t added bug (AMD) Something isn't working (AMD specific) Size S small change, basically no testing needed labels Jul 10, 2024
@infinity0
Copy link
Author

Need @lllyasviel to comment as I have no idea what the original comment is referring to regarding "worse". Works fine over here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug (AMD) Something isn't working (AMD specific) Size S small change, basically no testing needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: ROCm Fooocus doesn't garbage collect allocated VRAM

2 participants