Potential Fix : Pass ignore modules to lora loading code #3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
LoRA modules that have LoRA adapters for modules other than language module (like
vision_tower
/audio_tower
) fail to load. This PR side steps the issue by passing in an ignore modules list to the LoRA loading code.Although this is a potential fix. I am not sure we should do it. vllm
main
has a "check_unexpected_modules" that seems to be made for this purpose specifically -- I think the idea is to force the LoRA adapters be configured properly. It makes sense as, if there are audio_tower / vision_tower adapters in the LoRA adapter, then it probably doesn't make sense to just apply language adapters. Maybe I am missing some usecase.