Skip to content

Conversation

varun-sundar-rabindranath
Copy link

@varun-sundar-rabindranath varun-sundar-rabindranath commented Oct 9, 2025

LoRA modules that have LoRA adapters for modules other than language module (like vision_tower / audio_tower ) fail to load. This PR side steps the issue by passing in an ignore modules list to the LoRA loading code.

Although this is a potential fix. I am not sure we should do it. vllm main has a "check_unexpected_modules" that seems to be made for this purpose specifically -- I think the idea is to force the LoRA adapters be configured properly. It makes sense as, if there are audio_tower / vision_tower adapters in the LoRA adapter, then it probably doesn't make sense to just apply language adapters. Maybe I am missing some usecase.

Signed-off-by: Varun Sundar Rabindranath <[email protected]>
Copy link

github-actions bot commented Oct 9, 2025

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors.

You ask your reviewers to trigger select CI tests on top of fastcheck CI.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

If you have any questions, please reach out to us on Slack at https://slack.vllm.ai.

🚀

@jeejeelee
Copy link

If I understand correctly, we shouldn't filter out tower and connector, as it may lead to potential generation issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants