Skip to content

Conversation

@iceload123
Copy link

@iceload123 iceload123 commented Jul 25, 2025

Related issue #540

Motivation

Modifications

I think this is just a minor historical fix. To ensure compatibility with Flux-tool's LoRA, the original code had already extended the logic to support channel expansion. However, it seems the historical code was not removed, causing the updated logic to not actually take effect. Just comment it out to solve the problem.

Checklist

  • Code is formatted using Pre-Commit hooks.
  • Relevant unit tests are added in the tests directory following the guidance in Contribution Guide.
  • Documentation and example scripts in examples are updated if necessary.
  • Throughput/latency benchmarks and quality evaluations are included where applicable.
  • For reviewers: If you're only helping merge the main branch and haven't contributed code to this PR, please remove yourself as a co-author when merging.
  • Please feel free to join our Slack, Discord or WeChat to discuss your PR.

Comment on lines +191 to +195
"""
composed[k] = (
v if previous_lora is None else torch.cat([previous_lora, v], dim=0 if "lora_A" in k else 1)
)
"""

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"""
composed[k] = (
v if previous_lora is None else torch.cat([previous_lora, v], dim=0 if "lora_A" in k else 1)
)
"""

No reason to comment it out; just remove it completely. It has no use anymore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants