Skip to content

Conversation

@HollowMan6
Copy link
Contributor

@HollowMan6 HollowMan6 commented Dec 25, 2025

What does this PR do ?

Similar to:
https://github.com/huggingface/peft/blob/261366de2e40cde64b702d6b9c527081ad850549/src/peft/mixed_model.py#L192-L201

enable_adapter_layers and disable_adapter_layers are alternatives if users want to control manually.

Can be used for volcengine/verl#4673

Changelog

  • Add APIs to disable/re-enable adapters temporarily.

GitHub Actions CI

See the CI sectionin the Contributing doc for how to trigger the CI. A Nvidia developer will need to approve and trigger the CI for external contributors.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

If you haven't finished some of the above items you can still open "Draft" PR.

Additional Information

  • Related to # (issue)

✨ Presented to you with Mind Lab - A Lab for Experiential Intelligence.

@copy-pr-bot
Copy link

copy-pr-bot bot commented Dec 25, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@yaoyu-33
Copy link
Contributor

@HollowMan6 lgtm, need unit tests though.

Copilot AI review requested due to automatic review settings December 30, 2025 19:00
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for temporarily disabling and re-enabling adapter layers in PEFT (Parameter-Efficient Fine-Tuning) implementations, providing functionality similar to HuggingFace's PEFT library. This allows users to temporarily bypass adapter computations and fall back to the base model behavior without removing or unloading the adapters.

Key changes:

  • Added _adapter_enabled flag to AdapterWrapper base class with enable_adapter_layers() and disable_adapter_layers() methods
  • Implemented model-level enable_adapter_layers(), disable_adapter_layers(), and disable_adapter() context manager in the PEFT base class
  • Updated all adapter forward methods to check the _adapter_enabled flag and return only base output when disabled
  • Refactored model walking logic into reusable _walk_model() helper method

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/megatron/bridge/peft/adapter_wrapper.py Added _adapter_enabled flag and enable/disable methods to AdapterWrapper base class
src/megatron/bridge/peft/base.py Added model-level enable/disable methods, disable_adapter context manager, and refactored model walking logic into _walk_model() helper
src/megatron/bridge/peft/lora_layers.py Updated LoRALinear and TEFusedLoRALinear forward methods to check adapter enabled flag and updated docstring
src/megatron/bridge/peft/dora_layers.py Updated DoRALinear forward method to check adapter enabled flag
src/megatron/bridge/peft/canonical_lora.py Updated LoRALinearSplitQKV and LoRALinearSplitFC1UpGate forward methods to check adapter enabled flag

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Similar to:
https://github.com/huggingface/peft/blob/261366de2e40cde64b702d6b9c527081ad850549/src/peft/mixed_model.py#L192-L201

`enable_adapter_layers` and `disable_adapter_layers` are alternatives
if users want to control manually.

Signed-off-by: Hollow Man <[email protected]>
Signed-off-by: Hollow Man <[email protected]>
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

HollowMan6 added a commit to HollowMan6/Megatron-Bridge that referenced this pull request Dec 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants