Skip to content

Conversation

@vermouth1992
Copy link
Collaborator

What does this PR do?

  • As title

Checklist Before Starting

  • Search for similar PRs. Paste at least one query link here: ...
  • Format the PR title as [{modules}] {type}: {description} (This will be checked by the CI)
    • {modules} include fsdp, megatron, sglang, vllm, rollout, trainer, ci, training_utils, recipe, hardware, deployment, ray, worker, single_controller, misc, perf, model, algo, env, tool, ckpt, doc, data, cfg, reward
    • If this PR involves multiple modules, separate them with , like [megatron, fsdp, doc]
    • {type} is in feat, fix, refactor, chore, test
    • If this PR breaks any API (CLI arguments, config, function signature, etc.), add [BREAKING] to the beginning of the title.
    • Example: [BREAKING][fsdp, megatron] feat: dynamic batching

Test

For changes that can not be tested by CI (e.g., algorithm implementation, new model support), validate by experiment(s) and show results like training curve plots, evaluation results, etc.

API and Usage Example

Demonstrate how the API changes if any, and provide usage example(s) if possible.

# Add code snippet or script demonstrating how to use this

Design & Code Changes

Demonstrate the high-level design if this PR is complex, and list the specific changes.

Checklist Before Submitting

Important

Please check all the following items before requesting a review, otherwise the reviewer might deprioritize this PR for review.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds profiling support to the SFT trainer and model engine, which is a great performance analysis feature. The implementation is mostly solid, but I've identified a critical bug in the default profiler configuration that will cause a crash when profiling is enabled. Additionally, there's a significant amount of duplicated code between the standard and Ray-based SFT trainers that should be refactored to improve maintainability. Addressing these points will make the new feature robust and easier to maintain.

step_start: 0

# stop profile mini-batch in training
step_end: null
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The step_end value is set to null, which will be parsed as None in Python. This will cause a TypeError in verl/utils/profiler/profile.py during the validation check (self.tool_config.step_end >= 0) and when calculating the active duration for the profiler schedule (self.tool_config.step_end - self.tool_config.step_start).

To fix this, step_end must be an integer. A default value of 1 would be reasonable, allowing the profiler to capture a single step by default.

    step_end: 1

Comment on lines +93 to +105
self.profiler_config = omega_conf_to_dataclass(self.config.profiler)

# check profile interval
self.profiler_interval = self.config.trainer.profile_interval
self._validate_profiler_interval()

def _validate_profiler_interval(self):
assert len(self.profiler_interval) == 2
self.start_profile_step = self.profiler_interval[0]
self.end_profile_step = self.profiler_interval[1]
assert self.end_profile_step >= self.start_profile_step
if self.start_profile_step < 0:
assert self.end_profile_step < 0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The logic for handling the profiler configuration, including the _validate_profiler_interval method, is duplicated from verl/trainer/sft_trainer.py. This code duplication makes the codebase harder to maintain and increases the risk of introducing inconsistencies if one file is updated but the other is not.

Consider refactoring the common logic from SFTTrainer in both sft_trainer.py and sft_trainer_ray.py into a shared base class. This would centralize the configuration handling and other shared functionalities, leading to cleaner and more maintainable code.

@vermouth1992 vermouth1992 changed the title [perf] feat: support profiler in model engine ad sft trainer [perf] feat: support profiler in model engine and sft trainer Dec 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants