Skip to content

Conversation

@ananthsub
Copy link
Contributor

@ananthsub ananthsub commented Jan 27, 2026

What does this PR do ?

Sync with changes from NVIDIA/Megatron-LM#2210

Changelog

  • Add specific line by line info of high level changes in this PR.

GitHub Actions CI

See the CI sectionin the Contributing doc for how to trigger the CI. A Nvidia developer will need to approve and trigger the CI for external contributors.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

If you haven't finished some of the above items you can still open "Draft" PR.

Additional Information

  • Related to # (issue)

Summary by CodeRabbit

  • New Features
    • Added configurable timing verbosity levels for training performance logging. Users can now adjust logging detail through a configuration option to control which performance metrics appear in TensorBoard/W&B, enabling granular visibility into training dynamics.

✏️ Tip: You can customize this high-level summary in your review settings.

Signed-off-by: Ananth Subramaniam <ansubramania@nvidia.com>
Signed-off-by: Ananth Subramaniam <ansubramania@nvidia.com>
@copy-pr-bot
Copy link

copy-pr-bot bot commented Jan 27, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@ananthsub ananthsub changed the title Sync 2210 [sync] Reduce Overhead in Timers Jan 27, 2026
@ananthsub
Copy link
Contributor Author

/ok to test b83916c

@coderabbitai
Copy link

coderabbitai bot commented Jan 27, 2026

📝 Walkthrough

Walkthrough

Modified the training_log function to conditionally select timers for logging based on logger_config.timing_log_level instead of using a fixed set, enabling three tiers of logging detail (levels 0, 1, and 2+) for TensorBoard/W&B output.

Changes

Cohort / File(s) Summary
Training logging configuration
src/megatron/bridge/training/utils/train_utils.py
Replaced hard-coded timer list in training_log with conditional composition: level 0 logs no timers, level 1 logs forward-backward and gradient/optimizer operations, level 2+ additionally logs batch-generator and communication operations.
Test configuration
tests/unit_tests/training/utils/test_train_utils.py
Added timing_log_level = 0 to mock logger configuration in test fixture to support updated training log functionality.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 3 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Test Results For Major Changes ⚠️ Warning PR implements major performance-related changes to timer logging but PR description contains only a placeholder with no documentation, test results, or performance impact analysis. Update PR description to include concrete changes, test results, before-and-after performance numbers, and clarify convergence impact or upstream sync source.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title '[sync] Reduce Overhead in Timers' directly relates to the main change: making timer logging conditional on timing_log_level to reduce logging overhead.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants