Skip to content

migrate to transformers v5#760

Open
omkar-334 wants to merge 12 commits intoroboflow:developfrom
omkar-334:transv5
Open

migrate to transformers v5#760
omkar-334 wants to merge 12 commits intoroboflow:developfrom
omkar-334:transv5

Conversation

@omkar-334
Copy link
Contributor

@omkar-334 omkar-334 commented Feb 27, 2026

What does this PR do?

This PR migrates the backbone to transformers v5

Resolves #730

Type of Change

  • Refactoring (no functional changes)

Testing

Screenshot 2026-02-28 at 1 28 28 AM All `not gpu` tests pass
  • I have tested this change locally

Checklist

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code where necessary, particularly in hard-to-understand areas
  • My changes generate no new warnings or errors
  • I have updated the documentation accordingly (if applicable)

@codecov
Copy link

codecov bot commented Feb 27, 2026

Codecov Report

❌ Patch coverage is 96.42857% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 69%. Comparing base (6ee2b3e) to head (0c49b15).
⚠️ Report is 81 commits behind head on develop.

❌ Your project check has failed because the head coverage (69%) is below the target coverage (95%). You can increase the head coverage or adjust the target coverage.

Additional details and impacted files
@@          Coverage Diff           @@
##           develop   #760   +/-   ##
======================================
  Coverage       69%    69%           
======================================
  Files           56     56           
  Lines         6966   6983   +17     
======================================
+ Hits          4807   4823   +16     
- Misses        2159   2160    +1     
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@Borda Borda added the enhancement New feature or request label Feb 28, 2026
@Borda Borda requested a review from Copilot February 28, 2026 09:39
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Migrates RF-DETR’s DINOv2 windowed-attention backbone integration to be compatible with Transformers v5, aligning the project’s dependency constraints with the requested transformers>=5.0.0 support (Issue #730).

Changes:

  • Updated imports/backbone mixin initialization to match Transformers v5 APIs.
  • Removed head_mask plumbing throughout the windowed-attention DINOv2 implementation.
  • Updated package dependency constraint to require Transformers v5.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 4 comments.

File Description
src/rfdetr/models/backbone/dinov2_with_windowed_attn.py Refactors the DINOv2 windowed-attention backbone implementation for Transformers v5 API compatibility.
pyproject.toml Updates the Transformers dependency to >=5.0.0.
Comments suppressed due to low confidence (1)

src/rfdetr/models/backbone/dinov2_with_windowed_attn.py:864

  • PR description says this is a refactor with no functional changes, but head_mask was removed from the model forward(...) API and its docstrings. If this is an intentional breaking change required by Transformers v5, consider updating the PR description (and/or release notes) to reflect the public API change.
    def forward(
        self,
        pixel_values: Optional[torch.Tensor] = None,
        bool_masked_pos: Optional[torch.Tensor] = None,
        output_attentions: Optional[bool] = None,
        output_hidden_states: Optional[bool] = None,
        return_dict: Optional[bool] = None,
    ) -> Union[Tuple, BaseModelOutputWithPooling]:

Borda and others added 10 commits February 28, 2026 10:58
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
…ration

- Add `hidden_states: torch.Tensor` type annotation to
  `Dinov2WithRegistersSdpaSelfAttention.forward`
- Update stale warning message that incorrectly framed v5 as future
- Remove dead `_CHECKPOINT_FOR_DOC` constant (never referenced)
- Add upstream source comments on locally-copied
  `find_pruneable_heads_and_indices` and
  `get_aligned_output_features_output_indices` (both removed from
  transformers public API in v5)
- Update AGENTS.md version constraints to reflect v5 baseline
  (transformers >=5.0.0,<6.0.0; PyTorch >=2.0.0,<3.0.0)
- Add tests: utility function coverage (get_aligned, find_pruneable),
  WindowedDinov2WithRegistersBackbone smoke tests, and
  output_attentions=True SDPA fallback test

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…aviour

Windowed attention explicitly rejects output_attentions=True with an
AssertionError. The test now asserts that error is raised rather than
expecting attention weights to be returned.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
… code

Apache 2.0 Section 4(c) requires retaining all copyright notices from
upstream source. Two functions were copied from different transformers
source files, each with a distinct copyright year/holder:

- find_pruneable_heads_and_indices: pytorch_utils.py
  "Copyright 2022 The HuggingFace Team. All rights reserved."
- get_aligned_output_features_output_indices: backbone_utils.py
  "Copyright 2023 The HuggingFace Inc. team. All rights reserved."

Both notices are now included in the file header alongside the existing
DINOv2 attribution (Copyright 2024 Meta Inc. / HuggingFace Inc.).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Both locally-copied functions now carry a MAINTENANCE comment directing
future editors to update the corresponding copyright line in the file
header if the function is moved or removed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Request: Support for transformers >= 5.0.0

3 participants