Conversation
Codecov Report❌ Patch coverage is ❌ Your project check has failed because the head coverage (69%) is below the target coverage (95%). You can increase the head coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## develop #760 +/- ##
======================================
Coverage 69% 69%
======================================
Files 56 56
Lines 6966 6983 +17
======================================
+ Hits 4807 4823 +16
- Misses 2159 2160 +1 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
Migrates RF-DETR’s DINOv2 windowed-attention backbone integration to be compatible with Transformers v5, aligning the project’s dependency constraints with the requested transformers>=5.0.0 support (Issue #730).
Changes:
- Updated imports/backbone mixin initialization to match Transformers v5 APIs.
- Removed
head_maskplumbing throughout the windowed-attention DINOv2 implementation. - Updated package dependency constraint to require Transformers v5.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
src/rfdetr/models/backbone/dinov2_with_windowed_attn.py |
Refactors the DINOv2 windowed-attention backbone implementation for Transformers v5 API compatibility. |
pyproject.toml |
Updates the Transformers dependency to >=5.0.0. |
Comments suppressed due to low confidence (1)
src/rfdetr/models/backbone/dinov2_with_windowed_attn.py:864
- PR description says this is a refactor with no functional changes, but
head_maskwas removed from the modelforward(...)API and its docstrings. If this is an intentional breaking change required by Transformers v5, consider updating the PR description (and/or release notes) to reflect the public API change.
def forward(
self,
pixel_values: Optional[torch.Tensor] = None,
bool_masked_pos: Optional[torch.Tensor] = None,
output_attentions: Optional[bool] = None,
output_hidden_states: Optional[bool] = None,
return_dict: Optional[bool] = None,
) -> Union[Tuple, BaseModelOutputWithPooling]:
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
…ration - Add `hidden_states: torch.Tensor` type annotation to `Dinov2WithRegistersSdpaSelfAttention.forward` - Update stale warning message that incorrectly framed v5 as future - Remove dead `_CHECKPOINT_FOR_DOC` constant (never referenced) - Add upstream source comments on locally-copied `find_pruneable_heads_and_indices` and `get_aligned_output_features_output_indices` (both removed from transformers public API in v5) - Update AGENTS.md version constraints to reflect v5 baseline (transformers >=5.0.0,<6.0.0; PyTorch >=2.0.0,<3.0.0) - Add tests: utility function coverage (get_aligned, find_pruneable), WindowedDinov2WithRegistersBackbone smoke tests, and output_attentions=True SDPA fallback test Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…aviour Windowed attention explicitly rejects output_attentions=True with an AssertionError. The test now asserts that error is raised rather than expecting attention weights to be returned. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
… code Apache 2.0 Section 4(c) requires retaining all copyright notices from upstream source. Two functions were copied from different transformers source files, each with a distinct copyright year/holder: - find_pruneable_heads_and_indices: pytorch_utils.py "Copyright 2022 The HuggingFace Team. All rights reserved." - get_aligned_output_features_output_indices: backbone_utils.py "Copyright 2023 The HuggingFace Inc. team. All rights reserved." Both notices are now included in the file header alongside the existing DINOv2 attribution (Copyright 2024 Meta Inc. / HuggingFace Inc.). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Both locally-copied functions now carry a MAINTENANCE comment directing future editors to update the corresponding copyright line in the file header if the function is moved or removed. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
What does this PR do?
This PR migrates the backbone to transformers v5
Resolves #730
Type of Change
Testing
Checklist