Skip to content

Conversation

@ddilbazTT
Copy link
Contributor

@ddilbazTT ddilbazTT commented Nov 25, 2025

Ticket

#5813
#6016

Problem description

During uplift 2 weeks ago, default tt-metal/ttnn linear op behaviour has changed. linear op is now mapping PyTorch Linear This means that linear op bias needs to be 1d (or view, like 1x1x64)

@ctodTT relaxed the conditions to linear op workaround so that all linear ops would be broken down into matmul and add. However, this means no linear op is running as a linear op, even if they have 1d bias.

What's changed

Linear op workaround is changed such that:

  • only linear ops with non 1d bias or batched b input are broken down into matmul + add
  • activation is passed from linear to matmul op correctly (previously, this was not passed)
    Tests marked unsupported or skipped are reinstated.

Checklist

@codecov-commenter
Copy link

codecov-commenter commented Nov 25, 2025

Codecov Report

❌ Patch coverage is 73.80952% with 11 lines in your changes missing coverage. Please review.
✅ Project coverage is 69.38%. Comparing base (0e3246f) to head (05c7744).
⚠️ Report is 24 commits behind head on main.
✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
...rkarounds/Decomposition/LinearOpRewritePattern.cpp 73.80% 11 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #6014      +/-   ##
==========================================
+ Coverage   69.32%   69.38%   +0.06%     
==========================================
  Files         330      333       +3     
  Lines       50290    50951     +661     
==========================================
+ Hits        34862    35353     +491     
- Misses      15428    15598     +170     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ddilbazTT ddilbazTT changed the title Enable linear again [DRAFT PR] Change linear op rewrite pattern conditions Nov 25, 2025
@ddilbazTT ddilbazTT changed the title Change linear op rewrite pattern conditions Change linear op rewrite pattern Nov 25, 2025
@ddilbazTT ddilbazTT marked this pull request as ready for review November 25, 2025 20:15
@ddilbazTT ddilbazTT requested review from a team as code owners November 25, 2025 20:15
@ddilbazTT
Copy link
Contributor Author

@sdjordjevicTT @jserbedzijaTT I believe this PR would address missing activations in linears with activations. However, I believe you might also consider changing the linear with activation fusing pattern to absorb the repeat that comes before. That way, they would remain as linear with 1d bias.

@ddilbazTT ddilbazTT force-pushed the ddilbaz/revert_linear_fixes branch from 5fa574a to 05c7744 Compare November 25, 2025 20:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants