Commit 0741d5c
Fix clang-tidy misc-unused-params warnings (#35433)
# Fix clang-tidy `misc-unused-params` warnings
## Summary
This PR applies fixes for clang-tidy's `misc-unused-params` check, which
identifies and removes unused function parameters. The changes ensure
that function signatures match their actual usage, improving code
clarity and maintainability.
## Changes Made
### 1. Removed Unused Variables
- **`ttnn/cpp/ttnn/operations/data_movement/scatter/tosa_scatter.cpp`**:
Removed unused variable `K` (line 132)
-
**`ttnn/cpp/ttnn/operations/data_movement/pad/device/pad_rm_reader_writer_multi_core_program_factory.cpp`**:
Removed unused variable `nchannel` (line 196)
-
**`ttnn/cpp/ttnn/operations/eltwise/unary/device/unary_device_operation.cpp`**:
Removed unused variable `arch` (line 80)
### 2. Fixed Function Signature Mismatches
#### Slice Operations
-
**`ttnn/cpp/ttnn/operations/data_movement/slice/device/slice_program_factory_tile.cpp`**:
- Removed `num_cores` argument from `set_slice_runtime_args_tile()`
calls (2 locations)
- Function signature was updated by clang-tidy but call sites weren't
updated
-
**`ttnn/cpp/ttnn/operations/data_movement/slice/device/slice_program_factory_tile_tensor_args.cpp`**:
- Removed `num_cores` argument from
`set_slice_runtime_args_tensor_args()` calls (2 locations)
- Function signature was updated by clang-tidy but call sites weren't
updated
#### Grid Sample Operations
-
**`ttnn/cpp/ttnn/operations/pool/grid_sample/grid_sample_prepare_grid.cpp`**:
- Removed unused `output_dtype` parameter from
`create_host_buffer_for_grid_preprocessing()` call
- The function signature was updated by clang-tidy but the call site
wasn't updated
### 3. Fixed Template Function Calls
- **`ttnn/cpp/ttnn/operations/eltwise/unary/unary_nanobind.cpp`**:
- Updated bind function calls to remove second argument (operation
object) and explicitly specify template parameters using `decltype`
- Fixed functions: `bind_softplus`, `bind_sigmoid_accurate`,
`bind_sigmoid_mode_appx`, `bind_unary_chain`, `bind_identity`
- These functions use the operation directly from the namespace, so the
parameter was removed by clang-tidy
### 4. Updated Test Code
- **`tests/ttnn/unit_tests/gtests/test_launch_operation.cpp`**:
- Updated test structs to match expected method signatures:
- Added `const std::vector<Tensor>& input_tensors` parameter to
`compute_output_specs()` methods
- Added `const std::vector<Tensor>& input_tensors` parameter to
`create_output_tensors()` methods
- Added `const ttnn::MeshCoordinateRangeSet& mesh_coordinate_range_set`
parameter to `create_mesh_workload()` method
- These changes ensure test mocks match the actual operation interface
requirements
## Testing
- ✅ Build passes with `./build_metal.sh --debug --build-all`
- ✅ All clang-tidy `misc-unused-params` warnings resolved
- ✅ No functional changes - only signature updates to match actual usage
## Notes
- The clang-tidy tool automatically removed unused parameters from
function signatures, but some call sites needed manual updates
- All changes maintain backward compatibility at the API level - only
internal implementation details were modified
- Test code was updated to ensure test mocks correctly implement the
expected interfaces
---------
Co-authored-by: Bryan Wilder Field Lozano <blozano@tenstorrent.com>1 parent 181022b commit 0741d5c
File tree
557 files changed
+1062
-1101
lines changed- tests
- tt_eager
- integration_tests
- ops
- tensors
- tt_metal
- distributed
- tt_fabric
- fabric_data_movement
- system_health
- tt_metal
- api
- circular_buffer
- data_movement
- dispatch/dispatch_program
- integration
- matmul
- vecadd
- llk
- noc
- perf_microbenchmark
- 10_dram_read_remote_cb_sync
- 11_remote_cb_sync_matmul_single_core
- 1_compute_mm
- 6_dram_offchip
- 8_dram_adjacent_core_read
- 9_dram_adjacent_read_remote_l1_write
- dispatch
- ethernet
- noc
- routing
- tensix
- ttnn/unit_tests/gtests
- ccl
- tensor
- udm
- copy
- tt_metal
- hw/inc/api/tensor
- ttnn
- api
- tools/profiler
- ttnn
- core
- distributed
- graph
- tensor
- layout
- cpp
- ttnn-nanobind
- ttnn/operations
- bernoulli/device
- ccl
- all_broadcast/device
- all_gather/device
- all_to_all_combine/device
- all_to_all_dispatch/device
- broadcast/device
- common
- host
- types
- mesh_partition/device
- reduce_scatter/device
- reduce_to_root/device
- conv/conv2d
- device
- copy/typecast/device
- data_movement
- bcast/device
- clone/device
- common
- concat
- device
- copy/device
- fill_pad/device
- fill_rm/device
- fold
- device
- gather
- device
- tosa
- indexed_fill/device
- moe_expert_token_remap/device
- moe_routing_remap/device
- move/device
- non_zero_indices/device
- pad/device
- permute/device
- repeat/device
- reshape_on_device/device
- reshape_view
- device
- scatter
- device
- sharded_partial
- interleaved_to_sharded_partial/device
- sharded_to_interleaved_partial/device
- sharded
- interleaved_to_sharded/device
- reshard/device
- sharded_to_interleaved/device
- slice/device
- sort/device
- split/device
- tilize_with_val_padding/device/factories
- tilize/device
- transpose
- device
- untilize_with_unpadding/device
- factories
- untilize/device
- factories
- debug/device
- eltwise
- binary_backward
- binary_ng/device
- binary
- device
- complex_unary_backward/device
- complex_unary/device
- ternary_backward
- ternary/device
- unary_backward
- unary
- common
- device
- embedding/device
- examples
- example_multiple_return/device
- example/device
- experimental
- adaptive_pool
- bcast_to/device
- ccl
- all_gather_async
- device
- all_gather_matmul_async/device
- all_reduce_async
- device
- all_to_all_async_generic/device
- all_to_all_async/device
- deepseek_minimal_broadcast/device
- llama_all_gather_matmul_async/device
- llama_reduce_scatter_create_heads/device
- llama_reduce_scatter_matmul/device
- llama_reduce_scatter/device
- matmul_reduce_scatter_async/device
- neighbor_pad_async/device
- reduce_scatter_minimal_async/device
- ring_attention_all_gather_async/device
- send_recv_async
- recv_async/device
- send_async/device
- slice_reshard_async/device
- strided_all_gather_async/device
- strided_all_gather_minimal_matmul_async/device
- cnn
- convert_to_chw/device
- convert_to_hwc/device
- conv3d/device
- dropout
- device
- matmul/group_attn_matmul/device
- minimal_matmul/device
- padded_slice
- device
- paged_cache/device
- fill_cache
- fused_update_cache
- update_cache
- plusone/device
- reduction
- deepseek_grouped_gate/device
- integral_image/device
- slice_write/device
- ssm
- hc_sum_reduce/device
- prefix_scan/device
- test/hang_device
- transformer
- all_reduce_create_qkv_heads/device
- concatenate_heads/device
- create_qkv_heads_from_separate_tensors/device
- create_qkv_heads/device
- fused_distributed_rmsnorm
- device
- nlp_concat_heads_boltz/device
- nlp_concat_heads_decode/device
- nlp_concat_heads/device
- nlp_create_qkv_heads_boltz/device
- nlp_create_qkv_heads_decode
- device
- nlp_create_qkv_heads_falcon7b/device
- nlp_create_qkv_heads_segformer/device
- nlp_create_qkv_heads_vit/device
- nlp_create_qkv_heads/device
- nlp_kv_cache_load_slice/device
- rotary_embedding_llama_fused_qk/device
- rotary_embedding_llama/device
- rotate_half/device
- split_query_key_value_and_split_heads/device
- unary_backward/gelu_backward/device
- where/device
- program_factory
- full_like/device
- full/device
- generic/device
- index_fill/device
- kv_cache/device
- matmul/device
- config
- factory
- sparse
- factory
- utilities
- moreh
- moreh_abs_pow/device
- moreh_adamw/device
- moreh_adam/device
- moreh_arange/device
- moreh_clip_grad_norm
- moreh_clip_grad_norm_step1/device
- moreh_clip_grad_norm_step2/device
- moreh_clip_grad_norm_step3/device
- moreh_dot_backward/device
- moreh_dot/device
- moreh_fold/device
- moreh_getitem/device
- moreh_group_norm_backward/device
- gamma_beta_grad
- input_grad
- moreh_group_norm/device
- moreh_layer_norm_backward/device
- moreh_layer_norm/device
- moreh_linear_backward
- device
- moreh_matmul/device
- moreh_mean_backward/device
- moreh_mean/device
- moreh_nll_loss_backward/device
- moreh_nll_loss_unreduced_backward/device
- moreh_nll_loss
- moreh_nll_loss_step1/device
- moreh_nll_loss_step2/device
- moreh_norm_backward/device
- moreh_norm/device/ord_other
- moreh_sgd/device
- moreh_softmax_backward/device
- softmax_backward_c_large
- softmax_backward_h_large
- softmax_backward_h_small
- softmax_backward_w_large
- softmax_backward_w_small
- moreh_softmax/device
- softmax_c_large
- softmax_h_large
- softmax_h_small
- softmax_w_large
- softmax_w_small
- moreh_sum_backward/device
- moreh_sum/device
- normalization
- batch_norm/device
- groupnorm
- device
- layernorm_distributed/device
- layernorm/device
- softmax/device
- point_to_point/device/host
- pool
- generic
- device
- global_avg_pool
- grid_sample
- device
- upsample/device
- prefetcher/prefetcher/device
- rand/device
- reduction
- accumulation
- device
- ema/device
- argmax/device
- generic/device
- manual_seed/device
- moe/device
- prod/device
- sampling/device
- topk
- device
- sliding_window
- halo/device
- transformer
- attention_softmax
- sdpa/device
- uniform/device
Some content is hidden
Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
557 files changed
+1062
-1101
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
84 | 84 | | |
85 | 85 | | |
86 | 86 | | |
87 | | - | |
88 | 87 | | |
89 | 88 | | |
90 | 89 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
372 | 372 | | |
373 | 373 | | |
374 | 374 | | |
375 | | - | |
| 375 | + | |
376 | 376 | | |
377 | 377 | | |
378 | 378 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
35 | 35 | | |
36 | 36 | | |
37 | 37 | | |
38 | | - | |
| 38 | + | |
39 | 39 | | |
40 | 40 | | |
41 | 41 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
34 | 34 | | |
35 | 35 | | |
36 | 36 | | |
37 | | - | |
| 37 | + | |
38 | 38 | | |
39 | 39 | | |
40 | 40 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
284 | 284 | | |
285 | 285 | | |
286 | 286 | | |
287 | | - | |
| 287 | + | |
288 | 288 | | |
289 | 289 | | |
290 | 290 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
29 | 29 | | |
30 | 30 | | |
31 | 31 | | |
32 | | - | |
| 32 | + | |
33 | 33 | | |
34 | 34 | | |
35 | 35 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
29 | 29 | | |
30 | 30 | | |
31 | 31 | | |
32 | | - | |
| 32 | + | |
33 | 33 | | |
34 | 34 | | |
35 | 35 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
221 | 221 | | |
222 | 222 | | |
223 | 223 | | |
224 | | - | |
| 224 | + | |
225 | 225 | | |
226 | 226 | | |
227 | 227 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
53 | 53 | | |
54 | 54 | | |
55 | 55 | | |
56 | | - | |
| 56 | + | |
57 | 57 | | |
58 | 58 | | |
59 | 59 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1107 | 1107 | | |
1108 | 1108 | | |
1109 | 1109 | | |
1110 | | - | |
| 1110 | + | |
1111 | 1111 | | |
1112 | 1112 | | |
1113 | 1113 | | |
| |||
0 commit comments