[docs] feat: improve docstrings in tensordict_utils.py (#1345) #4732
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
verl/utils/tensordict_utils.pyFunctions Documented
Non-tensor data handling:
assign_non_tensor_data- Assign single non-tensor value to TensorDictunwrap_non_tensor_data- Unwrap NonTensorData to get underlying valueget_non_tensor_data- Retrieve and unwrap non-tensor dataTensorDict concatenation/splitting:
concat_nested_tensors- Concatenate 2D nested tensorsconcat_tensordict_with_none_bsz- Handle TensorDicts with empty batch sizeconcat_tensordict- Concatenate multiple TensorDicts along dim 0chunk_tensordict- Split TensorDict into equal-sized chunksData access and manipulation:
index_select_tensor_dict- Select rows using indicesunion_tensor_dict- Merge two TensorDictsmake_iterator- Create mini-batch iterator for trainingassert_tensordict_eq- Assert two TensorDicts are equalget/get_keys- Get values with automatic unwrappingpop/pop_keys- Remove and return valuesPadding utilities:
pad_to_divisor- Pad batch dimension for distributed trainingunpad- Remove padding from TensorDict