Skip to content

🧸 Fix unset tokenizer pad_token #3290

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Apr 22, 2025
6 changes: 5 additions & 1 deletion trl/trainer/grpo_trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,9 @@ def reward_func(completions, **kwargs):
Dataset to use for evaluation. It must meet the same requirements as `train_dataset`.
processing_class ([`~transformers.PreTrainedTokenizerBase`], *optional*, defaults to `None`):
Processing class used to process the data. The padding side must be set to "left". If `None`, the
processing class is loaded from the model's name with [`~transformers.AutoTokenizer.from_pretrained`].
processing class is loaded from the model's name with [`~transformers.AutoTokenizer.from_pretrained`]. A padding
token, `processing_class.pad_token`, must be defined. If not explicitly set, `processing_class.eos_token` will
be used as the default padding token.
reward_processing_classes (`Union[PreTrainedTokenizerBase, list[PreTrainedTokenizerBase]]`, *optional*, defaults to `None`):
Processing classes corresponding to the reward functions specified in `reward_funcs`. Can be either:
Expand Down Expand Up @@ -418,6 +420,8 @@ def __init__(
# Processing class
if processing_class is None:
processing_class = AutoTokenizer.from_pretrained(model.config._name_or_path, padding_side="left")
if processing_class.pad_token is None:
processing_class.pad_token = processing_class.eos_token
Copy link
Contributor Author

@LeonEricsson LeonEricsson Apr 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we warn/inform the user of this default behaviour?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if processing_class.pad_token is None:
processing_class.pad_token = processing_class.eos_token
if processing_class.pad_token is None:
processing_class.pad_token = processing_class.eos_token

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We also want to run this when the processing class is passed, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, agreed. But the more I think about it, the less I like the idea of silently setting the padding token.... even if it's documented.


# Reward functions
if not isinstance(reward_funcs, list):
Expand Down