Skip to content

Commit adbaa23

Browse files
Fix unexpected 'num_items_in_batch' argument in GPT-NeoX forward (#1850)
Co-authored-by: regisss <15324346+regisss@users.noreply.github.com>
1 parent d0d0172 commit adbaa23

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

optimum/habana/transformers/models/gpt_neox/modeling_gpt_neox.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -269,6 +269,7 @@ def gaudi_gpt_neox_model_forward(
269269
return_dict: Optional[bool] = None,
270270
cache_position: Optional[torch.LongTensor] = None,
271271
token_idx: Optional[torch.Tensor] = None,
272+
**kwargs,
272273
) -> Union[Tuple, BaseModelOutputWithPast]:
273274
"""
274275
Copied from GPTNeoxModel.forward: https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neox/modeling_gpt_neox.py

0 commit comments

Comments
 (0)