Error fine tuning "Phi 4": Phi3ForCausalLM.__init__() got an unexpected keyword argument 'use_flash_attention_2' #163
Unanswered
isabelcabezasm
asked this question in
Product Feedback and Ideas
Replies: 1 comment
-
|
Can you confirm your transformers version ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Technical Feedback
I've tried to fine tune models Phi-4 model.
Finalizing the training, the process fails with this error: "Phi3ForCausalLM.__init__() got an unexpected keyword argument 'use_flash_attention_2'"Training completed on
Sep 17, 2025 1:30 PM
Duration
47m 19.52s
Training file
train_jsonl_2025-09-17_083311_UTC
Base model
Phi-4:v8
Virtual Machine
Standard_NC80adis_H100_v5
Task parameters
Batch size multiplier
1
Epochs
1
Learning rate
0.0003
Desired Outcome
Describe the desired outcome:
Complete the training sucessfully.
Current Workaround
Describe the current workaround.
We didn't find any workaround.
Beta Was this translation helpful? Give feedback.
All reactions