This repository has been archived by the owner on Oct 19, 2024. It is now read-only.
This repository has been archived by the owner on Oct 19, 2024. It is now read-only.
Open
Description
Hi all, I notice that the latest transformers repo doesn't have flax version of llama model, any solutions to support llama finetune?
There exists a previous pr (#923) to support llama model conversion, but looks like it didn't support GQA (llama2).
Thanks for all your suggestions :)
Metadata
Assignees
Labels
No labels