Skip to content
This repository has been archived by the owner on Oct 19, 2024. It is now read-only.
This repository has been archived by the owner on Oct 19, 2024. It is now read-only.

Any solution to support llama2 finetune? #966

Open
@LeiWang1999

Description

Hi all, I notice that the latest transformers repo doesn't have flax version of llama model, any solutions to support llama finetune?

There exists a previous pr (#923) to support llama model conversion, but looks like it didn't support GQA (llama2).

Thanks for all your suggestions :)

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions