Skip to content

Conversation

@shirinyamani
Copy link

In this PR, I tried to add a complete guide on mistral-7b finetuning along with covering the potential questions one could have about Lora implementation. Hope this helps the community!
Cheers

@NazimHAli
Copy link

The notebook in this PR is using transformers, peft, bitsandbyters to finetune, not mistral-finetune. There are similar notebooks for these libraries like this with steps to finetune that you can update with some of your changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants