Replies: 1 comment
-
|
完整版指的是已经将LoRA和base模型合并,直接可以使用的模型,并不是说模型是 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
请问wiki的faq,问题5:为什么不对模型做全量预训练而是用LoRA? 但是在readme里“下载地址”板块,有Llama-3-Chinese-8B完整版呀。按照我的理解,这个完整版是从Meta的llama-3-8B base模型做全参中文无监督预训练得到的。
小白理解有限,谁能解释一下么?十分感谢
Beta Was this translation helpful? Give feedback.
All reactions