使用 inference_hf.py 推理异常 #53
Unanswered
Xiaoshu-Zhao
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
我希望在本地用wsl2运行 llama3。我参考了 ReadMe 里面的用 hugging face 推理的方法,运行代码如下:
发现推理的时候会卡住,显卡的占用率也很低。
请问是什么问题呢
Beta Was this translation helpful? Give feedback.
All reactions