Skip to content

LLaMA-Factory 对 Qwen2.5_VL全量微调后,LLaMA-Factory 验证结果和 vllm 起模型服务推理结果不一致 #6986

Unanswered
Lauriecando asked this question in Q&A
Discussion options

You must be logged in to vote

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@Lauriecando
Comment options

@hiyouga
Comment options

@Lauriecando
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants