add uv to fix vllm transformers install#59
Conversation
Signed-off-by: JaredforReal <w13431838023@gmail.com>
Signed-off-by: JaredforReal <w13431838023@gmail.com>
Signed-off-by: JaredforReal <w13431838023@gmail.com>
There was a problem hiding this comment.
Pull request overview
This PR updates the installation instructions in the documentation to use uv (a faster Python package installer) instead of pip. The changes aim to fix issues with vLLM and transformers installation by leveraging uv's better dependency resolution.
Changes:
- Replaced
pipcommands withuv pipcommands throughout both English and Chinese README files - Added virtual environment creation step using
uv venv --python 3.12 --seed - Added
--torch-backend=autoflag to vLLM installation command - Added comments noting that uv may not be needed in Docker containers
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 8 comments.
| File | Description |
|---|---|
| README.md | Updated installation instructions to use uv instead of pip, added venv creation step, and modified vLLM installation command |
| README_zh.md | Chinese version of the same documentation updates as README.md |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| git clone https://github.com/zai-org/glm-ocr.git | ||
| cd glm-ocr && pip install -e . | ||
| cd glm-ocr |
There was a problem hiding this comment.
The installation instructions now require users to have uv pre-installed, but there are no instructions on how to install uv itself. Users who don't have uv installed will not be able to follow these instructions. Consider adding a prerequisite section or a link to uv installation instructions (e.g., pip install uv or reference to https://docs.astral.sh/uv/getting-started/installation/).
| ``` | ||
|
|
||
| Launch the service: | ||
|
|
There was a problem hiding this comment.
The comment has a grammatical error. "uv may not be need" should be "uv may not be needed". The verb "need" should be in its past participle form "needed" when used with "be".
| # In docker container, uv may not be needed for transformers install |
| ``` | ||
|
|
||
| Launch the service: | ||
|
|
There was a problem hiding this comment.
The comment has a grammatical error. "uv may not be need" should be "uv may not be needed". The verb "need" should be in its past participle form "needed" when used with "be".
| # In docker container, uv may not be needed for transformers install |
| ``` | ||
|
|
||
| 启动服务: | ||
|
|
There was a problem hiding this comment.
The Chinese comment contains a typo. "或许不在需要" should be "或许不再需要". The character "在" (at/in) should be "再" (again) to correctly express "may no longer need".
| # 在 docker 容器中,或许不再需要 uv 来安装transformers |
| ``` | ||
|
|
||
| 启动服务: | ||
|
|
There was a problem hiding this comment.
The Chinese comment contains a typo. "或许不在需要" should be "或许不再需要". The character "在" (at/in) should be "再" (again) to correctly express "may no longer need".
| # 在 docker 容器中,或许不再需要 uv 来安装transformers |
| @@ -83,15 +87,16 @@ Deploy the GLM-OCR model locally for full control. The SDK provides the complete | |||
| Install vLLM: | |||
|
|
|||
There was a problem hiding this comment.
The vLLM installation command adds the --torch-backend=auto flag. However, this flag may not be compatible with all versions of vLLM. Consider verifying that this flag is supported in the nightly build being installed, or document the minimum vLLM version that supports this flag to prevent installation errors for users.
| uv pip install -U vllm --extra-index-url https://wheels.vllm.ai/nightly |
| @@ -85,15 +87,16 @@ API 文档:https://docs.bigmodel.cn/cn/guide/models/vlm/glm-ocr | |||
| 安装 vLLM: | |||
|
|
|||
There was a problem hiding this comment.
The vLLM installation command adds the --torch-backend=auto flag. However, this flag may not be compatible with all versions of vLLM. Consider verifying that this flag is supported in the nightly build being installed, or document the minimum vLLM version that supports this flag to prevent installation errors for users.
| uv pip install -U vllm --extra-index-url https://wheels.vllm.ai/nightly | |
| # 如需使用 `--torch-backend=auto` 参数,请先确认当前 vLLM 版本已支持该参数(可参考 vLLM 官方文档或发行说明) |
Contribution Guide
We welcome your contributions to this repository. To ensure elegant code style and better code quality, we have prepared
the following contribution guidelines.
What We Accept
follows the coding standards below.
your code strictly follows the coding standards below.
Code Style Guide
Good code style is an art. We have prepared a
pre-commithook to enforce consistent codeformatting across the project. You can clean up your code following the steps below:
If your code complies with the standards, you should not see any errors.
Naming Conventions
a,b,c.