Skip to content

add uv to fix vllm transformers install#59

Merged
JaredforReal merged 3 commits intomainfrom
fix/install
Feb 5, 2026
Merged

add uv to fix vllm transformers install#59
JaredforReal merged 3 commits intomainfrom
fix/install

Conversation

@JaredforReal
Copy link
Collaborator

Contribution Guide

We welcome your contributions to this repository. To ensure elegant code style and better code quality, we have prepared
the following contribution guidelines.

What We Accept

  • This PR fixes a typo or improves the documentation (if this is the case, you may skip the other checks).
  • This PR fixes a specific issue — please reference the issue number in the PR description. Make sure your code strictly
    follows the coding standards below.
  • This PR introduces a new feature — please clearly explain the necessity and implementation of the feature. Make sure
    your code strictly follows the coding standards below.

Code Style Guide

Good code style is an art. We have prepared a pre-commit hook to enforce consistent code
formatting across the project. You can clean up your code following the steps below:

pre-commit run --all-files

If your code complies with the standards, you should not see any errors.

Naming Conventions

  • Please use English for naming; do not use Pinyin or other languages. All comments should also be in English.
  • Follow PEP8 naming conventions strictly, and use underscores to separate words. Avoid meaningless names such as
    a, b, c.

Signed-off-by: JaredforReal <w13431838023@gmail.com>
Signed-off-by: JaredforReal <w13431838023@gmail.com>
Copilot AI review requested due to automatic review settings February 5, 2026 07:21
Signed-off-by: JaredforReal <w13431838023@gmail.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the installation instructions in the documentation to use uv (a faster Python package installer) instead of pip. The changes aim to fix issues with vLLM and transformers installation by leveraging uv's better dependency resolution.

Changes:

  • Replaced pip commands with uv pip commands throughout both English and Chinese README files
  • Added virtual environment creation step using uv venv --python 3.12 --seed
  • Added --torch-backend=auto flag to vLLM installation command
  • Added comments noting that uv may not be needed in Docker containers

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 8 comments.

File Description
README.md Updated installation instructions to use uv instead of pip, added venv creation step, and modified vLLM installation command
README_zh.md Chinese version of the same documentation updates as README.md

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 42 to +43
git clone https://github.com/zai-org/glm-ocr.git
cd glm-ocr && pip install -e .
cd glm-ocr
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The installation instructions now require users to have uv pre-installed, but there are no instructions on how to install uv itself. Users who don't have uv installed will not be able to follow these instructions. Consider adding a prerequisite section or a link to uv installation instructions (e.g., pip install uv or reference to https://docs.astral.sh/uv/getting-started/installation/).

Copilot uses AI. Check for mistakes.
```

Launch the service:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment has a grammatical error. "uv may not be need" should be "uv may not be needed". The verb "need" should be in its past participle form "needed" when used with "be".

Suggested change
# In docker container, uv may not be needed for transformers install

Copilot uses AI. Check for mistakes.
```

Launch the service:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment has a grammatical error. "uv may not be need" should be "uv may not be needed". The verb "need" should be in its past participle form "needed" when used with "be".

Suggested change
# In docker container, uv may not be needed for transformers install

Copilot uses AI. Check for mistakes.
```

启动服务:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Chinese comment contains a typo. "或许不在需要" should be "或许不再需要". The character "在" (at/in) should be "再" (again) to correctly express "may no longer need".

Suggested change
# 在 docker 容器中,或许不再需要 uv 来安装transformers

Copilot uses AI. Check for mistakes.
```

启动服务:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Chinese comment contains a typo. "或许不在需要" should be "或许不再需要". The character "在" (at/in) should be "再" (again) to correctly express "may no longer need".

Suggested change
# 在 docker 容器中,或许不再需要 uv 来安装transformers

Copilot uses AI. Check for mistakes.
@@ -83,15 +87,16 @@ Deploy the GLM-OCR model locally for full control. The SDK provides the complete
Install vLLM:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The vLLM installation command adds the --torch-backend=auto flag. However, this flag may not be compatible with all versions of vLLM. Consider verifying that this flag is supported in the nightly build being installed, or document the minimum vLLM version that supports this flag to prevent installation errors for users.

Suggested change
uv pip install -U vllm --extra-index-url https://wheels.vllm.ai/nightly

Copilot uses AI. Check for mistakes.
@@ -85,15 +87,16 @@ API 文档:https://docs.bigmodel.cn/cn/guide/models/vlm/glm-ocr
安装 vLLM:

Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The vLLM installation command adds the --torch-backend=auto flag. However, this flag may not be compatible with all versions of vLLM. Consider verifying that this flag is supported in the nightly build being installed, or document the minimum vLLM version that supports this flag to prevent installation errors for users.

Suggested change
uv pip install -U vllm --extra-index-url https://wheels.vllm.ai/nightly
# 如需使用 `--torch-backend=auto` 参数,请先确认当前 vLLM 版本已支持该参数(可参考 vLLM 官方文档或发行说明)

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant