Skip to content

[Upstream Breaking Change] vLLM d7de043d → v0.15.1 - Security CVEs #52

@github-actions

Description

@github-actions

Summary

vLLM has released v0.15.1 with critical security fixes for two CVEs. The current pin d7de043d55d1dd629554467e23874097e1c48993 is vulnerable.

Security Issues Fixed

Additional Improvements in v0.15.1

Hardware Support

  • RTX Blackwell (SM120): Fixed NVFP4 MoE kernel support
  • FP8 kernel selection: Fixed CUTLASS group GEMM fallback on SM120

Performance

  • torch.compile cold-start: Fixed regression (Llama3-70B: ~88s → ~22s)
  • MoE forward pass: Optimized layer name computation caching

Dependencies

  • LMCache pinned to >= v0.3.9 (we're at v0.3.13, compatible)

Impact

CRITICAL - Security vulnerabilities in dependencies (aiohttp, Protobuf) require immediate upgrade.

Files Affected

According to docs/upstream-versions.md:

  • docker/Dockerfile.cuda line 68 (VLLM_COMMIT_SHA)

Recommended Action

  1. Update docker/Dockerfile.cuda line 68:

    ARG VLLM_COMMIT_SHA=d7de043d55d1dd629554467e23874097e1c48993

    to:

    ARG VLLM_COMMIT_SHA=1892993bc18e243e2c05841314c5e9c06a80c70d

    (SHA for v0.15.1 tag)

  2. Test container build

  3. Run E2E tests to ensure compatibility

Upstream References

Generated by Upstream Dependency Monitor

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions