VLLM takes alot of time to start, can we use the docker image? #3872
Unanswered
Stealthwriter
asked this question in
Q&A
Replies: 2 comments
-
you should be good with picking a vllm container that suites your needs and load it as described in https://skypilot.readthedocs.io/en/latest/examples/docker-containers.html#docker-containers-as-runtime-environments |
Beta Was this translation helpful? Give feedback.
0 replies
-
We have an example using vllm docker image here: https://github.com/skypilot-org/skypilot/blob/master/llm/vllm/serve-openai-api-docker.yaml Note that runpod has some limitations and this yaml may not work. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm trying to use sky serve with the docker image of vllm, and my cloud provider is runpod, which YAML should I use?
Beta Was this translation helpful? Give feedback.
All reactions