Skip to content

Add possibility to do batch inference in vllm (#15) #41

Add possibility to do batch inference in vllm (#15)

Add possibility to do batch inference in vllm (#15) #41

Triggered via push April 26, 2025 20:44
Status Success
Total duration 2m 9s
Artifacts

python-app.yml

on: push
Fit to window
Zoom out
Zoom in