v1.63.8-nightly
·
38 commits
to be35c9a663b96203ee4a5e3a7ed89d5744643e86
since this release
What's Changed
- Delegate router azure client init logic to azure provider by @krrishdholakia in #9140
- Bing Search Pass Thru by @sfarthin in #8019
- [Feat] Add OpenAI Responses API to litellm python SDK by @ishaan-jaff in #9155
- Support credential management on Proxy - via CRUD endpoints -
credentials/*
by @krrishdholakia in #9124 - Bump @babel/runtime-corejs3 from 7.26.0 to 7.26.10 in /docs/my-website by @dependabot in #9167
- Bump @babel/helpers from 7.26.0 to 7.26.10 in /docs/my-website by @dependabot in #9168
- fix(azure): Patch for Function Calling Bug & Update Default API Version to
2025-02-01-preview
by @colesmcintosh in #9191 - [Feat] - Add Responses API on LiteLLM Proxy by @ishaan-jaff in #9183
New Contributors
Full Changelog: v1.63.7-nightly...v1.63.8-nightly
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.63.8-nightly
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 230.0 | 258.44225107871796 | 6.155259588605481 | 0.0033416175833905974 | 1842 | 1 | 86.54871299995648 | 3971.497738000039 |
Aggregated | Passed ✅ | 230.0 | 258.44225107871796 | 6.155259588605481 | 0.0033416175833905974 | 1842 | 1 | 86.54871299995648 | 3971.497738000039 |