-
-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Open
seonghobae/litellm
#2Labels
Description
Check for existing issues
- I have searched the existing issues and checked that my issue is not a duplicate.
What happened?
LiteLLM Proxy fails to send a request to Azure with:
litellm.APIError: AzureException APIError - 'utf-8' codec can't encode character '\ud83e' ... surrogates not allowed
This happens while routing azure/gpt-5.4 with configured model-group fallbacks.
Observed behavior:
- the original request fails with a UnicodeEncodeError on
\ud83e - LiteLLM retries 5 times
- fallback attempts also fail with the same encoding error
- one fallback path additionally emits
No fallback model group found for original model_group=gpt-5, which looks like secondary noise after the original malformed input already poisoned the request
Expected behavior:
- LiteLLM should sanitize / reject lone surrogate characters before sending the request to the provider
- or escape the offending value before transport/logging
- and fallback handling should preserve the original root cause without cascading misleading fallback errors
This looks related to malformed Unicode input (likely a broken/truncated emoji surrogate) reaching the Azure request serialization path.
Possibly related to prior surrogate/UTF-8 issues such as #8583, but this case is on Azure + proxy fallback flow.
Steps to Reproduce
- Run LiteLLM Proxy with an Azure model group such as
gpt-5.4. - Configure fallbacks for that model group.
- Send a request where one of the text fields (message content, tool output, or other forwarded text) contains a lone surrogate character such as
\ud83einstead of a valid Unicode scalar value. - Observe that the Azure call fails before completion with:
'utf-8' codec can't encode character '\ud83e' ... surrogates not allowed
- Observe repeated retries and fallback attempts failing with the same root error.
Relevant log output
Alert type: llm_exceptions
Level: High
Timestamp: 03:47:12
Message: LLM API call failed: `litellm.APIError: AzureException APIError - 'utf-8' codec can't encode character '\ud83e' in position 325105: surrogates not allowed. Received Model Group=gpt-5.4
Available Model Group Fallbacks=['gpt-5.4-pro', 'gpt-5.2', 'gpt-5.1', 'gpt-5.2-codex', 'gpt-5.1-codex-max', 'gpt-5-pro', 'gpt-5']
Error doing the fallback: litellm.APIError: AzureException APIError - 'utf-8' codec can't encode character '\ud83e' in position 325103: surrogates not allowedNo fallback model group found for original model_group=gpt-5. Fallbacks=[{'gpt-5.2-codex': ['gpt-5.1-codex-max', 'gpt-5.1', 'gpt-5-pro']}, {'gpt-5.2': ['gpt-5.1', 'gpt-5.1-codex-max', 'gpt-5.2-codex', 'gpt-5', 'gpt-5-pro']}, {'gpt-5.4': ['gpt-5.4-pro', 'gpt-5.2', 'gpt-5.1', 'gpt-5.2-codex', 'gpt-5.1-codex-max', 'gpt-5-pro', 'gpt-5']}]. Received Model Group=gpt-5
Available Model Group Fallbacks=None
Error doing the fallback: litellm.APIError: AzureException APIError - 'utf-8' codec can't encode character '\ud83e' in position 325103: surrogates not allowedNo fallback model group found for original model_group=gpt-5. Fallbacks=[{'gpt-5.2-codex': ['gpt-5.1-codex-max', 'gpt-5.1', 'gpt-5-pro']}, {'gpt-5.2': ['gpt-5.1', 'gpt-5.1-codex-max', 'gpt-5.2-codex', 'gpt-5', 'gpt-5-pro']}, {'gpt-5.4': ['gpt-5.4-pro', 'gpt-5.2', 'gpt-5.1', 'gpt-5.2-codex', 'gpt-5.1-codex-max', 'gpt-5-pro', 'gpt-5']}] LiteLLM Retried: 5 times, LiteLLM Max Retries: 5 LiteLLM Retried: 5 times, LiteLLM Max Retries: 5 LiteLLM Retried: 5 times, LiteLLM Max Retries: 5
Model: azure/gpt-5.4
API Base: [redacted Azure endpoint]
Messages: None`
Proxy URL: [redacted internal proxy URL]
What part of LiteLLM is this about?
Proxy
What LiteLLM version are you on ?
1.82.3
Twitter / LinkedIn details
No response
Reactions are currently unavailable