What happened?
When using azure_ai/flux.2-pro for image editing via /v1/images/edits, the reference image is silently ignored. FLUX generates a completely new image from the prompt instead of editing the provided reference image.
The root cause is in litellm/llms/azure_ai/image_edit/flux2_transformation.py. The transform_image_edit_request method sends the base64-encoded image under the JSON key "image", but the Black Forest Labs API on Azure AI Foundry expects "input_image".
The API does not return an error. It simply ignores the unknown "image" field and generates a new image from scratch, making the bug hard to detect.
I verified this by patching line 113 from "image" to "input_image" in a running LiteLLM proxy container. With the fix, the reference image is correctly preserved and edited. Without it, a completely different image is generated.
References:
Steps to Reproduce
- Configure
azure_ai/flux.2-pro in config.yaml:
- model_name: flux.2-pro
litellm_params:
model: azure_ai/flux.2-pro
api_base: "os.environ/AZURE_AI_API_BASE"
api_key: "os.environ/AZURE_AI_API_KEY"
api_version: "preview"
model_info:
mode: "image_generation"
- Send an image edit request:
import os
import litellm
response = litellm.image_edit(
model="azure_ai/flux.2-pro",
image=open("car.png", "rb"),
prompt="Make the car red.",
api_base=os.environ["AZURE_AI_API_BASE"],
api_key=os.environ["AZURE_AI_API_KEY"],
api_version="preview",
)
-
The returned image is a completely new image (different car, different scene) instead of the reference image with the color changed.
-
Patch flux2_transformation.py:113 from "image" to "input_image", restart the proxy, and repeat. The reference image is now correctly used.
The fix (one-line change in flux2_transformation.py:113):
- "image": image_b64,
+ "input_image": image_b64,
Secondary issue: _convert_image_to_base64 discards all but the first image from a list (line 126-129). The BFL API supports up to 8 reference images via input_image, input_image_2, ..., input_image_8.
Relevant log output
No errors logged. The API returns HTTP 200 with a valid but wrong image.
What part of LiteLLM is this about?
Proxy
What LiteLLM version are you on?
v1.82.3-stable (also verified on current main branch, same code, same bug)
Twitter / LinkedIn details
No response
What happened?
When using
azure_ai/flux.2-profor image editing via/v1/images/edits, the reference image is silently ignored. FLUX generates a completely new image from the prompt instead of editing the provided reference image.The root cause is in
litellm/llms/azure_ai/image_edit/flux2_transformation.py. Thetransform_image_edit_requestmethod sends the base64-encoded image under the JSON key"image", but the Black Forest Labs API on Azure AI Foundry expects"input_image".The API does not return an error. It simply ignores the unknown
"image"field and generates a new image from scratch, making the bug hard to detect.I verified this by patching line 113 from
"image"to"input_image"in a running LiteLLM proxy container. With the fix, the reference image is correctly preserved and edited. Without it, a completely different image is generated.References:
Steps to Reproduce
azure_ai/flux.2-proin config.yaml:The returned image is a completely new image (different car, different scene) instead of the reference image with the color changed.
Patch
flux2_transformation.py:113from"image"to"input_image", restart the proxy, and repeat. The reference image is now correctly used.The fix (one-line change in
flux2_transformation.py:113):Secondary issue:
_convert_image_to_base64discards all but the first image from a list (line 126-129). The BFL API supports up to 8 reference images viainput_image,input_image_2, ...,input_image_8.Relevant log output
What part of LiteLLM is this about?
Proxy
What LiteLLM version are you on?
v1.82.3-stable (also verified on current
mainbranch, same code, same bug)Twitter / LinkedIn details
No response