Model
lama
Describe the bug
When sending a POST request to the /api/v1/inpaint endpoint with valid PNG files (image and mask) using curl or Python requests, the server returns an HTTP 500 Internal Server Error with a UnicodeDecodeError: 'utf-8' codec can't decode byte 0x89 in position 135: invalid start byte. This error occurs consistently, even after converting the mask to a binary (1-bit grayscale) format. The issue appears to be related to FastAPI attempting to decode binary PNG data as UTF-8 text during error handling, likely due to improper validation or exception handling in the multipart form data processing.
Steps to Reproduce:
python -m iopaint start --device cpu --model lama --host 0.0.0.0
Send a multipart form-data request with valid PNG files:
curl -F '[email protected];type=image/png' -F 'mask=@mask_check_binary.png;type=image/png' http://192.168.251.81:8080/api/v1/inpaint --output demo_out.png
Observe the HTTP 500 response with the following error in demo_out.png
{"error":"UnicodeDecodeError","detail":"","body":"","errors":"'utf-8' codec can't decode byte 0x89 in position 135: invalid start byte"}
Expected Behavior
The server should process the inpainting request and return a processed image as a PNG file.
Actual Behavior
The server returns an HTTP 500 error with a UnicodeDecodeError, indicating a failure to handle binary PNG data correctly.
Screenshots
System Info
iopaint: 1.6.0
pytorch: 2.7.1
CUDA: N/A (running on CPU)
Additional details:
Python: 3.11.2
Platform: Debian GNU/Linux 12
Other dependencies:
torchvision: 0.22.1
Pillow: 9.5.0
diffusers: 0.27.2
transformers: 4.48.3
opencv-python: 4.11.0.86
accelerate: 1.7.0
Files: 4.png:
PNG image, 1500x1013, 8-bit/color RGB
files: mask_check_binary.png: PNG image, 1500x1013, 1-bit grayscale
(converted using convert mask_check.png -threshold 50% mask_check_binary.png)
If I use the web http://192.168.251.81:8080/ - is everything is perfect.
