-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Description
🔖 Feature description
Postiz currently uses a fixed OpenAI base URL and model. I am requesting support for customizing these values through environment variables, allowing developers and self-hosters to point Postiz to different OpenAI-compatible providers or local LLM servers.
This would include supporting:
OPENAI_BASE_URL=
OPENAI_MODEL=
When these environment variables are not provided, Postiz should use the existing default values.
🎤 Why is this feature needed ?
In my use-case, I need to run Postiz with local or alternative OpenAI-compatible APIs such as Ollama, LM Studio, OpenRouter, or custom vLLM deployments.
Right now, the base URL and model are hardcoded, making it impossible to change providers without modifying the source code.
Examples:
- Using local models during development
- Switching to more cost-effective API providers
- Running internal enterprise LLMs
- Testing different models dynamically
ENV-based configuration makes Postiz more flexible without requiring UI changes.
✌️ How do you aim to achieve this?
I want this feature to allow Postiz to read two optional ENV variables:
OPENAI_BASE_URL
OPENAI_MODEL
The app would use these values if available; otherwise, it would fall back to the default OpenAI settings.
No UI changes are required; the behavior remains unchanged unless users override it with environment settings.
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- I checked and didn't find similar issue
Are you willing to submit PR?
None