This project provides a simple Streamlit chat app to get you started with the Azure AI Foundry and Azure AI Inference client library for Python.
- Supports both text and image inputs
- Seamlessly switch between preconfigured AI models via a dropdown menu
- Select from predefined system prompt templates or enter custom instructions
- Non-persistent chat history managed with Streamlit session state, including options to view or clear the session
- Easily clear the chat history at any time
- Print messages and session state to console for troubleshooting
This project is using the Azure AI Inference client library for Python (azure-ai-inference). If preferred, you can modify the code to use the OpenAI Python client library (openai) instead.
For a list of supported models, services, and known issues, refer to the Azure AI Inference client library for Python documentation.
Use GitHub Codespaces, Dev Containers in VS Code, or setup manually:
- Clone the repository:
git clone https://github.com/gerbermarco/simple-azure-ai-streamlit-chat.git
cd simple-azure-ai-streamlit-chat- Copy and modify the
.envfile:
cp .env.example .env
# Edit .env and update the values with your detailsNote: Never commit your
.envfile to version control, as it contains sensitive information.
- Install dependencies:
pip install -r requirements.txt- Run the Streamlit app:
streamlit run app.py