Skip to content

A simple Streamlit chat app template to get you started quickly. Built using the Azure AI Inference client library for Python.

Notifications You must be signed in to change notification settings

gerbermarco/simple-azure-ai-streamlit-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

3 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Simple Azure AI Streamlit Chat Template

This project provides a simple Streamlit chat app to get you started with the Azure AI Foundry and Azure AI Inference client library for Python.

Screenshot

๐Ÿ“ Overview

Features

  • Supports both text and image inputs
  • Seamlessly switch between preconfigured AI models via a dropdown menu
  • Select from predefined system prompt templates or enter custom instructions
  • Non-persistent chat history managed with Streamlit session state, including options to view or clear the session
  • Easily clear the chat history at any time
  • Print messages and session state to console for troubleshooting

Azure AI Inference client library

This project is using the Azure AI Inference client library for Python (azure-ai-inference). If preferred, you can modify the code to use the OpenAI Python client library (openai) instead.

For a list of supported models, services, and known issues, refer to the Azure AI Inference client library for Python documentation.

๐Ÿš€ Quick Start

Use GitHub Codespaces, Dev Containers in VS Code, or setup manually:

  1. Clone the repository:
git clone https://github.com/gerbermarco/simple-azure-ai-streamlit-chat.git
cd simple-azure-ai-streamlit-chat
  1. Copy and modify the .env file:
cp .env.example .env
# Edit .env and update the values with your details

Note: Never commit your .env file to version control, as it contains sensitive information.

  1. Install dependencies:
pip install -r requirements.txt
  1. Run the Streamlit app:
streamlit run app.py

About

A simple Streamlit chat app template to get you started quickly. Built using the Azure AI Inference client library for Python.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published