This page describes how to run OpenChat Playground (OCP) with Foundry Local models integration.
-
Get the repository root.
# bash/zsh REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
# PowerShell $REPOSITORY_ROOT = git rev-parse --show-toplevel
-
Make sure the Foundry Local server is up and running.
foundry service start
-
Download the Foundry Local model. The default model OCP uses is
phi-4-mini.foundry model download phi-4-mini
Alternatively, if you want to run with a different model, say
qwen2.5-7b, other than the default one, download it first by running the following command.foundry model download qwen2.5-7b
Make sure to follow the model MUST be selected from the CLI output of
foundry model list. -
Make sure you are at the repository root.
cd $REPOSITORY_ROOT
-
Run the app.
# bash/zsh dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \ --connector-type FoundryLocal
# PowerShell dotnet run --project $REPOSITORY_ROOT\src\OpenChat.PlaygroundApp -- ` --connector-type FoundryLocal
Alternatively, if you want to run with a different model, say
qwen2.5-7b, make sure you've already downloaded the model by running thefoundry model download qwen2.5-7bcommand.# bash/zsh dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \ --connector-type FoundryLocal \ --alias qwen2.5-7b
# PowerShell dotnet run --project $REPOSITORY_ROOT\src\OpenChat.PlaygroundApp -- ` --connector-type FoundryLocal ` --alias qwen2.5-7b
-
Open your web browser, navigate to
http://localhost:5280, and enter prompts.
-
Set the Foundry Local service port. The default port OCP uses is
55438.foundry service set --port 55438Alternatively, if you want to run with a different port, say
63997, other than the default one, set it first by running the following command.foundry service set --port 63997 -
Make sure the Foundry Local server is up and running.
foundry service start
-
Download the Foundry Local model. The default model OCP uses is
phi-4-mini.foundry model download phi-4-mini
Alternatively, if you want to run with a different model, say
qwen2.5-7b, other than the default one, download it first by running the following command.foundry model download qwen2.5-7b
Make sure to follow the model MUST be selected from the CLI output of
foundry model list. -
Load the Foundry Local model. The default model OCP uses is
phi-4-mini.foundry model load phi-4-mini
Alternatively, if you want to run with a different model, say
qwen2.5-7b, other than the default one, download it first by running the following command.foundry model load qwen2.5-7b
-
Make sure you are at the repository root.
cd $REPOSITORY_ROOT
-
Build a container.
docker build -f Dockerfile -t openchat-playground:latest . -
Run the app. The
{{Model ID}}refers to theModel IDshown in the output of thefoundry service listcommand.# bash/zsh - from locally built container docker run -i --rm -p 8080:8080 openchat-playground:latest \ --connector-type FoundryLocal \ --alias {{Model ID}} \ --endpoint http://host.docker.internal:55438/v1 \ --disable-foundrylocal-manager# PowerShell - from locally built container docker run -i --rm -p 8080:8080 openchat-playground:latest ` --connector-type FoundryLocal ` --alias {{Model ID}} ` --endpoint http://host.docker.internal:55438/v1 ` --disable-foundrylocal-manager
# bash/zsh - from GitHub Container Registry docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \ --connector-type FoundryLocal \ --alias {{Model ID}} \ --endpoint http://host.docker.internal:55438/v1 \ --disable-foundrylocal-manager# PowerShell - from GitHub Container Registry docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest ` --connector-type FoundryLocal ` --alias {{Model ID}} ` --endpoint http://host.docker.internal:55438/v1 ` --disable-foundrylocal-manager
-
Open your web browser, navigate to
http://localhost:8080, and enter prompts.