Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
faf284c
Merging all constrained decoding commits
kunal-vaishnavi May 11, 2025
3aafd99
Make constrained decoding examples more generic
kunal-vaishnavi May 11, 2025
504fd94
Rewrite C# LLM example
kunal-vaishnavi May 11, 2025
8b83ed7
Merge branch 'main' into kvaishnavi/guidance-schema
kunal-vaishnavi Dec 20, 2025
d4b840d
Undo changes after syncing with main
kunal-vaishnavi Dec 22, 2025
7b3dcb9
Remove SLM engine example
kunal-vaishnavi Dec 22, 2025
3101872
Undo change in C# file
kunal-vaishnavi Dec 22, 2025
8dbb361
Refactor Python examples
kunal-vaishnavi Dec 22, 2025
aed89b3
Add new tool definitions
kunal-vaishnavi Dec 22, 2025
6afe11f
Refactor C# examples
kunal-vaishnavi Dec 25, 2025
d6ee36d
Remove commented out code
kunal-vaishnavi Dec 25, 2025
34dd37e
Fix some typos
kunal-vaishnavi Dec 25, 2025
3f12e60
Update READMEs
kunal-vaishnavi Dec 25, 2025
c78bc3d
Rename C# examples
kunal-vaishnavi Dec 25, 2025
16e1b1d
Refactor C/C++ examples
kunal-vaishnavi Jan 7, 2026
da11f3d
Improve parity between the examples
kunal-vaishnavi Jan 7, 2026
f857338
Add C# binding for Overlay API
kunal-vaishnavi Jan 8, 2026
57de609
Add more parity between the examples
kunal-vaishnavi Jan 8, 2026
bc72a13
Add changes suggested by C++ linter
kunal-vaishnavi Jan 8, 2026
71082aa
Disable HF token
kunal-vaishnavi Jan 8, 2026
94b07af
Disable HF remote
kunal-vaishnavi Jan 8, 2026
2ec708a
Fix some CI failures
kunal-vaishnavi Jan 16, 2026
4419129
Use bytes for special tokens processing on cross-platform
kunal-vaishnavi Jan 16, 2026
cc5d988
Change number of layers used in ONNX model creation
kunal-vaishnavi Jan 16, 2026
8fca8ea
Use OS replace instead of OS rename
kunal-vaishnavi Jan 16, 2026
6c7b186
Use str instead of int in subprocess
kunal-vaishnavi Jan 16, 2026
cad80cf
Add changes suggested by C++ linter
kunal-vaishnavi Jan 16, 2026
031be13
Update name references to match examples
kunal-vaishnavi Jan 17, 2026
b7eb5ea
Use all layers for Qwen-2.5 0.5B only
kunal-vaishnavi Jan 17, 2026
45ffe03
Reduce max length to fit KV cache in CI's GPU
kunal-vaishnavi Jan 17, 2026
0d42b89
Update CLI args in Python examples
kunal-vaishnavi Jan 17, 2026
f6b26cb
Update CLI args in C# example
kunal-vaishnavi Jan 17, 2026
6926c3f
Add C/C++ and C# example testing in CIs
kunal-vaishnavi Jan 17, 2026
cb64014
Fix spacing in YAML files
kunal-vaishnavi Jan 17, 2026
66bad08
Update CLI args in C examples
kunal-vaishnavi Jan 17, 2026
8116b68
Add missing param in header
kunal-vaishnavi Jan 17, 2026
bb22936
Build C/C++ examples with CMake in CIs
kunal-vaishnavi Jan 19, 2026
f3561b2
Fix library directory path
kunal-vaishnavi Jan 19, 2026
d392772
Use building Java API to build C/C++ examples via build script
kunal-vaishnavi Jan 19, 2026
2cc85d8
Run Java build after Python dependencies are installed
kunal-vaishnavi Jan 20, 2026
5fa6caa
Remove config overlay call in C# for now
kunal-vaishnavi Jan 20, 2026
88eeb31
Downgrade GenAI version in C# examples
kunal-vaishnavi Jan 20, 2026
40358f6
Use dnf instead of apt
kunal-vaishnavi Jan 20, 2026
5cba9f4
Use superuser to update OS packages
kunal-vaishnavi Jan 20, 2026
8e7c29d
Try to fix macOS CI failure
kunal-vaishnavi Jan 21, 2026
4243683
Test only CPU build of C examples
kunal-vaishnavi Jan 21, 2026
46cab0d
Use devel instead of dev for package name
kunal-vaishnavi Jan 21, 2026
5761423
Use Python 3.11 for devel package
kunal-vaishnavi Jan 21, 2026
aa4dfcf
Add Python3 packages for Linux CUDA CI
kunal-vaishnavi Jan 21, 2026
37d604f
Try aliasing Python3 to env Python executable
kunal-vaishnavi Jan 21, 2026
42d1ee3
Provide Python executable to subprocess
kunal-vaishnavi Jan 21, 2026
e070e71
Change path to built C examples for Linux
kunal-vaishnavi Jan 21, 2026
b421a39
Use CPU model instead of CUDA in CI
kunal-vaishnavi Jan 21, 2026
5a4cdc1
Merge branch 'main' into kvaishnavi/guidance-schema
kunal-vaishnavi Jan 23, 2026
a44beae
Remove chat app example
kunal-vaishnavi Jan 23, 2026
0f9059d
Remove Genny example
kunal-vaishnavi Jan 23, 2026
d6745e2
Update Python example docs
kunal-vaishnavi Jan 23, 2026
9272d15
Merge branch 'main' into kvaishnavi/guidance-schema
kunal-vaishnavi Jan 27, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 21 additions & 5 deletions .github/workflows/linux-cpu-x64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ jobs:
cmake --build --preset linux_gcc_cpu_release
cmake --build --preset linux_gcc_cpu_release --target PyPackageBuild

- name: Install the python wheel and test dependencies
- name: Install the Python wheel and test dependencies
run: |
python3 -m pip install -r test/python/requirements.txt --user
python3 -m pip install -r test/python/cpu/torch/requirements.txt --user
Expand All @@ -110,9 +110,14 @@ jobs:
ls -l ${{ github.workspace }}/build/cpu
ls -l ${{ github.workspace }}/build/cpu/wheel

- name: Build the Java API and Run the Java Tests
run: |
set -e -x
python3 build.py --config=Release --build_dir build/cpu --build_java --parallel --cmake_generator "Ninja"

# This will also download all the test models to the test/test_models directory
# These models are used by the python tests as well as C#, C++ and others.
- name: Run the python tests
- name: Run the Python tests
run: |
export ORTGENAI_LOG_ORT_LIB=1
python3 test/python/test_onnxruntime_genai.py --cwd test/python --test_models test/test_models
Expand All @@ -123,10 +128,21 @@ jobs:
cd test/csharp
dotnet test /p:Configuration=Release /p:NativeBuildOutputDir="../../build/cpu/" /p:OrtLibDir="../../ort/lib/" --verbosity normal

- name: Build the Java API and Run the Java Tests
- name: Build the C# Examples
run: |
set -e -x
python3 build.py --config=Release --build_dir build/cpu --build_java --parallel --cmake_generator "Ninja"
export ORTGENAI_LOG_ORT_LIB=1
cd examples/csharp/ModelChat
dotnet build -c Release
cd ../ModelVision
dotnet build -c Release
cd ../HelloPhi4MM
dotnet build -c Release

- name: Test the C# LLM Example with Tool Calling
run: |
export ORTGENAI_LOG_ORT_LIB=1
python3 test/python/special_tokens.py -p test/test_models/qwen-2.5-0.5b/int4/cpu/tokenizer.json -s "<tool_call>" -e "</tool_call>"
./examples/csharp/ModelChat/bin/Release/net8.0/ModelChat -m test/test_models/qwen-2.5-0.5b/int4/cpu/ -e cpu --response_format lark_grammar --tools_file test/test_models/tool-definitions/weather.json --tool_call_start "<tool_call>" --tool_call_end "</tool_call>" --user_prompt "What is the weather in Redmond, WA?" --tool_output --non_interactive --verbose

- name: Run tests
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linux-cpu-x64-nightly-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ jobs:
- name: Run Q&A Example
run: |
python3 -m onnxruntime_genai.models.builder -i /data/ortgenai/pytorch/qwen2.5-0.5b-instruct -e cpu -p int4 -o ./example-models/qwen2.5-0.5b-instruct
python3 examples/python/model-qa.py -m ./example-models/qwen2.5-0.5b-instruct -e cpu --input_prompt "what is 10+4?" > output.log 2>&1
python3 examples/python/model-qa.py -m ./example-models/qwen2.5-0.5b-instruct -e cpu --user_prompt "what is 10+4?" --non_interactive > output.log 2>&1
if cat output.log | grep -Eq "14|fourteen"; then
echo "Result seems correct"
else
Expand Down
22 changes: 19 additions & 3 deletions .github/workflows/linux-gpu-x64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ jobs:
--multiple_repos \
--repository onnxruntimecudabuildx64

- name: Config with Cmake in Docker
- name: Config with CMake in Docker
run: |
set -e -x
docker run \
Expand All @@ -125,7 +125,7 @@ jobs:
-DMANYLINUX=ON \
-DPYTHON_EXECUTABLE=${{ env.PYTHON_EXECUTABLE }} "

- name: Build with Cmake in Docker
- name: Build with CMake in Docker
run: |
set -e -x
docker run \
Expand All @@ -136,7 +136,23 @@ jobs:
bash -c " \
/usr/bin/cmake --build --preset linux_gcc_cuda_release && /usr/bin/cmake --build --preset linux_gcc_cuda_release --target PyPackageBuild"

- name: Install the onnxruntime-genai Python wheel and run python test
- name: Build the Java API and Run the Java Tests in Docker
run: |
set -e -x
docker run \
--gpus all \
--rm \
--user 0 \
--volume $GITHUB_WORKSPACE:/ort_genai_src \
-w /ort_genai_src onnxruntimecudabuildx64 bash -c " \
alias python3=${{ env.PYTHON_EXECUTABLE }} && \
dnf -y update && dnf install -y python3.11-devel && dnf install -y python3-pip python3-setuptools python3-wheel && \
${{ env.PYTHON_EXECUTABLE }} -m pip install -r test/python/requirements.txt --user && \
${{ env.PYTHON_EXECUTABLE }} -m pip install -r test/python/cuda/torch/requirements.txt --user && \
${{ env.PYTHON_EXECUTABLE }} -m pip install -r test/python/cuda/ort/requirements.txt --user && \
${{ env.PYTHON_EXECUTABLE }} build.py --config=Release --build_dir build/cuda --build_java --parallel --cmake_generator Ninja --cmake_extra_defines PYTHON_EXECUTABLE=${{ env.PYTHON_EXECUTABLE }}"

- name: Install the onnxruntime-genai Python wheel and run Python tests
run: |
echo "Installing the onnxruntime-genai Python wheel and running the Python tests"
docker run \
Expand Down
28 changes: 22 additions & 6 deletions .github/workflows/mac-cpu-arm64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ jobs:
cmake --build --preset macos_arm64_cpu_release --target PyPackageBuild
continue-on-error: false

- name: Install the python wheel and test dependencies
- name: Install the Python wheel and test dependencies
run: |
python3 -m venv genai-macos-venv
source genai-macos-venv/bin/activate
Expand All @@ -117,6 +117,12 @@ jobs:
python3 -m pip install -r test/python/macos/ort/requirements.txt
python3 -m pip install build/cpu/osx-arm64/wheel/onnxruntime_genai*.whl --no-deps

- name: Build the Java API and Run the Java Tests
run: |
set -e -x
source genai-macos-venv/bin/activate
python3 build.py --config=Release --build_dir build/cpu/osx-arm64 --build_java --parallel --cmake_generator "Unix Makefiles" --macos MacOSX --osx_arch arm64 --apple_deploy_target 12.0 --apple_sysroot macosx

- name: Remove the ort lib and header files
run: |
rm -rf ort
Expand All @@ -130,7 +136,7 @@ jobs:

# This will also download all the test models to the test/test_models directory
# These models are used by the python tests as well as C#, C++ and others.
- name: Run the python tests
- name: Run the Python tests
run: |
source genai-macos-venv/bin/activate
export HF_TOKEN="12345"
Expand All @@ -144,11 +150,21 @@ jobs:
cd test/csharp
dotnet test /p:Configuration=Release /p:NativeBuildOutputDir="../../build/cpu/osx-arm64" --verbosity normal

- name: Build the Java API and Run the Java Tests
- name: Build the C# Examples
run: |
set -e -x
source genai-macos-venv/bin/activate
python3 build.py --config=Release --build_dir build/cpu/osx-arm64 --build_java --parallel --cmake_generator "Unix Makefiles" --macos MacOSX --osx_arch arm64 --apple_deploy_target 12.0 --apple_sysroot macosx
export ORTGENAI_LOG_ORT_LIB=1
cd examples/csharp/ModelChat
dotnet build -c Release
cd ../ModelVision
dotnet build -c Release
cd ../HelloPhi4MM
dotnet build -c Release

- name: Test the C# LLM Example with Tool Calling
run: |
export ORTGENAI_LOG_ORT_LIB=1
python3 test/python/special_tokens.py -p test/test_models/qwen-2.5-0.5b/int4/cpu/tokenizer.json -s "<tool_call>" -e "</tool_call>"
./examples/csharp/ModelChat/bin/Release/net8.0/ModelChat -m test/test_models/qwen-2.5-0.5b/int4/cpu/ -e cpu --response_format lark_grammar --tools_file test/test_models/tool-definitions/weather.json --tool_call_start "<tool_call>" --tool_call_end "</tool_call>" --user_prompt "What is the weather in Redmond, WA?" --tool_output --non_interactive --verbose

- name: Run tests
run: |
Expand Down
22 changes: 19 additions & 3 deletions .github/workflows/win-cpu-arm64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -93,10 +93,16 @@ jobs:
run: |
# Uninstalling LLVM/Clang as it is no longer required and causes issues with numpy installation
choco uninstall llvm --yes
python -m pip install "numpy<2" coloredlogs flatbuffers packaging protobuf sympy pytest
python -m pip install -r test\python\requirements.txt --user
python -m pip install -r test\python\cpu\torch\requirements.txt --user
python -m pip install -r test\python\cpu\ort\requirements.txt --user
python -m pip install onnxruntime-qnn
python -m pip install (Get-ChildItem ("$env:binaryDir\wheel\*.whl")) --no-deps

- name: Build the Java API and Run the Java Tests
run: |
python build.py --config=Release --build_dir $env:binaryDir --build_java --parallel

- name: Run the Python Tests
run: |
python test/python/test_onnxruntime_genai.py --cwd "test\python" --test_models "test\test_models"
Expand All @@ -106,9 +112,19 @@ jobs:
cd test\csharp
dotnet test /p:NativeBuildOutputDir="$env:GITHUB_WORKSPACE\$env:binaryDir\Release" /p:OrtLibDir="$env:GITHUB_WORKSPACE\ort\lib"

- name: Build the Java API and Run the Java Tests
- name: Build the C# Examples
run: |
python build.py --config=Release --build_dir $env:binaryDir --build_java --parallel
cd examples\csharp\ModelChat
dotnet build -c Release
cd ..\ModelVision
dotnet build -c Release
cd ..\HelloPhi4MM
dotnet build -c Release

- name: Test the C# LLM Example with Tool Calling
run: |
python3 test\python\special_tokens.py -p test\test_models\qwen-2.5-0.5b\int4\cpu\tokenizer.json -s "<tool_call>" -e "</tool_call>"
.\examples\csharp\ModelChat\bin\Release\net8.0\ModelChat.exe -m test\test_models\qwen-2.5-0.5b\int4\cpu\ -e cpu --response_format lark_grammar --tools_file test\test_models\tool-definitions\weather.json --tool_call_start "<tool_call>" --tool_call_end "</tool_call>" --user_prompt "What is the weather in Redmond, WA?" --tool_output --non_interactive --verbose

- name: Verify Build Artifacts
if: always()
Expand Down
20 changes: 17 additions & 3 deletions .github/workflows/win-cpu-x64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,13 +102,17 @@ jobs:
cmake --build --preset windows_x64_cpu_release --parallel
cmake --build --preset windows_x64_cpu_release --target PyPackageBuild

- name: Install the python wheel and test dependencies
- name: Install the Python wheel and test dependencies
run: |
python3 -m pip install -r test\python\requirements.txt --user
python3 -m pip install -r test\python\cpu\torch\requirements.txt --user
python3 -m pip install -r test\python\cpu\ort\requirements.txt --user
python3 -m pip install (Get-ChildItem ("$env:binaryDir\wheel\*.whl")) --no-deps

- name: Build the Java API and Run the Java Tests
run: |
python3 build.py --config=Release --build_dir $env:binaryDir --build_java --parallel

- name: Run the Python Tests
run: |
python test/python/test_onnxruntime_genai.py --cwd "test\python" --test_models "test\test_models"
Expand All @@ -118,9 +122,19 @@ jobs:
cd test\csharp
dotnet test /p:NativeBuildOutputDir="$env:GITHUB_WORKSPACE\$env:binaryDir\Release" /p:OrtLibDir="$env:GITHUB_WORKSPACE\ort\lib" --verbosity normal

- name: Build the Java API and Run the Java Tests
- name: Build the C# Examples
run: |
python3 build.py --config=Release --build_dir $env:binaryDir --build_java --parallel
cd examples\csharp\ModelChat
dotnet build -c Release
cd ..\ModelVision
dotnet build -c Release
cd ..\HelloPhi4MM
dotnet build -c Release

- name: Test the C# LLM Example with Tool Calling
run: |
python3 test\python\special_tokens.py -p test\test_models\qwen-2.5-0.5b\int4\cpu\tokenizer.json -s "<tool_call>" -e "</tool_call>"
.\examples\csharp\ModelChat\bin\Release\net8.0\ModelChat.exe -m test\test_models\qwen-2.5-0.5b\int4\cpu\ -e cpu --response_format lark_grammar --tools_file test\test_models\tool-definitions\weather.json --tool_call_start "<tool_call>" --tool_call_end "</tool_call>" --user_prompt "What is the weather in Redmond, WA?" --tool_output --non_interactive --verbose

- name: Verify Build Artifacts
if: always()
Expand Down
18 changes: 18 additions & 0 deletions .github/workflows/win-cuda-x64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,10 @@ jobs:
python -m pip install -r test\python\cuda\ort\requirements.txt
python -m pip install (Get-ChildItem ("$env:binaryDir\wheel\*.whl")) --no-deps

- name: Build the Java API and Run the Java Tests
run: |
python build.py --config=Release --build_dir $env:binaryDir --build_java --parallel

- name: Run the Python Tests
run: |
python test/python/test_onnxruntime_genai.py --cwd "test\python" --test_models "test\test_models" --e2e
Expand All @@ -115,6 +119,20 @@ jobs:
cd test\csharp
dotnet test /p:Configuration=release /p:NativeBuildOutputDir="$env:GITHUB_WORKSPACE\$env:binaryDir\Release" /p:OrtLibDir="$env:GITHUB_WORKSPACE\ort\lib"

- name: Build the C# Examples
run: |
cd examples\csharp\ModelChat
dotnet build -c Release
cd ..\ModelVision
dotnet build -c Release
cd ..\HelloPhi4MM
dotnet build -c Release

- name: Test the C# LLM Example with Tool Calling
run: |
python test\python\special_tokens.py -p test\test_models\qwen-2.5-0.5b\int4\cpu\tokenizer.json -s "<tool_call>" -e "</tool_call>"
.\examples\csharp\ModelChat\bin\Release\net8.0\ModelChat.exe -m test\test_models\qwen-2.5-0.5b\int4\cpu\ -e cpu --response_format lark_grammar --tools_file test\test_models\tool-definitions\weather.json --tool_call_start "<tool_call>" --tool_call_end "</tool_call>" --user_prompt "What is the weather in Redmond, WA?" --tool_output --non_interactive --verbose

- name: Prepend CUDA to PATH and Run tests
run: |-
$env:PATH = "${{ env.cuda_dir }}\\v${{ env.cuda_version }}\\bin;" + $env:PATH
Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/win-directml-x64-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -114,6 +114,10 @@ jobs:
python -m pip install -r test\python\directml\ort\requirements.txt
python -m pip install (Get-ChildItem ("$env:binaryDir\wheel\*.whl")) --no-deps

- name: Build the Java API and Run the Java Tests
run: |
python build.py --config=Release --build_dir $env:binaryDir --build_java --parallel

- name: Run the Python Tests
run: |
python test/python/test_onnxruntime_genai.py --cwd "test\python" --test_models "test\test_models" --e2e
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ models_outputs_cpu
benchmark/python/output
examples/python/genai_models
examples/python/hf_cache
examples/csharp/HelloPhi/models
examples/csharp/ModelChat/models

!test/test_models/hf-internal-testing/
!test/test_models/hf-internal-testing/tiny-random-gpt2*/*.onnx
Expand Down
1 change: 0 additions & 1 deletion .pipelines/stages/jobs/custom-nuget-packaging-job.yml
Original file line number Diff line number Diff line change
Expand Up @@ -157,4 +157,3 @@ jobs:
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)\nuget'
artifactName: $(genai_nuget_package_name)

18 changes: 9 additions & 9 deletions .pipelines/stages/jobs/nuget-validation-job.yml
Original file line number Diff line number Diff line change
Expand Up @@ -143,14 +143,14 @@ jobs:
HuggingFaceRepo: 'microsoft/Phi-3-mini-4k-instruct-onnx'
LocalFolder: 'phi3-mini'
RepoFolder: $(prebuild_phi3_mini_model_folder)
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/HelloPhi'
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/ModelChat'
HuggingFaceToken: $(HF_TOKEN)
os: ${{ parameters.os }}

- template: steps/nuget-validation-step.yml
parameters:
CsprojFolder: "examples/csharp/HelloPhi"
CsprojName: "HelloPhi"
CsprojFolder: "examples/csharp/ModelChat"
CsprojName: "ModelChat"
CsprojConfiguration: $(csproj_configuration)
LocalFolder: 'phi3-mini'
ModelFolder: $(prebuild_phi3_mini_model_folder)
Expand All @@ -160,14 +160,14 @@ jobs:
HuggingFaceRepo: 'microsoft/Phi-3.5-vision-instruct-onnx'
LocalFolder: 'phi3.5-vision'
RepoFolder: $(prebuild_phi3_5_vision_model_folder)
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/HelloPhi3V'
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/ModelVision'
HuggingFaceToken: $(HF_TOKEN)
os: ${{ parameters.os }}

- template: steps/nuget-validation-step.yml
parameters:
CsprojFolder: "examples/csharp/HelloPhi3V"
CsprojName: "HelloPhi3V"
CsprojFolder: "examples/csharp/ModelVision"
CsprojName: "ModelVision"
CsprojConfiguration: $(csproj_configuration)
LocalFolder: 'phi3.5-vision'
ModelFolder: $(prebuild_phi3_5_vision_model_folder)
Expand All @@ -177,14 +177,14 @@ jobs:
HuggingFaceRepo: 'microsoft/Phi-4-multimodal-instruct-onnx'
LocalFolder: 'phi4-mm'
RepoFolder: $(prebuild_phi4_mm_model_folder)
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/HelloPhi4MM'
WorkingDirectory: '$(Build.Repository.LocalPath)/examples/csharp/ModelChat4MM'
HuggingFaceToken: $(HF_TOKEN)
os: ${{ parameters.os }}

- template: steps/nuget-validation-step.yml
parameters:
CsprojFolder: "examples/csharp/HelloPhi4MM"
CsprojName: "HelloPhi4MM"
CsprojFolder: "examples/csharp/ModelChat4MM"
CsprojName: "ModelChat4MM"
CsprojConfiguration: $(csproj_configuration)
LocalFolder: 'phi4-mm'
ModelFolder: $(prebuild_phi4_mm_model_folder)
Expand Down
Loading
Loading