Create a comprehensive C# wrapper for OpenVINO and OpenVINO GenAI C code to make it more usable directly in C# applications, compared to basic P/Invoke examples.
- Target Platform: Windows x64, .NET 8.0+
- OpenVINO Version: OpenVINO GenAI 2025.3.0.0.dev20250801 (nightly build with all needed C API changes)
- Modern C# Patterns: Async/await, IAsyncEnumerable, SafeHandle resource management
- Simple Demo: CLI with hardcoded values, device selection, and benchmark mode
- Automatic Setup: Model download from HuggingFace, native DLL deployment
- OpenVINO.NET.Core: Core OpenVINO functionality
- OpenVINO.NET.GenAI: GenAI-specific features (LLM pipelines, streaming)
- OpenVINO.NET.Native: Native library management and MSBuild targets
- P/Invoke Layer:
GenAINativeMethods.cswith proper marshalling - SafeHandle Pattern:
LLMPipelineHandle,GenerationConfigHandlefor resource management - Streaming Support:
IAsyncEnumerable<string>for token-by-token generation - Fluent API:
GenerationConfig.Default.WithMaxTokens(100).WithTemperature(0.7f) - MSBuild Integration: Automatic native DLL deployment via
.targetsfiles
src/OpenVINO.NET.GenAI/Native/GenAINativeMethods.cs- P/Invoke declarationssrc/OpenVINO.NET.GenAI/LLMPipeline.cs- Main high-level APIsrc/OpenVINO.NET.GenAI/GenerationConfig.cs- Configuration with fluent APIsrc/OpenVINO.NET.GenAI/OpenVINO.NET.GenAI.targets- MSBuild targets
samples/QuickDemo/Program.cs- Simple CLI demo with hardcoded values- Model:
OpenVINO/Qwen3-0.6B-fp16-ov(1.2GB download from HuggingFace)
# Build entire solution
dotnet build OpenVINO.NET.sln
# Run QuickDemo (default CPU) - Linux requires LD_LIBRARY_PATH
cd samples/QuickDemo/bin/Debug/net8.0 && LD_LIBRARY_PATH=. dotnet QuickDemo.dll
# Alternative: Build and run from project directory
dotnet build samples/QuickDemo && cd samples/QuickDemo/bin/Debug/net8.0 && LD_LIBRARY_PATH=. dotnet QuickDemo.dll
# Run on specific device (Linux)
cd samples/QuickDemo/bin/Debug/net8.0 && LD_LIBRARY_PATH=. dotnet QuickDemo.dll --device GPU
# Benchmark all devices (Linux)
cd samples/QuickDemo/bin/Debug/net8.0 && LD_LIBRARY_PATH=. dotnet QuickDemo.dll --benchmark
# Windows (no special environment needed)
dotnet run --project samples/QuickDemo- Model:
OpenVINO/Qwen3-0.6B-fp16-ov - Temperature: 0.7f
- Max Tokens: 100
- Top-P: 0.9f
- Devices: CPU, GPU, NPU
- "Explain quantum computing in simple terms:"
- "Write a short poem about artificial intelligence:"
- "What are the benefits of renewable energy?"
- "Describe the process of making coffee:"
- Approach: Simplified to use standard .NET library resolution with recursive search
- Windows: Automatic loading via SetDllDirectory() API
- Linux: Requires LD_LIBRARY_PATH=. when running from output directory
- Libraries: Uses official OpenVINO GenAI C API (
openvino_genai_c.dll/.so) - Dependencies: All dependencies deployed via MSBuild targets
- Custom Installations: Set
OPENVINO_RUNTIME_PATHenvironment variable to point to your OpenVINO runtime directory
- OPENVINO_RUNTIME_PATH: Optional environment variable to specify custom OpenVINO runtime directory
- Use this for non-standard OpenVINO installations or different architectures
- Points to the directory containing
openvino_genai_c.dll(Windows) orlibopenvino_genai_c.so(Linux) - Example:
export OPENVINO_RUNTIME_PATH="/opt/intel/openvino/runtime/bin/intel64" - If not set, the system will auto-discover libraries in the application directory and subdirectories
- Modern C# Features: Takes advantage of latest language features and performance improvements
- Enhanced Performance: Benefits from .NET 8 runtime optimizations for better inference speed
- Native AOT Ready: Can be compiled to native code for faster startup times
- Issue: MSBuild targets path warnings on non-Windows
- Solution: Conditional inclusion in
.targetsfile
- CPU: ~12-15 tokens/sec
- GPU: ~20-30 tokens/sec (if available)
- NPU: ~15-25 tokens/sec (if available)
- First Token Latency: 400-800ms
- Preference: Simple, hardcoded demos over complex configuration
- Focus: Device selection and benchmark comparisons
- Avoid: Overly complex CLI interfaces
- Emphasize: "Just works" experience with automatic model download
- Remember the important configuration, environment and test changes
- Add documentation steps for creating and publishing documentation for the project
- Steps to follow for updating and maintaining project documentation
- Steps to run Whisper pipeline demo locally:
- Clone the OpenVINO repository
- Navigate to the Whisper pipeline demo directory
- Ensure OpenVINO runtime is installed
- Download the Whisper model from the specified source
- Set up the necessary environment variables
- Run the demo with appropriate command-line arguments
- Verify input audio file compatibility
- Check device selection (CPU/GPU/NPU)
- Monitor console output for transcription results