A demonstration application showcasing OpenFeature Remote Evaluation Protocol (OFREP) capabilities in a polyglot environment with .NET, Python, Go, and React. This application manages a collection of Le Mans winner cars and includes an AI-powered chatbot.
This demo showcases how to implement feature flags using OpenFeature and the OFREP (OpenFeature Remote Evaluation Protocol) in a full-stack polyglot application. Key features include:
- OFREP Integration: Remote feature flag evaluation using the standardized protocol across .NET, Python, and Go
- OpenFeature SDK: Industry-standard feature flagging for .NET backend, Python service, and React frontend
- flagd Provider: Using flagd as the feature flag evaluation engine with OFREP
- Dynamic Configuration: Real-time feature flag updates without redeployment
- Full-Stack Implementation: Feature flags working seamlessly across React UI, .NET API, and Python services
- Kill Switches: Safely toggle features in production environments
- GitHub Models Integration: AI-powered chatbot using GPT-4o via GitHub Models
- GitHub Repository Prompts: Dynamic prompt selection using
.prompt.ymlfiles
- Garage.Web: React + Vite frontend for managing car collections with floating chatbot UI
- Garage.ApiService: REST API for car data with Entity Framework Core
- Garage.ChatService: Python FastAPI service for AI chatbot using GitHub Models
- Garage.FeatureFlags: Go API for managing feature flag targeting rules
- Garage.ServiceDefaults: Shared services including feature flag implementations
- Garage.Shared: Common models and DTOs
- Garage.AppHost: .NET Aspire orchestration and service discovery
- PostgreSQL: Database for storing car collection data
- Redis: Caching layer for improved performance
- flagd: OpenFeature-compliant feature flag evaluation engine
- GitHub Models: AI model provider for chatbot functionality
This application includes comprehensive telemetry support through .NET Aspire:
- Distributed Tracing: Track requests across all services (.NET, Python, Go)
- Metrics Collection: Monitor application performance, feature flag usage, and chat request counts
- Structured Logging: Centralized log aggregation with trace correlation
Note: All services export telemetry via OTLP to the Aspire dashboard.
The demo demonstrates these feature flags:
| Flag | Type | Purpose | Default |
|---|---|---|---|
enable-database-winners |
bool |
Toggle data source (DB vs JSON) | true |
winners-count |
int |
Control number of winners shown | 100 |
enable-stats-header |
bool |
Show/hide statistics header | true |
enable-tabs |
bool |
Enable tabbed interface (with targeting) | false |
enable-preview-mode |
string |
Comma-separated list of editable flags | "" |
enable-chatbot |
bool |
Show/hide AI chatbot (with targeting) | false |
prompt-file |
string |
Select chatbot prompt style | "expert" |
The chatbot supports multiple prompt styles via GitHub Repository Prompts (.prompt.yml files):
- expert: Detailed Le Mans racing historian with comprehensive knowledge
- casual: Friendly enthusiast for casual conversation
- brief: Quick facts with concise responses
- unreliable: Confidently incorrect information (for A/B testing demos)
- .NET 10.0 SDK or later
- Python 3.12 or later
- Go 1.25 or later (for Feature Flags API)
- Visual Studio, Visual Studio Code with C# extension or JetBrains Rider
- Git for version control
- Docker Desktop (for containerized dependencies)
- GitHub PAT with access to GitHub Models (for chatbot functionality)
git clone https://github.com/open-feature/openfeature-dotnet-workshop.git
cd openfeature-dotnet-workshopcd src/Garage.AppHost
dotnet user-secrets set "Parameters:github-token" "<your-github-pat>"dotnet restoreaspire run- Web Frontend: https://localhost:7070
- API Service: https://localhost:7071
- Aspire Dashboard: https://localhost:15888
The application will start with flagd running as a container, providing OFREP endpoints for the React frontend, .NET API service, and Python chatbot to consume feature flags.
The Python chat service (Garage.ChatService) provides an AI-powered chatbot for Le Mans racing questions:
- Framework: FastAPI with Uvicorn
- AI Provider: GitHub Models (GPT-4o)
- Feature Flags: OpenFeature with OFREP provider
- Telemetry: Full OpenTelemetry integration (traces, metrics, logs)
- Prompts: GitHub Repository Prompts format (
.prompt.yml)
POST /chat
Request: { "message": "Who won Le Mans in 2023?", "userId": "user-123" }
Response: { "response": "...", "prompt_style": "expert" }
GET /health
Response: { "status": "healthy" }
- OpenFeature Documentation
- OFREP Specification
- flagd Documentation
- .NET Aspire Documentation
- GitHub Models Documentation
- GitHub Repository Prompts
- Feature Flag Best Practices
This project is licensed under the MIT License.