A Next.js application that demonstrates dynamic AI-generated software flows using XState state machines. This project showcases how AI can dynamically assemble software from task lists and execute them as state machines, applying dynamic programming principles to LLM reasoning tasks.
- Overview
- Video Demo
- Features
- Getting Started
- Architecture
- How It Works
- Recent Enhancements
- API Documentation
- Contributing
- Development Status
- License
X-Reason is an innovative prototype that demonstrates how AI can dynamically assemble and execute software workflows. By converting AI-generated task lists into executable XState machines, it enables non-technical users to compose software through natural language descriptions.
Recent architectural changes (as of v0.3.0):
- Unified Access: Now uses Vercel AI Gateway for all AI providers
- Single API Key:
AI_GATEWAY_API_KEYreplaces provider-specific keys - Cost Control: Built-in rate limiting and usage monitoring
- Deprecated Keys:
OPENAI_API_KEY,GOOGLE_GENERATIVE_AI_API_KEY,XAI_API_KEYno longer supported
- Unified AI SDK: All AI provider interactions now use Vercel AI SDK (
ai,@ai-sdk/openai,@ai-sdk/google) - Three Providers: OpenAI, Google Gemini, and X.AI (Grok) supported
- Removed Routes: Legacy
/api/openai/and/api/gemini/directories have been deleted - New Endpoint: Use unified
/api/ai/chatfor all AI interactions - Centralized Config: All provider setup in
apps/x-reason-web/src/app/api/ai/providers.ts
- Remove old keys from
.env.local:OPENAI_API_KEY,GOOGLE_GENERATIVE_AI_API_KEY - Add Gateway key:
AI_GATEWAY_API_KEY=your_gateway_key_here - Clear browser localStorage to remove old client-side credentials
- Update any custom integrations to use
/api/ai/chatinstead of legacy provider-specific routes - Run
pnpm installto ensure Vercel AI SDK dependencies are installed
See AI_SDK_VERIFICATION.md for detailed migration instructions.
For detailed technical documentation, see:
- AGENTS.md - Updated agent interaction patterns
- apps/x-reason-web/src/app/api/reasoning/README.md - Reasoning engine documentation
- Dynamic State Machine Generation: Convert AI-generated task lists into executable XState machines
- Multi-AI Provider Support: Seamlessly switch between OpenAI, Google Gemini, and X.AI (Grok)
- Real-time Streaming: Server-sent events for live AI response streaming
- Domain-Specific Workflows: Pre-built demos for chemical engineering (Chemli) and user registration (Regie)
- Modern UI: Tailwind CSS with shadcn/ui components for a clean, responsive interface
- Human-in-the-Loop: Support for pause/resume execution with user interaction
- Persistent Context: Maintain state across execution steps
- Visual Debugging: State machine visualizer for understanding workflow execution
- Node.js 18+
- pnpm (recommended) or npm
- Vercel AI Gateway API key (recommended for unified access to all providers)
- Get your key from: Vercel AI Gateway
- Clone the repository:
git clone https://github.com/yourusername/x-reason.git
cd x-reason- Install dependencies:
pnpm install
# or
npm installCreate a .env.local file in the apps/x-reason-web/ directory:
# Vercel AI Gateway Configuration (REQUIRED)
# Get your key from: https://vercel.com/docs/ai-gateway
AI_GATEWAY_API_KEY=your_gateway_api_key_here
# Optional: Custom Gateway Base URL
# AI_GATEWAY_BASE_URL=https://your-custom-gateway.vercel.appGateway Benefits:
- Single API key for OpenAI, Google Gemini, and X.AI
- Built-in rate limiting and cost monitoring
- Simplified credential management
- Server-side only (no client-side exposure)
See AI_SDK_VERIFICATION.md for verification steps and troubleshooting.
Start the development server:
pnpm run dev
# or
npm run devOpen http://localhost:3000 in your browser.
Other available commands:
# Build for production
pnpm run build
# Run production server
pnpm start
# Run tests
pnpm test
# Lint code
pnpm run lint-
State Machine Macro System (
src/app/actions/statesMacros.ts)- Converts task maps to XState configurations
- Supports pause/resume execution
- Maintains context across execution steps
- Enables real-time streaming support
-
AI Provider System (
src/app/api/ai/)- Unified interface for multiple AI providers
- Provider-specific adapters
- Streaming response support
- Automatic fallback handling
-
Domain-Specific Components
- Chemli (
src/app/components/chemli/): Chemical product engineering workflows - Regie (
src/app/components/regie/): Dynamic user registration flows
- Chemli (
The application uses the Vercel AI SDK for unified multi-provider support:
- Architecture: Centralized provider configuration in
src/app/api/ai/providers.ts - Supported Providers (via Gateway):
- OpenAI: GPT-5 Mini, GPT-5 Nano, GPT-OSS 120B, GPT-4o Mini, GPT-4.1 Nano
- Google Gemini: Gemini 2.0 Flash, Gemini 2.5 Flash, Gemini 2.5 Flash Lite
- X.AI (Grok): Grok 4 Fast (Non-Reasoning), Grok 4 Fast (Reasoning), Grok Code Fast 1
- Features:
- Gateway-only authentication (AI_GATEWAY_API_KEY)
- Streaming responses via
streamText() - Server-side credential management
- Runtime provider switching
- Cost control through Gateway
All providers accessed through Vercel AI Gateway with a single API key.
Built on XState v5, the system provides:
- Dynamic state machine generation from task lists
- Error handling with retry mechanisms
- Context preservation across states
- Support for long-running async operations
- Human-in-the-loop interactions
X-Reason transforms natural language descriptions into executable software flows:
- Task Generation: AI analyzes user requirements and generates an optimized task list
- State Machine Creation: The macro system converts tasks into XState configurations
- Execution: The state machine executes tasks sequentially with support for:
- User interactions
- External system callbacks
- Async operations (data persistence, notifications)
- Adaptation: AI can modify flows based on context and user behavior
The system can generate customized registration flows based on user context:
// AI analyzes user context (location, visit frequency, etc.)
// and generates an optimized task list:
[
"Collect User Details",
"Age Confirmation", // Added for US/Canada users
"Present Special Offers", // Added for frequent visitors
"Select Plan",
"TOS Acceptance",
"Persist User Details",
"Send Registration Event"
]
// The macro converts this to an executable state machine
const registrationMachine = machineMacro(taskMap);- Vercel AI SDK Integration: Migrated to unified AI SDK for all provider interactions
- Server-Side Credentials: Removed client credential prompts, all keys managed server-side
- Next.js 15: Upgraded from v14 with Turbopack support
- XState v5: Migrated from v4 with improved APIs
- Multi-AI Providers: Added Google Gemini alongside OpenAI
- Modern UI: Replaced Blueprint.js with Tailwind CSS + shadcn/ui
- Streaming: Real-time AI responses via Server-Sent Events
- Enhanced DX: Better TypeScript support and error handling
POST /api/ai/chat- Unified streaming chat endpoint for all providers (powered by Vercel AI SDK)POST /api/reasoning/stream- Reasoning engine streaming endpoint
POST /api/state-machine/create- Generate state machine from task listPOST /api/state-machine/execute- Execute state machine with contextGET /api/state-machine/status/:id- Get execution status
|
|
We welcome contributions! Please see our Contributing Guide for details. |
Expect bugs and breaking changes. The focus is on showcasing innovative AI-driven software composition patterns rather than production stability.
- Some TODO items remain in the codebase
- Limited error recovery in certain edge cases
This project is licensed under the MIT License - see the LICENSE file for details.