Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
239 changes: 129 additions & 110 deletions docs/configuration.md
Original file line number Diff line number Diff line change
@@ -1,168 +1,187 @@
# Configuration Guide

Hive uses a centralized configuration system based on a single `config.yaml` file. This makes it easy to configure the entire application from one place.
Aden Hive is a Python-based agent framework. Configuration is handled through environment variables and agent-level config files. There is no centralized `config.yaml` or Docker Compose setup.

## Configuration Flow
## Configuration Overview
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Information-API & Infra-Energy Optimisation

Overview

This project integrates information theory, thermodynamics, statistics, quantum principles, and distributed systems security into one framework.

Components

  • Shannon Entropy → Optimises information flow in APIs.
  • Thermodynamic Entropy → Models infra-energy efficiency.
  • P-value Analysis → Validates statistical significance of human judgement in LLM outputs.
  • MCMC Simulation → Models uncertainty in decision-making.
  • Kafka + Turing Machine Security → Ensures secure message passing with Schrödinger’s cat principle (secure/insecure until consumed).
  • Terraform Automation → Deploys infrastructure automatically.

Usage

  1. Run python main.py to execute entropy, statistical, and MCMC simulations.
  2. Ensure Kafka is running locally for the security demo.
  3. Deploy infra with:
    terraform init
    terraform apply


```
config.yaml --> generate-env.ts --> .env files
├── .env (root)
├── honeycomb/.env (frontend)
└── hive/.env (backend)
Environment variables (API keys, runtime flags)
Agent config.py (per-agent settings: model, tools, storage)
pyproject.toml (package metadata and dependencies)
.mcp.json (MCP server connections)
```

## Getting Started
## Environment Variables

1. Copy the example configuration:
```bash
cp config.yaml.example config.yaml
```
### LLM Providers (at least one required for real execution)

2. Edit `config.yaml` with your settings
```bash
# Anthropic (primary provider)
export ANTHROPIC_API_KEY="sk-ant-..."

# OpenAI (optional, for GPT models via LiteLLM)
export OPENAI_API_KEY="sk-..."

3. Generate environment files:
```bash
npm run generate:env
```
# Cerebras (optional, used by output cleaner and some nodes)
export CEREBRAS_API_KEY="..."

## Configuration Options
# Groq (optional, fast inference)
export GROQ_API_KEY="..."
```

### Application Settings
The framework supports 100+ LLM providers through [LiteLLM](https://docs.litellm.ai/docs/providers). Set the corresponding environment variable for your provider.

```yaml
app:
# Application name - displayed in UI and logs
name: Hive
### Search & Tools (optional)

# Environment mode
# - development: enables debug features, verbose logging
# - production: optimized for performance, minimal logging
# - test: for running tests
environment: development
```bash
# Web search for agents (Brave Search)
export BRAVE_SEARCH_API_KEY="..."

# Log level: debug, info, warn, error
log_level: info
# Exa Search (alternative web search)
export EXA_API_KEY="..."
```

### Server Configuration
### Runtime Flags

```yaml
server:
frontend:
# Port for the React frontend
port: 3000
```bash
# Run agents without LLM calls (structure-only validation)
export MOCK_MODE=1

backend:
# Port for the Node.js API
port: 4000
# Custom credentials storage path (default: ~/.aden/credentials)
export ADEN_CREDENTIALS_PATH="/custom/path"

# Host to bind (0.0.0.0 = all interfaces)
host: 0.0.0.0
# Custom agent storage path (default: /tmp)
export AGENT_STORAGE_PATH="/custom/storage"
```

### Database Configuration
## Agent Configuration

```yaml
database:
# PostgreSQL connection URL
url: postgresql://user:password@localhost:5432/hive
Each agent package in `exports/` contains its own `config.py`:

# For SQLite (local development)
# url: sqlite:./data/hive.db
```python
# exports/my_agent/config.py
CONFIG = {
"model": "claude-haiku-4-5-20251001", # Default LLM model
"max_tokens": 4096,
"temperature": 0.7,
"tools": ["web_search", "pdf_read"], # MCP tools to enable
"storage_path": "/tmp/my_agent", # Runtime data location
}
```

**Connection URL Format:**
```
postgresql://[user]:[password]@[host]:[port]/[database]
### Agent Graph Specification

Agent behavior is defined in `agent.json` (or constructed in `agent.py`):

```json
{
"id": "my_agent",
"name": "My Agent",
"goal": {
"success_criteria": [...],
"constraints": [...]
},
"nodes": [...],
"edges": [...]
}
```

### Authentication
See the [Getting Started Guide](getting-started.md) for building agents.

```yaml
auth:
# JWT secret key for signing tokens
# IMPORTANT: Change this in production!
# Generate with: openssl rand -base64 32
jwt_secret: your-secret-key
## MCP Server Configuration

# Token expiration time
# Examples: 1h, 7d, 30d
jwt_expires_in: 7d
```

### CORS Configuration
MCP (Model Context Protocol) servers are configured in `.mcp.json` at the project root:

```yaml
cors:
# Allowed origin for cross-origin requests
# Set to your frontend URL in production
origin: http://localhost:3000
```json
{
"servers": {
"tools": {
"command": "python",
"args": ["tools/mcp_server.py"],
"env": {
"BRAVE_SEARCH_API_KEY": "..."
}
}
}
}
```

### Feature Flags
The tools MCP server exposes 19 tools including web search, PDF reading, CSV processing, and file system operations.

```yaml
features:
# Enable/disable user registration
registration: true
## Storage

# Enable API rate limiting
rate_limiting: false
Aden Hive uses **file-based persistence** (no database required):

# Enable request logging
request_logging: true
```
{storage_path}/
runs/{run_id}.json # Complete execution traces
indexes/
by_goal/{goal_id}.json # Runs indexed by goal
by_status/{status}.json # Runs indexed by status
by_node/{node_id}.json # Runs indexed by node
summaries/{run_id}.json # Quick-load run summaries
```

## Environment-Specific Configuration

You can create environment-specific config files:
Storage is managed by `framework.storage.FileStorage`. No external database setup is needed.

- `config.yaml` - Your main configuration (git-ignored)
- `config.yaml.example` - Template with safe defaults (committed)
## IDE Setup

For different environments, you might want separate files:
### VS Code

```bash
# Development
cp config.yaml.example config.yaml
# Edit for development settings
Add to `.vscode/settings.json`:

# Production
cp config.yaml.example config.production.yaml
# Edit for production settings
```json
{
"python.analysis.extraPaths": [
"${workspaceFolder}/core",
"${workspaceFolder}/exports"
]
}
```

### PyCharm

1. Open Project Settings > Project Structure
2. Mark `core` as Sources Root
3. Mark `exports` as Sources Root

## Security Best Practices

1. **Never commit `config.yaml`** - It may contain secrets
2. **Use strong JWT secrets** - Generate with `openssl rand -base64 32`
3. **Restrict CORS in production** - Set to your exact frontend URL
4. **Use environment variables for CI/CD** - Override config in deployments
1. **Never commit API keys** - Use environment variables or `.env` files
2. **`.env` is git-ignored** - Copy from `.env.example` and fill in your values
3. **Mock mode for testing** - Set `MOCK_MODE=1` to avoid LLM calls during development
4. **Credential isolation** - Each tool validates its own credentials at runtime

## Troubleshooting

## Updating Configuration
### "ModuleNotFoundError: No module named 'framework'"

After changing `config.yaml`:
Install the core package:

```bash
# Regenerate .env files
npm run generate:env
cd core && pip install -e .
```

## Troubleshooting
### API key not found

### Changes Not Taking Effect
Ensure the environment variable is set in your current shell session:

1. Ensure you ran `npm run generate:env`
2. Restart the services
3. Check if the correct `.env` file is being loaded
```bash
echo $ANTHROPIC_API_KEY # Should print your key
```

### Configuration Validation Errors
On Windows PowerShell:

The backend validates configuration on startup. Check logs for specific errors.
```powershell
$env:ANTHROPIC_API_KEY = "sk-ant-..."
```

### Agent not found

Run from the project root with PYTHONPATH:

### Missing Environment Variables
```bash
PYTHONPATH=core:exports python -m my_agent validate
```

If a required variable is missing, add it to:
1. `config.yaml.example` (with safe default)
2. `config.yaml` (with your value)
3. `scripts/generate-env.ts` (to generate it)
See [Environment Setup](../ENVIRONMENT_SETUP.md) for detailed installation instructions.