forked from agntcy/coffeeAgntcy
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy path.env.example
More file actions
53 lines (42 loc) · 1.64 KB
/
.env.example
File metadata and controls
53 lines (42 loc) · 1.64 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
PYTHONPATH=.
#============================
# LLM Provider Settings
#============================
# CoffeeAGNTCY uses litellm to manage LLM connections.
# Full list of supported providers:https://docs.litellm.ai/docs/providers
# Note: In CoffeeAGNTCY, the environment variable for specifying the model is always LLM_MODEL, regardless of the provider.
# Examples:
# OpenAI
# LLM_MODEL="openai/<model_of_choice>"
# OPENAI_API_KEY=<your_openai_api_key>
# Azure OpenAI
# LLM_MODEL="azure/<your_deployment_name>"
# AZURE_API_BASE=https://your-azure-resource.openai.azure.com/
# AZURE_API_KEY=<your_azure_api_key>
# AZURE_API_VERSION=<your_azure_api_version>
# GROQ
# LLM_MODEL="groq/<model_of_choice>"
# GROQ_API_KEY=<your_groq_api_key>
# Litellm Proxy
# LLM_MODEL="azure/<your_deployment_name>"
# LITELLM_PROXY_BASE_URL=<your_litellm_proxy_base_url>
# LITELLM_PROXY_API_KEY=<your_litellm_proxy_api_key>
# NVIDIA NIM
# LLM_MODEL="nvidia_nim/<model_of_choice>"
# NVIDIA_NIM_API_KEY=<your_nvidia_api_key>
# NVIDIA_NIM_API_BASE=<your_nvidia_nim_endpoint_url>
# Recommended temperature setting for OpenAI models
OPENAI_TEMPERATURE=0.7
#============================
# Local Development Settings
#============================
#(not needed when running agents via Docker Compose)
# OTEL environment variables
# OTLP_HTTP_ENDPOINT="http://localhost:4318"
# === Transport Settings ===
# SLIM (Default):
# DEFAULT_MESSAGE_TRANSPORT=SLIM
# TRANSPORT_SERVER_ENDPOINT=http://localhost:46357
# Alternative: NATS transport (uncomment to use NATS). The endpoint address must be in the form: nats://host:port.
# DEFAULT_MESSAGE_TRANSPORT=NATS
# TRANSPORT_SERVER_ENDPOINT=nats://localhost:4222