Skip to content

Add OPENAI_LLM_TIMEOUT env var for custom OpenAI-compatible provider#1022

Closed
joseluisll wants to merge 2 commits intoNousResearch:mainfrom
joseluisll:feat/custom-timeout
Closed

Add OPENAI_LLM_TIMEOUT env var for custom OpenAI-compatible provider#1022
joseluisll wants to merge 2 commits intoNousResearch:mainfrom
joseluisll:feat/custom-timeout

Conversation

@joseluisll
Copy link

@joseluisll joseluisll commented Mar 12, 2026

Summary

Adds support for OPENAI_LLM_TIMEOUT environment variable to control LLM timeout when using custom OpenAI-compatible providers.

This PR addresses issue #1010.

Changes

run_agent.py

  • Modified _build_api_kwargs() method in AIAgent class (line 2624)
  • Timeout now dynamically uses OPENAI_LLM_TIMEOUT env var when provider is "custom"
  • Falls back to default 900 seconds for other providers or invalid env values

tests/test_run_agent.py

Added comprehensive test coverage:

  • test_timeout_custom_provider_with_env: Verifies custom timeout from env var
  • test_timeout_custom_provider_without_env: Verifies default timeout when env not set
  • test_timeout_openrouter_ignores_env: Confirms other providers ignore the env var
  • test_timeout_invalid_env_value: Validates fallback to default on invalid values

hermes_cli/config.py

Added OPENAI_LLM_TIMEOUT to OPTIONAL_ENV_VARS (line 576)

  • Marked as advanced setting
  • Documents purpose and usage

Behavior

Provider OPENAI_LLM_TIMEOUT Set Timeout Used
custom Yes Env value
custom No/Invalid 900 seconds
Other Any 900 seconds

Testing

All tests passing:

  • ✅ test_basic_kwargs
  • ✅ test_timeout_custom_provider_with_env
  • ✅ test_timeout_custom_provider_without_env
  • ✅ test_timeout_openrouter_ignores_env
  • ✅ test_timeout_invalid_env_value

@joseluisll
Copy link
Author

@teknium1 Can you please review this PR?. A 900s hardcoded delay is a stopper for developers using local LLM´s.

Copy link

@arilotter arilotter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this commit includes many unrelated formatting changes. please revert those.

Adds support for OPENAI_LLM_TIMEOUT environment variable to control LLM timeout when using custom OpenAI-compatible providers.

Key changes:
- Added OPENAI_LLM_TIMEOUT to OPTIONAL_ENV_VARS in config.py
- Modified _build_api_kwargs() in run_agent.py to use custom timeout for custom providers
- Added comprehensive test coverage for all scenarios

Behavior:
- Custom provider + env set: Uses OPENAI_LLM_TIMEOUT value
- Custom provider + env not set/invalid: Falls back to 900 seconds
- Other providers: Always uses 900 seconds (ignores env var)

This addresses issue NousResearch#1010 without any unrelated formatting changes.
@joseluisll joseluisll force-pushed the feat/custom-timeout branch from 5f24034 to bc10916 Compare March 12, 2026 17:37
@joseluisll
Copy link
Author

this commit includes many unrelated formatting changes. please revert those.

I have restored the formatting changes, so only the TIMEOUT remains, for the custom provider.

@joseluisll joseluisll requested a review from arilotter March 12, 2026 18:43
@joseluisll
Copy link
Author

change done by PR: #1194 . Use the environment variable HERMES_API_TIMEOUT

@joseluisll joseluisll closed this Mar 14, 2026
@joseluisll joseluisll deleted the feat/custom-timeout branch March 14, 2026 05:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants