Description
When using the AutoAgents class with process="hierarchical" and specifying a Gemini model (e.g., llm="gemini/gemini-2.5-flash-lite-preview-06-17"), the library still requires an OpenAI API key. This happens even though OpenAI is not being used, and results in a ValueError about the missing OPENAI_API_KEY.
Environment
- Provider (select one):
- PraisonAI version: PraisonAI==2.2.53
- Operating System: macOS 15.4
Full Code
from praisonaiagents import AutoAgents
def get_stock_price(company_name: str) -> str:
"""
Get the stock price of a company
Args:
company_name (str): The name of the company
Returns:
str: The stock price of the company
"""
if company_name.lower() == "apple" or company_name.lower() == "aapl":
return f"The stock price of {company_name} is 100"
elif company_name.lower() == "google" or company_name.lower() == "googl":
return f"The stock price of {company_name} is 200"
else:
return f"The stock price of {company_name} is 50"
# Create AutoAgents instance
agents = AutoAgents(
instructions="Write a poem on the stock price of apple",
tools=[get_stock_price],
process="hierarchical",
llm="gemini/gemini-2.5-flash-lite-preview-06-17",
self_reflect=True,
verbose=True,
max_agents=3 # Maximum number of agents to create
)
# Start the agents
result = agents.start()
print(result)
Steps to Reproduce
- Install the library
- Copy the code above
- Run the script
Expected Behavior
The code should run using the Gemini model and not require an OpenAI API key.
Actual Behavior
ValueError: OPENAI_API_KEY environment variable is required for the default OpenAI service. If you are targeting a local server (e.g., LM Studio), ensure OPENAI_API_BASE is set (e.g., 'http://localhost:1234/v1') and you can use a placeholder API key by setting OPENAI_API_KEY='not-needed'
Additional Context
This only happens when process="hierarchical" is used "sequential" is working fine.
It appears the manager agent is hardcoded to use OpenAI, regardless of the llm parameter.
Description
When using the AutoAgents class with process="hierarchical" and specifying a Gemini model (e.g., llm="gemini/gemini-2.5-flash-lite-preview-06-17"), the library still requires an OpenAI API key. This happens even though OpenAI is not being used, and results in a ValueError about the missing OPENAI_API_KEY.
Environment
Full Code
Steps to Reproduce
Expected Behavior
The code should run using the Gemini model and not require an OpenAI API key.
Actual Behavior
ValueError: OPENAI_API_KEY environment variable is required for the default OpenAI service. If you are targeting a local server (e.g., LM Studio), ensure OPENAI_API_BASE is set (e.g., 'http://localhost:1234/v1') and you can use a placeholder API key by setting OPENAI_API_KEY='not-needed'
Additional Context
This only happens when process="hierarchical" is used "sequential" is working fine.
It appears the manager agent is hardcoded to use OpenAI, regardless of the llm parameter.