-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Description
Question
Simple question. Unclear why the docs are not helping me resolve this inline so here I am.
I want to use Agent to generate a general purpose string chat response. The system I encoded works as intended when output_type is a BaseModel. But what about when all I want is any generated string? I noticed that the default for output_type=str, but that, in my understanding, actually still does the PydanticAI implementation under the hood of requiring the LLM to call the final_result(parameter: str) tool, which is additional overhead; and I noticed that the LLM fails at this task (because we are testing with small GPU models). Passing None did not function as direct chat, and the code agents with Context7 and e.g. Google/Bing provide differing results, none of which are the solution for this particular pattern. (For instance, Gemini said to use output_type=Agent[None | str], but see above.) What is the correct solution?
llm = OpenAIChatModel(model_name=model_name, provider=provider)
return Agent(llm, output_type=output_type,
instructions=instructions or "Be concise, reply with the correct answer.",
event_stream_handler=handler,
instrument=InstrumentationSettings(include_content=True))
Not blocker. Just inconvenient. Thanks.
Additional Context
No response