Skip to content

How to specify Agent without requiring structured output? #4724

@davidbernat

Description

@davidbernat

Question

Simple question. Unclear why the docs are not helping me resolve this inline so here I am.

I want to use Agent to generate a general purpose string chat response. The system I encoded works as intended when output_type is a BaseModel. But what about when all I want is any generated string? I noticed that the default for output_type=str, but that, in my understanding, actually still does the PydanticAI implementation under the hood of requiring the LLM to call the final_result(parameter: str) tool, which is additional overhead; and I noticed that the LLM fails at this task (because we are testing with small GPU models). Passing None did not function as direct chat, and the code agents with Context7 and e.g. Google/Bing provide differing results, none of which are the solution for this particular pattern. (For instance, Gemini said to use output_type=Agent[None | str], but see above.) What is the correct solution?

        llm = OpenAIChatModel(model_name=model_name, provider=provider)
        return Agent(llm, output_type=output_type,
                     instructions=instructions or "Be concise, reply with the correct answer.",
                     event_stream_handler=handler,
                     instrument=InstrumentationSettings(include_content=True))

Not blocker. Just inconvenient. Thanks.

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions