Skip to content

Move response format from general user-prompt to LLM class #27

@dguerri

Description

@dguerri

The response template is added to the user_prompt before the latter is processed by each LLM class.
Some LLM APIs natively support response format (e.g., ChatGPT and Gemini) and sometimes can get confused by the json dump added to the user prompt.

One solution is to let each LLM class add the response format, maybe by implementing the user prompt creation in the base class and allowing subclasses to specialize its behaviour.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions