Skip to content

feat: Helper method to return combined output response #840

@larskoole

Description

@larskoole

A common practice in prompting is to ask the LLM to return what you want between XML tags.
For example, asking it to return the Laravel version of a project in tags you'd get <laravel_version>12.1</laravel_version>

In this case it would be super easy to extract only the Laravel version from the response.
But an issue arrises when a model starts to think in steps.
The moment a model (looking at you Gemini), outputs <laravel_version>12.1</laravel_version> in one of steps OTHER than the last step we cant rely on the ->text attribute of the Response class.

The openai-php/client package solves this by providing an ->outputText method which combines all the output of the model.

Image

Not sure what the best approach would be for Prism here, but any kind of "just give me all the output as a string so I can extract something specific myself" method would be nice.

Having to glue together all steps all the time is quite a chore.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions