Skip to content

🚀 Feature: Add LLM_IS_STREAMING to span attr for langchain instrumentation #3411

@minimAluminiumalism

Description

@minimAluminiumalism

Which component is this feature for?

Langchain Instrumentation

🔖 Feature description

The attr LLM_IS_STREAMING is already used in instrumentations for ollama, openai and groq, but is not utilized in langchain instrumentation. Considering that LangChain also has extensive streaming scenarios when calling LLM models, we should also add this it to span attribute.

🎤 Why is this feature needed ?

As stated above.

✌️ How do you aim to achieve this?

Set this attribute in on_llm_new_token() since it's only triggered in streaming request.

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions