You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The attr LLM_IS_STREAMING is already used in instrumentations for ollama, openai and groq, but is not utilized in langchain instrumentation. Considering that LangChain also has extensive streaming scenarios when calling LLM models, we should also add this it to span attribute.
🎤 Why is this feature needed ?
As stated above.
✌️ How do you aim to achieve this?
Set this attribute in on_llm_new_token() since it's only triggered in streaming request.
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?