Which component is this feature for?
Langchain Instrumentation
🔖 Feature description
The attr LLM_IS_STREAMING is already used in instrumentations for ollama, openai and groq, but is not utilized in langchain instrumentation. Considering that LangChain also has extensive streaming scenarios when calling LLM models, we should also add this it to span attribute.
🎤 Why is this feature needed ?
As stated above.
✌️ How do you aim to achieve this?
Set this attribute in on_llm_new_token() since it's only triggered in streaming request.
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
Are you willing to submit PR?
Yes I am willing to submit a PR!