Deprecation Notice: LangSmith Tracing Changes #34690
mdrxy
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Summary
Starting with
langchain-core 1.2.4, we've made important changes to how runs are traced to LangSmith. If you're using an older version of LangChain, you may experience missing token counts, cost information, or input data in your LangSmith traces.What's Changing
1. Token Counts and Cost Metrics
Issue: Token counts and cost information may not appear in LangSmith traces when using older LangChain versions.
What happened: We've updated where token usage metadata is stored in traces. The LangSmith platform now expects this information in a specific location that older LangChain versions don't populate.
Action Required: Upgrade to
langchain-core 1.2.4or later to ensure token counts and cost metrics continue to appear in your traces. Information on backports below.2. Streaming Runnable Inputs
Issue: When using the
.transform()or.atransform()methods on any runnable, input data may not appear in LangSmith traces on older LangChain versions.What happened: Previously, these methods would create a trace before inputs were fully materialized, then attempt to update the inputs later. LangSmith now requires all inputs to be provided when the trace is initially created, not as a later update.
Action Required: Upgrade to
langchain-core 1.2.4or later to ensure inputs from.transform()and.atransform()are properly captured in traces. Information on backports below.Recommended Action
Upgrade to the latest version of LangChain:
Affected Versions
langchain-core 1.2.4: Versions prior to1.2.4langchain-core 0.3.82Timeline
Questions?
If you have questions or need assistance upgrading, please:
Note: This change only affects tracing to LangSmith. Core LangChain functionality is not impacted. However, upgrading ensures you maintain full observability of your LLM applications.
Beta Was this translation helpful? Give feedback.
All reactions