export LANGCHAIN_CALLBACKS_BACKGROUND=true
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:export LANGCHAIN_CALLBACKS_BACKGROUND=false
See this LangChain.js guide for more information.default
. An example of a trace logged using the above code is made public and can be viewed here.
LangChainTracer
(reference docs) instance as a callback, or by using the tracing_context
context manager (reference docs).
In JS/TS, you can pass a LangChainTracer
(reference docs) instance as a callback.
LANGSMITH_PROJECT
environment variable to configure a custom project name for an entire application run. This should be done before executing your application.
LANGSMITH_PROJECT
flag is only supported in JS SDK versions >= 0.2.16, use LANGCHAIN_PROJECT
instead if you are using an older version.LangChainTracer
instance or as parameters to the tracing_context
context manager in Python.
RunnableConfig
. This is useful for associating additional information with a trace, such as the environment in which it was executed, or the user who initiated it. For information on how to query traces and runs by metadata and tags, see this guide
RunnableConfig
or at runtime with invocation params), they are inherited by all child runnables of that runnable.run_name
in the RunnableConfig
object at construction or by passing a run_name
in the invocation parameters in JS/TS.
run_name
parameter only changes the name of the runnable you invoke (e.g., a chain, function). It does not rename the nested run automatically created when you invoke an LLM object like ChatOpenAI
(gpt-4o-mini
). In the example, the enclosing run will appear in LangSmith as MyCustomChain
, while the nested LLM run still shows the model’s default name.To give the LLM run a more meaningful name, you can either:run_name
to that step.@traceable
in Python, or traceable
from langsmith
in JS/TS) to create a custom run around the model call.run_id
in the RunnableConfig
object at construction or by passing a run_id
in the invocation parameters.
trace_id
).
RunCollectorCallbackHandler
instance to access the run ID.
LANGCHAIN_CALLBACKS_BACKGROUND
environment variable to "false"
.
For both languages, LangChain exposes methods to wait for traces to be submitted before exiting your application. Below is an example:
LANGSMITH_TRACING
LANGSMITH_API_KEY
LANGSMITH_ENDPOINT
LANGSMITH_PROJECT
traceable
function and be bound as a child run of the traceable
function.
traceable
(JS only)[email protected]
, LangChain objects are traced automatically when used inside @traceable
functions, inheriting the client, tags, metadata and project name of the traceable function.
For older versions of LangChain below 0.2.x
, you will need to manually pass an instance LangChainTracer
created from the tracing context found in @traceable
.
traceable
/ RunTree API (JS only)traceable
and LangChain. The following limitations are present when using combining LangChain with traceable
:getCurrentRunTree()
of the RunnableLambda context will result in a no-op.getCurrentRunTree()
as it may not contain all the RunTree nodes.execution_order
and child_execution_order
value. Thus in extreme circumstances, some runs may end up in a different order, depending on the start_time
.traceable
functions as part of the RunnableSequence or trace child runs of LangChain run imperatively via the RunTree
API. Starting with LangSmith 0.1.39 and @langchain/core 0.2.18, you can directly invoke traceable
-wrapped functions within RunnableLambda.
RunnableConfig
to a equivalent RunTree object by using RunTree.fromRunnableConfig
or pass the RunnableConfig
as the first argument of traceable
-wrapped function.