id
stringlengths
14
15
text
stringlengths
49
2.47k
source
stringlengths
61
166
0e0de03d1e94-0
langchain.callbacks.mlflow_callback.MlflowLogger¶ class langchain.callbacks.mlflow_callback.MlflowLogger(**kwargs: Any)[source]¶ Callback Handler that logs metrics and artifacts to mlflow server. Parameters name (str) – Name of the run. experiment (str) – Name of the experiment. tags (dict) – Tags to be attached for the run. tracking_uri (str) – MLflow tracking server uri. This handler implements the helper functions to initialize, log metrics and artifacts to the mlflow server. Methods __init__(**kwargs) artifact(path) To upload the file from given path as artifact. finish_run() To finish the run. html(html, filename) To log the input html string as html file artifact. jsonf(data, filename) To log the input data as json file artifact. langchain_artifact(chain) metric(key, value) To log metric to mlflow server. metrics(data[, step]) To log all metrics in the input dict. start_run(name, tags) To start a new run, auto generates the random suffix for name table(name, dataframe) To log the input pandas dataframe as a html table text(text, filename) To log the input text as text file artifact. __init__(**kwargs: Any)[source]¶ artifact(path: str) → None[source]¶ To upload the file from given path as artifact. finish_run() → None[source]¶ To finish the run. html(html: str, filename: str) → None[source]¶ To log the input html string as html file artifact. jsonf(data: Dict[str, Any], filename: str) → None[source]¶ To log the input data as json file artifact.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowLogger.html
0e0de03d1e94-1
To log the input data as json file artifact. langchain_artifact(chain: Any) → None[source]¶ metric(key: str, value: float) → None[source]¶ To log metric to mlflow server. metrics(data: Union[Dict[str, float], Dict[str, int]], step: Optional[int] = 0) → None[source]¶ To log all metrics in the input dict. start_run(name: str, tags: Dict[str, str]) → None[source]¶ To start a new run, auto generates the random suffix for name table(name: str, dataframe) → None[source]¶ To log the input pandas dataframe as a html table text(text: str, filename: str) → None[source]¶ To log the input text as text file artifact.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowLogger.html
431bd7dd3a8b-0
langchain.callbacks.manager.CallbackManagerForToolRun¶ class langchain.callbacks.manager.CallbackManagerForToolRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶ Callback manager for tool run. Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. Methods __init__(*, run_id, handlers, ...[, ...]) Initialize the run manager. get_child([tag]) Get a child callback manager. get_noop_manager() Return a manager that doesn't perform any operations. on_retry(retry_state, **kwargs) on_text(text, **kwargs) Run when text is received. on_tool_end(output, **kwargs) Run when tool ends running. on_tool_error(error, **kwargs) Run when tool errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForToolRun.html
431bd7dd3a8b-1
on_tool_error(error, **kwargs) Run when tool errors. __init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶ Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. get_child(tag: Optional[str] = None) → CallbackManager¶ Get a child callback manager. Parameters tag (str, optional) – The tag for the child callback manager. Defaults to None. Returns The child callback manager. Return type CallbackManager classmethod get_noop_manager() → BRM¶ Return a manager that doesn’t perform any operations. Returns The noop manager. Return type BaseRunManager on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶ on_text(text: str, **kwargs: Any) → Any¶ Run when text is received. Parameters text (str) – The received text. Returns The result of the callback.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForToolRun.html
431bd7dd3a8b-2
text (str) – The received text. Returns The result of the callback. Return type Any on_tool_end(output: str, **kwargs: Any) → None[source]¶ Run when tool ends running. Parameters output (str) – The output of the tool. on_tool_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when tool errors. Parameters error (Exception or KeyboardInterrupt) – The error. Examples using CallbackManagerForToolRun¶ Defining Custom Tools
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForToolRun.html
85f2a3fbed6e-0
langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler¶ class langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler(example_id: Optional[Union[UUID, str]] = None, **kwargs: Any)[source]¶ A tracer that collects all nested runs in a list. This tracer is useful for inspection and evaluation purposes. Parameters example_id (Optional[Union[UUID, str]], default=None) – The ID of the example being traced. It can be either a UUID or a string. Initialize the RunCollectorCallbackHandler. Parameters example_id (Optional[Union[UUID, str]], default=None) – The ID of the example being traced. It can be either a UUID or a string. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. name raise_error run_inline Methods __init__([example_id]) Initialize the RunCollectorCallbackHandler. on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id, **kwargs) End a trace for a chain run. on_chain_error(error, *, run_id, **kwargs) Handle an error for a chain run. on_chain_start(serialized, inputs, *, run_id) Start a trace for a chain run. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler.html
85f2a3fbed6e-1
Run when a chat model starts running. on_llm_end(response, *, run_id, **kwargs) End a trace for an LLM run. on_llm_error(error, *, run_id, **kwargs) Handle an error for an LLM run. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Start a trace for an LLM run. on_retriever_end(documents, *, run_id, **kwargs) Run when Retriever ends running. on_retriever_error(error, *, run_id, **kwargs) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_retry(retry_state, *, run_id, **kwargs) on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id, **kwargs) End a trace for a tool run. on_tool_error(error, *, run_id, **kwargs) Handle an error for a tool run. on_tool_start(serialized, input_str, *, run_id) Start a trace for a tool run. __init__(example_id: Optional[Union[UUID, str]] = None, **kwargs: Any) → None[source]¶ Initialize the RunCollectorCallbackHandler. Parameters example_id (Optional[Union[UUID, str]], default=None) – The ID of the example being traced. It can be either a UUID or a string.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler.html
85f2a3fbed6e-2
on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a chain run. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a chain run. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, run_type: Optional[str] = None, **kwargs: Any) → None¶ Start a trace for a chain run. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for an LLM run. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for an LLM run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler.html
85f2a3fbed6e-3
Handle an error for an LLM run. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None¶ Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for an LLM run. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when Retriever starts running. on_retry(retry_state: RetryCallState, *, run_id: UUID, **kwargs: Any) → None¶ on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a tool run. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler.html
85f2a3fbed6e-4
Handle an error for a tool run. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for a tool run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.run_collector.RunCollectorCallbackHandler.html
019c7921a2ac-0
langchain.callbacks.tracers.schemas.LLMRun¶ class langchain.callbacks.tracers.schemas.LLMRun[source]¶ Bases: BaseRun Class for LLMRun. Create a new model by parsing and validating input data from keyword arguments. Raises ValidationError if the input data cannot be parsed to form a valid model. param child_execution_order: int [Required]¶ param end_time: datetime.datetime [Optional]¶ param error: Optional[str] = None¶ param execution_order: int [Required]¶ param extra: Optional[Dict[str, Any]] = None¶ param parent_uuid: Optional[str] = None¶ param prompts: List[str] [Required]¶ param response: Optional[langchain.schema.output.LLMResult] = None¶ param serialized: Dict[str, Any] [Required]¶ param session_id: int [Required]¶ param start_time: datetime.datetime [Optional]¶ param uuid: str [Required]¶ classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶ Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶ Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include – fields to include in new model exclude – fields to exclude from new model, as with values this takes precedence over include
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.LLMRun.html
019c7921a2ac-1
exclude – fields to exclude from new model, as with values this takes precedence over include update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep – set to True to make a deep copy of the model Returns new model instance dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶ Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. classmethod from_orm(obj: Any) → Model¶ json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶ Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps(). classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ classmethod parse_obj(obj: Any) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.LLMRun.html
019c7921a2ac-2
classmethod parse_obj(obj: Any) → Model¶ classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶ classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶ classmethod update_forward_refs(**localns: Any) → None¶ Try to update ForwardRefs on fields based on this Model, globalns and localns. classmethod validate(value: Any) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.LLMRun.html
810539e12af2-0
langchain.callbacks.base.ToolManagerMixin¶ class langchain.callbacks.base.ToolManagerMixin[source]¶ Mixin for tool callbacks. Methods __init__() on_tool_end(output, *, run_id[, parent_run_id]) Run when tool ends running. on_tool_error(error, *, run_id[, parent_run_id]) Run when tool errors. __init__()¶ on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run when tool errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.ToolManagerMixin.html
797e8a9dc7e7-0
langchain.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState¶ class langchain.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]¶ Enumerator of the LLMThought state. THINKING = 'THINKING'¶ RUNNING_TOOL = 'RUNNING_TOOL'¶ COMPLETE = 'COMPLETE'¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState.html
b5a5a7ae50d2-0
langchain.callbacks.tracers.langchain_v1.LangChainTracerV1¶ class langchain.callbacks.tracers.langchain_v1.LangChainTracerV1(**kwargs: Any)[source]¶ An implementation of the SharedTracer that POSTS to the langchain endpoint. Initialize the LangChain tracer. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__(**kwargs) Initialize the LangChain tracer. load_default_session() Load the default tracing session and set it as the Tracer's session. load_session(session_name) Load a session with the given name from the tracer. on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id, **kwargs) End a trace for a chain run. on_chain_error(error, *, run_id, **kwargs) Handle an error for a chain run. on_chain_start(serialized, inputs, *, run_id) Start a trace for a chain run. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id, **kwargs) End a trace for an LLM run. on_llm_error(error, *, run_id, **kwargs) Handle an error for an LLM run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain_v1.LangChainTracerV1.html
b5a5a7ae50d2-1
Handle an error for an LLM run. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Start a trace for an LLM run. on_retriever_end(documents, *, run_id, **kwargs) Run when Retriever ends running. on_retriever_error(error, *, run_id, **kwargs) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_retry(retry_state, *, run_id, **kwargs) on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id, **kwargs) End a trace for a tool run. on_tool_error(error, *, run_id, **kwargs) Handle an error for a tool run. on_tool_start(serialized, input_str, *, run_id) Start a trace for a tool run. __init__(**kwargs: Any) → None[source]¶ Initialize the LangChain tracer. load_default_session() → Union[TracerSessionV1, TracerSession][source]¶ Load the default tracing session and set it as the Tracer’s session. load_session(session_name: str) → Union[TracerSessionV1, TracerSession][source]¶ Load a session with the given name from the tracer. on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain_v1.LangChainTracerV1.html
b5a5a7ae50d2-2
Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a chain run. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a chain run. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, run_type: Optional[str] = None, **kwargs: Any) → None¶ Start a trace for a chain run. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for an LLM run. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for an LLM run. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None¶ Run on new LLM token. Only available when streaming is enabled.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain_v1.LangChainTracerV1.html
b5a5a7ae50d2-3
Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for an LLM run. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when Retriever starts running. on_retry(retry_state: RetryCallState, *, run_id: UUID, **kwargs: Any) → None¶ on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a tool run. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a tool run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain_v1.LangChainTracerV1.html
b5a5a7ae50d2-4
Handle an error for a tool run. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for a tool run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain_v1.LangChainTracerV1.html
8b9a5cd63624-0
langchain.callbacks.flyte_callback.import_flytekit¶ langchain.callbacks.flyte_callback.import_flytekit() → Tuple[flytekit, renderer][source]¶ Import flytekit and flytekitplugins-deck-standard.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.flyte_callback.import_flytekit.html
7e36a5434fa5-0
langchain.callbacks.tracers.evaluation.wait_for_all_evaluators¶ langchain.callbacks.tracers.evaluation.wait_for_all_evaluators() → None[source]¶ Wait for all tracers to finish.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.evaluation.wait_for_all_evaluators.html
0bae9eadfd48-0
langchain.callbacks.tracers.langchain.LangChainTracer¶ class langchain.callbacks.tracers.langchain.LangChainTracer(example_id: Optional[Union[str, UUID]] = None, project_name: Optional[str] = None, client: Optional[Client] = None, tags: Optional[List[str]] = None, use_threading: bool = True, **kwargs: Any)[source]¶ An implementation of the SharedTracer that POSTS to the langchain endpoint. Initialize the LangChain tracer. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__([example_id, project_name, client, ...]) Initialize the LangChain tracer. on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id, **kwargs) End a trace for a chain run. on_chain_error(error, *, run_id, **kwargs) Handle an error for a chain run. on_chain_start(serialized, inputs, *, run_id) Start a trace for a chain run. on_chat_model_start(serialized, messages, *, ...) Start a trace for an LLM run. on_llm_end(response, *, run_id, **kwargs) End a trace for an LLM run. on_llm_error(error, *, run_id, **kwargs)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain.LangChainTracer.html
0bae9eadfd48-1
on_llm_error(error, *, run_id, **kwargs) Handle an error for an LLM run. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Start a trace for an LLM run. on_retriever_end(documents, *, run_id, **kwargs) Run when Retriever ends running. on_retriever_error(error, *, run_id, **kwargs) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_retry(retry_state, *, run_id, **kwargs) on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id, **kwargs) End a trace for a tool run. on_tool_error(error, *, run_id, **kwargs) Handle an error for a tool run. on_tool_start(serialized, input_str, *, run_id) Start a trace for a tool run. wait_for_futures() Wait for the given futures to complete. __init__(example_id: Optional[Union[str, UUID]] = None, project_name: Optional[str] = None, client: Optional[Client] = None, tags: Optional[List[str]] = None, use_threading: bool = True, **kwargs: Any) → None[source]¶ Initialize the LangChain tracer. on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain.LangChainTracer.html
0bae9eadfd48-2
Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a chain run. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a chain run. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, run_type: Optional[str] = None, **kwargs: Any) → None¶ Start a trace for a chain run. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Start a trace for an LLM run. on_llm_end(response: LLMResult, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for an LLM run. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for an LLM run. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None¶ Run on new LLM token. Only available when streaming is enabled.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain.LangChainTracer.html
0bae9eadfd48-3
Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for an LLM run. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when Retriever starts running. on_retry(retry_state: RetryCallState, *, run_id: UUID, **kwargs: Any) → None¶ on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a tool run. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a tool run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain.LangChainTracer.html
0bae9eadfd48-4
Handle an error for a tool run. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for a tool run. wait_for_futures() → None[source]¶ Wait for the given futures to complete. Examples using LangChainTracer¶ Async API
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.langchain.LangChainTracer.html
3ebd7094d273-0
langchain.callbacks.manager.CallbackManagerForChainRun¶ class langchain.callbacks.manager.CallbackManagerForChainRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶ Callback manager for chain run. Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. Methods __init__(*, run_id, handlers, ...[, ...]) Initialize the run manager. get_child([tag]) Get a child callback manager. get_noop_manager() Return a manager that doesn't perform any operations. on_agent_action(action, **kwargs) Run when agent action is received. on_agent_finish(finish, **kwargs) Run when agent finish is received. on_chain_end(outputs, **kwargs) Run when chain ends running. on_chain_error(error, **kwargs) Run when chain errors. on_retry(retry_state, **kwargs) on_text(text, **kwargs)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForChainRun.html
3ebd7094d273-1
on_retry(retry_state, **kwargs) on_text(text, **kwargs) Run when text is received. __init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶ Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. get_child(tag: Optional[str] = None) → CallbackManager¶ Get a child callback manager. Parameters tag (str, optional) – The tag for the child callback manager. Defaults to None. Returns The child callback manager. Return type CallbackManager classmethod get_noop_manager() → BRM¶ Return a manager that doesn’t perform any operations. Returns The noop manager. Return type BaseRunManager on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶ Run when agent action is received. Parameters action (AgentAction) – The agent action. Returns The result of the callback.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForChainRun.html
3ebd7094d273-2
action (AgentAction) – The agent action. Returns The result of the callback. Return type Any on_agent_finish(finish: AgentFinish, **kwargs: Any) → Any[source]¶ Run when agent finish is received. Parameters finish (AgentFinish) – The agent finish. Returns The result of the callback. Return type Any on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain ends running. Parameters outputs (Dict[str, Any]) – The outputs of the chain. on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶ Run when chain errors. Parameters error (Exception or KeyboardInterrupt) – The error. on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶ on_text(text: str, **kwargs: Any) → Any¶ Run when text is received. Parameters text (str) – The received text. Returns The result of the callback. Return type Any Examples using CallbackManagerForChainRun¶ Custom chain
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForChainRun.html
87bc151a0952-0
langchain.callbacks.openai_info.OpenAICallbackHandler¶ class langchain.callbacks.openai_info.OpenAICallbackHandler[source]¶ Callback Handler that tracks OpenAI info. Attributes always_verbose Whether to call verbose callbacks even if verbose is False. completion_tokens ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. prompt_tokens raise_error run_inline successful_requests total_cost total_tokens Methods __init__() on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, parent_run_id]) Run when chain ends running. on_chain_error(error, *, run_id[, parent_run_id]) Run when chain errors. on_chain_start(serialized, inputs, *, run_id) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, **kwargs) Collect token usage. on_llm_error(error, *, run_id[, parent_run_id]) Run when LLM errors. on_llm_new_token(token, **kwargs) Print out the token. on_llm_start(serialized, prompts, **kwargs) Print out the prompts. on_retriever_end(documents, *, run_id[, ...]) Run when Retriever ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.openai_info.OpenAICallbackHandler.html
87bc151a0952-1
Run when Retriever ends running. on_retriever_error(error, *, run_id[, ...]) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id[, parent_run_id]) Run when tool ends running. on_tool_error(error, *, run_id[, parent_run_id]) Run when tool errors. on_tool_start(serialized, input_str, *, run_id) Run when tool starts running. __init__()¶ on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when chain errors. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when chain starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.openai_info.OpenAICallbackHandler.html
87bc151a0952-2
Run when chain starts running. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶ Collect token usage. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when LLM errors. on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ Print out the token. on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶ Print out the prompts. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when Retriever starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.openai_info.OpenAICallbackHandler.html
87bc151a0952-3
Run when Retriever starts running. on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool errors. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when tool starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.openai_info.OpenAICallbackHandler.html
62bf43854fab-0
langchain.callbacks.aim_callback.import_aim¶ langchain.callbacks.aim_callback.import_aim() → Any[source]¶ Import the aim python package and raise an error if it is not installed.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.import_aim.html
4a6501a0b7b9-0
langchain.callbacks.base.RunManagerMixin¶ class langchain.callbacks.base.RunManagerMixin[source]¶ Mixin for run manager. Methods __init__() on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. __init__()¶ on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run on arbitrary text.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.RunManagerMixin.html
06a7bfda7455-0
langchain.callbacks.tracers.stdout.FunctionCallbackHandler¶ class langchain.callbacks.tracers.stdout.FunctionCallbackHandler(function: Callable[[str], None], **kwargs: Any)[source]¶ Tracer that calls a function with a single str parameter. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. name raise_error run_inline Methods __init__(function, **kwargs) get_breadcrumbs(run) get_parents(run) on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id, **kwargs) End a trace for a chain run. on_chain_error(error, *, run_id, **kwargs) Handle an error for a chain run. on_chain_start(serialized, inputs, *, run_id) Start a trace for a chain run. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id, **kwargs) End a trace for an LLM run. on_llm_error(error, *, run_id, **kwargs) Handle an error for an LLM run. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Start a trace for an LLM run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.stdout.FunctionCallbackHandler.html
06a7bfda7455-1
Start a trace for an LLM run. on_retriever_end(documents, *, run_id, **kwargs) Run when Retriever ends running. on_retriever_error(error, *, run_id, **kwargs) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_retry(retry_state, *, run_id, **kwargs) on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id, **kwargs) End a trace for a tool run. on_tool_error(error, *, run_id, **kwargs) Handle an error for a tool run. on_tool_start(serialized, input_str, *, run_id) Start a trace for a tool run. __init__(function: Callable[[str], None], **kwargs: Any) → None[source]¶ get_breadcrumbs(run: Run) → str[source]¶ get_parents(run: Run) → List[Run][source]¶ on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a chain run. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.stdout.FunctionCallbackHandler.html
06a7bfda7455-2
Handle an error for a chain run. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, run_type: Optional[str] = None, **kwargs: Any) → None¶ Start a trace for a chain run. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for an LLM run. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for an LLM run. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None¶ Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for an LLM run. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.stdout.FunctionCallbackHandler.html
06a7bfda7455-3
Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when Retriever starts running. on_retry(retry_state: RetryCallState, *, run_id: UUID, **kwargs: Any) → None¶ on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, **kwargs: Any) → None¶ End a trace for a tool run. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None¶ Handle an error for a tool run. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Start a trace for a tool run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.stdout.FunctionCallbackHandler.html
d11f3a0dcd00-0
langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler¶ class langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler(pl_id_callback: Optional[Callable[[...], Any]] = None, pl_tags: Optional[List[str]] = [])[source]¶ Callback handler for promptlayer. Initialize the PromptLayerCallbackHandler. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__([pl_id_callback, pl_tags]) Initialize the PromptLayerCallbackHandler. on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, parent_run_id]) Run when chain ends running. on_chain_error(error, *, run_id[, parent_run_id]) Run when chain errors. on_chain_start(serialized, inputs, *, run_id) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id[, parent_run_id]) Run when LLM ends running. on_llm_error(error, *, run_id[, parent_run_id]) Run when LLM errors. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler.html
d11f3a0dcd00-1
on_llm_start(serialized, prompts, *, run_id) Run when LLM starts running. on_retriever_end(documents, *, run_id[, ...]) Run when Retriever ends running. on_retriever_error(error, *, run_id[, ...]) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id[, parent_run_id]) Run when tool ends running. on_tool_error(error, *, run_id[, parent_run_id]) Run when tool errors. on_tool_start(serialized, input_str, *, run_id) Run when tool starts running. __init__(pl_id_callback: Optional[Callable[[...], Any]] = None, pl_tags: Optional[List[str]] = []) → None[source]¶ Initialize the PromptLayerCallbackHandler. on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler.html
d11f3a0dcd00-2
Run when chain errors. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when chain starts running. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → Any[source]¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None[source]¶ Run when LLM ends running. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when LLM errors. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → Any[source]¶ Run when LLM starts running. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler.html
d11f3a0dcd00-3
Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when Retriever starts running. on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool errors. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when tool starts running. Examples using PromptLayerCallbackHandler¶ PromptLayer
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.promptlayer_callback.PromptLayerCallbackHandler.html
b7889f92298c-0
langchain.callbacks.mlflow_callback.MlflowCallbackHandler¶ class langchain.callbacks.mlflow_callback.MlflowCallbackHandler(name: Optional[str] = 'langchainrun-%', experiment: Optional[str] = 'langchain', tags: Optional[Dict] = {}, tracking_uri: Optional[str] = None)[source]¶ Callback Handler that logs metrics and artifacts to mlflow server. Parameters name (str) – Name of the run. experiment (str) – Name of the experiment. tags (dict) – Tags to be attached for the run. tracking_uri (str) – MLflow tracking server uri. This handler will utilize the associated callback method called and formats the input of each callback function with metadata regarding the state of LLM run, and adds the response to the list of records for both the {method}_records and action. It then logs the response to mlflow server. Initialize callback handler. Attributes always_verbose Whether to call verbose callbacks even if verbose is False. ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__([name, experiment, tags, tracking_uri]) Initialize callback handler. flush_tracker([langchain_asset, finish]) get_custom_callback_meta() on_agent_action(action, **kwargs) Run on agent action. on_agent_finish(finish, **kwargs) Run when agent ends running. on_chain_end(outputs, **kwargs) Run when chain ends running. on_chain_error(error, **kwargs) Run when chain errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowCallbackHandler.html
b7889f92298c-1
on_chain_error(error, **kwargs) Run when chain errors. on_chain_start(serialized, inputs, **kwargs) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, **kwargs) Run when LLM ends running. on_llm_error(error, **kwargs) Run when LLM errors. on_llm_new_token(token, **kwargs) Run when LLM generates a new token. on_llm_start(serialized, prompts, **kwargs) Run when LLM starts. on_retriever_end(documents, *, run_id[, ...]) Run when Retriever ends running. on_retriever_error(error, *, run_id[, ...]) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_text(text, **kwargs) Run when agent is ending. on_tool_end(output, **kwargs) Run when tool ends running. on_tool_error(error, **kwargs) Run when tool errors. on_tool_start(serialized, input_str, **kwargs) Run when tool starts running. reset_callback_meta() Reset the callback metadata. __init__(name: Optional[str] = 'langchainrun-%', experiment: Optional[str] = 'langchain', tags: Optional[Dict] = {}, tracking_uri: Optional[str] = None) → None[source]¶ Initialize callback handler. flush_tracker(langchain_asset: Any = None, finish: bool = False) → None[source]¶ get_custom_callback_meta() → Dict[str, Any]¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowCallbackHandler.html
b7889f92298c-2
get_custom_callback_meta() → Dict[str, Any]¶ on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶ Run on agent action. on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶ Run when agent ends running. on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when chain errors. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain starts running. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶ Run when LLM ends running. on_llm_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when LLM errors. on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ Run when LLM generates a new token. on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶ Run when LLM starts. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowCallbackHandler.html
b7889f92298c-3
Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when Retriever starts running. on_text(text: str, **kwargs: Any) → None[source]¶ Run when agent is ending. on_tool_end(output: str, **kwargs: Any) → None[source]¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when tool errors. on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶ Run when tool starts running. reset_callback_meta() → None¶ Reset the callback metadata. Examples using MlflowCallbackHandler¶ MLflow
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.mlflow_callback.MlflowCallbackHandler.html
3d04616c6244-0
langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler¶ class langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler(*, answer_prefix_tokens: Optional[List[str]] = None, strip_tokens: bool = True, stream_prefix: bool = False)[source]¶ Callback handler that returns an async iterator. Only the final output of the agent will be iterated. Instantiate AsyncFinalIteratorCallbackHandler. Parameters answer_prefix_tokens – Token sequence that prefixes the answer. Default is [“Final”, “Answer”, “:”] strip_tokens – Ignore white spaces and new lines when comparing answer_prefix_tokens to last tokens? (to determine if answer has been reached) stream_prefix – Should answer prefix itself also be streamed? Attributes always_verbose ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__(*[, answer_prefix_tokens, ...]) Instantiate AsyncFinalIteratorCallbackHandler. aiter() append_to_last_tokens(token) check_if_answer_reached() on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, ...]) Run when chain ends running. on_chain_error(error, *, run_id[, ...]) Run when chain errors. on_chain_start(serialized, inputs, *, run_id) Run when chain starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler.html
3d04616c6244-1
Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, **kwargs) Run when LLM ends running. on_llm_error(error, **kwargs) Run when LLM errors. on_llm_new_token(token, **kwargs) Run on new LLM token. on_llm_start(serialized, prompts, **kwargs) Run when LLM starts running. on_retriever_end(documents, *, run_id[, ...]) Run on retriever end. on_retriever_error(error, *, run_id[, ...]) Run on retriever error. on_retriever_start(serialized, query, *, run_id) Run on retriever start. on_text(text, *, run_id[, parent_run_id, tags]) Run on arbitrary text. on_tool_end(output, *, run_id[, ...]) Run when tool ends running. on_tool_error(error, *, run_id[, ...]) Run when tool errors. on_tool_start(serialized, input_str, *, run_id) Run when tool starts running. __init__(*, answer_prefix_tokens: Optional[List[str]] = None, strip_tokens: bool = True, stream_prefix: bool = False) → None[source]¶ Instantiate AsyncFinalIteratorCallbackHandler. Parameters answer_prefix_tokens – Token sequence that prefixes the answer. Default is [“Final”, “Answer”, “:”] strip_tokens – Ignore white spaces and new lines when comparing answer_prefix_tokens to last tokens? (to determine if answer has been reached) stream_prefix – Should answer prefix itself also be streamed?
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler.html
3d04616c6244-2
reached) stream_prefix – Should answer prefix itself also be streamed? async aiter() → AsyncIterator[str]¶ append_to_last_tokens(token: str) → None[source]¶ check_if_answer_reached() → bool[source]¶ async on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run on agent action. async on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run on agent end. async on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run when chain ends running. async on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run when chain errors. async on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when chain starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler.html
3d04616c6244-3
Run when chain starts running. async on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. async on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶ Run when LLM ends running. async on_llm_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None¶ Run when LLM errors. async on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ Run on new LLM token. Only available when streaming is enabled. async on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶ Run when LLM starts running. async on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run on retriever end. async on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run on retriever error. async on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run on retriever start.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler.html
3d04616c6244-4
Run on retriever start. async on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run on arbitrary text. async on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run when tool ends running. async on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶ Run when tool errors. async on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ Run when tool starts running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler.html
e3b4c2f0774a-0
langchain.callbacks.utils.BaseMetadataCallbackHandler¶ class langchain.callbacks.utils.BaseMetadataCallbackHandler[source]¶ This class handles the metadata and associated function states for callbacks. step¶ The current step. Type int starts¶ The number of times the start method has been called. Type int ends¶ The number of times the end method has been called. Type int errors¶ The number of times the error method has been called. Type int text_ctr¶ The number of times the text method has been called. Type int ignore_llm_¶ Whether to ignore llm callbacks. Type bool ignore_chain_¶ Whether to ignore chain callbacks. Type bool ignore_agent_¶ Whether to ignore agent callbacks. Type bool ignore_retriever_¶ Whether to ignore retriever callbacks. Type bool always_verbose_¶ Whether to always be verbose. Type bool chain_starts¶ The number of times the chain start method has been called. Type int chain_ends¶ The number of times the chain end method has been called. Type int llm_starts¶ The number of times the llm start method has been called. Type int llm_ends¶ The number of times the llm end method has been called. Type int llm_streams¶ The number of times the text method has been called. Type int tool_starts¶ The number of times the tool start method has been called. Type int tool_ends¶ The number of times the tool end method has been called. Type int agent_ends¶ The number of times the agent end method has been called. Type int on_llm_start_records¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.utils.BaseMetadataCallbackHandler.html
e3b4c2f0774a-1
Type int on_llm_start_records¶ A list of records of the on_llm_start method. Type list on_llm_token_records¶ A list of records of the on_llm_token method. Type list on_llm_end_records¶ A list of records of the on_llm_end method. Type list on_chain_start_records¶ A list of records of the on_chain_start method. Type list on_chain_end_records¶ A list of records of the on_chain_end method. Type list on_tool_start_records¶ A list of records of the on_tool_start method. Type list on_tool_end_records¶ A list of records of the on_tool_end method. Type list on_agent_finish_records¶ A list of records of the on_agent_end method. Type list Attributes always_verbose Whether to call verbose callbacks even if verbose is False. ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_llm Whether to ignore LLM callbacks. Methods __init__() get_custom_callback_meta() reset_callback_meta() Reset the callback metadata. __init__() → None[source]¶ get_custom_callback_meta() → Dict[str, Any][source]¶ reset_callback_meta() → None[source]¶ Reset the callback metadata.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.utils.BaseMetadataCallbackHandler.html
87a46a8af752-0
langchain.callbacks.manager.CallbackManagerForLLMRun¶ class langchain.callbacks.manager.CallbackManagerForLLMRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶ Callback manager for LLM run. Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. Methods __init__(*, run_id, handlers, ...[, ...]) Initialize the run manager. get_noop_manager() Return a manager that doesn't perform any operations. on_llm_end(response, **kwargs) Run when LLM ends running. on_llm_error(error, **kwargs) Run when LLM errors. on_llm_new_token(token, **kwargs) Run when LLM generates a new token. on_retry(retry_state, **kwargs) on_text(text, **kwargs) Run when text is received.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForLLMRun.html
87a46a8af752-1
on_text(text, **kwargs) Run when text is received. __init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶ Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. classmethod get_noop_manager() → BRM¶ Return a manager that doesn’t perform any operations. Returns The noop manager. Return type BaseRunManager on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶ Run when LLM ends running. Parameters response (LLMResult) – The LLM result. on_llm_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when LLM errors. Parameters error (Exception or KeyboardInterrupt) – The error. on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ Run when LLM generates a new token. Parameters
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForLLMRun.html
87a46a8af752-2
Run when LLM generates a new token. Parameters token (str) – The new token. on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶ on_text(text: str, **kwargs: Any) → Any¶ Run when text is received. Parameters text (str) – The received text. Returns The result of the callback. Return type Any Examples using CallbackManagerForLLMRun¶ Custom LLM
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.CallbackManagerForLLMRun.html
6063bdb96d2a-0
langchain.callbacks.tracers.schemas.TracerSessionV1Base¶ class langchain.callbacks.tracers.schemas.TracerSessionV1Base[source]¶ Bases: BaseModel Base class for TracerSessionV1. Create a new model by parsing and validating input data from keyword arguments. Raises ValidationError if the input data cannot be parsed to form a valid model. param extra: Optional[Dict[str, Any]] = None¶ param name: Optional[str] = None¶ param start_time: datetime.datetime [Optional]¶ classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶ Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶ Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include – fields to include in new model exclude – fields to exclude from new model, as with values this takes precedence over include update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep – set to True to make a deep copy of the model Returns new model instance
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.TracerSessionV1Base.html
6063bdb96d2a-1
deep – set to True to make a deep copy of the model Returns new model instance dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶ Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. classmethod from_orm(obj: Any) → Model¶ json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶ Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps(). classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ classmethod parse_obj(obj: Any) → Model¶ classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.TracerSessionV1Base.html
6063bdb96d2a-2
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶ classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶ classmethod update_forward_refs(**localns: Any) → None¶ Try to update ForwardRefs on fields based on this Model, globalns and localns. classmethod validate(value: Any) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.TracerSessionV1Base.html
036663eeb2a6-0
langchain.callbacks.manager.wandb_tracing_enabled¶ langchain.callbacks.manager.wandb_tracing_enabled(session_name: str = 'default') → Generator[None, None, None][source]¶ Get the WandbTracer in a context manager. Parameters session_name (str, optional) – The name of the session. Defaults to “default”. Returns None Example >>> with wandb_tracing_enabled() as session: ... # Use the WandbTracer session Examples using wandb_tracing_enabled¶ WandB Tracing
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.wandb_tracing_enabled.html
3e26e71e3945-0
langchain.callbacks.manager.trace_as_chain_group¶ langchain.callbacks.manager.trace_as_chain_group(group_name: str, callback_manager: Optional[CallbackManager] = None, *, project_name: Optional[str] = None, example_id: Optional[Union[str, UUID]] = None, tags: Optional[List[str]] = None) → Generator[CallbackManager, None, None][source]¶ Get a callback manager for a chain group in a context manager. Useful for grouping different calls together as a single run even if they aren’t composed in a single chain. Parameters group_name (str) – The name of the chain group. project_name (str, optional) – The name of the project. Defaults to None. example_id (str or UUID, optional) – The ID of the example. Defaults to None. tags (List[str], optional) – The inheritable tags to apply to all runs. Defaults to None. Returns The callback manager for the chain group. Return type CallbackManager Example >>> with trace_as_chain_group("group_name") as manager: ... # Use the callback manager for the chain group ... llm.predict("Foo", callbacks=manager)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.trace_as_chain_group.html
003e5765d9c6-0
langchain.callbacks.manager.env_var_is_set¶ langchain.callbacks.manager.env_var_is_set(env_var: str) → bool[source]¶ Check if an environment variable is set. Parameters env_var (str) – The name of the environment variable. Returns True if the environment variable is set, False otherwise. Return type bool
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.env_var_is_set.html
d07c66b84735-0
langchain.callbacks.streamlit.mutable_expander.ChildType¶ class langchain.callbacks.streamlit.mutable_expander.ChildType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]¶ The enumerator of the child type. MARKDOWN = 'MARKDOWN'¶ EXCEPTION = 'EXCEPTION'¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streamlit.mutable_expander.ChildType.html
917e03900aa7-0
langchain.callbacks.aim_callback.AimCallbackHandler¶ class langchain.callbacks.aim_callback.AimCallbackHandler(repo: Optional[str] = None, experiment_name: Optional[str] = None, system_tracking_interval: Optional[int] = 10, log_system_params: bool = True)[source]¶ Callback Handler that logs to Aim. Parameters repo (str, optional) – Aim repository path or Repo object to which Run object is bound. If skipped, default Repo is used. experiment_name (str, optional) – Sets Run’s experiment property. ‘default’ if not specified. Can be used later to query runs/sequences. system_tracking_interval (int, optional) – Sets the tracking interval in seconds for system usage metrics (CPU, Memory, etc.). Set to None to disable system metrics tracking. log_system_params (bool, optional) – Enable/Disable logging of system params such as installed packages, git info, environment variables, etc. This handler will utilize the associated callback method called and formats the input of each callback function with metadata regarding the state of LLM run and then logs the response to Aim. Initialize callback handler. Attributes always_verbose Whether to call verbose callbacks even if verbose is False. ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__([repo, experiment_name, ...]) Initialize callback handler. flush_tracker([repo, experiment_name, ...]) Flush the tracker and reset the session. get_custom_callback_meta() on_agent_action(action, **kwargs)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.AimCallbackHandler.html
917e03900aa7-1
get_custom_callback_meta() on_agent_action(action, **kwargs) Run on agent action. on_agent_finish(finish, **kwargs) Run when agent ends running. on_chain_end(outputs, **kwargs) Run when chain ends running. on_chain_error(error, **kwargs) Run when chain errors. on_chain_start(serialized, inputs, **kwargs) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, **kwargs) Run when LLM ends running. on_llm_error(error, **kwargs) Run when LLM errors. on_llm_new_token(token, **kwargs) Run when LLM generates a new token. on_llm_start(serialized, prompts, **kwargs) Run when LLM starts. on_retriever_end(documents, *, run_id[, ...]) Run when Retriever ends running. on_retriever_error(error, *, run_id[, ...]) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_text(text, **kwargs) Run when agent is ending. on_tool_end(output, **kwargs) Run when tool ends running. on_tool_error(error, **kwargs) Run when tool errors. on_tool_start(serialized, input_str, **kwargs) Run when tool starts running. reset_callback_meta() Reset the callback metadata. setup(**kwargs)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.AimCallbackHandler.html
917e03900aa7-2
reset_callback_meta() Reset the callback metadata. setup(**kwargs) __init__(repo: Optional[str] = None, experiment_name: Optional[str] = None, system_tracking_interval: Optional[int] = 10, log_system_params: bool = True) → None[source]¶ Initialize callback handler. flush_tracker(repo: Optional[str] = None, experiment_name: Optional[str] = None, system_tracking_interval: Optional[int] = 10, log_system_params: bool = True, langchain_asset: Any = None, reset: bool = True, finish: bool = False) → None[source]¶ Flush the tracker and reset the session. Parameters repo (str, optional) – Aim repository path or Repo object to which Run object is bound. If skipped, default Repo is used. experiment_name (str, optional) – Sets Run’s experiment property. ‘default’ if not specified. Can be used later to query runs/sequences. system_tracking_interval (int, optional) – Sets the tracking interval in seconds for system usage metrics (CPU, Memory, etc.). Set to None to disable system metrics tracking. log_system_params (bool, optional) – Enable/Disable logging of system params such as installed packages, git info, environment variables, etc. langchain_asset – The langchain asset to save. reset – Whether to reset the session. finish – Whether to finish the run. Returns – None get_custom_callback_meta() → Dict[str, Any]¶ on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶ Run on agent action. on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶ Run when agent ends running. on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.AimCallbackHandler.html
917e03900aa7-3
Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when chain errors. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain starts running. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶ Run when LLM ends running. on_llm_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when LLM errors. on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ Run when LLM generates a new token. on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶ Run when LLM starts. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.AimCallbackHandler.html
917e03900aa7-4
Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when Retriever starts running. on_text(text: str, **kwargs: Any) → None[source]¶ Run when agent is ending. on_tool_end(output: str, **kwargs: Any) → None[source]¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], **kwargs: Any) → None[source]¶ Run when tool errors. on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶ Run when tool starts running. reset_callback_meta() → None¶ Reset the callback metadata. setup(**kwargs: Any) → None[source]¶ Examples using AimCallbackHandler¶ Aim
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.AimCallbackHandler.html
a7343e52c006-0
langchain.callbacks.aim_callback.BaseMetadataCallbackHandler¶ class langchain.callbacks.aim_callback.BaseMetadataCallbackHandler[source]¶ This class handles the metadata and associated function states for callbacks. step¶ The current step. Type int starts¶ The number of times the start method has been called. Type int ends¶ The number of times the end method has been called. Type int errors¶ The number of times the error method has been called. Type int text_ctr¶ The number of times the text method has been called. Type int ignore_llm_¶ Whether to ignore llm callbacks. Type bool ignore_chain_¶ Whether to ignore chain callbacks. Type bool ignore_agent_¶ Whether to ignore agent callbacks. Type bool ignore_retriever_¶ Whether to ignore retriever callbacks. Type bool always_verbose_¶ Whether to always be verbose. Type bool chain_starts¶ The number of times the chain start method has been called. Type int chain_ends¶ The number of times the chain end method has been called. Type int llm_starts¶ The number of times the llm start method has been called. Type int llm_ends¶ The number of times the llm end method has been called. Type int llm_streams¶ The number of times the text method has been called. Type int tool_starts¶ The number of times the tool start method has been called. Type int tool_ends¶ The number of times the tool end method has been called. Type int agent_ends¶ The number of times the agent end method has been called. Type int Attributes always_verbose
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.BaseMetadataCallbackHandler.html
a7343e52c006-1
Type int Attributes always_verbose Whether to call verbose callbacks even if verbose is False. ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. Methods __init__() get_custom_callback_meta() reset_callback_meta() Reset the callback metadata. __init__() → None[source]¶ get_custom_callback_meta() → Dict[str, Any][source]¶ reset_callback_meta() → None[source]¶ Reset the callback metadata.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.aim_callback.BaseMetadataCallbackHandler.html
cbc08d97767c-0
langchain.callbacks.human.HumanApprovalCallbackHandler¶ class langchain.callbacks.human.HumanApprovalCallbackHandler(approve: ~typing.Callable[[~typing.Any], bool] = <function _default_approve>, should_check: ~typing.Callable[[~typing.Dict[str, ~typing.Any]], bool] = <function _default_true>)[source]¶ Callback for manually validating values. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__([approve, should_check]) on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, parent_run_id]) Run when chain ends running. on_chain_error(error, *, run_id[, parent_run_id]) Run when chain errors. on_chain_start(serialized, inputs, *, run_id) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id[, parent_run_id]) Run when LLM ends running. on_llm_error(error, *, run_id[, parent_run_id]) Run when LLM errors. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.human.HumanApprovalCallbackHandler.html
cbc08d97767c-1
Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Run when LLM starts running. on_retriever_end(documents, *, run_id[, ...]) Run when Retriever ends running. on_retriever_error(error, *, run_id[, ...]) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id[, parent_run_id]) Run when tool ends running. on_tool_error(error, *, run_id[, parent_run_id]) Run when tool errors. on_tool_start(serialized, input_str, *, run_id) Run when tool starts running. __init__(approve: ~typing.Callable[[~typing.Any], bool] = <function _default_approve>, should_check: ~typing.Callable[[~typing.Dict[str, ~typing.Any]], bool] = <function _default_true>)[source]¶ on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when chain ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.human.HumanApprovalCallbackHandler.html
cbc08d97767c-2
Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when chain errors. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when chain starts running. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when LLM ends running. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when LLM errors. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on new LLM token. Only available when streaming is enabled.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.human.HumanApprovalCallbackHandler.html
cbc08d97767c-3
Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when LLM starts running. on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever ends running. on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when Retriever errors. on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when Retriever starts running. on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on arbitrary text. on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool ends running. on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run when tool errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.human.HumanApprovalCallbackHandler.html
cbc08d97767c-4
Run when tool errors. on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run when tool starts running. Examples using HumanApprovalCallbackHandler¶ Human-in-the-loop Tool Validation
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.human.HumanApprovalCallbackHandler.html
3f4d2036308e-0
langchain.callbacks.streamlit.mutable_expander.ChildRecord¶ class langchain.callbacks.streamlit.mutable_expander.ChildRecord(type: ChildType, kwargs: Dict[str, Any], dg: DeltaGenerator)[source]¶ The child record as a NamedTuple. Create new instance of ChildRecord(type, kwargs, dg) Attributes dg Alias for field number 2 kwargs Alias for field number 1 type Alias for field number 0 Methods __init__() count(value, /) Return number of occurrences of value. index(value[, start, stop]) Return first index of value. __init__()¶ count(value, /)¶ Return number of occurrences of value. index(value, start=0, stop=9223372036854775807, /)¶ Return first index of value. Raises ValueError if the value is not present.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streamlit.mutable_expander.ChildRecord.html
9506188a31e9-0
langchain.callbacks.manager.tracing_v2_enabled¶ langchain.callbacks.manager.tracing_v2_enabled(project_name: Optional[str] = None, *, example_id: Optional[Union[str, UUID]] = None, tags: Optional[List[str]] = None, client: Optional[LangSmithClient] = None) → Generator[None, None, None][source]¶ Instruct LangChain to log all runs in context to LangSmith. Parameters project_name (str, optional) – The name of the project. Defaults to “default”. example_id (str or UUID, optional) – The ID of the example. Defaults to None. tags (List[str], optional) – The tags to add to the run. Defaults to None. Returns None Example >>> with tracing_v2_enabled(): ... # LangChain code will automatically be traced Examples using tracing_v2_enabled¶ LangSmith Walkthrough
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.tracing_v2_enabled.html
7aeea0c0336f-0
langchain.callbacks.tracers.schemas.Run¶ class langchain.callbacks.tracers.schemas.Run[source]¶ Bases: RunBase Run schema for the V2 API in the Tracer. Create a new model by parsing and validating input data from keyword arguments. Raises ValidationError if the input data cannot be parsed to form a valid model. param child_execution_order: int [Required]¶ param child_runs: List[langchain.callbacks.tracers.schemas.Run] [Optional]¶ param end_time: Optional[<module 'datetime' from '/home/docs/.asdf/installs/python/3.11.4/lib/python3.11/datetime.py'>] = None¶ param error: Optional[str] = None¶ param events: Optional[List[Dict]] = None¶ param execution_order: int [Required]¶ param extra: Optional[dict] = None¶ param id: uuid.UUID [Required]¶ param inputs: dict [Required]¶ param name: str [Required]¶ param outputs: Optional[dict] = None¶ param parent_run_id: Optional[uuid.UUID] = None¶ param reference_example_id: Optional[uuid.UUID] = None¶ param run_type: str [Required]¶ The type of run, such as tool, chain, llm, retriever, embedding, prompt, parser. param serialized: Optional[dict] = None¶ param start_time: <module 'datetime' from '/home/docs/.asdf/installs/python/3.11.4/lib/python3.11/datetime.py'> [Required]¶ param tags: Optional[List[str]] [Optional]¶ classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model¶ Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.Run.html
7aeea0c0336f-1
Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model¶ Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include – fields to include in new model exclude – fields to exclude from new model, as with values this takes precedence over include update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep – set to True to make a deep copy of the model Returns new model instance dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) → DictStrAny¶ Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. classmethod from_orm(obj: Any) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.Run.html
7aeea0c0336f-2
classmethod from_orm(obj: Any) → Model¶ json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) → unicode¶ Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps(). classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ classmethod parse_obj(obj: Any) → Model¶ classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) → Model¶ classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') → DictStrAny¶ classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) → unicode¶ classmethod update_forward_refs(**localns: Any) → None¶ Try to update ForwardRefs on fields based on this Model, globalns and localns. classmethod validate(value: Any) → Model¶
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.schemas.Run.html
ffd82529891e-0
langchain.callbacks.wandb_callback.construct_html_from_prompt_and_generation¶ langchain.callbacks.wandb_callback.construct_html_from_prompt_and_generation(prompt: str, generation: str) → Any[source]¶ Construct an html element from a prompt and a generation. Parameters prompt (str) – The prompt. generation (str) – The generation. Returns The html element. Return type (wandb.Html)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.wandb_callback.construct_html_from_prompt_and_generation.html
f8482485c9af-0
langchain.callbacks.base.ChainManagerMixin¶ class langchain.callbacks.base.ChainManagerMixin[source]¶ Mixin for chain callbacks. Methods __init__() on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, parent_run_id]) Run when chain ends running. on_chain_error(error, *, run_id[, parent_run_id]) Run when chain errors. __init__()¶ on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run when chain ends running. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶ Run when chain errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.ChainManagerMixin.html
030bf0e3956a-0
langchain.callbacks.utils.load_json¶ langchain.callbacks.utils.load_json(json_path: Union[str, Path]) → str[source]¶ Load json file to a string. Parameters json_path (str) – The path to the json file. Returns The string representation of the json file. Return type (str)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.utils.load_json.html
43958d94c507-0
langchain.callbacks.streamlit.streamlit_callback_handler.ToolRecord¶ class langchain.callbacks.streamlit.streamlit_callback_handler.ToolRecord(name: str, input_str: str)[source]¶ The tool record as a NamedTuple. Create new instance of ToolRecord(name, input_str) Attributes input_str Alias for field number 1 name Alias for field number 0 Methods __init__() count(value, /) Return number of occurrences of value. index(value[, start, stop]) Return first index of value. __init__()¶ count(value, /)¶ Return number of occurrences of value. index(value, start=0, stop=9223372036854775807, /)¶ Return first index of value. Raises ValueError if the value is not present.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.streamlit.streamlit_callback_handler.ToolRecord.html
e117a4bf5b75-0
langchain.callbacks.manager.AsyncCallbackManagerForChainRun¶ class langchain.callbacks.manager.AsyncCallbackManagerForChainRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶ Async callback manager for chain run. Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. Methods __init__(*, run_id, handlers, ...[, ...]) Initialize the run manager. get_child([tag]) Get a child callback manager. get_noop_manager() Return a manager that doesn't perform any operations. on_agent_action(action, **kwargs) Run when agent action is received. on_agent_finish(finish, **kwargs) Run when agent finish is received. on_chain_end(outputs, **kwargs) Run when chain ends running. on_chain_error(error, **kwargs) Run when chain errors. on_retry(retry_state, **kwargs)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.AsyncCallbackManagerForChainRun.html
e117a4bf5b75-1
Run when chain errors. on_retry(retry_state, **kwargs) on_text(text, **kwargs) Run when text is received. __init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶ Initialize the run manager. Parameters run_id (UUID) – The ID of the run. handlers (List[BaseCallbackHandler]) – The list of handlers. inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers. parent_run_id (UUID, optional) – The ID of the parent run. Defaults to None. tags (Optional[List[str]]) – The list of tags. inheritable_tags (Optional[List[str]]) – The list of inheritable tags. metadata (Optional[Dict[str, Any]]) – The metadata. inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata. get_child(tag: Optional[str] = None) → AsyncCallbackManager¶ Get a child callback manager. Parameters tag (str, optional) – The tag for the child callback manager. Defaults to None. Returns The child callback manager. Return type AsyncCallbackManager classmethod get_noop_manager() → BRM¶ Return a manager that doesn’t perform any operations. Returns The noop manager. Return type BaseRunManager async on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶ Run when agent action is received. Parameters action (AgentAction) – The agent action. Returns
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.AsyncCallbackManagerForChainRun.html
e117a4bf5b75-2
Parameters action (AgentAction) – The agent action. Returns The result of the callback. Return type Any async on_agent_finish(finish: AgentFinish, **kwargs: Any) → Any[source]¶ Run when agent finish is received. Parameters finish (AgentFinish) – The agent finish. Returns The result of the callback. Return type Any async on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶ Run when chain ends running. Parameters outputs (Dict[str, Any]) – The outputs of the chain. async on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶ Run when chain errors. Parameters error (Exception or KeyboardInterrupt) – The error. async on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶ async on_text(text: str, **kwargs: Any) → Any¶ Run when text is received. Parameters text (str) – The received text. Returns The result of the callback. Return type Any Examples using AsyncCallbackManagerForChainRun¶ Custom chain
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.AsyncCallbackManagerForChainRun.html
809d742322c5-0
langchain.callbacks.manager.tracing_enabled¶ langchain.callbacks.manager.tracing_enabled(session_name: str = 'default') → Generator[TracerSessionV1, None, None][source]¶ Get the Deprecated LangChainTracer in a context manager. Parameters session_name (str, optional) – The name of the session. Defaults to “default”. Returns The LangChainTracer session. Return type TracerSessionV1 Example >>> with tracing_enabled() as session: ... # Use the LangChainTracer session Examples using tracing_enabled¶ Multiple callback handlers
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.manager.tracing_enabled.html
3f597fed3ed2-0
langchain.callbacks.utils.import_pandas¶ langchain.callbacks.utils.import_pandas() → Any[source]¶ Import the pandas python package and raise an error if it is not installed.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.utils.import_pandas.html
8c1441fc1c26-0
langchain.callbacks.utils.flatten_dict¶ langchain.callbacks.utils.flatten_dict(nested_dict: Dict[str, Any], parent_key: str = '', sep: str = '_') → Dict[str, Any][source]¶ Flattens a nested dictionary into a flat dictionary. Parameters nested_dict (dict) – The nested dictionary to flatten. parent_key (str) – The prefix to prepend to the keys of the flattened dict. sep (str) – The separator to use between the parent key and the key of the flattened dictionary. Returns A flat dictionary. Return type (dict)
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.utils.flatten_dict.html
80c45ed92df2-0
langchain.callbacks.base.AsyncCallbackHandler¶ class langchain.callbacks.base.AsyncCallbackHandler[source]¶ Async callback handler that can be used to handle callbacks from langchain. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__() on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id[, ...]) Run when chain ends running. on_chain_error(error, *, run_id[, ...]) Run when chain errors. on_chain_start(serialized, inputs, *, run_id) Run when chain starts running. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id[, ...]) Run when LLM ends running. on_llm_error(error, *, run_id[, ...]) Run when LLM errors. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Run when LLM starts running. on_retriever_end(documents, *, run_id[, ...]) Run on retriever end. on_retriever_error(error, *, run_id[, ...]) Run on retriever error.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.AsyncCallbackHandler.html
80c45ed92df2-1
Run on retriever error. on_retriever_start(serialized, query, *, run_id) Run on retriever start. on_text(text, *, run_id[, parent_run_id, tags]) Run on arbitrary text. on_tool_end(output, *, run_id[, ...]) Run when tool ends running. on_tool_error(error, *, run_id[, ...]) Run when tool errors. on_tool_start(serialized, input_str, *, run_id) Run when tool starts running. __init__()¶ async on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on agent action. async on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on agent end. async on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when chain ends running. async on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when chain errors.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.AsyncCallbackHandler.html
80c45ed92df2-2
Run when chain errors. async on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Run when chain starts running. async on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any[source]¶ Run when a chat model starts running. async on_llm_end(response: LLMResult, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when LLM ends running. async on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when LLM errors. async on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on new LLM token. Only available when streaming is enabled.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.AsyncCallbackHandler.html
80c45ed92df2-3
Run on new LLM token. Only available when streaming is enabled. async on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Run when LLM starts running. async on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on retriever end. async on_retriever_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on retriever error. async on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Run on retriever start. async on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run on arbitrary text. async on_tool_end(output: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when tool ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.AsyncCallbackHandler.html
80c45ed92df2-4
Run when tool ends running. async on_tool_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None[source]¶ Run when tool errors. async on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Run when tool starts running. Examples using AsyncCallbackHandler¶ Async callbacks
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.base.AsyncCallbackHandler.html
2450641e731b-0
langchain.callbacks.tracers.base.BaseTracer¶ class langchain.callbacks.tracers.base.BaseTracer(**kwargs: Any)[source]¶ Base interface for tracers. Attributes ignore_agent Whether to ignore agent callbacks. ignore_chain Whether to ignore chain callbacks. ignore_chat_model Whether to ignore chat model callbacks. ignore_llm Whether to ignore LLM callbacks. ignore_retriever Whether to ignore retriever callbacks. ignore_retry Whether to ignore retry callbacks. raise_error run_inline Methods __init__(**kwargs) on_agent_action(action, *, run_id[, ...]) Run on agent action. on_agent_finish(finish, *, run_id[, ...]) Run on agent end. on_chain_end(outputs, *, run_id, **kwargs) End a trace for a chain run. on_chain_error(error, *, run_id, **kwargs) Handle an error for a chain run. on_chain_start(serialized, inputs, *, run_id) Start a trace for a chain run. on_chat_model_start(serialized, messages, *, ...) Run when a chat model starts running. on_llm_end(response, *, run_id, **kwargs) End a trace for an LLM run. on_llm_error(error, *, run_id, **kwargs) Handle an error for an LLM run. on_llm_new_token(token, *, run_id[, ...]) Run on new LLM token. on_llm_start(serialized, prompts, *, run_id) Start a trace for an LLM run. on_retriever_end(documents, *, run_id, **kwargs) Run when Retriever ends running.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.base.BaseTracer.html
2450641e731b-1
Run when Retriever ends running. on_retriever_error(error, *, run_id, **kwargs) Run when Retriever errors. on_retriever_start(serialized, query, *, run_id) Run when Retriever starts running. on_retry(retry_state, *, run_id, **kwargs) on_text(text, *, run_id[, parent_run_id]) Run on arbitrary text. on_tool_end(output, *, run_id, **kwargs) End a trace for a tool run. on_tool_error(error, *, run_id, **kwargs) Handle an error for a tool run. on_tool_start(serialized, input_str, *, run_id) Start a trace for a tool run. __init__(**kwargs: Any) → None[source]¶ on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent action. on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ Run on agent end. on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, **kwargs: Any) → None[source]¶ End a trace for a chain run. on_chain_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None[source]¶ Handle an error for a chain run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.base.BaseTracer.html
2450641e731b-2
Handle an error for a chain run. on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, run_type: Optional[str] = None, **kwargs: Any) → None[source]¶ Start a trace for a chain run. on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶ Run when a chat model starts running. on_llm_end(response: LLMResult, *, run_id: UUID, **kwargs: Any) → None[source]¶ End a trace for an LLM run. on_llm_error(error: Union[Exception, KeyboardInterrupt], *, run_id: UUID, **kwargs: Any) → None[source]¶ Handle an error for an LLM run. on_llm_new_token(token: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → None[source]¶ Run on new LLM token. Only available when streaming is enabled. on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, tags: Optional[List[str]] = None, parent_run_id: Optional[UUID] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None[source]¶ Start a trace for an LLM run.
https://api.python.langchain.com/en/latest/callbacks/langchain.callbacks.tracers.base.BaseTracer.html