get_usage_metadata_callback#
- langchain_core.callbacks.usage.get_usage_metadata_callback(name: str = 'usage_metadata_callback') Generator[UsageMetadataCallbackHandler, None, None] [source]#
Beta
This feature is in beta. It is actively being worked on, so the API may change.
Get context manager for tracking usage metadata across chat model calls using
AIMessage.usage_metadata
.- Parameters:
name (str) – The name of the context variable. Defaults to
"usage_metadata_callback"
.- Return type:
Generator[UsageMetadataCallbackHandler, None, None]
Example
from langchain.chat_models import init_chat_model from langchain_core.callbacks import get_usage_metadata_callback llm_1 = init_chat_model(model="openai:gpt-4o-mini") llm_2 = init_chat_model(model="anthropic:claude-3-5-haiku-latest") with get_usage_metadata_callback() as cb: llm_1.invoke("Hello") llm_2.invoke("Hello") print(cb.usage_metadata)
{'gpt-4o-mini-2024-07-18': {'input_tokens': 8, 'output_tokens': 10, 'total_tokens': 18, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}, 'claude-3-5-haiku-20241022': {'input_tokens': 8, 'output_tokens': 21, 'total_tokens': 29, 'input_token_details': {'cache_read': 0, 'cache_creation': 0}}}
Added in version 0.3.49.