ConversationTokenBufferMemory#
- class langchain.memory.token_buffer.ConversationTokenBufferMemory[source]#
Bases:
BaseChatMemory
Deprecated since version 0.3.1: Please see the migration guide at: https://python.langchain.com/docs/versions/migrating_memory/
Conversation chat memory with token limit.
Keeps only the most recent messages in the conversation under the constraint that the total number of tokens in the conversation does not exceed a certain limit.
- param ai_prefix: str = 'AI'#
- param chat_memory: BaseChatMessageHistory [Optional]#
- param human_prefix: str = 'Human'#
- param input_key: str | None = None#
- param llm: BaseLanguageModel [Required]#
- param max_token_limit: int = 2000#
- param memory_key: str = 'history'#
- param output_key: str | None = None#
- param return_messages: bool = False#
- async aclear() None #
Clear memory contents.
- Return type:
None
- async aload_memory_variables(inputs: dict[str, Any]) dict[str, Any] #
Async return key-value pairs given the text input to the chain.
- Parameters:
inputs (dict[str, Any]) – The inputs to the chain.
- Returns:
A dictionary of key-value pairs.
- Return type:
dict[str, Any]
- async asave_context(inputs: Dict[str, Any], outputs: Dict[str, str]) None #
Save context from this conversation to buffer.
- Parameters:
inputs (Dict[str, Any])
outputs (Dict[str, str])
- Return type:
None
- clear() None #
Clear memory contents.
- Return type:
None
- load_memory_variables(inputs: Dict[str, Any]) Dict[str, Any] [source]#
Return history buffer.
- Parameters:
inputs (Dict[str, Any])
- Return type:
Dict[str, Any]
- save_context(inputs: Dict[str, Any], outputs: Dict[str, str]) None [source]#
Save context from this conversation to buffer. Pruned.
- Parameters:
inputs (Dict[str, Any])
outputs (Dict[str, str])
- Return type:
None
- property buffer: Any#
String buffer of memory.
- property buffer_as_messages: List[BaseMessage]#
Exposes the buffer as a list of messages in case return_messages is True.
- property buffer_as_str: str#
Exposes the buffer as a string in case return_messages is False.