ZepMemory#

class langchain_community.memory.zep_memory.ZepMemory[source]#

Bases: ConversationBufferMemory

Persist your chain history to the Zep MemoryStore.

The number of messages returned by Zep and when the Zep server summarizes chat histories is configurable. See the Zep documentation for more details.

Documentation: https://docs.getzep.com

Example


memory = ZepMemory(

session_id=session_id, # Identifies your user or a user’s session url=ZEP_API_URL, # Your Zep server’s URL api_key=<your_api_key>, # Optional memory_key=”history”, # Ensure this matches the key used in

# chain’s prompt template

return_messages=True, # Does your prompt template expect a string

# or a list of Messages?

)

chain = LLMChain(memory=memory,…) # Configure your chain to use the ZepMemory

instance

Note

To persist metadata alongside your chat history, your will need to create a

custom Chain class that overrides the prep_outputs method to include the metadata in the call to self.memory.save_context.

Zep - Fast, scalable building blocks for LLM Apps#

Zep is an open source platform for productionizing LLM apps. Go from a prototype built in LangChain or LlamaIndex, or a custom app, to production in minutes without rewriting code.

For server installation instructions and more, see: https://docs.getzep.com/deployment/quickstart/

For more information on the zep-python package, see: getzep/zep-python

Initialize ZepMemory.

param session_id:

Identifies your user or a user’s session

type session_id:

str

param url:

Your Zep server’s URL. Defaults to “http://localhost:8000”.

type url:

str, optional

param api_key:

Your Zep API key. Defaults to None.

type api_key:

Optional[str], optional

param output_key:

The key to use for the output message. Defaults to None.

type output_key:

Optional[str], optional

param input_key:

The key to use for the input message. Defaults to None.

type input_key:

Optional[str], optional

param return_messages:

Does your prompt template expect a string or a list of Messages? Defaults to False i.e. return a string.

type return_messages:

bool, optional

param human_prefix:

The prefix to use for human messages. Defaults to “Human”.

type human_prefix:

str, optional

param ai_prefix:

The prefix to use for AI messages. Defaults to “AI”.

type ai_prefix:

str, optional

param memory_key:

The key to use for the memory. Defaults to “history”. Ensure that this matches the key used in chain’s prompt template.

type memory_key:

str, optional

param ai_prefix: str = 'AI'#
param chat_memory: ZepChatMessageHistory [Required]#
param human_prefix: str = 'Human'#
param input_key: str | None = None#
param output_key: str | None = None#
param return_messages: bool = False#
async abuffer() Any#

String buffer of memory.

Return type:

Any

async abuffer_as_messages() List[BaseMessage]#

Exposes the buffer as a list of messages in case return_messages is False.

Return type:

List[BaseMessage]

async abuffer_as_str() str#

Exposes the buffer as a string in case return_messages is True.

Return type:

str

async aclear() None#

Clear memory contents.

Return type:

None

async aload_memory_variables(inputs: Dict[str, Any]) Dict[str, Any]#

Return key-value pairs given the text input to the chain.

Parameters:

inputs (Dict[str, Any]) –

Return type:

Dict[str, Any]

async asave_context(inputs: Dict[str, Any], outputs: Dict[str, str]) None#

Save context from this conversation to buffer.

Parameters:
  • inputs (Dict[str, Any]) –

  • outputs (Dict[str, str]) –

Return type:

None

clear() None#

Clear memory contents.

Return type:

None

load_memory_variables(inputs: Dict[str, Any]) Dict[str, Any]#

Return history buffer.

Parameters:

inputs (Dict[str, Any]) –

Return type:

Dict[str, Any]

save_context(inputs: Dict[str, Any], outputs: Dict[str, str], metadata: Dict[str, Any] | None = None) None[source]#

Save context from this conversation to buffer.

Parameters:
  • inputs (Dict[str, Any]) – The inputs to the chain.

  • outputs (Dict[str, str]) – The outputs from the chain.

  • metadata (Optional[Dict[str, Any]], optional) – Any metadata to save with the context. Defaults to None

Returns:

None

Return type:

None

property buffer: Any#

String buffer of memory.

property buffer_as_messages: List[BaseMessage]#

Exposes the buffer as a list of messages in case return_messages is False.

property buffer_as_str: str#

Exposes the buffer as a string in case return_messages is True.

Examples using ZepMemory