create_vectorstore_router_agent#
- langchain.agents.agent_toolkits.vectorstore.base.create_vectorstore_router_agent(llm: BaseLanguageModel, toolkit: VectorStoreRouterToolkit, callback_manager: BaseCallbackManager | None = None, prefix: str = 'You are an agent designed to answer questions.\nYou have access to tools for interacting with different sources, and the inputs to the tools are questions.\nYour main task is to decide which of the tools is relevant for answering question at hand.\nFor complex questions, you can break the question down into sub questions and use tools to answers the sub questions.\n', verbose: bool = False, agent_executor_kwargs: Dict[str, Any] | None = None, **kwargs: Any) AgentExecutor [source]#
Deprecated since version 0.2.13: This function will continue to be supported, but it is recommended for new use cases to be built with LangGraph. LangGraph offers a more flexible and full-featured framework for building agents, including support for tool-calling, persistence of state, and human-in-the-loop workflows. See API reference for this function for a replacement implementation: https://api.python.langchain.com/en/latest/agents/langchain.agents.agent_toolkits.vectorstore.base.create_vectorstore_router_agent.html Read more here on how to create agents that query vector stores: https://python.langchain.com/docs/how_to/qa_chat_history_how_to/#agents It will be removed in None==1.0.
Construct a VectorStore router agent from an LLM and tools.
Note: this class is deprecated. See below for a replacement that uses tool calling methods and LangGraph. Install LangGraph with:
pip install -U langgraph
from langchain_core.tools import create_retriever_tool from langchain_core.vectorstores import InMemoryVectorStore from langchain_openai import ChatOpenAI, OpenAIEmbeddings from langgraph.prebuilt import create_react_agent llm = ChatOpenAI(model="gpt-4o-mini", temperature=0) pet_vector_store = InMemoryVectorStore.from_texts( [ "Dogs are great companions, known for their loyalty and friendliness.", "Cats are independent pets that often enjoy their own space.", ], OpenAIEmbeddings(), ) food_vector_store = InMemoryVectorStore.from_texts( [ "Carrots are orange and delicious.", "Apples are red and delicious.", ], OpenAIEmbeddings(), ) tools = [ create_retriever_tool( pet_vector_store.as_retriever(), "pet_information_retriever", "Fetches information about pets.", ), create_retriever_tool( food_vector_store.as_retriever(), "food_information_retriever", "Fetches information about food.", ) ] agent = create_react_agent(llm, tools) for step in agent.stream( {"messages": [("human", "Tell me about carrots.")]}, stream_mode="values", ): step["messages"][-1].pretty_print()
- Parameters:
llm (BaseLanguageModel) – LLM that will be used by the agent
toolkit (VectorStoreRouterToolkit) – Set of tools for the agent which have routing capability with multiple vector stores
callback_manager (Optional[BaseCallbackManager], optional) – Object to handle the callback [ Defaults to None. ]
prefix (str, optional) – The prefix prompt for the router agent. If not provided uses default ROUTER_PREFIX.
verbose (bool, optional) – If you want to see the content of the scratchpad. [ Defaults to False ]
agent_executor_kwargs (Optional[Dict[str, Any]], optional) – If there is any other parameter you want to send to the agent. [ Defaults to None ]
kwargs (Any) – Additional named parameters to pass to the ZeroShotAgent.
- Returns:
Returns a callable AgentExecutor object. Either you can call it or use run method with the query to get the response.
- Return type: