create_vectorstore_agent#

langchain.agents.agent_toolkits.vectorstore.base.create_vectorstore_agent(llm: BaseLanguageModel, toolkit: VectorStoreToolkit, callback_manager: BaseCallbackManager | None = None, prefix: str = 'You are an agent designed to answer questions about sets of documents.\nYou have access to tools for interacting with the documents, and the inputs to the tools are questions.\nSometimes, you will be asked to provide sources for your questions, in which case you should use the appropriate tool to do so.\nIf the question does not seem relevant to any of the tools provided, just return "I don\'t know" as the answer.\n', verbose: bool = False, agent_executor_kwargs: Dict[str, Any] | None = None, **kwargs: Any) AgentExecutor[source]#

Deprecated since version 0.2.13: See API reference for this function for a replacement implementation: https://api.python.langchain.com/en/latest/agents/langchain.agents.agent_toolkits.vectorstore.base.create_vectorstore_agent.html Read more here on how to create agents that query vector stores: https://python.langchain.com/v0.2/docs/how_to/qa_chat_history_how_to/#agents

Construct a VectorStore agent from an LLM and tools.

Note: this class is deprecated. See below for a replacement that uses tool calling methods and LangGraph. Install LangGraph with:

pip install -U langgraph
from langchain_core.tools import create_retriever_tool
from langchain_core.vectorstores import InMemoryVectorStore
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langgraph.prebuilt import create_react_agent

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)

vector_store = InMemoryVectorStore.from_texts(
    [
        "Dogs are great companions, known for their loyalty and friendliness.",
        "Cats are independent pets that often enjoy their own space.",
    ],
    OpenAIEmbeddings(),
)

tool = create_retriever_tool(
    vector_store.as_retriever(),
    "pet_information_retriever",
    "Fetches information about pets.",
)

agent = create_react_agent(llm, [tool])

for step in agent.stream(
    {"messages": [("human", "What are dogs known for?")]},
    stream_mode="values",
):
    step["messages"][-1].pretty_print()
Parameters:
  • llm (BaseLanguageModel) – LLM that will be used by the agent

  • toolkit (VectorStoreToolkit) – Set of tools for the agent

  • callback_manager (Optional[BaseCallbackManager], optional) – Object to handle the callback [ Defaults to None. ]

  • prefix (str, optional) – The prefix prompt for the agent. If not provided uses default PREFIX.

  • verbose (bool, optional) – If you want to see the content of the scratchpad. [ Defaults to False ]

  • agent_executor_kwargs (Optional[Dict[str, Any]], optional) – If there is any other parameter you want to send to the agent. [ Defaults to None ]

  • kwargs (Any) – Additional named parameters to pass to the ZeroShotAgent.

Returns:

Returns a callable AgentExecutor object. Either you can call it or use run method with the query to get the response

Return type:

AgentExecutor