load_chat_planner#
- langchain_experimental.plan_and_execute.planners.chat_planner.load_chat_planner(llm: BaseLanguageModel, system_prompt: str = "Let's first understand the problem and devise a plan to solve the problem. Please output the plan starting with the header 'Plan:' and then followed by a numbered list of steps. Please make the plan the minimum number of steps required to accurately complete the task. If the task is a question, the final step should almost always be 'Given the above steps taken, please respond to the users original question'. At the end of your plan, say '<END_OF_PLAN>'") LLMPlanner [source]#
Load a chat planner.
- Parameters:
llm (BaseLanguageModel) – Language model.
system_prompt (str) – System prompt.
- Returns:
LLMPlanner
- Return type: