chat_with_retry#
- langchain_community.chat_models.premai.chat_with_retry(
- llm: ChatPremAI,
- project_id: int,
- messages: List[dict],
- stream: bool = False,
- run_manager: CallbackManagerForLLMRun | None = None,
- **kwargs: Any,
Using tenacity for retry in completion call
- Parameters:
llm (ChatPremAI)
project_id (int)
messages (List[dict])
stream (bool)
run_manager (CallbackManagerForLLMRun | None)
kwargs (Any)
- Return type:
Any