chat_models#

Chat Models are a variation on language models.

While Chat Models use language models under the hood, the interface they expose is a bit different. Rather than expose a “text in, text out” API, they expose an interface where “chat messages” are the inputs and outputs.

Class hierarchy:

BaseLanguageModel --> BaseChatModel --> <name>  # Examples: ChatOpenAI, ChatGooglePalm

Main helpers:

AIMessage, BaseMessage, HumanMessage

Classes

chat_models.anyscale.ChatAnyscale

Anyscale Chat large language models.

chat_models.azureml_endpoint.AzureMLChatOnlineEndpoint

Azure ML Online Endpoint chat models.

chat_models.azureml_endpoint.CustomOpenAIChatContentFormatter()

Chat Content formatter for models with OpenAI like API scheme.

chat_models.azureml_endpoint.LlamaChatContentFormatter()

Deprecated: Kept for backwards compatibility

chat_models.azureml_endpoint.LlamaContentFormatter()

Content formatter for LLaMA.

chat_models.azureml_endpoint.MistralChatContentFormatter()

Content formatter for Mistral.

chat_models.baichuan.ChatBaichuan

Baichuan chat model integration.

chat_models.baidu_qianfan_endpoint.QianfanChatEndpoint

Baidu Qianfan chat model integration.

chat_models.bedrock.ChatPromptAdapter()

Adapter class to prepare the inputs from Langchain to prompt format that Chat model expects.

chat_models.cloudflare_workersai.ChatCloudflareWorkersAI

Custom chat model for Cloudflare Workers AI

chat_models.coze.ChatCoze

ChatCoze chat models API by coze.com

chat_models.dappier.ChatDappierAI

Dappier chat large language models.

chat_models.deepinfra.ChatDeepInfra

A chat model that uses the DeepInfra API.

chat_models.deepinfra.ChatDeepInfraException

Exception raised when the DeepInfra API returns an error.

chat_models.edenai.ChatEdenAI

EdenAI chat large language models.

chat_models.everlyai.ChatEverlyAI

EverlyAI Chat large language models.

chat_models.fake.FakeListChatModel

Fake ChatModel for testing purposes.

chat_models.fake.FakeMessagesListChatModel

Fake ChatModel for testing purposes.

chat_models.friendli.ChatFriendli

Friendli LLM for chat.

chat_models.gigachat.GigaChat

GigaChat large language models API.

chat_models.google_palm.ChatGooglePalm

Google PaLM Chat models API.

chat_models.google_palm.ChatGooglePalmError

Error with the Google PaLM API.

chat_models.gpt_router.GPTRouter

GPTRouter by Writesonic Inc.

chat_models.gpt_router.GPTRouterException

Error with the GPTRouter APIs

chat_models.gpt_router.GPTRouterModel

GPTRouter model.

chat_models.human.HumanInputChatModel

ChatModel which returns user input as the response.

chat_models.hunyuan.ChatHunyuan

Tencent Hunyuan chat models API by Tencent.

chat_models.javelin_ai_gateway.ChatJavelinAIGateway

Javelin AI Gateway chat models API.

chat_models.javelin_ai_gateway.ChatParams

Parameters for the Javelin AI Gateway LLM.

chat_models.jinachat.JinaChat

Jina AI Chat models API.

chat_models.kinetica.ChatKinetica

Kinetica LLM Chat Model API.

chat_models.kinetica.KineticaSqlOutputParser

Fetch and return data from the Kinetica LLM.

chat_models.kinetica.KineticaSqlResponse

Response containing SQL and the fetched data.

chat_models.kinetica.KineticaUtil()

Kinetica utility functions.

chat_models.konko.ChatKonko

ChatKonko Chat large language models API.

chat_models.litellm.ChatLiteLLM

Chat model that uses the LiteLLM API.

chat_models.litellm.ChatLiteLLMException

Error with the LiteLLM I/O library

chat_models.litellm_router.ChatLiteLLMRouter

LiteLLM Router as LangChain Model.

chat_models.llama_edge.LlamaEdgeChatService

Chat with LLMs via llama-api-server

chat_models.llamacpp.ChatLlamaCpp

llama.cpp model.

chat_models.maritalk.ChatMaritalk

MariTalk Chat models API.

chat_models.maritalk.MaritalkHTTPError(...)

Initialize RequestException with request and response objects.

chat_models.minimax.MiniMaxChat

MiniMax chat model integration.

chat_models.mlflow.ChatMlflow

MLflow chat models API.

chat_models.mlflow_ai_gateway.ChatMLflowAIGateway

MLflow AI Gateway chat models API.

chat_models.mlflow_ai_gateway.ChatParams

Parameters for the MLflow AI Gateway LLM.

chat_models.mlx.ChatMLX

MLX chat models.

chat_models.moonshot.MoonshotChat

Moonshot large language models.

chat_models.naver.ChatClovaX

NCP ClovaStudio Chat Completion API.

chat_models.oci_data_science.ChatOCIModelDeployment

OCI Data Science Model Deployment chat model integration.

chat_models.oci_data_science.ChatOCIModelDeploymentTGI

OCI large language chat models deployed with Text Generation Inference.

chat_models.oci_data_science.ChatOCIModelDeploymentVLLM

OCI large language chat models deployed with vLLM.

chat_models.oci_generative_ai.ChatOCIGenAI

ChatOCIGenAI chat model integration.

chat_models.oci_generative_ai.CohereProvider()

chat_models.oci_generative_ai.MetaProvider()

chat_models.oci_generative_ai.Provider()

chat_models.octoai.ChatOctoAI

OctoAI Chat large language models.

chat_models.pai_eas_endpoint.PaiEasChatEndpoint

Alibaba Cloud PAI-EAS LLM Service chat model API.

chat_models.perplexity.ChatPerplexity

Perplexity AI Chat models API.

chat_models.premai.ChatPremAI

PremAI Chat models.

chat_models.premai.ChatPremAPIError

Error with the PremAI API.

chat_models.promptlayer_openai.PromptLayerChatOpenAI

PromptLayer and OpenAI Chat large language models API.

chat_models.sambanova.ChatSambaNovaCloud

SambaNova Cloud chat model.

chat_models.sambanova.ChatSambaStudio

SambaStudio chat model.

chat_models.snowflake.ChatSnowflakeCortex

Snowflake Cortex based Chat model

chat_models.snowflake.ChatSnowflakeCortexError

Error with Snowpark client.

chat_models.sparkllm.ChatSparkLLM

IFlyTek Spark chat model integration.

chat_models.symblai_nebula.ChatNebula

Nebula chat large language model - https://docs.symbl.ai/docs/nebula-llm

chat_models.tongyi.ChatTongyi

Alibaba Tongyi Qwen chat model integration.

chat_models.volcengine_maas.VolcEngineMaasChat

Volc Engine Maas hosts a plethora of models.

chat_models.writer.ChatWriter

Writer chat model.

chat_models.yandex.ChatYandexGPT

YandexGPT large language models.

chat_models.yi.ChatYi

Yi chat models API.

chat_models.yuan2.ChatYuan2

Yuan2.0 Chat models API.

chat_models.zhipuai.ChatZhipuAI

ZhipuAI chat model integration.

Functions

chat_models.anthropic.convert_messages_to_prompt_anthropic(...)

Format a list of messages into a full prompt for the Anthropic model

chat_models.baichuan.aconnect_httpx_sse(...)

Async context manager for connecting to an SSE stream.

chat_models.baidu_qianfan_endpoint.convert_message_to_dict(message)

Convert a message to a dictionary that can be passed to the API.

chat_models.bedrock.convert_messages_to_prompt_mistral(...)

Convert a list of messages to a prompt for mistral.

chat_models.cohere.get_cohere_chat_request(...)

Get the request for the Cohere chat API.

chat_models.cohere.get_role(message)

Get the role of the message.

chat_models.fireworks.acompletion_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.fireworks.acompletion_with_retry_streaming(...)

Use tenacity to retry the completion call for streaming.

chat_models.fireworks.completion_with_retry(...)

Use tenacity to retry the completion call.

chat_models.fireworks.conditional_decorator(...)

Define conditional decorator.

chat_models.fireworks.convert_dict_to_message(_dict)

Convert a dict response to a message.

chat_models.friendli.get_chat_request(messages)

Get a request of the Friendli chat API.

chat_models.friendli.get_role(message)

Get role of the message.

chat_models.google_palm.achat_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.google_palm.chat_with_retry(llm, ...)

Use tenacity to retry the completion call.

chat_models.gpt_router.acompletion_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.gpt_router.completion_with_retry(...)

Use tenacity to retry the completion call.

chat_models.gpt_router.get_ordered_generation_requests(...)

Return the body for the model router input.

chat_models.jinachat.acompletion_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.litellm.acompletion_with_retry(llm)

Use tenacity to retry the async completion call.

chat_models.litellm_router.get_llm_output(...)

Get llm output from usage and params.

chat_models.meta.convert_messages_to_prompt_llama(...)

Convert a list of messages to a prompt for llama.

chat_models.minimax.aconnect_httpx_sse(...)

Async context manager for connecting to an SSE stream.

chat_models.minimax.connect_httpx_sse(...)

Context manager for connecting to an SSE stream.

chat_models.openai.acompletion_with_retry(llm)

Use tenacity to retry the async completion call.

chat_models.premai.chat_with_retry(llm, ...)

Using tenacity for retry in completion call

chat_models.premai.create_prem_retry_decorator(llm, *)

Create a retry decorator for PremAI API errors.

chat_models.sparkllm.convert_dict_to_message(_dict)

chat_models.sparkllm.convert_message_to_dict(message)

chat_models.tongyi.convert_dict_to_message(_dict)

Convert a dict to a message.

chat_models.tongyi.convert_message_chunk_to_message(...)

Convert a message chunk to a message.

chat_models.tongyi.convert_message_to_dict(message)

Convert a message to a dict.

chat_models.volcengine_maas.convert_dict_to_message(_dict)

Convert a dict to a message.

chat_models.yandex.acompletion_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.yandex.completion_with_retry(...)

Use tenacity to retry the completion call.

chat_models.yi.aconnect_httpx_sse(client, ...)

chat_models.yuan2.acompletion_with_retry(...)

Use tenacity to retry the async completion call.

chat_models.zhipuai.aconnect_sse(client, ...)

Async context manager for connecting to an SSE stream.

chat_models.zhipuai.connect_sse(client, ...)

Context manager for connecting to an SSE stream.

Deprecated classes