LlamaChatContentFormatter#
- class langchain_community.chat_models.azureml_endpoint.LlamaChatContentFormatter[source]#
Deprecated: Kept for backwards compatibility
Chat Content formatter for Llama.
Attributes
SUPPORTED_ROLES
accepts
The MIME type of the response data returned from the endpoint
content_type
The MIME type of the input data passed to the endpoint
format_error_msg
supported_api_types
Supported APIs for the given formatter.
Methods
__init__
()escape_special_characters
(prompt)Escapes any special characters in prompt
format_messages_request_payload
(messages,Β ...)Formats the request according to the chosen api
format_request_payload
(prompt,Β model_kwargs)Formats the request body according to the input schema of the model.
format_response_payload
(output[,Β api_type])Formats response
- static escape_special_characters(prompt: str) str #
Escapes any special characters in prompt
- Parameters:
prompt (str) β
- Return type:
str
- format_messages_request_payload(messages: List[BaseMessage], model_kwargs: Dict, api_type: AzureMLEndpointApiType) bytes #
Formats the request according to the chosen api
- Parameters:
messages (List[BaseMessage]) β
model_kwargs (Dict) β
api_type (AzureMLEndpointApiType) β
- Return type:
bytes
- format_request_payload(prompt: str, model_kwargs: Dict, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.dedicated) Any #
Formats the request body according to the input schema of the model. Returns bytes or seekable file like object in the format specified in the content_type request header.
- Parameters:
prompt (str) β
model_kwargs (Dict) β
api_type (AzureMLEndpointApiType) β
- Return type:
Any
- format_response_payload(output: bytes, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.dedicated) ChatGeneration #
Formats response
- Parameters:
output (bytes) β
api_type (AzureMLEndpointApiType) β
- Return type: