Provider#
- class langchain_community.chat_models.oci_generative_ai.Provider[source]#
Attributes
stop_sequence_key
Methods
__init__
()chat_generation_info
(response)chat_response_to_text
(response)chat_stream_generation_info
(event_data)chat_stream_to_text
(event_data)convert_to_oci_tool
(tool)get_role
(message)is_chat_stream_end
(event_data)messages_to_oci_params
(messages, **kwargs)- __init__()#
- abstract chat_generation_info(response: Any) Dict[str, Any] [source]#
- Parameters:
response (Any) –
- Return type:
Dict[str, Any]
- abstract chat_response_to_text(response: Any) str [source]#
- Parameters:
response (Any) –
- Return type:
str
- abstract chat_stream_generation_info(event_data: Dict) Dict[str, Any] [source]#
- Parameters:
event_data (Dict) –
- Return type:
Dict[str, Any]
- abstract chat_stream_to_text(event_data: Dict) str [source]#
- Parameters:
event_data (Dict) –
- Return type:
str
- abstract convert_to_oci_tool(tool: Dict[str, Any] | Type[BaseModel] | Callable | BaseTool) Dict[str, Any] [source]#
- Parameters:
tool (Dict[str, Any] | Type[BaseModel] | Callable | BaseTool) –
- Return type:
Dict[str, Any]
- abstract get_role(message: BaseMessage) str [source]#
- Parameters:
message (BaseMessage) –
- Return type:
str