SagemakerEndpointEmbeddings#

class langchain_community.embeddings.sagemaker_endpoint.SagemakerEndpointEmbeddings[source]#

Bases: BaseModel, Embeddings

Custom Sagemaker Inference Endpoints.

To use, you must supply the endpoint name from your deployed Sagemaker model & the region where it is deployed.

To authenticate, the AWS client uses the following methods to automatically load credentials: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

If a specific credential profile should be used, you must pass the name of the profile from the ~/.aws/credentials file that is to be used.

Make sure the credentials / roles used have the required policies to access the Sagemaker endpoint. See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

param client: Any = None#
param content_handler: EmbeddingsContentHandler [Required]#

The content handler class that provides an input and output transform functions to handle formats between LLM and the endpoint.

param credentials_profile_name: str | None = None#

The name of the profile in the ~/.aws/credentials or ~/.aws/config files, which has either access keys or role information specified. If not specified, the default credential profile or, if on an EC2 instance, credentials from IMDS will be used. See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

param endpoint_kwargs: Dict | None = None#

Optional attributes passed to the invoke_endpoint function. See `boto3`_. docs for more info. .. _boto3: <https://boto3.amazonaws.com/v1/documentation/api/latest/index.html>

param endpoint_name: str = ''#

The name of the endpoint from the deployed Sagemaker model. Must be unique within an AWS Region.

param model_kwargs: Dict | None = None#

Keyword arguments to pass to the model.

param region_name: str = ''#

The aws region where the Sagemaker model is deployed, eg. us-west-2.

async aembed_documents(texts: list[str]) list[list[float]]#

Asynchronous Embed search docs.

Parameters:

texts (list[str]) – List of text to embed.

Returns:

List of embeddings.

Return type:

list[list[float]]

async aembed_query(text: str) list[float]#

Asynchronous Embed query text.

Parameters:

text (str) – Text to embed.

Returns:

Embedding.

Return type:

list[float]

embed_documents(texts: List[str], chunk_size: int = 64) List[List[float]][source]#

Compute doc embeddings using a SageMaker Inference Endpoint.

Parameters:
  • texts (List[str]) – The list of texts to embed.

  • chunk_size (int) – The chunk size defines how many input texts will be grouped together as request. If None, will use the chunk size specified by the class.

Returns:

List of embeddings, one for each text.

Return type:

List[List[float]]

embed_query(text: str) List[float][source]#

Compute query embeddings using a SageMaker inference endpoint.

Parameters:

text (str) – The text to embed.

Returns:

Embeddings for the text.

Return type:

List[float]

classmethod validate_environment(values: Dict) Dict[source]#

Dont do anything if client provided externally

Parameters:

values (Dict)

Return type:

Dict

Examples using SagemakerEndpointEmbeddings