GoogleGenerativeAIEmbeddings#

class langchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings[source]#

Bases: BaseModel, Embeddings

Google Generative AI Embeddings.

To use, you must have either:

  1. The GOOGLE_API_KEY` environment variable set with your API key, or

  2. Pass your API key using the google_api_key kwarg to the ChatGoogle constructor.

Example

from langchain_google_genai import GoogleGenerativeAIEmbeddings

embeddings = GoogleGenerativeAIEmbeddings(model="models/embedding-001")
embeddings.embed_query("What's our Q1 revenue?")

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be parsed to form a valid model.

param client_options: Dict | None = None#

A dictionary of client options to pass to the Google API client, such as api_endpoint.

param credentials: Any = None#

The default custom credentials (google.auth.credentials.Credentials) to use when making API calls. If not provided, credentials will be ascertained from the GOOGLE_API_KEY envvar

param google_api_key: SecretStr | None [Optional]#

The Google API key to use. If not provided, the GOOGLE_API_KEY environment variable will be used.

Constraints:
  • type = string

  • writeOnly = True

  • format = password

param model: str [Required]#

The name of the embedding model to use. Example: models/embedding-001

param request_options: Dict | None = None#

A dictionary of request options to pass to the Google API client.Example: {β€˜timeout’: 10}

param task_type: str | None = None#

The task type. Valid options include: task_type_unspecified, retrieval_query, retrieval_document, semantic_similarity, classification, and clustering

param transport: str | None = None#

A string, one of: [rest, grpc, grpc_asyncio].

async aembed_documents(texts: List[str]) β†’ List[List[float]]#

Asynchronous Embed search docs.

Parameters:

texts (List[str]) – List of text to embed.

Returns:

List of embeddings.

Return type:

List[List[float]]

async aembed_query(text: str) β†’ List[float]#

Asynchronous Embed query text.

Parameters:

text (str) – Text to embed.

Returns:

Embedding.

Return type:

List[float]

embed_documents(texts: List[str], *, batch_size: int = 100, task_type: str | None = None, titles: List[str] | None = None, output_dimensionality: int | None = None) β†’ List[List[float]][source]#

Embed a list of strings. Google Generative AI currently sets a max batch size of 100 strings.

Parameters:
  • texts (List[str]) – List[str] The list of strings to embed.

  • batch_size (int) – [int] The batch size of embeddings to send to the model

  • task_type (str | None) – task_type (https://ai.google.dev/api/rest/v1/TaskType)

  • titles (List[str] | None) – An optional list of titles for texts provided.

  • RETRIEVAL_DOCUMENT. (Only applicable when TaskType is) –

  • output_dimensionality (int | None) – Optional reduced dimension for the output embedding.

  • https – //ai.google.dev/api/rest/v1/models/batchEmbedContents#EmbedContentRequest

Returns:

List of embeddings, one for each text.

Return type:

List[List[float]]

embed_query(text: str, task_type: str | None = None, title: str | None = None, output_dimensionality: int | None = None) β†’ List[float][source]#

Embed a text.

Parameters:
  • text (str) – The text to embed.

  • task_type (str | None) – task_type (https://ai.google.dev/api/rest/v1/TaskType)

  • title (str | None) – An optional title for the text.

  • RETRIEVAL_DOCUMENT. (Only applicable when TaskType is) –

  • output_dimensionality (int | None) – Optional reduced dimension for the output embedding.

  • https – //ai.google.dev/api/rest/v1/models/batchEmbedContents#EmbedContentRequest

Returns:

Embedding for the text.

Return type:

List[float]