GoogleGenerativeAIEmbeddings#
- class langchain_google_genai.embeddings.GoogleGenerativeAIEmbeddings[source]#
Bases:
BaseModel
,Embeddings
Google Generative AI Embeddings.
To use, you must have either:
The
GOOGLE_API_KEY`
environment variable set with your API key, orPass your API key using the google_api_key kwarg to the ChatGoogle constructor.
Example
from langchain_google_genai import GoogleGenerativeAIEmbeddings embeddings = GoogleGenerativeAIEmbeddings(model="models/embedding-001") embeddings.embed_query("What's our Q1 revenue?")
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- param client_options: Dict | None = None#
A dictionary of client options to pass to the Google API client, such as api_endpoint.
- param credentials: Any = None#
The default custom credentials (google.auth.credentials.Credentials) to use when making API calls. If not provided, credentials will be ascertained from the GOOGLE_API_KEY envvar
- param google_api_key: SecretStr | None [Optional]#
The Google API key to use. If not provided, the GOOGLE_API_KEY environment variable will be used.
- param model: str [Required]#
The name of the embedding model to use. Example: models/embedding-001
- param request_options: Dict | None = None#
A dictionary of request options to pass to the Google API client.Example: {βtimeoutβ: 10}
- param task_type: str | None = None#
The task type. Valid options include: task_type_unspecified, retrieval_query, retrieval_document, semantic_similarity, classification, and clustering
- param transport: str | None = None#
A string, one of: [rest, grpc, grpc_asyncio].
- async aembed_documents(texts: list[str]) list[list[float]] #
Asynchronous Embed search docs.
- Parameters:
texts (list[str]) β List of text to embed.
- Returns:
List of embeddings.
- Return type:
list[list[float]]
- async aembed_query(text: str) list[float] #
Asynchronous Embed query text.
- Parameters:
text (str) β Text to embed.
- Returns:
Embedding.
- Return type:
list[float]
- embed_documents(texts: List[str], *, batch_size: int = 100, task_type: str | None = None, titles: List[str] | None = None, output_dimensionality: int | None = None) List[List[float]] [source]#
Embed a list of strings. Google Generative AI currently sets a max batch size of 100 strings.
- Parameters:
texts (List[str]) β List[str] The list of strings to embed.
batch_size (int) β [int] The batch size of embeddings to send to the model
task_type (str | None) β task_type (https://ai.google.dev/api/rest/v1/TaskType)
titles (List[str] | None) β An optional list of titles for texts provided.
RETRIEVAL_DOCUMENT. (Only applicable when TaskType is)
output_dimensionality (int | None) β Optional reduced dimension for the output embedding.
https β //ai.google.dev/api/rest/v1/models/batchEmbedContents#EmbedContentRequest
- Returns:
List of embeddings, one for each text.
- Return type:
List[List[float]]
- embed_query(text: str, task_type: str | None = None, title: str | None = None, output_dimensionality: int | None = None) List[float] [source]#
Embed a text.
- Parameters:
text (str) β The text to embed.
task_type (str | None) β task_type (https://ai.google.dev/api/rest/v1/TaskType)
title (str | None) β An optional title for the text.
RETRIEVAL_DOCUMENT. (Only applicable when TaskType is)
output_dimensionality (int | None) β Optional reduced dimension for the output embedding.
https β //ai.google.dev/api/rest/v1/models/batchEmbedContents#EmbedContentRequest
- Returns:
Embedding for the text.
- Return type:
List[float]