GradientEmbeddings#
- class langchain_community.embeddings.gradient_ai.GradientEmbeddings[source]#
Bases:
BaseModel
,Embeddings
Gradient.ai Embedding models.
GradientLLM is a class to interact with Embedding Models on gradient.ai
To use, set the environment variable
GRADIENT_ACCESS_TOKEN
with your API token andGRADIENT_WORKSPACE_ID
for your gradient workspace, or alternatively provide them as keywords to the constructor of this class.Example
from langchain_community.embeddings import GradientEmbeddings GradientEmbeddings( model="bge-large", gradient_workspace_id="12345614fc0_workspace", gradient_access_token="gradientai-access_token", )
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- param client: Any = None#
Gradient client.
- param gradient_access_token: str | None = None#
gradient.ai API Token, which can be generated by going to https://auth.gradient.ai/select-workspace and selecting “Access tokens” under the profile drop-down.
- param gradient_api_url: str = 'https://api.gradient.ai/api'#
Endpoint URL to use.
- param gradient_workspace_id: str | None = None#
Underlying gradient.ai workspace_id.
- param model: str [Required]#
Underlying gradient.ai model id.
- param query_prompt_for_retrieval: str | None = None#
Query pre-prompt
- async aembed_documents(texts: List[str]) List[List[float]] [source]#
Async call out to Gradient’s embedding endpoint.
- Parameters:
texts (List[str]) – The list of texts to embed.
- Returns:
List of embeddings, one for each text.
- Return type:
List[List[float]]
- async aembed_query(text: str) List[float] [source]#
Async call out to Gradient’s embedding endpoint.
- Parameters:
text (str) – The text to embed.
- Returns:
Embeddings for the text.
- Return type:
List[float]
Examples using GradientEmbeddings