InfinityEmbeddings#
- class langchain_community.embeddings.infinity.InfinityEmbeddings[source]#
Bases:
BaseModel
,Embeddings
Self-hosted embedding models for infinity package.
See michaelfeil/infinity This also works for text-embeddings-inference and other self-hosted openai-compatible servers.
Infinity is a package to interact with Embedding Models on michaelfeil/infinity
Example
from langchain_community.embeddings import InfinityEmbeddings InfinityEmbeddings( model="BAAI/bge-small", infinity_api_url="http://localhost:7997", )
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
- param client: Any = None#
Infinity client.
- param infinity_api_url: str = 'http://localhost:7997'#
Endpoint URL to use.
- param model: str [Required]#
Underlying Infinity model id.
- async aembed_documents(texts: List[str]) List[List[float]] [source]#
Async call out to Infinity’s embedding endpoint.
- Parameters:
texts (List[str]) – The list of texts to embed.
- Returns:
List of embeddings, one for each text.
- Return type:
List[List[float]]
- async aembed_query(text: str) List[float] [source]#
Async call out to Infinity’s embedding endpoint.
- Parameters:
text (str) – The text to embed.
- Returns:
Embeddings for the text.
- Return type:
List[float]
Examples using InfinityEmbeddings