LocalAIEmbeddings#

class langchain_community.embeddings.localai.LocalAIEmbeddings[source]#

Bases: BaseModel, Embeddings

LocalAI embedding models.

Since LocalAI and OpenAI have 1:1 compatibility between APIs, this class uses the openai Python package’s openai.Embedding as its client. Thus, you should have the openai python package installed, and defeat the environment variable OPENAI_API_KEY by setting to a random string. You also need to specify OPENAI_API_BASE to point to your LocalAI service endpoint.

Example

from langchain_community.embeddings import LocalAIEmbeddings
openai = LocalAIEmbeddings(
    openai_api_key="random-string",
    openai_api_base="http://localhost:8080"
)

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be parsed to form a valid model.

param allowed_special: Literal['all'] | Set[str] = {}#
param chunk_size: int = 1000#

Maximum number of texts to embed in each batch

param deployment: str = 'text-embedding-ada-002'#
param disallowed_special: Literal['all'] | Set[str] | Sequence[str] = 'all'#
param embedding_ctx_length: int = 8191#

The maximum number of tokens to embed at once.

param headers: Any = None#
param max_retries: int = 6#

Maximum number of retries to make when generating.

param model: str = 'text-embedding-ada-002'#
param model_kwargs: Dict[str, Any] [Optional]#

Holds any model parameters valid for create call not explicitly specified.

param openai_api_base: str | None = None#
param openai_api_key: str | None = None#
param openai_api_version: str | None = None#
param openai_organization: str | None = None#
param openai_proxy: str | None = None#
param request_timeout: float | Tuple[float, float] | None = None#

Timeout in seconds for the LocalAI request.

param show_progress_bar: bool = False#

Whether to show a progress bar when embedding.

async aembed_documents(texts: List[str], chunk_size: int | None = 0) List[List[float]][source]#

Call out to LocalAI’s embedding endpoint async for embedding search docs.

Parameters:
  • texts (List[str]) – The list of texts to embed.

  • chunk_size (int | None) – The chunk size of embeddings. If None, will use the chunk size specified by the class.

Returns:

List of embeddings, one for each text.

Return type:

List[List[float]]

async aembed_query(text: str) List[float][source]#

Call out to LocalAI’s embedding endpoint async for embedding query text.

Parameters:

text (str) – The text to embed.

Returns:

Embedding for the text.

Return type:

List[float]

embed_documents(texts: List[str], chunk_size: int | None = 0) List[List[float]][source]#

Call out to LocalAI’s embedding endpoint for embedding search docs.

Parameters:
  • texts (List[str]) – The list of texts to embed.

  • chunk_size (int | None) – The chunk size of embeddings. If None, will use the chunk size specified by the class.

Returns:

List of embeddings, one for each text.

Return type:

List[List[float]]

embed_query(text: str) List[float][source]#

Call out to LocalAI’s embedding endpoint for embedding query text.

Parameters:

text (str) – The text to embed.

Returns:

Embedding for the text.

Return type:

List[float]

Examples using LocalAIEmbeddings