HuggingFaceBgeEmbeddings#

class langchain_community.embeddings.huggingface.HuggingFaceBgeEmbeddings[source]#

Bases: BaseModel, Embeddings

HuggingFace sentence_transformers embedding models.

To use, you should have the sentence_transformers python package installed. To use Nomic, make sure the version of sentence_transformers >= 2.3.0.

Bge Example:
from langchain_community.embeddings import HuggingFaceBgeEmbeddings

model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cpu'}
encode_kwargs = {'normalize_embeddings': True}
hf = HuggingFaceBgeEmbeddings(
    model_name=model_name,
    model_kwargs=model_kwargs,
    encode_kwargs=encode_kwargs
)
Nomic Example:
from langchain_community.embeddings import HuggingFaceBgeEmbeddings

model_name = "nomic-ai/nomic-embed-text-v1"
model_kwargs = {
    'device': 'cpu',
    'trust_remote_code':True
    }
encode_kwargs = {'normalize_embeddings': True}
hf = HuggingFaceBgeEmbeddings(
    model_name=model_name,
    model_kwargs=model_kwargs,
    encode_kwargs=encode_kwargs,
    query_instruction = "search_query:",
    embed_instruction = "search_document:"
)

Initialize the sentence_transformer.

param cache_folder: str | None = None#

Path to store models. Can be also set by SENTENCE_TRANSFORMERS_HOME environment variable.

param embed_instruction: str = ''#

Instruction to use for embedding document.

param encode_kwargs: Dict[str, Any] [Optional]#

Keyword arguments to pass when calling the encode method of the model.

param model_kwargs: Dict[str, Any] [Optional]#

Keyword arguments to pass to the model.

param model_name: str = 'BAAI/bge-large-en'#

Model name to use.

param query_instruction: str = 'Represent this question for searching relevant passages: '#

Instruction to use for embedding query.

param show_progress: bool = False#

Whether to show a progress bar.

async aembed_documents(texts: List[str]) List[List[float]]#

Asynchronous Embed search docs.

Parameters:

texts (List[str]) – List of text to embed.

Returns:

List of embeddings.

Return type:

List[List[float]]

async aembed_query(text: str) List[float]#

Asynchronous Embed query text.

Parameters:

text (str) – Text to embed.

Returns:

Embedding.

Return type:

List[float]

embed_documents(texts: List[str]) List[List[float]][source]#

Compute doc embeddings using a HuggingFace transformer model.

Parameters:

texts (List[str]) – The list of texts to embed.

Returns:

List of embeddings, one for each text.

Return type:

List[List[float]]

embed_query(text: str) List[float][source]#

Compute query embeddings using a HuggingFace transformer model.

Parameters:

text (str) – The text to embed.

Returns:

Embeddings for the text.

Return type:

List[float]

Examples using HuggingFaceBgeEmbeddings