AscendEmbeddings#

class langchain_community.embeddings.ascend.AscendEmbeddings[source]#

Bases: Embeddings, BaseModel

Ascend NPU accelerate Embedding model

Please ensure that you have installed CANN and torch_npu.

Example:

from langchain_community.embeddings import AscendEmbeddings model = AscendEmbeddings(model_path=<path_to_model>,

device_id=0, query_instruction=”Represent this sentence for searching relevant passages: β€œ

)

param device_id: int = 0#

Unstruntion to used for embedding query.

param document_instruction: str = ''#
param model: Any [Required]#
param model_path: str [Required]#

Ascend NPU device id.

param pooling_method: str | None = 'cls'#
param query_instruction: str = ''#

Unstruntion to used for embedding document.

param tokenizer: Any [Required]#
param use_fp16: bool = True#
async aembed_documents(texts: list[str]) β†’ list[list[float]]#

Asynchronous Embed search docs.

Parameters:

texts (list[str]) – List of text to embed.

Returns:

List of embeddings.

Return type:

list[list[float]]

async aembed_query(text: str) β†’ list[float]#

Asynchronous Embed query text.

Parameters:

text (str) – Text to embed.

Returns:

Embedding.

Return type:

list[float]

embed_documents(texts: List[str]) β†’ List[List[float]][source]#

Embed search docs.

Parameters:

texts (List[str]) – List of text to embed.

Returns:

List of embeddings.

Return type:

List[List[float]]

embed_query(text: str) β†’ List[float][source]#

Embed query text.

Parameters:

text (str) – Text to embed.

Returns:

Embedding.

Return type:

List[float]

encode(sentences: Any) β†’ Any[source]#
Parameters:

sentences (Any)

Return type:

Any

pooling(last_hidden_state: Any, attention_mask: Any = None) β†’ Any[source]#
Parameters:
  • last_hidden_state (Any)

  • attention_mask (Any)

Return type:

Any

Examples using AscendEmbeddings