TinyAsyncOpenAIInfinityEmbeddingClient#
- class langchain_community.embeddings.infinity.TinyAsyncOpenAIInfinityEmbeddingClient(host: str = 'http://localhost:7797/v1', aiosession: ClientSession | None = None)[source]#
Helper tool to embed Infinity.
It is not a part of Langchain’s stable API, direct use discouraged.
Example
mini_client = TinyAsyncInfinityEmbeddingClient( ) embeds = mini_client.embed( model="BAAI/bge-small", text=["doc1", "doc2"] ) # or embeds = await mini_client.aembed( model="BAAI/bge-small", text=["doc1", "doc2"] )
Methods
__init__
([host, aiosession])aembed
(model, texts)call the embedding of model, async method
embed
(model, texts)call the embedding of model
- Parameters:
host (str) –
aiosession (ClientSession | None) –
- __init__(host: str = 'http://localhost:7797/v1', aiosession: ClientSession | None = None) None [source]#
- Parameters:
host (str) –
aiosession (ClientSession | None) –
- Return type:
None