Skip to main content

GPT4All

GPT4All is a free-to-use, locally running, privacy-aware chatbot. There is no GPU or internet required. It features popular models and its own models such as GPT4All Falcon, Wizard, etc.

This notebook explains how to use GPT4All embeddings with LangChain.

Install GPT4All's Python Bindingsโ€‹

%pip install --upgrade --quiet  gpt4all > /dev/null

Note: you may need to restart the kernel to use updated packages.

from langchain_community.embeddings import GPT4AllEmbeddings

API Reference:

gpt4all_embd = GPT4AllEmbeddings()
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 45.5M/45.5M [00:02<00:00, 18.5MiB/s]
``````output
Model downloaded at: /Users/rlm/.cache/gpt4all/ggml-all-MiniLM-L6-v2-f16.bin
``````output
objc[45711]: Class GGMLMetalClass is implemented in both /Users/rlm/anaconda3/envs/lcn2/lib/python3.9/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libreplit-mainline-metal.dylib (0x29fe18208) and /Users/rlm/anaconda3/envs/lcn2/lib/python3.9/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllamamodel-mainline-metal.dylib (0x2a0244208). One of the two will be used. Which one is undefined.
text = "This is a test document."

Embed the Textual Dataโ€‹

query_result = gpt4all_embd.embed_query(text)

With embed_documents you can embed multiple pieces of text. You can also map these embeddings with Nomic's Atlas to see a visual representation of your data.

doc_result = gpt4all_embd.embed_documents([text])

Help us out by providing feedback on this documentation page: