Skip to main content

Llama.cpp

This page covers how to use llama.cpp within LangChain. It is broken into two parts: installation and setup, and then references to specific Llama-cpp wrappers.

Installation and Setup​

  • Install the Python package with pip install llama-cpp-python
  • Download one of the supported models and convert them to the llama.cpp format per the instructions

Wrappers​

LLM​

There exists a LlamaCpp LLM wrapper, which you can access with

from langchain_community.llms import LlamaCpp
API Reference:LlamaCpp

For a more detailed walkthrough of this, see this notebook

Embeddings​

There exists a LlamaCpp Embeddings wrapper, which you can access with

from langchain_community.embeddings import LlamaCppEmbeddings
API Reference:LlamaCppEmbeddings

For a more detailed walkthrough of this, see this notebook


Was this page helpful?


You can also leave detailed feedback on GitHub.