Skip to main content

C Transformers

The C Transformers library provides Python bindings for GGML models.

This example goes over how to use LangChain to interact with C Transformers models.


%pip install --upgrade --quiet  ctransformers

Load Model

from langchain_community.llms import CTransformers

llm = CTransformers(model="marella/gpt-2-ggml")
API Reference:CTransformers

Generate Text

print(llm.invoke("AI is going to"))


from langchain_core.callbacks import StreamingStdOutCallbackHandler

llm = CTransformers(
model="marella/gpt-2-ggml", callbacks=[StreamingStdOutCallbackHandler()]

response = llm.invoke("AI is going to")


from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate

template = """Question: {question}


prompt = PromptTemplate.from_template(template)

llm_chain = LLMChain(prompt=prompt, llm=llm)

response ="What is AI?")
API Reference:LLMChain | PromptTemplate

Was this page helpful?

You can also leave detailed feedback on GitHub.