OctoAI
OctoAI offers easy access to efficient compute and enables users to integrate their choice of AI models into applications. The OctoAI
compute service helps you run, tune, and scale AI applications easily.
This example goes over how to use LangChain to interact with OctoAI
LLM endpoints
Setup
To run our example app, there are two simple steps to take:
-
Get an API Token from your OctoAI account page.
-
Paste your API key in in the code cell below.
Note: If you want to use a different LLM model, you can containerize the model and make a custom OctoAI endpoint yourself, by following Build a Container from Python and Create a Custom Endpoint from a Container and then updating your OCTOAI_API_BASE
environment variable.
import os
os.environ["OCTOAI_API_TOKEN"] = "OCTOAI_API_TOKEN"
from langchain.chains import LLMChain
from langchain_community.llms.octoai_endpoint import OctoAIEndpoint
from langchain_core.prompts import PromptTemplate