Skip to main content

Google AlloyDB for PostgreSQL

Google Cloud AlloyDB for PostgreSQL is a fully managed PostgreSQL compatible database service for your most demanding enterprise workloads. AlloyDB combines the best of Google Cloud with PostgreSQL, for superior performance, scale, and availability. Extend your database application to build AI-powered experiences leveraging AlloyDB Langchain integrations.

This notebook goes over how to use Google Cloud AlloyDB for PostgreSQL to store chat message history with the AlloyDBChatMessageHistory class.

Learn more about the package on GitHub.

Open In Colab

Before You Begin​

To run this notebook, you will need to do the following:

πŸ¦œπŸ”— Library Installation​

The integration lives in its own langchain-google-alloydb-pg package, so we need to install it.

%pip install --upgrade --quiet langchain-google-alloydb-pg langchain-google-vertexai

Colab only: Uncomment the following cell to restart the kernel or use the button to restart the kernel. For Vertex AI Workbench you can restart the terminal using the button on top.

# # Automatically restart kernel after installs so that your environment can access the new packages
# import IPython

# app = IPython.Application.instance()
# app.kernel.do_shutdown(True)

πŸ” Authentication​

Authenticate to Google Cloud as the IAM user logged into this notebook in order to access your Google Cloud Project.

  • If you are using Colab to run this notebook, use the cell below and continue.
  • If you are using Vertex AI Workbench, check out the setup instructions here.
from google.colab import auth


☁ Set Your Google Cloud Project​

Set your Google Cloud project so that you can leverage Google Cloud resources within this notebook.

If you don't know your project ID, try the following:

# @markdown Please fill in the value below with your Google Cloud project ID and then run the cell.

PROJECT_ID = "my-project-id" # @param {type:"string"}

# Set the project id
!gcloud config set project {PROJECT_ID}

πŸ’‘ API Enablement​

The langchain-google-alloydb-pg package requires that you enable the AlloyDB Admin API in your Google Cloud Project.

# enable AlloyDB API
!gcloud services enable

Basic Usage​

Set AlloyDB database values​

Find your database values, in the AlloyDB cluster page.

# @title Set Your Values Here { display-mode: "form" }
REGION = "us-central1" # @param {type: "string"}
CLUSTER = "my-alloydb-cluster" # @param {type: "string"}
INSTANCE = "my-alloydb-instance" # @param {type: "string"}
DATABASE = "my-database" # @param {type: "string"}
TABLE_NAME = "message_store" # @param {type: "string"}

AlloyDBEngine Connection Pool​

One of the requirements and arguments to establish AlloyDB as a ChatMessageHistory memory store is a AlloyDBEngine object. The AlloyDBEngine configures a connection pool to your AlloyDB database, enabling successful connections from your application and following industry best practices.

To create a AlloyDBEngine using AlloyDBEngine.from_instance() you need to provide only 5 things:

  1. project_id : Project ID of the Google Cloud Project where the AlloyDB instance is located.
  2. region : Region where the AlloyDB instance is located.
  3. cluster: The name of the AlloyDB cluster.
  4. instance : The name of the AlloyDB instance.
  5. database : The name of the database to connect to on the AlloyDB instance.

By default, IAM database authentication will be used as the method of database authentication. This library uses the IAM principal belonging to the Application Default Credentials (ADC) sourced from the envionment.

Optionally, built-in database authentication using a username and password to access the AlloyDB database can also be used. Just provide the optional user and password arguments to AlloyDBEngine.from_instance():

  • user : Database user to use for built-in database authentication and login
  • password : Database password to use for built-in database authentication and login.
from langchain_google_alloydb_pg import AlloyDBEngine

engine = AlloyDBEngine.from_instance(

Initialize a table​

The AlloyDBChatMessageHistory class requires a database table with a specific schema in order to store the chat message history.

The AlloyDBEngine engine has a helper method init_chat_history_table() that can be used to create a table with the proper schema for you.



To initialize the AlloyDBChatMessageHistory class you need to provide only 3 things:

  1. engine - An instance of a AlloyDBEngine engine.
  2. session_id - A unique identifier string that specifies an id for the session.
  3. table_name : The name of the table within the AlloyDB database to store the chat message history.
from langchain_google_alloydb_pg import AlloyDBChatMessageHistory

history = AlloyDBChatMessageHistory.create_sync(
engine, session_id="test_session", table_name=TABLE_NAME
history.add_ai_message("whats up?")

Cleaning up​

When the history of a specific session is obsolete and can be deleted, it can be done the following way.

Note: Once deleted, the data is no longer stored in AlloyDB and is gone forever.


πŸ”— Chaining​

We can easily combine this message history class with LCEL Runnables

To do this we will use one of Google's Vertex AI chat models which requires that you enable the Vertex AI API in your Google Cloud Project.

# enable Vertex AI API
!gcloud services enable
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_google_vertexai import ChatVertexAI
prompt = ChatPromptTemplate.from_messages(
("system", "You are a helpful assistant."),
("human", "{question}"),

chain = prompt | ChatVertexAI(project=PROJECT_ID)
chain_with_history = RunnableWithMessageHistory(
lambda session_id: AlloyDBChatMessageHistory.create_sync(
# This is where we configure the session id
config = {"configurable": {"session_id": "test_session"}}
chain_with_history.invoke({"question": "Hi! I'm bob"}, config=config)
chain_with_history.invoke({"question": "Whats my name"}, config=config)

Was this page helpful?

You can leave detailed feedback on GitHub.