Skip to main content

TiDB

TiDB is an open-source, cloud-native, distributed, MySQL-Compatible database for elastic scale and real-time analytics.

This notebook introduces how to use TiDB to store chat message history.

Setup​

Firstly, we will install the following dependencies:

%pip install --upgrade --quiet langchain langchain_openai

Configuring your OpenAI Key

import getpass
import os

os.environ["OPENAI_API_KEY"] = getpass.getpass("Input your OpenAI API key:")

Finally, we will configure the connection to a TiDB. In this notebook, we will follow the standard connection method provided by TiDB Cloud to establish a secure and efficient database connection.

# copy from tidb cloud console
tidb_connection_string_template = "mysql+pymysql://<USER>:<PASSWORD>@<HOST>:4000/<DB>?ssl_ca=/etc/ssl/cert.pem&ssl_verify_cert=true&ssl_verify_identity=true"
tidb_password = getpass.getpass("Input your TiDB password:")
tidb_connection_string = tidb_connection_string_template.replace(
"<PASSWORD>", tidb_password
)

Generating historical data​

Creating a set of historical data, which will serve as the foundation for our upcoming demonstrations.

from datetime import datetime

from langchain_community.chat_message_histories import TiDBChatMessageHistory

history = TiDBChatMessageHistory(
connection_string=tidb_connection_string,
session_id="code_gen",
earliest_time=datetime.utcnow(), # Optional to set earliest_time to load messages after this time point.
)

history.add_user_message("How's our feature going?")
history.add_ai_message(
"It's going well. We are working on testing now. It will be released in Feb."
)
history.messages
[HumanMessage(content="How's our feature going?"),
AIMessage(content="It's going well. We are working on testing now. It will be released in Feb.")]

Chatting with historical data​

Let’s build upon the historical data generated earlier to create a dynamic chat interaction.

Firstly, Creating a Chat Chain with LangChain:

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI

prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You're an assistant who's good at coding. You're helping a startup build",
),
MessagesPlaceholder(variable_name="history"),
("human", "{question}"),
]
)
chain = prompt | ChatOpenAI()

Building a Runnable on History:

from langchain_core.runnables.history import RunnableWithMessageHistory

chain_with_history = RunnableWithMessageHistory(
chain,
lambda session_id: TiDBChatMessageHistory(
session_id=session_id, connection_string=tidb_connection_string
),
input_messages_key="question",
history_messages_key="history",
)

Initiating the Chat:

response = chain_with_history.invoke(
{"question": "Today is Jan 1st. How many days until our feature is released?"},
config={"configurable": {"session_id": "code_gen"}},
)
response
AIMessage(content='There are 31 days in January, so there are 30 days until our feature is released in February.')

Checking the history data​

history.reload_cache()
history.messages
[HumanMessage(content="How's our feature going?"),
AIMessage(content="It's going well. We are working on testing now. It will be released in Feb."),
HumanMessage(content='Today is Jan 1st. How many days until our feature is released?'),
AIMessage(content='There are 31 days in January, so there are 30 days until our feature is released in February.')]