Bigtable
Bigtable is a scalable, fully managed key-value and wide-column store ideal for fast access to structured, semi-structured, or unstructured data. This page provides an overview of Bigtable's LangChain integrations.
Client Library Documentation: cloud.google.com/python/docs/reference/langchain-google-bigtable/latest
Product Documentation: cloud.google.com/bigtable
Quick Start
To use this library, you first need to:
- Select or create a Cloud Platform project.
- Enable billing for your project.
- Enable the Google Cloud Bigtable API.
- Set up Authentication.
Installation
The main package for this integration is langchain-google-bigtable
.
pip install -U langchain-google-bigtable
Integrations
The langchain-google-bigtable
package provides the following integrations:
Vector Store
With BigtableVectorStore
, you can store documents and their vector embeddings to find the most similar or relevant information in your database.
- Full
VectorStore
Implementation: Supports all methods from the LangChainVectorStore
abstract class. - Async/Sync Support: All methods are available in both asynchronous and synchronous versions.
- Metadata Filtering: Powerful filtering on metadata fields, including logical AND/OR combinations.
- Multiple Distance Strategies: Supports both Cosine and Euclidean distance for similarity search.
- Customizable Storage: Full control over how content, embeddings, and metadata are stored in Bigtable columns.
from langchain_google_bigtable import BigtableVectorStore
# Your embedding service and other configurations
# embedding_service = ...
engine = await BigtableEngine.async_initialize(project_id="your-project-id")
vector_store = await BigtableVectorStore.create(
engine=engine,
instance_id="your-instance-id",
table_id="your-table-id",
embedding_service=embedding_service,
collection="your_collection_name",
)
await vector_store.aadd_documents([your_documents])
results = await vector_store.asimilarity_search("your query")
Learn more in the Vector Store how-to guide.
Key-value Store
Use BigtableByteStore
as a persistent, scalable key-value store for caching, session management, or other storage needs. It supports both synchronous and asynchronous operations.
from langchain_google_bigtable import BigtableByteStore
# Initialize the store
store = await BigtableByteStore.create(
project_id="your-project-id",
instance_id="your-instance-id",
table_id="your-table-id",
)
# Set and get values
await store.amset([("key1", b"value1")])
retrieved = await store.amget(["key1"])
Learn more in the Key-value Store how-to guide.
Document Loader
Use the BigtableLoader
to load data from a Bigtable table and represent it as LangChain Document
objects.
from langchain_google_bigtable import BigtableLoader
loader = BigtableLoader(
project_id="your-project-id",
instance_id="your-instance-id",
table_id="your-table-name"
)
docs = loader.load()
Learn more in the Document Loader how-to guide.
Chat Message History
Use BigtableChatMessageHistory
to store conversation histories, enabling stateful chains and agents.
from langchain_google_bigtable import BigtableChatMessageHistory
history = BigtableChatMessageHistory(
project_id="your-project-id",
instance_id="your-instance-id",
table_id="your-message-store",
session_id="user-session-123"
)
history.add_user_message("Hello!")
history.add_ai_message("Hi there!")
Learn more in the Chat Message History how-to guide.
Contributions
Contributions to this library are welcome. Please see the CONTRIBUTING guide in the package repo for more details
License
This project is licensed under the Apache 2.0 License - see the LICENSE file in the package repo for details.
Disclaimer
This is not an officially supported Google product.