Firestore Chat Memory
For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory
that backs chat memory classes like BufferMemory
for a firestore.
Setupβ
First, install the Firebase admin package in your project:
- npm
- Yarn
- pnpm
npm install firebase-admin
yarn add firebase-admin
pnpm add firebase-admin
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/community @langchain/core
yarn add @langchain/openai @langchain/community @langchain/core
pnpm add @langchain/openai @langchain/community @langchain/core
Visit the Project Settings
page from your Firebase project and select the Service accounts
tab.
Inside the Service accounts
tab, click the Generate new private key
button inside the Firebase Admin SDK
section to download a JSON file containing your service account's credentials.
Using the downloaded JSON file, pass in the projectId
, privateKey
, and clientEmail
to the config
object of the FirestoreChatMessageHistory
class, like shown below:
import { FirestoreChatMessageHistory } from "@langchain/community/stores/message/firestore";
import admin from "firebase-admin";
const messageHistory = new FirestoreChatMessageHistory({
collections: ["chats"],
docs: ["user-id"],
sessionId: "user-id",
userId: "a@example.com",
config: {
projectId: "YOUR-PROJECT-ID",
credential: admin.credential.cert({
projectId: "YOUR-PROJECT-ID",
privateKey:
"-----BEGIN PRIVATE KEY-----\nCHANGE-ME\n-----END PRIVATE KEY-----\n",
clientEmail: "CHANGE-ME@CHANGE-ME-TOO.iam.gserviceaccount.com",
}),
},
});
Here, the collections
field should match the names and ordering of the collections
in your database.
The same goes for docs
, it should match the names and ordering of the docs
in your database.
Usageβ
import { BufferMemory } from "langchain/memory";
import { FirestoreChatMessageHistory } from "@langchain/community/stores/message/firestore";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import admin from "firebase-admin";
const memory = new BufferMemory({
chatHistory: new FirestoreChatMessageHistory({
collections: ["langchain"],
docs: ["lc-example"],
sessionId: "lc-example-id",
userId: "a@example.com",
config: {
projectId: "YOUR-PROJECT-ID",
credential: admin.credential.cert({
projectId: "YOUR-PROJECT-ID",
privateKey:
"-----BEGIN PRIVATE KEY-----\nnCHANGE-ME\n-----END PRIVATE KEY-----\n",
clientEmail: "CHANGE-ME@CHANGE-ME-TOO.iam.gserviceaccount.com",
}),
},
}),
});
const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });
const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{ res1: { text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?" } }
*/
const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });
/*
{ res1: { text: "You said your name was Jim." } }
*/
API Reference:
- BufferMemory from
langchain/memory
- FirestoreChatMessageHistory from
@langchain/community/stores/message/firestore
- ChatOpenAI from
@langchain/openai
- ConversationChain from
langchain/chains
Nested Collectionsβ
The FirestoreChatMessageHistory
class supports nested collections, and dynamic collection/doc names.
The example below shows how to add and retrieve messages from a database with the following structure:
/chats/{chat-id}/bots/{bot-id}/messages/{message-id}
import { BufferMemory } from "langchain/memory";
import { FirestoreChatMessageHistory } from "@langchain/community/stores/message/firestore";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import admin from "firebase-admin";
const memory = new BufferMemory({
chatHistory: new FirestoreChatMessageHistory({
collections: ["chats", "bots"],
docs: ["chat-id", "bot-id"],
sessionId: "user-id",
userId: "a@example.com",
config: {
projectId: "YOUR-PROJECT-ID",
credential: admin.credential.cert({
projectId: "YOUR-PROJECT-ID",
privateKey:
"-----BEGIN PRIVATE KEY-----\nnCHANGE-ME\n-----END PRIVATE KEY-----\n",
clientEmail: "CHANGE-ME@CHANGE-ME-TOO.iam.gserviceaccount.com",
}),
},
}),
});
const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });
const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{ res1: { response: 'Hello Jim! How can I assist you today?' } }
*/
const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });
/*
{ res2: { response: 'You just said that your name is Jim.' } }
*/
API Reference:
- BufferMemory from
langchain/memory
- FirestoreChatMessageHistory from
@langchain/community/stores/message/firestore
- ChatOpenAI from
@langchain/openai
- ConversationChain from
langchain/chains
Firestore Rulesβ
If your collection name is "chathistory," you can configure Firestore rules as follows.
match /chathistory/{sessionId} {
allow read: if request.auth.uid == resource.data.createdBy;
allow write: if request.auth.uid == request.resource.data.createdBy;
}
match /chathistory/{sessionId}/messages/{messageId} {
allow read: if request.auth.uid == resource.data.createdBy;
allow write: if request.auth.uid == request.resource.data.createdBy;
}