Cloudflare D1-Backed Chat Memory
info
This integration is only supported in Cloudflare Workers.
For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory
that backs chat memory classes like BufferMemory
for a Cloudflare D1 instance.
Setupβ
You'll need to install the LangChain Cloudflare integration package. For the below example, we also use Anthropic, but you can use any model you'd like:
- npm
- Yarn
- pnpm
npm install @langchain/cloudflare @langchain/anthropic @langchain/core
yarn add @langchain/cloudflare @langchain/anthropic @langchain/core
pnpm add @langchain/cloudflare @langchain/anthropic @langchain/core
Set up a D1 instance for your worker by following the official documentation. Your project's wrangler.toml
file should
look something like this:
name = "YOUR_PROJECT_NAME"
main = "src/index.ts"
compatibility_date = "2024-01-10"
[vars]
ANTHROPIC_API_KEY = "YOUR_ANTHROPIC_KEY"
[[d1_databases]]
binding = "DB" # available in your Worker as env.DB
database_name = "YOUR_D1_DB_NAME"
database_id = "YOUR_D1_DB_ID"
Usageβ
You can then use D1 to store your history as follows:
import type { D1Database } from "@cloudflare/workers-types";
import { BufferMemory } from "langchain/memory";
import { CloudflareD1MessageHistory } from "@langchain/cloudflare";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import { RunnableSequence } from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { ChatAnthropic } from "@langchain/anthropic";
export interface Env {
DB: D1Database;
ANTHROPIC_API_KEY: string;
}
export default {
async fetch(request: Request, env: Env): Promise<Response> {
try {
const { searchParams } = new URL(request.url);
const input = searchParams.get("input");
if (!input) {
throw new Error(`Missing "input" parameter`);
}
const memory = new BufferMemory({
returnMessages: true,
chatHistory: new CloudflareD1MessageHistory({
tableName: "stored_message",
sessionId: "example",
database: env.DB,
}),
});
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful chatbot"],
new MessagesPlaceholder("history"),
["human", "{input}"],
]);
const model = new ChatAnthropic({
apiKey: env.ANTHROPIC_API_KEY,
});
const chain = RunnableSequence.from([
{
input: (initialInput) => initialInput.input,
memory: () => memory.loadMemoryVariables({}),
},
{
input: (previousOutput) => previousOutput.input,
history: (previousOutput) => previousOutput.memory.history,
},
prompt,
model,
new StringOutputParser(),
]);
const chainInput = { input };
const res = await chain.invoke(chainInput);
await memory.saveContext(chainInput, {
output: res,
});
return new Response(JSON.stringify(res), {
headers: { "content-type": "application/json" },
});
} catch (err: any) {
console.log(err.message);
return new Response(err.message, { status: 500 });
}
},
};
API Reference:
- BufferMemory from
langchain/memory
- CloudflareD1MessageHistory from
@langchain/cloudflare
- ChatPromptTemplate from
@langchain/core/prompts
- MessagesPlaceholder from
@langchain/core/prompts
- RunnableSequence from
@langchain/core/runnables
- StringOutputParser from
@langchain/core/output_parsers
- ChatAnthropic from
@langchain/anthropic