ChatCloudflareWorkersAI
Workers AI allows you to run machine learning models, on the Cloudflare network, from your own code.
This will help you getting started with Cloudflare Workers AI chat
models. For detailed documentation of all
ChatCloudflareWorkersAI
features and configurations head to the API
reference.
Overviewβ
Integration detailsβ
Class | Package | Local | Serializable | PY support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatCloudflareWorkersAI | @langchain/cloudflare | β | β | β |
Model featuresβ
See the links in the table headers below for guides on how to use specific features.
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|
β | β | β | β | β | β | β | β | β |
Setupβ
To access Cloudflare Workers AI models youβll need to create a
Cloudflare account, get an API key, and install the
@langchain/cloudflare
integration package.
Credentialsβ
Head to this page to
sign up to Cloudflare and generate an API key. Once youβve done this,
note your CLOUDFLARE_ACCOUNT_ID
and CLOUDFLARE_API_TOKEN
.
Passing a binding within a Cloudflare Worker is not yet supported.
Installationβ
The LangChain ChatCloudflareWorkersAI integration lives in the
@langchain/cloudflare
package:
- npm
- yarn
- pnpm
npm i @langchain/cloudflare @langchain/core
yarn add @langchain/cloudflare @langchain/core
pnpm add @langchain/cloudflare @langchain/core
Instantiationβ
Now we can instantiate our model object and generate chat completions:
import { ChatCloudflareWorkersAI } from "@langchain/cloudflare";
const llm = new ChatCloudflareWorkersAI({
model: "@cf/meta/llama-2-7b-chat-int8", // Default value
cloudflareAccountId: CLOUDFLARE_ACCOUNT_ID,
cloudflareApiToken: CLOUDFLARE_API_TOKEN,
// Pass a custom base URL to use Cloudflare AI Gateway
// baseUrl: `https://gateway.ai.cloudflare.com/v1/{YOUR_ACCOUNT_ID}/{GATEWAY_NAME}/workers-ai/`,
});
Invocationβ
const aiMsg = await llm.invoke([
[
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
],
["human", "I love programming."],
]);
aiMsg;
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: 'I can help with that! The translation of "I love programming" in French is:\n' +
"\n" +
`"J'adore le programmati`... 4 more characters,
tool_calls: [],
invalid_tool_calls: [],
additional_kwargs: {},
response_metadata: {}
},
lc_namespace: [ "langchain_core", "messages" ],
content: 'I can help with that! The translation of "I love programming" in French is:\n' +
"\n" +
`"J'adore le programmati`... 4 more characters,
name: undefined,
additional_kwargs: {},
response_metadata: {},
tool_calls: [],
invalid_tool_calls: []
}
console.log(aiMsg.content);
I can help with that! The translation of "I love programming" in French is:
"J'adore le programmation."
Chainingβ
We can chain our model with a prompt template like so:
import { ChatPromptTemplate } from "@langchain/core/prompts";
const prompt = ChatPromptTemplate.fromMessages([
[
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
],
["human", "{input}"],
]);
const chain = prompt.pipe(llm);
await chain.invoke({
input_language: "English",
output_language: "German",
input: "I love programming.",
});
AIMessage {
lc_serializable: true,
lc_kwargs: {
content: "Das Programmieren ist fΓΌr mich sehr Valent sein!",
tool_calls: [],
invalid_tool_calls: [],
additional_kwargs: {},
response_metadata: {}
},
lc_namespace: [ "langchain_core", "messages" ],
content: "Das Programmieren ist fΓΌr mich sehr Valent sein!",
name: undefined,
additional_kwargs: {},
response_metadata: {},
tool_calls: [],
invalid_tool_calls: []
}
API referenceβ
For detailed documentation of all ChatCloudflareWorkersAI
features and
configurations head to the API reference:
https://api.js.langchain.com/classes/langchain_cloudflare.ChatCloudflareWorkersAI.html
Relatedβ
- Chat model conceptual guide
- Chat model how-to guides