A runnable sequence that will pass the given function to the model when run.
Prefer the .withStructuredOutput
method on chat model classes.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Create a runnable that uses an OpenAI function to get a structured output.
import { createStructuredOutputRunnable } from "langchain/chains/openai_functions";
import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";
const jsonSchema = {
title: "Person",
description: "Identifying information about a person.",
type: "object",
properties: {
name: { title: "Name", description: "The person's name", type: "string" },
age: { title: "Age", description: "The person's age", type: "integer" },
fav_food: {
title: "Fav Food",
description: "The person's favorite food",
type: "string",
},
},
required: ["name", "age"],
};
const model = new ChatOpenAI();
const prompt = ChatPromptTemplate.fromMessages([
["human", "Human description: {description}"],
]);
const outputParser = new JsonOutputFunctionsParser();
// Also works with Zod schema
const runnable = createStructuredOutputRunnable({
outputSchema: jsonSchema,
llm: model,
prompt,
outputParser
});
const response = await runnable.invoke({
description:
"My name's John Doe and I'm 30 years old. My favorite kind of food are chocolate chip cookies.",
});
console.log(response);
// { name: 'John Doe', age: 30, fav_food: 'chocolate chip cookies' }
Generated using TypeDoc
Params required to create the runnable.