A type of Large Language Model (LLM) that interacts with the Bedrock
service. It extends the base LLM class and implements the
BaseBedrockInput interface. The class is designed to authenticate and
interact with the Bedrock service, which is a part of Amazon Web
Services (AWS). It uses AWS credentials for authentication and can be
configured with various parameters such as the model to use, the AWS
region, and the maximum number of tokens to generate.
The BedrockChat class supports both synchronous and asynchronous interactions with the model,
allowing for streaming responses and handling new token callbacks. It can be configured with
optional parameters like temperature, stop sequences, and guardrail settings for enhanced control
over the generated responses.
asyncfunctionrunStreaming() { // Instantiate the BedrockChat model with the desired configuration constmodel = newBedrockChat({ model:"anthropic.claude-3-sonnet-20240229-v1:0", region:"us-east-1", credentials: { accessKeyId:process.env.BEDROCK_AWS_ACCESS_KEY_ID!, secretAccessKey:process.env.BEDROCK_AWS_SECRET_ACCESS_KEY!, }, maxTokens:150, temperature:0.7, stopSequences: ["\n", " Human:", " Assistant:"], streaming:true, trace:"ENABLED", guardrailIdentifier:"your-guardrail-id", guardrailVersion:"1.0", guardrailConfig: { tagSuffix:"example", streamProcessingMode:"SYNCHRONOUS", }, });
// Prepare the message to be sent to the model constmessage = newHumanMessage("Tell me a joke");
// Stream the response from the model conststream = awaitmodel.stream([message]); forawait (constchunkofstream) { // Output each chunk of the response console.log(chunk); } }
A type of Large Language Model (LLM) that interacts with the Bedrock service. It extends the base
LLM
class and implements theBaseBedrockInput
interface. The class is designed to authenticate and interact with the Bedrock service, which is a part of Amazon Web Services (AWS). It uses AWS credentials for authentication and can be configured with various parameters such as the model to use, the AWS region, and the maximum number of tokens to generate.The
BedrockChat
class supports both synchronous and asynchronous interactions with the model, allowing for streaming responses and handling new token callbacks. It can be configured with optional parameters like temperature, stop sequences, and guardrail settings for enhanced control over the generated responses.Example
For streaming responses, use the following example:
Example