-
Notifications
You must be signed in to change notification settings - Fork 191
feat: Add Amazon Bedrock provider #494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds support for Amazon Bedrock as a new AI provider, leveraging the AWS SDK's ConverseStream API for streaming responses. The implementation includes comprehensive model definitions, tool calling support, and image handling capabilities while maintaining consistency with existing provider patterns.
Key changes:
- New Amazon Bedrock provider implementation using
@aws-sdk/client-bedrock-runtime - Support for 60+ Bedrock models including Claude, Llama, DeepSeek, and Amazon Nova variants
- Integration of cross-region inference profiles for select Claude models
- Authentication via AWS credentials (environment variables or profile) instead of API keys
Reviewed changes
Copilot reviewed 10 out of 11 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/ai/src/providers/amazon-bedrock.ts | Core provider implementation handling streaming, message conversion, tool calling, and image processing for Bedrock's ConverseStream API |
| packages/ai/src/stream.ts | Integration of Bedrock provider into the unified streaming interface with AWS credential handling |
| packages/ai/src/types.ts | Type definitions for Bedrock API and options including region and profile configuration |
| packages/ai/src/models.generated.ts | Generated model definitions for 60+ Bedrock models with pricing, context windows, and capability metadata |
| packages/ai/scripts/generate-models.ts | Model generation logic including cross-region inference profile handling for specific Claude models |
| packages/coding-agent/src/core/model-resolver.ts | Default model configuration for amazon-bedrock provider |
| packages/ai/test/stream.test.ts | E2E test suite covering basic generation, tool calling, streaming, and image input |
| packages/ai/package.json | Added @aws-sdk/client-bedrock-runtime dependency and bedrock keyword |
| packages/ai/README.md | Documentation update listing Amazon Bedrock as a supported provider |
| packages/ai/src/providers/anthropic.ts | Fixed typo in error message (corrected "unkown" to "unknown" and "ocurred" to "occurred") |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
15fba30 to
6b42670
Compare
|
Cool! Configuring thinking etc. should be part of this PR, please. Also need to add bedrock to all other ai package tests that have multiple providers to be tested in them. Any way you can disable Copilot for PRs send to this repo? It's never really helpful. |
|
Yup I'll convert this PR into a draft until I finish adding remaining bits. Yeah sorry about the Copilot, I was testing it on my personal project and I think I messed up with configurations to review all PRs I create :) |
| id.includes("anthropic.claude-opus-4-5") || | ||
| id.includes("anthropic.claude-haiku-4-5") || | ||
| id.includes("anthropic.claude-sonnet-4")) { | ||
| // TODO: Add other models. Can we get this information from models.dev or AWS SDK? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Created a PR on models.dev to add cross-region inference profiles for Claude 4.x family: anomalyco/models.dev#607
6b42670 to
df1835e
Compare
|
@badlogic provider related changes should be ready to review now. Would you also prefer having coding agent changes in this PR as well (haven't checked the necessary changes and not sure if those would be small or big tbh)? Also, I'm running tests on my local currently, but do you also plan to set up an AWS account for CI as well? |
|
CI currently doesn't do any LLM endpoint tests, I let them run locally before a release, or when the ai package changes. Too costly otherwise. The coding-agent shouldn't really need any changes and work out of the box if the ai changes are complete. Iiuc Bedrock doesn't need pi-side login, that's handled by the SDK which looks for env vars/files on disk. |
4ab0695 to
90e4f0c
Compare
|
Added it to the coding agent as well - it wasn't much effort as you mentioned. I think now it should be ready for review. Due to cross-region inference, some model identifiers needs to be tweaked. I've added a test to ensure all models are usable and added a TODO to upstream correct model identifiers to models.dev repository if they accept my initial PR. But for now all models work. Currently, only |
90e4f0c to
d565394
Compare
|
Rebased and also added a CHANGELOG entry. @badlogic appreciate your review once you have some time! |
d565394 to
01b3b5e
Compare
01b3b5e to
320fc52
Compare
Adding initial Amazon Bedrock provider using @aws-sdk/client-bedrock-runtime with the ConverseStream API.