Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Streaming Support Validator for AWS Bedrock Models (e.g. Meta, AI21, Mistral) #354

Open
martinwangjian opened this issue Feb 9, 2025 · 0 comments

Comments

@martinwangjian
Copy link

martinwangjian commented Feb 9, 2025

Description

Our AWS Bedrock integration currently misidentifies the streaming support behavior for some models. According to the AWS Bedrock Models Documentation, several providers—such as Anthropic, Cohere, AI21 Labs, Meta, and Mistral—support streaming mode. For Amazon models, streaming is enabled only for specific model IDs that contain keywords like "nova-lite", "nova-micro", "nova-pro", "titan-text-express", "titan-text-lite", or "titan-text-premier".

However, we have observed that a Meta model (e.g. "us.meta.llama3-3-70b-instruct-v1:0") is being flagged as having streaming disabled. This behaviour is inconsistent with the AWS documentation, where Meta models are expected to support streaming.


Steps to Reproduce

AWS Region: us-east-1

from langchain_aws import ChatBedrock
from langchain_core.messages import HumanMessage

# Instantiate a ChatBedrock model (use a model ID that should support streaming)
chat = ChatBedrock(model_id="us.meta.llama3-3-70b-instruct-v1:0")  # You can also test with "meta.llama3-3-70b-instruct-v1:0" without cross region inference profile

# Create a sample human message
message = HumanMessage(content="Hello")

# Initiate streaming for the message
stream = chat.stream([message])

# Retrieve and print the first response from the stream
response = next(stream)
print(response)

Expected Behaviour

Streaming Enabled: Models from Anthropic, Cohere, AI21 Labs, Meta, and Mistral should have streaming enabled (i.e. disable_streaming is False).
Amazon Models: For Amazon models, streaming should be enabled only if the model ID contains one of the approved keywords.

Actual Behaviour

ERROR ValidationException: This model doesn't support tool use in streaming mode.
The current behaviour incorrectly disables streaming for models like "us.meta.llama3-3-70b-instruct-v1:0", even though the AWS documentation confirms that such models support streaming.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant