You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our AWS Bedrock integration currently misidentifies the streaming support behavior for some models. According to the AWS Bedrock Models Documentation, several providers—such as Anthropic, Cohere, AI21 Labs, Meta, and Mistral—support streaming mode. For Amazon models, streaming is enabled only for specific model IDs that contain keywords like "nova-lite", "nova-micro", "nova-pro", "titan-text-express", "titan-text-lite", or "titan-text-premier".
However, we have observed that a Meta model (e.g. "us.meta.llama3-3-70b-instruct-v1:0") is being flagged as having streaming disabled. This behaviour is inconsistent with the AWS documentation, where Meta models are expected to support streaming.
Steps to Reproduce
AWS Region: us-east-1
fromlangchain_awsimportChatBedrockfromlangchain_core.messagesimportHumanMessage# Instantiate a ChatBedrock model (use a model ID that should support streaming)chat=ChatBedrock(model_id="us.meta.llama3-3-70b-instruct-v1:0") # You can also test with "meta.llama3-3-70b-instruct-v1:0" without cross region inference profile# Create a sample human messagemessage=HumanMessage(content="Hello")
# Initiate streaming for the messagestream=chat.stream([message])
# Retrieve and print the first response from the streamresponse=next(stream)
print(response)
Expected Behaviour
Streaming Enabled: Models from Anthropic, Cohere, AI21 Labs, Meta, and Mistral should have streaming enabled (i.e. disable_streaming is False).
Amazon Models: For Amazon models, streaming should be enabled only if the model ID contains one of the approved keywords.
Actual Behaviour
ERROR ValidationException: This model doesn't support tool use in streaming mode.
The current behaviour incorrectly disables streaming for models like "us.meta.llama3-3-70b-instruct-v1:0", even though the AWS documentation confirms that such models support streaming.
The text was updated successfully, but these errors were encountered:
Description
Our AWS Bedrock integration currently misidentifies the streaming support behavior for some models. According to the AWS Bedrock Models Documentation, several providers—such as Anthropic, Cohere, AI21 Labs, Meta, and Mistral—support streaming mode. For Amazon models, streaming is enabled only for specific model IDs that contain keywords like
"nova-lite"
,"nova-micro"
,"nova-pro"
,"titan-text-express"
,"titan-text-lite"
, or"titan-text-premier"
.However, we have observed that a Meta model (e.g.
"us.meta.llama3-3-70b-instruct-v1:0"
) is being flagged as having streaming disabled. This behaviour is inconsistent with the AWS documentation, where Meta models are expected to support streaming.Steps to Reproduce
AWS Region:
us-east-1
Expected Behaviour
Streaming Enabled: Models from Anthropic, Cohere, AI21 Labs, Meta, and Mistral should have streaming enabled (i.e. disable_streaming is False).
Amazon Models: For Amazon models, streaming should be enabled only if the model ID contains one of the approved keywords.
Actual Behaviour
ERROR ValidationException: This model doesn't support tool use in streaming mode.
The current behaviour incorrectly disables streaming for models like "us.meta.llama3-3-70b-instruct-v1:0", even though the AWS documentation confirms that such models support streaming.
The text was updated successfully, but these errors were encountered: