Skip to content

Commit

Permalink
Cleanup BEDROCK_API_KEY as not required (#112)
Browse files Browse the repository at this point in the history
  • Loading branch information
AnirudhDagar authored Nov 12, 2024
1 parent c600b3d commit 2334a49
Show file tree
Hide file tree
Showing 8 changed files with 10 additions and 10 deletions.
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,16 +25,15 @@ cd autogluon-assistant && pip install -e ".[dev]" && cd ..
AG-A supports using both AWS Bedrock and OpenAI as LLM model providers. You will need to set up API keys for the respective provider you choose. By default, AG-A uses AWS Bedrock for its language models.

#### AWS Bedrock Setup
AG-A integrates with AWS Bedrock by default. To use AWS Bedrock, you will need to configure your AWS credentials and region settings, along with the Bedrock-specific API key:
AG-A integrates with AWS Bedrock by default. To use AWS Bedrock, you will need to configure your AWS credentials and region settings:

```bash
export BEDROCK_API_KEY="<your-bedrock-api-key>"
export AWS_DEFAULT_REGION="<your-region>"
export AWS_ACCESS_KEY_ID="<your-access-key>"
export AWS_SECRET_ACCESS_KEY="<your-secret-key>"
```

Ensure you have an active AWS account and appropriate permissions set up for Bedrock. You can manage your AWS credentials through the AWS Management Console.
Ensure you have an active AWS account and appropriate permissions set up for using Bedrock models. You can manage your AWS credentials through the AWS Management Console. See [Bedrock supported AWS regions](https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-regions.html)


#### OpenAI Setup
Expand Down
2 changes: 0 additions & 2 deletions docs/tutorials/autogluon-assistant-quick-start.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,6 @@
"outputs": [],
"source": [
"# Option A: AWS Bedrock (Recommended)\n",
"!export BEDROCK_API_KEY='4509...'\n",
"!export AWS_DEFAULT_REGION='<your-region>'\n",
"!export AWS_ACCESS_KEY_ID='<your-access-key>'\n",
"!export AWS_SECRET_ACCESS_KEY='<your-secret-key>'\n",
Expand Down Expand Up @@ -247,7 +246,6 @@
" 'autogluon': {'predictor_init_kwargs': {}, 'predictor_fit_kwargs': {'presets': 'medium_quality', 'time_limit': 600}},\n",
" 'llm': {\n",
" 'provider': 'bedrock',\n",
" 'api_key_location': 'BEDROCK_API_KEY',\n",
" 'model': 'anthropic.claude-3-5-sonnet-20241022-v2:0',\n",
" 'max_tokens': 512,\n",
" 'proxy_url': None,\n",
Expand Down
2 changes: 1 addition & 1 deletion src/autogluon_assistant/configs/best_quality.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ autogluon:
time_limit: 14400
llm:
# Note: bedrock is only supported in limited AWS regions
# and requires AWS credentials
provider: bedrock
api_key_location: BEDROCK_API_KEY
model: "anthropic.claude-3-5-sonnet-20241022-v2:0"
# provider: openai
# api_key_location: OPENAI_API_KEY
Expand Down
2 changes: 1 addition & 1 deletion src/autogluon_assistant/configs/high_quality.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ autogluon:
time_limit: 3600
llm:
# Note: bedrock is only supported in limited AWS regions
# and requires AWS credentials
provider: bedrock
api_key_location: BEDROCK_API_KEY
model: "anthropic.claude-3-5-sonnet-20241022-v2:0"
# provider: openai
# api_key_location: OPENAI_API_KEY
Expand Down
2 changes: 1 addition & 1 deletion src/autogluon_assistant/configs/medium_quality.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ autogluon:
time_limit: 600
llm:
# Note: bedrock is only supported in limited AWS regions
# and requires AWS credentials
provider: bedrock
api_key_location: BEDROCK_API_KEY
model: "anthropic.claude-3-5-sonnet-20241022-v2:0"
# provider: openai
# api_key_location: OPENAI_API_KEY
Expand Down
2 changes: 2 additions & 0 deletions src/autogluon_assistant/llm/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,7 @@ def get_openai_models() -> List[str]:
@staticmethod
def get_bedrock_models() -> List[str]:
try:
# TODO: Remove hardcoding AWS region
bedrock = boto3.client("bedrock", region_name="us-west-2")
response = bedrock.list_foundation_models()
return [model["modelId"] for model in response["modelSummaries"]]
Expand Down Expand Up @@ -165,6 +166,7 @@ def _get_bedrock_chat_model(config: DictConfig) -> AssistantChatBedrock:
"temperature": config.temperature,
"max_tokens": config.max_tokens,
},
# TODO: Remove hardcoding AWS region
region_name="us-west-2",
verbose=config.verbose,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ def __init__(
num_iterations: int = 2,
optimization_metric: str = "roc",
eval_model: str = "lightgbm",
# TODO: Remove hardcoding AWS region
region_name: str = "us-west-2",
**kwargs,
) -> None:
Expand Down
4 changes: 2 additions & 2 deletions src/autogluon_assistant/ui/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@
# Provider configuration
PROVIDER_MAPPING = {"Claude 3.5 with Amazon Bedrock": "bedrock", "GPT 4o": "openai"}


API_KEY_LOCATION = {"Claude 3.5 with Amazon Bedrock": "BEDROCK_API_KEY", "GPT 4o": "OPENAI_API_KEY"}
# TODO: Remove model specific mappings
API_KEY_LOCATION = {"GPT 4o": "OPENAI_API_KEY"}

INITIAL_STAGE = {
"Task Understanding": [],
Expand Down

0 comments on commit 2334a49

Please sign in to comment.