generated from amazon-archives/__template_Apache-2.0
-
Notifications
You must be signed in to change notification settings - Fork 26
Merge SDK and Lambda Releases #461
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
ezhang6811
wants to merge
11
commits into
aws-observability:main
Choose a base branch
from
ezhang6811:zhaez/merge-releases
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
34d0d19
modify lambda workflow to update SDK draft release
ezhang6811 0cc8497
remove previous logic uploading most recent lambda layer to new relea…
ezhang6811 e574fa5
create release notes skeleton for SDK release
ezhang6811 c51ca82
move lambda release into SDK release workflow
ezhang6811 4298ed1
remove lambda release and automate release notes
ezhang6811 0bc9c09
remove release environment from downstream job
ezhang6811 7192017
Merge branch 'main' into zhaez/merge-releases
ezhang6811 978ac26
refactor release job order
ezhang6811 ae1ab19
rename description for aws_region
ezhang6811 f1c94fd
add all dependency versions to release notes
ezhang6811 d1a406f
Merge branch 'main' into zhaez/merge-releases
thpierce File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -5,6 +5,10 @@ on: | |
version: | ||
description: The version to tag the release with, e.g., 1.2.0 | ||
required: true | ||
aws_region: | ||
description: 'Deploy lambda layer to aws regions' | ||
required: true | ||
default: 'us-east-1, us-east-2, us-west-1, us-west-2, ap-south-1, ap-northeast-3, ap-northeast-2, ap-southeast-1, ap-southeast-2, ap-northeast-1, ca-central-1, eu-central-1, eu-west-1, eu-west-2, eu-west-3, eu-north-1, sa-east-1, af-south-1, ap-east-1, ap-south-2, ap-southeast-3, ap-southeast-4, eu-central-2, eu-south-1, eu-south-2, il-central-1, me-central-1, me-south-1, ap-southeast-5, ap-southeast-7, mx-central-1, ca-west-1, cn-north-1, cn-northwest-1' | ||
|
||
env: | ||
AWS_DEFAULT_REGION: us-east-1 | ||
|
@@ -15,13 +19,16 @@ env: | |
RELEASE_PRIVATE_REGISTRY: 020628701572.dkr.ecr.us-west-2.amazonaws.com | ||
PACKAGE_NAME: aws-opentelemetry-distro | ||
ARTIFACT_NAME: aws_opentelemetry_distro-${{ github.event.inputs.version }}-py3-none-any.whl | ||
# Legacy list of commercial regions to deploy to. New regions should NOT be added here, and instead should be added to the `aws_region` default input to the workflow. | ||
LEGACY_COMMERCIAL_REGIONS: us-east-1, us-east-2, us-west-1, us-west-2, ap-south-1, ap-northeast-3, ap-northeast-2, ap-southeast-1, ap-southeast-2, ap-northeast-1, ca-central-1, eu-central-1, eu-west-1, eu-west-2, eu-west-3, eu-north-1, sa-east-1 | ||
LAYER_NAME: AWSOpenTelemetryDistroPython | ||
|
||
permissions: | ||
id-token: write | ||
contents: write | ||
|
||
jobs: | ||
build: | ||
build-sdk: | ||
environment: Release | ||
runs-on: ubuntu-latest | ||
steps: | ||
|
@@ -60,6 +67,54 @@ jobs: | |
# release the artifacts. adot java for reference: | ||
# https://github.com/aws-observability/aws-otel-java-instrumentation/tree/93870a550ac30988fbdd5d3bf1e8f9f1b37916f5/smoke-tests | ||
|
||
- name: Upload SDK artifact | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: ${{ env.ARTIFACT_NAME }} | ||
path: dist/${{ env.ARTIFACT_NAME }} | ||
|
||
build-layer: | ||
needs: build-sdk | ||
runs-on: ubuntu-latest | ||
outputs: | ||
aws_regions_json: ${{ steps.set-matrix.outputs.aws_regions_json }} | ||
steps: | ||
- name: Set up regions matrix | ||
id: set-matrix | ||
run: | | ||
IFS=',' read -ra REGIONS <<< "${{ github.event.inputs.aws_region }}" | ||
MATRIX="[" | ||
for region in "${REGIONS[@]}"; do | ||
trimmed_region=$(echo "$region" | xargs) | ||
MATRIX+="\"$trimmed_region\"," | ||
done | ||
MATRIX="${MATRIX%,}]" | ||
echo ${MATRIX} | ||
echo "aws_regions_json=${MATRIX}" >> $GITHUB_OUTPUT | ||
- name: Checkout Repo @ SHA - ${{ github.sha }} | ||
uses: actions/checkout@v4 | ||
- uses: actions/setup-python@v5 | ||
with: | ||
python-version: '3.x' | ||
- name: Build layers | ||
working-directory: lambda-layer/src | ||
run: | | ||
./build-lambda-layer.sh | ||
pip install tox | ||
tox | ||
- name: upload layer | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: layer.zip | ||
path: lambda-layer/src/build/aws-opentelemetry-python-layer.zip | ||
|
||
publish-sdk: | ||
needs: [build-sdk, build-layer] | ||
runs-on: ubuntu-latest | ||
steps: | ||
- name: Checkout Repo @ SHA - ${{ github.sha }} | ||
uses: actions/checkout@v4 | ||
|
||
- name: Configure AWS credentials for PyPI secrets | ||
uses: aws-actions/configure-aws-credentials@v4 | ||
with: | ||
|
@@ -102,20 +157,25 @@ jobs: | |
- name: Install twine | ||
run: pip install twine | ||
|
||
- name: Download SDK artifact | ||
uses: actions/download-artifact@v4 | ||
with: | ||
name: ${{ env.ARTIFACT_NAME }} | ||
|
||
- name: Publish to TestPyPI | ||
env: | ||
TWINE_USERNAME: '__token__' | ||
TWINE_PASSWORD: ${{ env.TEST_PYPI_TOKEN_API_TOKEN }} | ||
run: | | ||
twine upload --repository testpypi --skip-existing --verbose dist/${{ env.ARTIFACT_NAME }} | ||
twine upload --repository testpypi --skip-existing --verbose ${{ env.ARTIFACT_NAME }} | ||
|
||
# Publish to prod PyPI | ||
- name: Publish to PyPI | ||
env: | ||
TWINE_USERNAME: '__token__' | ||
TWINE_PASSWORD: ${{ env.PROD_PYPI_TOKEN_API_TOKEN }} | ||
run: | | ||
twine upload --skip-existing --verbose dist/${{ env.ARTIFACT_NAME }} | ||
twine upload --skip-existing --verbose ${{ env.ARTIFACT_NAME }} | ||
|
||
# Publish to public ECR | ||
- name: Build and push public ECR image | ||
|
@@ -138,29 +198,230 @@ jobs: | |
platforms: linux/amd64,linux/arm64 | ||
tags: | | ||
${{ env.RELEASE_PRIVATE_REPOSITORY }}:v${{ github.event.inputs.version }} | ||
|
||
publish-layer-prod: | ||
runs-on: ubuntu-latest | ||
needs: [build-layer, publish-sdk] | ||
strategy: | ||
matrix: | ||
aws_region: ${{ fromJson(needs.build-layer.outputs.aws_regions_json) }} | ||
steps: | ||
- name: role arn | ||
env: | ||
LEGACY_COMMERCIAL_REGIONS: ${{ env.LEGACY_COMMERCIAL_REGIONS }} | ||
run: | | ||
LEGACY_COMMERCIAL_REGIONS_ARRAY=(${LEGACY_COMMERCIAL_REGIONS//,/ }) | ||
FOUND=false | ||
for REGION in "${LEGACY_COMMERCIAL_REGIONS_ARRAY[@]}"; do | ||
if [[ "$REGION" == "${{ matrix.aws_region }}" ]]; then | ||
FOUND=true | ||
break | ||
fi | ||
done | ||
if [ "$FOUND" = true ]; then | ||
echo "Found ${{ matrix.aws_region }} in LEGACY_COMMERCIAL_REGIONS" | ||
SECRET_KEY="LAMBDA_LAYER_RELEASE" | ||
else | ||
echo "Not found ${{ matrix.aws_region }} in LEGACY_COMMERCIAL_REGIONS" | ||
SECRET_KEY="${{ matrix.aws_region }}_LAMBDA_LAYER_RELEASE" | ||
fi | ||
SECRET_KEY=${SECRET_KEY//-/_} | ||
echo "SECRET_KEY=${SECRET_KEY}" >> $GITHUB_ENV | ||
- uses: aws-actions/[email protected] | ||
with: | ||
role-to-assume: ${{ secrets[env.SECRET_KEY] }} | ||
role-duration-seconds: 1200 | ||
aws-region: ${{ matrix.aws_region }} | ||
- name: Get s3 bucket name for release | ||
run: | | ||
echo BUCKET_NAME=python-lambda-layer-${{ github.run_id }}-${{ matrix.aws_region }} | tee --append $GITHUB_ENV | ||
- name: download layer.zip | ||
uses: actions/download-artifact@v4 | ||
with: | ||
name: layer.zip | ||
- name: publish | ||
run: | | ||
aws s3 mb s3://${{ env.BUCKET_NAME }} | ||
aws s3 cp aws-opentelemetry-python-layer.zip s3://${{ env.BUCKET_NAME }} | ||
layerARN=$( | ||
aws lambda publish-layer-version \ | ||
--layer-name ${{ env.LAYER_NAME }} \ | ||
--content S3Bucket=${{ env.BUCKET_NAME }},S3Key=aws-opentelemetry-python-layer.zip \ | ||
--compatible-runtimes python3.10 python3.11 python3.12 python3.13 \ | ||
--compatible-architectures "arm64" "x86_64" \ | ||
--license-info "Apache-2.0" \ | ||
--description "AWS Distro of OpenTelemetry Lambda Layer for Python Runtime" \ | ||
--query 'LayerVersionArn' \ | ||
--output text | ||
) | ||
echo $layerARN | ||
echo "LAYER_ARN=${layerARN}" >> $GITHUB_ENV | ||
mkdir ${{ env.LAYER_NAME }} | ||
echo $layerARN > ${{ env.LAYER_NAME }}/${{ matrix.aws_region }} | ||
cat ${{ env.LAYER_NAME }}/${{ matrix.aws_region }} | ||
- name: public layer | ||
run: | | ||
layerVersion=$( | ||
aws lambda list-layer-versions \ | ||
--layer-name ${{ env.LAYER_NAME }} \ | ||
--query 'max_by(LayerVersions, &Version).Version' | ||
) | ||
aws lambda add-layer-version-permission \ | ||
--layer-name ${{ env.LAYER_NAME }} \ | ||
--version-number $layerVersion \ | ||
--principal "*" \ | ||
--statement-id publish \ | ||
--action lambda:GetLayerVersion | ||
- name: upload layer arn artifact | ||
if: ${{ success() }} | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: ${{ env.LAYER_NAME }}-${{ matrix.aws_region }} | ||
path: ${{ env.LAYER_NAME }}/${{ matrix.aws_region }} | ||
- name: clean s3 | ||
if: always() | ||
run: | | ||
aws s3 rb --force s3://${{ env.BUCKET_NAME }} | ||
|
||
generate-lambda-release-note: | ||
runs-on: ubuntu-latest | ||
needs: publish-layer-prod | ||
outputs: | ||
layer-note: ${{ steps.layer-note.outputs.layer-note }} | ||
steps: | ||
- name: Checkout Repo @ SHA - ${{ github.sha }} | ||
uses: actions/checkout@v4 | ||
- uses: hashicorp/setup-terraform@v2 | ||
- name: download layerARNs | ||
uses: actions/download-artifact@v4 | ||
with: | ||
pattern: ${{ env.LAYER_NAME }}-* | ||
path: ${{ env.LAYER_NAME }} | ||
merge-multiple: true | ||
- name: show layerARNs | ||
run: | | ||
for file in ${{ env.LAYER_NAME }}/* | ||
do | ||
echo $file | ||
cat $file | ||
done | ||
- name: generate layer-note | ||
id: layer-note | ||
working-directory: ${{ env.LAYER_NAME }} | ||
run: | | ||
echo "| Region | Layer ARN |" >> ../layer-note | ||
echo "| ---- | ---- |" >> ../layer-note | ||
for file in * | ||
do | ||
read arn < $file | ||
echo "| " $file " | " $arn " |" >> ../layer-note | ||
done | ||
cd .. | ||
{ | ||
echo "layer-note<<EOF" | ||
cat layer-note | ||
echo "EOF" | ||
} >> $GITHUB_OUTPUT | ||
cat layer-note | ||
- name: generate tf layer | ||
working-directory: ${{ env.LAYER_NAME }} | ||
run: | | ||
echo "locals {" >> ../layer_arns.tf | ||
echo " sdk_layer_arns = {" >> ../layer_arns.tf | ||
for file in * | ||
do | ||
read arn < $file | ||
echo " \""$file"\" = \""$arn"\"" >> ../layer_arns.tf | ||
done | ||
cd .. | ||
echo " }" >> layer_arns.tf | ||
echo "}" >> layer_arns.tf | ||
terraform fmt layer_arns.tf | ||
cat layer_arns.tf | ||
- name: generate layer ARN constants for CDK | ||
working-directory: ${{ env.LAYER_NAME }} | ||
run: | | ||
echo "{" > ../layer_cdk | ||
for file in *; do | ||
read arn < "$file" | ||
echo " \"$file\": \"$arn\"," >> ../layer_cdk | ||
done | ||
echo "}" >> ../layer_cdk | ||
cat ../layer_cdk | ||
|
||
publish-github: | ||
needs: generate-lambda-release-note | ||
runs-on: ubuntu-latest | ||
steps: | ||
- name: Checkout Repo @ SHA - ${{ github.sha }} | ||
uses: actions/checkout@v4 | ||
|
||
- name: Download SDK artifact | ||
uses: actions/download-artifact@v4 | ||
with: | ||
name: ${{ env.ARTIFACT_NAME }} | ||
|
||
- name: Download layer.zip artifact | ||
uses: actions/download-artifact@v4 | ||
with: | ||
name: layer.zip | ||
|
||
- name: Get SHA256 checksum of wheel file | ||
id: get_sha256 | ||
- name: Rename layer file | ||
run: | | ||
shasum -a 256 dist/${{ env.ARTIFACT_NAME }} | sed "s|dist/||" > ${{ env.ARTIFACT_NAME }}.sha256 | ||
cp aws-opentelemetry-python-layer.zip layer.zip | ||
|
||
# Publish to GitHub releases | ||
- name: Create GH release | ||
id: create_release | ||
env: | ||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token | ||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} | ||
run: | | ||
# Download layer.zip from existing latest tagged SDK release note | ||
LATEST_SDK_VERSION=$(gh release list --repo "aws-observability/aws-otel-python-instrumentation" --json tagName,isLatest -q 'map(select(.isLatest==true)) | .[0].tagName') | ||
mkdir -p layer_artifact | ||
gh release download "$LATEST_SDK_VERSION" --repo "aws-observability/aws-otel-python-instrumentation" --pattern "layer.zip" --dir layer_artifact | ||
shasum -a 256 layer_artifact/layer.zip > layer_artifact/layer.zip.sha256 | ||
# Extract all dependencies from pyproject.toml | ||
DEPS=$(python3 -c " | ||
import re | ||
with open('aws-opentelemetry-distro/pyproject.toml', 'r') as f: | ||
content = f.read() | ||
deps_match = re.search(r'dependencies\s*=\s*\[(.*?)\]', content, re.DOTALL) | ||
if deps_match: | ||
deps_content = deps_match.group(1) | ||
dep_lines = re.findall(r'\"([^\"]+)\"', deps_content) | ||
formatted_deps = [] | ||
for dep_line in dep_lines: | ||
if ' == ' in dep_line: | ||
package, version = dep_line.split(' == ', 1) | ||
formatted_deps.append(f'- \`{package}\` - {version}') | ||
else: | ||
formatted_deps.append(f'- \`{dep_line}\`') | ||
print('\n'.join(formatted_deps)) | ||
") | ||
|
||
# Create release notes | ||
cat > release_notes.md << EOF | ||
This release contains the following upstream components: | ||
|
||
$DEPS | ||
|
||
This release also publishes to public ECR and PyPi. | ||
* See ADOT Python auto-instrumentation Docker image v${{ github.event.inputs.version }} in our public ECR repository: | ||
https://gallery.ecr.aws/aws-observability/adot-autoinstrumentation-python | ||
* See version ${{ github.event.inputs.version }} in our PyPi repository: | ||
https://pypi.org/project/aws-opentelemetry-distro/ | ||
|
||
This release also includes the AWS OpenTelemetry Lambda Layer for Python version ${{ github.event.inputs.version }}-$(echo $GITHUB_SHA | cut -c1-7). | ||
|
||
Lambda Layer ARNs: | ||
${{ needs.generate-lambda-release-note.outputs.layer-note }} | ||
EOF | ||
|
||
shasum -a 256 ${{ env.ARTIFACT_NAME }} > ${{ env.ARTIFACT_NAME }}.sha256 | ||
shasum -a 256 layer.zip > layer.zip.sha256 | ||
|
||
gh release create --target "$GITHUB_REF_NAME" \ | ||
--title "Release v${{ github.event.inputs.version }}" \ | ||
--notes-file release_notes.md \ | ||
--draft \ | ||
"v${{ github.event.inputs.version }}" \ | ||
dist/${{ env.ARTIFACT_NAME }} \ | ||
${{ env.ARTIFACT_NAME }} \ | ||
${{ env.ARTIFACT_NAME }}.sha256 \ | ||
layer_artifact/layer.zip \ | ||
layer_artifact/layer.zip.sha256 | ||
layer.zip \ | ||
layer.zip.sha256 |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not needed, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We use layer.zip as the name to identify the artifact, but its still referred to with the path it was uploaded with. See the old lambda workflow