Skip to content

imwebdev/s3-to-frameio-lambda-copy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

S3 to Frame.io Lambda copy

Serverless workflow to copy a bucket into a Frame.io project using Lambda. Medium article.

Features:

  • Quick copy from S3 to Frame.io
  • Handles any file and bucket size
  • Preserves folder structure

Architecture:

  • API Gateway triggers the workflow from a POST request
  • Main Lambda verifies project info
  • Copy Lambda copies S3 -> Frame.io
  • Copy restarts itself if Lambda timeouts

Copying is done by generating a presigned S3 URL and sending it Frame.io for ingest.

Getting started:

Make sure you have npm and docker installed and running.

git clone https://github.com/strombergdev/s3-to-frameio-lambda-copy.git
cd s3-to-frameio-lambda-copy 
npm install -g serverless
sls plugin install -n serverless-python-requirements

Setup AWS credentials: https://www.youtube.com/watch?v=KngM5bfpttA

Open handler.py and add your Frame.io developer token.
sls deploy

Trigger by making a POST request with the below JSON content to your new endpoint.

{
  "bucket": "your_bucket",
  "project": "your_frameio_project",
  "token": "your_frameio_dev_token"
}


*****************************

STEP BY STEP USING CLOUD 9

Install Python 3.8 - https://tecadmin.net/install-python-3-8-amazon-linux/
  • Step by Step: sudo yum install gcc openssl-devel bzip2-devel libffi-devel zlib-devel cd /opt sudo wget https://www.python.org/ftp/python/3.8.12/Python-3.8.12.tgz sudo tar xzf Python-3.8.12.tgz cd Python-3.8.12 sudo ./configure --enable-optimizations sudo make altinstall sudo rm -f /opt/Python-3.8.12.tgz python3.8 -V

INSTALL REPOSITORY (Run below commands)

git clone https://github.com/strombergdev/s3-to-frameio-lambda-copy.git cd s3-to-frameio-lambda-copy npm install -g serverless sls plugin install -n serverless-python-requirements

  • Open handler.py and add your Frame.io developer token. sls deploy

SETUP SERVERLESS (Run below commands) serverless

  • (then follow prompts on screen to connect Github/AWS) pip install pipenv sudo cp /home/ec2-user/.local/lib/python3.7/site-packages/certifi/cacert.pem /usr/lib/python3.7/site-packages/pip/_vendor/certifi/ docker system prune --all --force sls deploy

You will get results back like this Serverless: Stack update finished... Service Information service: s3-to-frameio-lambda-copy stage: dev region: us-east-1 stack: s3-to-frameio-lambda-copy-dev resources: 21 api keys: None endpoints: POST - https://xxxxxx.us-east-1.amazonaws.com/dev/ functions: main: s3-to-frameio-lambda-copy-dev-main copy: s3-to-frameio-lambda-copy-dev-copy layers: None Serverless: Publishing service to the Serverless Dashboard...

TEST VIDEO PROCESSING Go to https://reqbin.com/req/v0crmky0/rest-api-post-example For the URL type in the POST Endpoint URL you got from the previous step. It should look like this: https://xxxxxx.us-east-1.amazonaws.com/dev/ Select the Content Tab and enter the below. Substituting the values with your own

{ "bucket": "recordings", "project": "S3Upload", "token": "XXXX" }

AUTOMATE PROCESS WITH LAMBDA You can write a simple Python script in Lambda and trigger it using S3

Go To Lambda in AWS Console Click Create Function Then click Author from Scratch Give your function a name (AutomatedS3toFrameIOUpload) Choose Python 3.7 as your Runtime then click Create Function

Here is the script:


import json import requests

def lambda_handler(event, context):

#UPDATE THE URL URL = "https://XXXX.execute-api.us-east-1.amazonaws.com/dev/"

#UPDATE THE BELOW PAYLOAD SETTINGS payload = {'bucket': "my-recordings", 'project': "S3Upload", 'token': "XXXX"} requests.post(URL, json=payload)


paste the above (changing the parameters to your own)

  • Go to Add trigger, and search for S3, and Select your bucket name
  • Under Event Type, Select All Object create events
  • Define a Prefix, if you want to restrict file types to only .mp4 or .mov for example
  • Check the Acknowledge box and click Add
  • Hit Deploy in Lambda

RUN A TEST Copy a video file over to your S3 bucket that you defined above. It will

About

Serverless workflow to copy a bucket into a Frame.io project using Lambda.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages