Blog

Viewing posts tagged S3

Django Admin File field ( Widget ) for AWS Identity based S3 uploads

When you are working with AWS serverless, You probably has faced body size limit for your lambda function. Basically It won't allow file uploads beyond size limit specified. 
Here is guide if you are using Django in your website and deployed on AWS Lambda + API Gateway ( Zappa ) And want to allow file uploads of any size in Django Admin using AWS S3 Identity based uploads.

Write One API which give response as following json : 
     `/api/awsIdentity/` // any endpoint you like make sure authenticated and GET only
     {

          "IdentityId": "",
          "Token": "",
          "bucket_name": "",
          "bucket_region": "",
          "auth_role_arn": ""
     }

Your widget for File Field And Admin Form will looks like this : 

    from django import forms
    from django.contrib import admin

Copy files from one AWS S3 bucket to another with public permission

Here is what you may looking for : 

AWS Cognito setup to work with AWS s3 identity based uploads

1. In AWS S3 console:
    Set CORS as below to your bucket.
<?xml version="1.0" encoding="UTF-8"?>
  <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
            <CORSRule>
                <AllowedOrigin>*</AllowedOrigin>
                <AllowedMethod>POST</AllowedMethod>
                <AllowedMethod>GET</AllowedMethod>
                <AllowedMethod>PUT</AllowedMethod>
                <AllowedMethod>DELETE</AllowedMethod>
                <AllowedMethod>HEAD</AllowedMethod>
                <AllowedHeader>*</AllowedHeader>
            </CORSRule>
 </CORSConfiguration>
2. In Cognito Console Set User Pool:
Manage User Pools > Custom settings > Name Pool "TestPool" > "sign themselves up?" only administrator > No verification (nor email nor phone) > App clients > Add an app client > "TestPoolApp" > check "generate client secret" > Note down pool id and ARN , App client id and app client secret.
3. In Cognito Console Set Federated Identities:
Click on "Federated Identities" > Name "Identity pool name" > In "Authentication providers" > "cognito" tab > Set details of prev step. > "Custom" tab set developer name. > Create > Edit your `Auth_Role` > Set following:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"mobileanalytics:PutEvents",
"cognito-sync:*",
"cognito-identity:*"
],
"Resource": [
"*"
]
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<yourbucketname>/${cognito-identity.amazonaws.com:sub}/*"
]
}
]
}
Note : With this user can upload files to only directory key named with its identity id.

4. Get ARN of above auth rule and also Note down identity pool id.

Now you can create api that can return identity id and token so client sdk can upload to s3 directly.

Bash Shell Script to backup RDS/EC2 PostgreSQL DB and upload to S3 weekly

#!/bin/bash
# Run as sudo. for weekly backup of db. and upload to s3 bucket.
DBHOME="/home/priyank/crontabs/dbbackups/"
BUCKETNAME="yourAWSbucket"
SCRIPTNAME="$(basename $BASH_SOURCE)"
SCRIPTFULLPATH="$(pwd)/$(basename $BASH_SOURCE)"
mkdir -p $DBHOME
chown -R postgres:postgres $DBHOME
cp $SCRIPTFULLPATH $DBHOME
SCHEMA_BACKUP="$DBHOME/$(date +%w).sql"
sudo -u postgres touch $SCHEMA_BACKUP
sudo -u ubuntu echo "" > $SCHEMA_BACKUP
sudo -u postgres PGPASSWORD="yourPGpassword" pg_dump -h localhost -p 5432 -U postgres -F p -b -v --column-inserts --data-only -f $SCHEMA_BACKUP "yourDBname"
CRONPATH="$DBHOME$SCRIPTNAME"
chmod +x $CRONPATH
FLAGCHK=0
crontab -l | grep -q "$SCRIPTNAME" && FLAGCHK=1 || (crontab -l | { cat; echo "00 23 * * * $CRONPATH"; } | crontab -)
if [ $FLAGCHK -eq 0 ]
then
apt-get install s3cmd
s3cmd --configure
fi
s3cmd put $SCHEMA_BACKUP "s3://$BUCKETNAME/dbbackups/"

Bash Script to backup RDS/EC2 MySQL DB and upload to S3 weekly

You may come across task to write cronjob that takes backup of db every day/week/month and upload to aws s3. 
Here is shell script to do that job. make sure to replace bucket name, credentials with yours.

#!/bin/bash
# Run as sudo. for weekly backup of db. and upload to s3 bucket.
DBHOME="/home/ubuntu/priyank/crontabs/dbbackups/"
BUCKETNAME="yourAWSbucket"
SCRIPTNAME="$(basename $BASH_SOURCE)"
SCRIPTFULLPATH="$(pwd)/$(basename $BASH_SOURCE)"
mkdir -p $DBHOME
chown -R ubuntu:ubuntu $DBHOME
cp $SCRIPTFULLPATH $DBHOME
SCHEMA_BACKUP="$DBHOME/$(date +%w).gzip"
sudo -u ubuntu touch $SCHEMA_BACKUP
sudo -u ubuntu echo "" > $SCHEMA_BACKUP
sudo -u ubuntu mysqldump -P <yourDBport> -h <yourDBHost> -u <yourDBUser> -p<yourDBpassword> --force --opt --databases <yourDBName> | gzip -c > $SCHEMA_BACKUP
CRONPATH="$DBHOME$SCRIPTNAME"
chmod +x $CRONPATH
FLAGCHK=0
crontab -l | grep -q "$SCRIPTNAME" && FLAGCHK=1 || (crontab -l | { cat; echo "00 23 * * * $CRONPATH"; } | crontab -)
if [ $FLAGCHK -eq 0 ]
then
apt-get install s3cmd
s3cmd --configure
fi
s3cmd put $SCHEMA_BACKUP "s3://$BUCKETNAME/dbbackups/"