Git Product home page Git Product logo

s3-credentials's Introduction

s3-credentials

PyPI Changelog Tests License

A tool for creating credentials for accessing S3 buckets

For project background, see s3-credentials: a tool for creating credentials for S3 buckets on my blog.

⚠️ Warning

I am not an AWS security expert. You shoud review how this tool works carefully before using it against with own AWS account.

If you are an AWS security expert I would love to get your feedback!

Installation

Install this tool using pip:

$ pip install s3-credentials

Configuration

This tool uses boto3 under the hood which supports a number of different ways of providing your AWS credentials.

If you have an existing ~/.aws/config or ~/.aws/credentials file the tool will use that.

You can set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables before calling this tool.

You can also use the --access-key= and --secret-key= options documented below.

Usage

The s3-credentials create command is the core feature of this tool. Pass it one or more S3 bucket names and it will create a new user with permission to access just those specific buckets, then create access credentials for that user and output them to your console.

Make sure to record the SecretAccessKey because it will only be displayed once and cannot be recreated later on.

In this example I create credentials for reading and writing files in my static.niche-museums.com S3 bucket:

% s3-credentials create static.niche-museums.com

Created user: s3.read-write.static.niche-museums.com with permissions boundary: arn:aws:iam::aws:policy/AmazonS3FullAccess
Attached policy s3.read-write.static.niche-museums.com to user s3.read-write.static.niche-museums.com
Created access key for user: s3.read-write.static.niche-museums.com
{
    "UserName": "s3.read-write.static.niche-museums.com",
    "AccessKeyId": "AKIAWXFXAIOZOYLZAEW5",
    "Status": "Active",
    "SecretAccessKey": "...",
    "CreateDate": "2021-11-03 01:38:24+00:00"
}

The command has several additional options:

  • --username TEXT: The username to use for the user that is created by the command (or the username of an existing user if you do not want to create a new one). If ommitted a default such as s3.read-write.static.niche-museums.com will be used.
  • -c, --create-bucket: Create the buckts if they do not exist. Without this any missing buckets will be treated as an error.
  • --read-only: The user should only be allowed to read files from the bucket.-
  • --write-only: The user should only be allowed to write files to the bucket, but not read them. This is useful for logging use-cases.
  • --policy filepath-or-string: A custom policy document (as a file path, literal JSON string or - for standard input) - see below
  • --bucket-region: If creating buckets, the region in which they should be created.
  • --silent: Don't output details of what is happening, just output the JSON for the created access credentials at the end.
  • --user-permissions-boundary: Custom permissions boundary to use for users created by this tool. This will default to restricting those users to only interacting with S3, taking the --read-only option into account. Use none to create users without any permissions boundary at all.

Here's the full sequence of events that take place when you run this command:

  1. Confirm that each of the specified buckets exists. If they do not and --create-bucket was passed create them - otherwise exit with an error.
  2. If a username was not specified, determine a username using the s3.$permission.$buckets format.
  3. If a user with that username does not exist, create one with an S3 permissions boundary that respects the --read-only option - unless --user-permissions-boundary=none was passed (or a custom permissions boundary string).
  4. For each specified bucket, add an inline IAM policy to the user that gives them permission to either read-only, write-only or read-write against that bucket.
  5. Create a new access key for that user and output the key and its secret to the console.

Using a custom policy

The policy documents applied by this tool can be seen in policies.py. If you want to use a custom policy document you can do so using the --policy option.

First, create your policy document as a JSON file that looks something like this:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:GetObject*", "s3:ListBucket"],
      "Resource": [
        "arn:aws:s3:::$!BUCKET_NAME!$",
        "arn:aws:s3:::$!BUCKET_NAME!$/*"
      ],
    }
  ]
}

Note the $!BUCKET_NAME!$ strings - these will be replaced with the name of the relevant S3 bucket before the policy is applied.

Save that as custom-policy.json and apply it using the following command:

% s3-credentials create my-s3-bucket \
    --policy custom-policy.json

You can also pass - to read from standard input, or you can pass the literal JSON string directly to the --policy option:

% s3-credentials create my-s3-bucket --policy '{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:GetObject*", "s3:ListBucket"],
      "Resource": [
        "arn:aws:s3:::$!BUCKET_NAME!$",
        "arn:aws:s3:::$!BUCKET_NAME!$/*"
      ],
    }
  ]
}'

Other commands

whoami

To see which user you are authenticated as:

s3-credentials whoami

This will output JSON representing the currently authenticated user.

list-users

To see a list of all users that exist for your AWS account:

s3-credentials list-users

This will return pretty-printed JSON objects by default.

Add --nl to collapse these to single lines as valid newline-delimited JSON.

Add --array to output a valid JSON array of objects instead.

list-buckets

Shows a list of all buckets in your AWS account.

s3-credentials list-buckets

Accepts the same --nl and --array options as list-users.

list-user-policies

To see a list of inline policies belonging to users:

% s3-credentials list-user-policies s3.read-write.static.niche-museums.com

User: s3.read-write.static.niche-museums.com
PolicyName: s3.read-write.static.niche-museums.com
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:ListBucket"
      ],
      "Resource": [
        "arn:aws:s3:::static.niche-museums.com"
      ]
    },
    {
      "Effect": "Allow",
      "Action": "s3:*Object",
      "Resource": [
        "arn:aws:s3:::static.niche-museums.com/*"
      ]
    }
  ]
}

You can pass any number of usernames here. If you don't specify a username the tool will loop through every user belonging to your account:

s3-credentials list-user-policies

delete-user

In trying out this tool it's possible you will create several different user accounts that you later decide to clean up.

Deleting AWS users is a little fiddly: you first need to delete their access keys, then their inline policies and finally the user themselves.

The s3-credentials delete-user handles this for you:

% s3-credentials delete-user s3.read-write.simonw-test-bucket-10
User: s3.read-write.simonw-test-bucket-10
  Deleted policy: s3.read-write.simonw-test-bucket-10
  Deleted access key: AKIAWXFXAIOZK3GPEIWR
  Deleted user

You can pass it multiple usernames to delete multiple users at a time.

Common options

All of the s3-credentials commands also accept the following options for authenticating against AWS:

  • --access-key: AWS access key ID
  • --secret-key: AWS secret access key
  • --session-token: AWS session token
  • --endpoint-url: Custom endpoint URL

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd s3-credentials
python -mvenv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

s3-credentials's People

Contributors

simonw avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.