Git Product home page Git Product logo

Comments (22)

simonw avatar simonw commented on July 19, 2024

Should the command allow a single credential to be created for multiple buckets? That feels like a useful ability.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

So the command is:

s3-credentials create my-bucket1 [...my-bucket2]

It defaults to creating a brand new user with the ability to read and write content to the specified bucket (or buckets).

Options will include:

  • --read-only - create the user such that they can only read
  • --write-only - create the user such that they can only write (useful for logging style use-cases)
  • --username - specify the username to use - without this a default username of s3:read-write:my-bucket will be used

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Should it create the bucket if one does not exist? Or should it only do that if a --create-bucket option is passed?

I'm going to require --create-bucket - shortcut -c.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Also --bucket-region= for creating the bucket in a specific region.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

And --user-permissions-boundary X for setting a different user permissions boundary, see #1 (comment) - if not specified the permissions boundary will default to one that only allows read or read-write access to S3.

Use --user-permissions-boundary none to disable that default and not set one at all.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Open question: should this support adding new bucket permissions to existing users?

I'd like to do that, but I'm not sure how to yet.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

I'm going to use inline policies attached directly to the users for this tool.

https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_iam-quotas.html says:

You can add as many inline policies as you want to an IAM user, role, or group. But the total aggregate policy size (the sum size of all inline policies) per entity cannot exceed the following quotas:

  • User policy size cannot exceed 2,048 characters.
  • Role policy size cannot exceed 10,240 characters.
  • Group policy size cannot exceed 5,120 characters.

Looks like put_user_policy() is the boto3 method for this: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iam.html#IAM.Client.put_user_policy

response = client.put_user_policy(
    UserName='string',
    PolicyName='string',
    PolicyDocument='string'
)
# Example provided later:
response = client.put_user_policy(
    PolicyDocument='{"Version":"2012-10-17","Statement":{"Effect":"Allow","Action":"*","Resource":"*"}}',
    PolicyName='AllAccessPolicy',
    UserName='Bob',
)

The PolicyName is required. It's not clear to me what the uniqueness constraints around this are - my hunch is that it only has to be unique per-user, but I'll find that out for sure once I start testing.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

In progress --help:

% s3-credentials create --help
Usage: s3-credentials create [OPTIONS] BUCKETS...

  Create and return new AWS credentials for specified S3 buckets

Options:
  --username TEXT                 Username to create or existing user to use
  -c, --create-bucket TEXT        Create buckets if they do not already exist
  --read-only                     Only allow reading from the bucket
  --write-only                    Only allow writing to the bucket
  --bucket-region TEXT            Region in which to create buckets
  --user-permissions-boundary TEXT
                                  Custom permissions boundary to use for
                                  created users, or 'none' to create without.
                                  Defaults to limiting to S3 based on --read-
                                  only and --write-only options.
  --help                          Show this message and exit.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.head_bucket

head_bucket(**kwargs)ΒΆ

This action is useful to determine if a bucket exists and you have permission to access it. The action returns a 200 OK if the bucket exists and you have permission to access it.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024
import botocore
try:
    s3.meta.client.head_bucket(Bucket="static.niche-museums.com2")
    print("Exists")
except botocore.exceptions.ClientError:
    print("Does not exist / not accessible")

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

If the user specifies more than one bucket then the default username of s3:read-write:my-bucket doesn't make sense any more - I'll use a comma separated list of buckets as the suffix instead.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

I'm skipping --user-tag for the moment - it would be used like so:

response = iam.create_user(**kwargs)
    UserName=username,
    PermissionsBoundary='string',
    Tags=[
        {
            'Key': 'string',
            'Value': 'string'
        },
    ]
)

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

The specified value for userName is invalid. It must contain only alphanumeric characters and/or the following: +=,.@_-

No : then.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

It's nearly working:

% s3-credentials create simonw-test-bucket-7                                    
Error: Bucket does not exist: simonw-test-bucket-7 - try --create-bucket to create it
s3-credentials create simonw-test-bucket-7 -c
['Created bucket: simonw-test-bucket-7', 'Created user: s3.read-write.simonw-test-bucket-7 with permissions boundary: arn:aws:iam::aws:policy/AmazonS3FullAccess']
% s3-credentials create simonw-test-bucket-7 -c --read-only
['Created user: s3.read-only.simonw-test-bucket-7 with permissions boundary: arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess']

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

I'm going to add one inline policy to the user per bucket they are allowed to access.

I'll generate policy names that can be used to de-dupe these inline policies later, similar to the usernames:

s3.read-only.simonw-test-bucket-7

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Need to figure out what the JSON policy documents should look like.

The examples on https://docs.aws.amazon.com/AmazonS3/latest/userguide/example-bucket-policies.html are far more complicated than I want. I just want "read-only" or "write-only" or "read-write" for a specific S3 bucket.

Referring back to dogsheep/dogsheep-photos#4 (comment)

Read-only:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject*",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::dogsheep-photos-simon/*"
            ]
        }
    ]
}

Or these examples look more relevant to me: https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

https://stackoverflow.com/questions/15076645/amazon-s3-write-only-access/50839107 suggests this for write-only:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKET_NAME/*"
            ]
        }
    ]
}

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

https://blog.antoine-augusti.fr/2018/08/aws-s3-read-only-policy-for-bucket/ suggests the following for read-only:

{
  "Version":"2012-10-17",
  "Statement":[
    {
      "Effect":"Allow",
      "Action":[
        "s3:ListBucket",
        "s3:ListAllMyBuckets"
      ],
      "Resource":"arn:aws:s3:::*"
    },
    {
      "Effect":"Deny",
      "Action":[
        "s3:ListBucket"
      ],
      "NotResource":[
        "arn:aws:s3:::bucketname",
        "arn:aws:s3:::bucketname/*"
      ]
    },
    {
      "Effect":"Allow",
      "Action":[
        "s3:ListBucket",
        "s3:GetObject"
      ],
      "Resource":[
        "arn:aws:s3:::bucketname",
        "arn:aws:s3:::bucketname/*"
      ]
    }
  ]
}

Not sure why this explicitly allows ListAllMyBuckets against anything but then denies ListBucket against anything that isn't the one specific bucket though.

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Similar example here: https://coderwall.com/p/jrjwza/s3-group-policy-for-read-only-access-to-only-one-bucket

Simple AWS IAM Group policy to limit a client to read-only access to a single bucket. They'll be able to see the names of all other buckets on your account, but won't be able to get into them. They will be able to see all folders and files in the bucket you specify.

Replace "bucketname" below:

{
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:ListBucket", "s3:ListAllMyBuckets" ],
      "Resource": "arn:aws:s3:::*"
    },
   {
      "Effect": "Deny",
      "Action": ["s3:ListBucket"],
      "NotResource":["arn:aws:s3:::bucketname", "arn:aws:s3:::bucketname/*"]
    },
    {
      "Effect": "Allow",
      "Action": ["s3:ListBucket","s3:GetObject"],
      "Resource": ["arn:aws:s3:::bucketname", "arn:aws:s3:::bucketname/*"],
      "Condition": {}
    }
  ]
}

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Going with these policies for the moment:

def read_write(bucket):
# https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html
return {
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::{}".format(bucket)],
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": "s3:*Object",
"Resource": ["arn:aws:s3:::{}/*".format(bucket)],
},
],
}
def read_only(bucket):
return {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:GetObject*", "s3:ListBucket"],
"Resource": [
"arn:aws:s3:::{}".format(bucket),
"arn:aws:s3:::{}/*".format(bucket),
],
}
],
}
def write_only(bucket):
return {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:PutObject"],
"Resource": ["arn:aws:s3:::{}/*".format(bucket)],
}
],
}

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

Last step: create the access key: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iam.html#IAM.Client.create_access_key

from s3-credentials.

simonw avatar simonw commented on July 19, 2024

I'm going to turn my work-in-progress into a PR.

from s3-credentials.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.