Git Product home page Git Product logo

amazon-s3-developer-guide's Introduction

This guide has been archived

This guide has been archived. Please see https://github.com/awsdocs/amazon-s3-userguide which combines information from the three retired Amazon S3 guides: Amazon S3 Developer Guide, Console User Guide, and Getting Started Guide.

Amazon S3 Devloper Guide

The open source version of the Amazon S3 Devloper Guide. You can submit feedback & requests for changes by submitting issues in this repo or by making proposed changes & submitting a pull request.

License Summary

The documentation is made available under the Creative Commons Attribution-ShareAlike 4.0 International License. See the LICENSE file.

The sample code within this documentation is made available under a modified MIT license. See the LICENSE-SAMPLECODE file.

amazon-s3-developer-guide's People

Contributors

aws-alan avatar bcagarwal avatar blange avatar dbkingsb avatar dulac avatar ej-acebedo avatar elyrixia avatar enumjorge avatar fletpatr avatar hyandell avatar ilikhan avatar joeholl avatar joshbean avatar jschwarzwalder avatar jzonthemtn avatar kemitix avatar kkode777 avatar lincolahanbeck avatar littlepunks avatar lukaszjankowski avatar mkyieaws avatar mohanva avatar panjws avatar rabilahiri avatar randyocheltree avatar rudpot avatar sasake615 avatar sftim avatar tomcart avatar unacceptable avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

amazon-s3-developer-guide's Issues

Build script to PDF/HTML plus Kindle format?

Hi all,

Thank you for AWS team who provides this. I really appreciate you working on this. I don't know where to post the issue but S3 is my favorite service so I think I just posted this.

I am wondering if it's also possible that you provide the build script as well to output the docs to PDF/HTML format. Plus if it also possible, Kindle format. I am having a problem downloading the Kindle format of the docs at Amazon Kindle store due to country restrictions.

Regards,

Alternative supported transitions waterfall diagram

There's a diagram of the supported lifecycle transitions on this page: https://docs.aws.amazon.com/AmazonS3/latest/dev/lifecycle-transition-general-considerations.html

SupportedTransitionsWaterfallModel

I've never found this diagram especially easy to follow – apparently objects can move up, down, and sideways?

(Additionally, the white text with light background colours has too low a contrast ratio for AA or AAA accessibility guidelines; DEEP_ARCHIVE excluded of course.)

I recently redrew the diagram for my own purposes:

s3_transitions_reborder

Objects can only move downwards, following the order of the tiers.

Would you be interested in using this diagram? (Or the general idea of it, I'm not picky.)

Error attempting to use AmazonS3EncryptionV2 and Authenticated Encryption

Regardless of what version of Bouncy Castle I pull in I can't get past this runtime error. 1.8 Sources and JDK 14. Using the maven-assembly-plugin to assemble a single Jar.

Thanks in advance for any help.

Exception in thread "main" java.lang.UnsupportedOperationException: A more recent version of Bouncy castle is required for authenticated encryption.
	at com.amazonaws.services.s3.model.CryptoConfigurationV2.checkBountyCastle(CryptoConfigurationV2.java:379)
	at com.amazonaws.services.s3.model.CryptoConfigurationV2.checkCryptoMode(CryptoConfigurationV2.java:366)
	at com.amazonaws.services.s3.model.CryptoConfigurationV2.<init>(CryptoConfigurationV2.java:68)
	at com.amazonaws.services.s3.model.CryptoConfigurationV2.<init>(CryptoConfigurationV2.java:47)
	at securities.CryptoUtil.Encrypt(CryptoUtil.java:41)
	at securities.App.main(App.java:19)
        KeyPairGenerator keyPairGenerator = KeyPairGenerator.getInstance("RSA");
        keyPairGenerator.initialize(2048);

        KeyPair keyPair = keyPairGenerator.generateKeyPair();

        String s3ObjectKey = "test.txt";
        String s3ObjectContent = "This should be encrypt";

        AmazonS3EncryptionV2 s3EncryptionClientV2 = AmazonS3EncryptionClientV2Builder.standard()
                .withRegion(Regions.DEFAULT_REGION)
                .withClientConfiguration(new ClientConfiguration())
                .withCryptoConfiguration(new CryptoConfigurationV2().withCryptoMode(CryptoMode.AuthenticatedEncryption))
                .withEncryptionMaterialsProvider(new StaticEncryptionMaterialsProvider(new EncryptionMaterials(keyPair)))
                .build();

        s3EncryptionClientV2.putObject(bucketName, s3ObjectKey, s3ObjectContent);
        s3EncryptionClientV2.shutdown();

        return s3EncryptionClientV2.getObjectAsString(bucketName, s3ObjectKey);
   <dependencies>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-core</artifactId>
            <version>1.11.877</version>
            <type>jar</type>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-kms</artifactId>
            <version>1.11.877</version>
            <type>jar</type>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-s3</artifactId>
            <version>1.11.877</version>
            <type>jar</type>
        </dependency>
        <dependency>
            <groupId>org.bouncycastle</groupId>
            <artifactId>bcprov-ext-jdk15on</artifactId>
            <version>1.66</version>
        </dependency>
    </dependencies>

Static web site, adding bucket policy, and public access settings

On the HostingWebsiteOnS3Setup page there is the section "Step 2: Adding a Bucket Policy That Makes Your Bucket Content Publicly Available". This section explains how to add a Bucket Policy that grants public access.

The instructions do not work for a bucket created using the defaults. One must first UNcheck the "Block new public bucket policies" in the "Public access settings" before you can add a new bucket policy. At that point, you can then add the bucket policy.

This point needs to be clarified.

Missing code in Doc - Upload an Object Using the AWS SDK for .NET

`
try
{
// 1. Put object-specify only key name for the new object.
var putRequest1 = new PutObjectRequest
{
BucketName = bucketName,
Key = keyName1,
ContentBody = "sample text"
};

            PutObjectResponse response1 = await client.PutObjectAsync(putRequest1);

            // 2. Put the object-set ContentType and add metadata.
            var putRequest2 = new PutObjectRequest
            {
                BucketName = bucketName,
                Key = keyName2,
                FilePath = filePath,
                ContentType = "text/plain"
            };
            putRequest2.Metadata.Add("x-amz-meta-title", "someTitle");
        }`

The second put request putRequest2.Metadata.Add("x-amz-meta-title", "someTitle"); is never called invoked with the client.

Restricting Access to Specific IP Addresses

The following statements are incorrect:

"However, the request must not originate from the range of IP addresses specified in the condition."

"The condition in this statement identifies the 54.240.143.* range of disallowed Internet Protocol version 4 (IPv4) IP addresses."

The bucket policy evaluates to denying all actions from all users except those that originate from the range of IP addresses specified in the condition.

The condition in this statement identifies the 54.240.143.* range of ALLOWED Internet Protocol version 4 (IPv4) IP addresses."

Impossible guidance given for granting access to root user

On the page https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html, under the section "Restricting Access to a Specific HTTP Referer", the second policy mentioned has a caution notice stating "Consider adding a third Sid that grants the root user s3:* actions.".  This is also located at https://github.com/awsdocs/amazon-s3-developer-guide/blob/master/doc_source/example-bucket-policies.md

This guidance is incorrect. First, I assume they mean Statement, not Sid, as you can't add any additional Sids. More importantly, no statement could be added that would grant access to the root user, because the deny in the second statement would take precedence. You would instead need to replace "Principal": "*" with "NotPrincipal": { "AWS": "arn:aws:iam::123456789012:root" }. Further, you cannot reference the root user specifically without also not denying access to all IAM principals in the account.

There's almost no documentation for new object lock permissions

In this page I can see a s3:PutObjectRetention action, as well as a s3:object-lock-remaining-retention-days condition key.

However, the ARC page for S3 doesn't mention either of those nor any of the associated IAM actions or condition keys.

Furthermore, nor does the S3 actions mapping page, or the condition keys page.

Given the importance of these object locks for compliance, it seems good to get some pretty detailed permissions spelled out for it.

s3 not an option in policysim

I found a very neat tool while searching for ways to debug why I'm failing to make request to an s3 bucket for a particular IAM user at https://policysim.aws.amazon.com/ here https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_testing-policies.html?icmpid=docs_iam_console

Unfortunately, when I tried to find "S3" as an option to test permissions (as done in the video even!) I saw the options stop after Route53 services.

After scrolling to the very bottom of the service options, this is where it stops.

Screen Shot 2019-09-29 at 1 17 51 PM

Does it support to set object lifetime as hours in object lifecycle rules

Hi,
Regrading to the object lifecycle rules, is it possible to set the lifetime of object into hours?
such as, apply Expiration action to remove objects in bucket per hour, does it take effect?


Expiration Rule

tax/

Enabled

1


I readed the page of https://github.com/awsdocs/amazon-s3-developer-guide/blob/master/doc_source/intro-lifecycle-rules.md, it is mentioned that, there are two kinds of rules below, does it only support to make the lifetime as midnight? That's great to get your response, thanks.
1, Lifecycle rules: Based on an object's age
2. Lifecycle rules: Based on a specific date

Typo in link to CloudTrail Data Event

The doc says

How Do I Enable Object-Level Logging for an S3 Bucket with CloudWatch Data Events?

However it's linked to CloudTrail Data Event page. Should change the link text to

How Do I Enable Object-Level Logging for an S3 Bucket with CloudTrail Data Events?

Typo

There is a typo in the first line of "Troubleshooting Amazon S3 by Symptom" section:
The line goes as...
The following topics lists symptoms to help ...
instead of "The following topics list symptoms to help..."

Access to a deleted object in a versioned bucket in S3 returns 403 rather than 404

The documentation at https://docs.aws.amazon.com/AmazonS3/latest/dev/ObjectVersioning.html is incompatible with S3 Get behaviour for a deleted object which resides in versioning enabled bucket (with Delete Marker). It returns 403 Forbidden rather than 404

The following screenshot shows access to the current version (which is marked as deleted) is returning 403 (and not 404) while the previous version is accessible as expected.

Screen Shot 2019-03-23 at 2 27 50 pm

S3 CRR Replication Policy Invalid for new schema

In this doc: https://github.com/awsdocs/amazon-s3-developer-guide/blob/master/doc_source/crr-walkthrough-4.md

For replication.json, The schema is currently invalid as is,

  • Priority should be int but it is str as of now
  • Setting (overlapping)priorities is supported in new schema
    -- This also means DeleteMarkerReplication will become invalid.

The document should either be updated for the new schema and encourage users to upgrade to new schema or call out that, it is using older schema

error CS0234: The type or namespace name 'Transfer' does not exist in the namespace 'Amazon.S3' (are you missing an assembly reference?)

Hi All

I am working in Unity 2018.4.7.. In My project i have to upload the images from unity. So i have used this aws code.

https://docs.aws.amazon.com/AmazonS3/latest/dev/HLuploadFileDotNet.html..

using Amazon;
using Amazon.S3;
using Amazon.S3.Transfer;
using System;
using System.IO;
using System.Threading.Tasks;

namespace Amazon.DocSamples.S3
{
class UploadFileMPUHighLevelAPITest
{
private const string bucketName = "*** provide bucket name ";
private const string keyName = "
provide a name for the uploaded object ";
private const string filePath = "
provide the full path name of the file to upload ***";
// Specify your bucket region (an example region is shown).
private static readonly RegionEndpoint bucketRegion = RegionEndpoint.USWest2;
private static IAmazonS3 s3Client;

    public static void Main()
    {
        s3Client = new AmazonS3Client(bucketRegion);
        UploadFileAsync().Wait();
    }

    private static async Task UploadFileAsync()
    {
        try
        {
            var fileTransferUtility =
                new TransferUtility(s3Client);

            // Option 1. Upload a file. The file name is used as the object key name.
            await fileTransferUtility.UploadAsync(filePath, bucketName);
            Console.WriteLine("Upload 1 completed");

            // Option 2. Specify object key name explicitly.
            await fileTransferUtility.UploadAsync(filePath, bucketName, keyName);
            Console.WriteLine("Upload 2 completed");

            // Option 3. Upload data from a type of System.IO.Stream.
            using (var fileToUpload = 
                new FileStream(filePath, FileMode.Open, FileAccess.Read))
            {
                await fileTransferUtility.UploadAsync(fileToUpload,
                                           bucketName, keyName);
            }
            Console.WriteLine("Upload 3 completed");

            // Option 4. Specify advanced settings.
            var fileTransferUtilityRequest = new TransferUtilityUploadRequest
            {
                BucketName = bucketName,
                FilePath = filePath,
                StorageClass = S3StorageClass.StandardInfrequentAccess,
                PartSize = 6291456, // 6 MB.
                Key = keyName,
                CannedACL = S3CannedACL.PublicRead
            };
            fileTransferUtilityRequest.Metadata.Add("param1", "Value1");
            fileTransferUtilityRequest.Metadata.Add("param2", "Value2");

            await fileTransferUtility.UploadAsync(fileTransferUtilityRequest);
            Console.WriteLine("Upload 4 completed");
        }
        catch (AmazonS3Exception e)
        {
            Console.WriteLine("Error encountered on server. Message:'{0}' when writing an object", e.Message);
        }
        catch (Exception e)
        {
            Console.WriteLine("Unknown encountered on server. Message:'{0}' when writing an object", e.Message);
        }

    }
}

}

My project settings Scripting Runtime Version is 4.x... When i run the above code in unity. It says

error CS0234: The type or namespace name 'Transfer' does not exist in the namespace 'Amazon.S3' (are you missing an assembly reference?)

I have installed Aws AWSSDK.S3.3.3.113.2,AWSSDK.SimpleEmail.3.3.101.193 ... But it shows the above error.

I have downgraded to 3.5 framework but it says error CS1644: Feature `asynchronous functions' cannot be used because it is not part of the C# 4.0 language specification

I have downloaded the Aws https://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/net-dg-install-assemblies.html

but it also not worksw...

How can i solve this error.

Wildcards DO span resource ARN segments in policies

The documentation on "Specifying Resources in a Policy" states (also here):

You can use wildcards as part of the resource ARN. You can use wildcard characters (* and ?) within any ARN segment (the parts separated by colons). An asterisk (*) represents any combination of zero or more characters, and a question mark (?) represents any single character. You can use multiple * or ? characters in each segment, but a wildcard cannot span segments.

I checked in the policy simulator and the part on wildcards not spanning segments seems to be wrong.

Policy:

{
  "Version": "2012-10-17",
  "Statement": [{
    "Effect": "Allow",
    "Action": "s3:GetObject",
    "Resource": "arn:aws:s3:::my-bucket/foo/*/bar"
  }]
}

With the above policy, the user is granted access to arn:aws:s3:::my-bucket/foo/1/2/bar. So the wildcard DOES span ARN segments.

I believe this could lead to serious security issues in organization relying on the documented behavior. An example of such misconfiguration would be arn:aws:s3:::my-bucket-*/public/*. In this case, the policy matches arn:aws:s3:::my-bucket-prod/private/public/* - this is unexpected.

S3 permission for accessing objects

Im uploading a video to S3 from Java, I need permission as shown in the image below.
i have tried with few option like :-

request.setMetadata(metadata); //request is object of PutObjectRequest

        option 1-->  request.setCannedAcl(CannedAccessControlList.PublicRead);
            
            
       option 2--> AccessControlList acl = new AccessControlList();
                           acl.grantPermission(GroupGrantee.AllUsers, Permission.FullControl); 
                           request.setAccessControlList(acl);

However Im not getting expected result as shown in image, let me know if i can get any solution for the same.

aws_permission_error

Incorrect policy in object lock example

This page says the policy sets a minimum retention period, when in fact it's setting a maximum retention period. As I mentioned in #53, there's also almost no documentation on what permissions apply to object locks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.