Git Product home page Git Product logo

react-s3-uploader's People

Contributors

absoludity avatar alex-tan avatar alexprice1 avatar andrewgrewell avatar ani-u avatar bkneis avatar cellis avatar clarkie avatar coerick avatar dennisv avatar dobesv avatar dpretty avatar dylangriffith avatar hardfire avatar jakemmarsh avatar jakerichan avatar jakubrohleder avatar joelhooks avatar juusaw avatar mrjones2014 avatar njj avatar rogerso avatar seanadkinson avatar sfrdmn avatar shalstvedt avatar stanboyet avatar techwes avatar tobymurray avatar tomitrescak avatar x2es avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

react-s3-uploader's Issues

Remove peer dependency on express

I'm using hapi instead of express yet I now have both installed just to use this lib. When using npm v2 peer dependencies are automatically installed.

I'm not sure if there is anything I can do to prevent having both express and hapi installed, except upgrade to npm v3.

Any thoughts?

Uploaded file URL

Hi! I have just started using this plugin with Ruby on Rails backend. Everything seems to work fine for me, but I'm stuck on one thing.

I try to use the plugin for following scenario:

  • Client requests signed url from Server (works)
  • Client uploads file to S3 (works)
  • Clients sends form to Server with updated user.avatar_url

How to get the public url of the recently uploaded file?

Safari doesn't like normalize() (aka Safari is the new IE)

SAFARI Version 9.0.1 (11601.2.7.2), MacOSX El President 🎁

(I'm up to date)

You'll get errors such as:

TypeError: file.name.replace(/\s+/g, "_").normalize is not a function. (In 'file.name.replace(/\s+/g, "_").normalize()', 'file.name.replace(/\s+/g, "_").normalize' is undefined)
executeOnSignedUrl
uploadFile
send
onAttachmentAdd
dispatchEvent
triggerEvent
notify
compositionDidAddAttachment
refreshAttachments
setDocument
insertText
insertAttachment
insertFile
insertFile
onFileSelect

Happens here:

var normalizedFileName = file.name.replace(/\s+/g, "_").normalize();

var normalizedFileName = file.name.replace(/\s+/g, "_").normalize();

Resources:
http://stackoverflow.com/questions/27235902/javascript-normalize-causing-error-in-safari

Proposed fix:
URL encoding the shit could do it.

The request signature we calculated does not match the signature you provided. Check your key and signing method.

Hi there.

I'm getting The request signature we calculated does not match the signature you provided. Check your key and signing method. back from AWS when trying to submit files. I'm on version "react-s3-uploader": "^3.3.0".

Any ideas what may be happening?

Here's my setup:

express:

  app.use('/s3', require('react-s3-uploader/s3router')({
    bucket: "my-queue-name",
    region: 'us-east-1',
    signatureVersion: 'v4', 
    headers: {'Access-Control-Allow-Origin': '*'}, 
    ACL: 'private',
  }));

react: (I'm using DropzoneS3Uploader, which just wraps the S3Upload component.)


		const style = {
			className: "uploadNew"
		}

		const uploaderProps = {
			style, 
			server: 'http://localhost:3000', 
			s3Url: 'https://dotbc-queue.s3.amazonaws.com/', 
			signingUrl: "/s3/sign",
			uploadRequestHeaders: { 'x-amz-acl': 'public-read' },
			contentDisposition: "auto",
		}

    return (
		<div className="filesDocs">
			<label>Files & Documents</label>
			<div className="uploaded">
				<ul className="unstyled">
					

					
				</ul>
			</div>
			<DropzoneS3Uploader {...uploaderProps}>

Here's the complete error I'm getting back. I notice the CanonicalRequest is labeled as UNSIGNED-PAYLOAD. Not sure if that's indicative of what's going on...

<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>
The request signature we calculated does not match the signature you provided. Check your key and signing method.
</Message>
<AWSAccessKeyId>AKIAI7O2H3HADOMOS3MA</AWSAccessKeyId>
<StringToSign>
AWS4-HMAC-SHA256 20161114T033436Z 20161114/us-east-1/s3/aws4_request 41fe70bff8e3ba4abbcb046361064f9a8dab17cd2b4de89d640a23f0e4a9268c
</StringToSign>
<SignatureProvided>
5dc341cd1d9d240be19cc9284aee38fbe00c93c54743dbf7f2c98740bb82d396
</SignatureProvided>
<StringToSignBytes>
41 57 53 34 2d 48 4d 41 43 2d 53 48 41 32 35 36 0a 32 30 31 36 31 31 31 34 54 30 33 33 34 33 36 5a 0a 32 30 31 36 31 31 31 34 2f 75 73 2d 65 61 73 74 2d 31 2f 73 33 2f 61 77 73 34 5f 72 65 71 75 65 73 74 0a 34 31 66 65 37 30 62 66 66 38 65 33 62 61 34 61 62 62 63 62 30 34 36 33 36 31 30 36 34 66 39 61 38 64 61 62 31 37 63 64 32 62 34 64 65 38 39 64 36 34 30 61 32 33 66 30 65 34 61 39 32 36 38 63
</StringToSignBytes>
<CanonicalRequest>
GET /28fad916-9cee-4116-93aa-f80461d77e41_Getting_Started.rtf Content-Type=text%2Frtf&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAI7O2H3HADOMOS3MA%2F20161114%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20161114T033436Z&X-Amz-Expires=60&X-Amz-SignedHeaders=host%3Bx-amz-acl&x-amz-acl=private host:dotbc-queue.s3.amazonaws.com x-amz-acl:private host;x-amz-acl UNSIGNED-PAYLOAD
</CanonicalRequest>
<CanonicalRequestBytes>
47 45 54 0a 2f 32 38 66 61 64 39 31 36 2d 39 63 65 65 2d 34 31 31 36 2d 39 33 61 61 2d 66 38 30 34 36 31 64 37 37 65 34 31 5f 47 65 74 74 69 6e 67 5f 53 74 61 72 74 65 64 2e 72 74 66 0a 43 6f 6e 74 65 6e 74 2d 54 79 70 65 3d 74 65 78 74 25 32 46 72 74 66 26 58 2d 41 6d 7a 2d 41 6c 67 6f 72 69 74 68 6d 3d 41 57 53 34 2d 48 4d 41 43 2d 53 48 41 32 35 36 26 58 2d 41 6d 7a 2d 43 72 65 64 65 6e 74 69 61 6c 3d 41 4b 49 41 49 37 4f 32 48 33 48 41 44 4f 4d 4f 53 33 4d 41 25 32 46 32 30 31 36 31 31 31 34 25 32 46 75 73 2d 65 61 73 74 2d 31 25 32 46 73 33 25 32 46 61 77 73 34 5f 72 65 71 75 65 73 74 26 58 2d 41 6d 7a 2d 44 61 74 65 3d 32 30 31 36 31 31 31 34 54 30 33 33 34 33 36 5a 26 58 2d 41 6d 7a 2d 45 78 70 69 72 65 73 3d 36 30 26 58 2d 41 6d 7a 2d 53 69 67 6e 65 64 48 65 61 64 65 72 73 3d 68 6f 73 74 25 33 42 78 2d 61 6d 7a 2d 61 63 6c 26 78 2d 61 6d 7a 2d 61 63 6c 3d 70 72 69 76 61 74 65 0a 68 6f 73 74 3a 64 6f 74 62 63 2d 71 75 65 75 65 2e 73 33 2e 61 6d 61 7a 6f 6e 61 77 73 2e 63 6f 6d 0a 78 2d 61 6d 7a 2d 61 63 6c 3a 70 72 69 76 61 74 65 0a 0a 68 6f 73 74 3b 78 2d 61 6d 7a 2d 61 63 6c 0a 55 4e 53 49 47 4e 45 44 2d 50 41 59 4c 4f 41 44
</CanonicalRequestBytes>
<RequestId>B81DDBE145ECAA39</RequestId>
<HostId>
l+kVUQMe4jrC8EM88iML42PLRxiKhG1XmxdFkEt9k7C8lPbVmlsTQURN3ZFRe7FoEe+9IcK7g48=
</HostId>
</Error>

How can I resize the image?

I am trying to resize the image in the preprocess event using html5 canvas resize, but after resizing I have a blob that doesn't have a file.name and file.type attributes, which causes the operation to fail. Any idea how I can resize the image in a way that I will be able to continue the process with the new resized image?

need help with issues in setting up react-s3-uploader in my react(UI)->nodeJS(backend) app..

I am getting the below error in my Node.js service. My UI app( build with react, using react-s3-uploader) is running on port 3000.
error message in service below :
Origin:http://localhost:3000
{ [TimeoutError: Missing credentials in config]
message: 'Missing credentials in config',
code: 'CredentialsError',
time: Thu Apr 21 2016 21:49:50 GMT+0530 (India Standard Time),
originalError:
{ message: 'Could not load credentials from any providers',
code: 'CredentialsError',
time: Thu Apr 21 2016 21:49:50 GMT+0530 (India Standard Time),
originalError:
{ message: 'Connection timed out after 1000ms',
code: 'TimeoutError',
time: Thu Apr 21 2016 21:49:50 GMT+0530 (India Standard Time) } } }
node.js service :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./config.json');
app.use('/s3', require('react-s3-uploader/s3router')({
bucket: "pictureBucket",
region: 'us-west-2', //optional
headers: {'Access-Control-Allow-Origin': '*'}, // optional
ACL: 'private' // this is default
}));

config.json:

{
"accessKeyId": "xxxxx",
"secretAccessKey":"yyyyy",
"region": "xxxxx"
}

ui app:

<ReactS3Uploader
signingUrl="/s3/sign"
accept="image/*"
onProgress={this.onUploadProgress}
onError={this.onUploadError}
onFinish={this.onUploadFinish}
signingUrlHeaders={{ "additional": "headers" }}
signingUrlQueryParams={{ "additional": "query-params" }}
uploadRequestHeaders={{ 'x-amz-acl': 'public-read' }}
contentDisposition="auto"
server="http://localhost:8000"/>

my service app is running on port 8000. why does node doesnt recognize the credentials from my config.json.
is there something to connect the AWS variable and the s3 bucket.

ACL:'private'

What is the purpose of setting ACL to private versus public-read? I can't seem to get my uploaded file to work correctly per the README. Perhaps allow passing an param to the ACL?

I'm trying to access my URL via my express i.e. /s3/img/whatever.foo.jpg

Getting error with ES6

My code

import s3router from 'react-s3-uploader/s3router';
export default (app, auth) => {
  app.use('/s3', require('react-s3-uploader/s3router'))({
      bucket: 'jsan-assets',
      region: 'us-west-1', //optional
      // signatureVersion: 'v4', //optional (use for some amazon regions: frankfurt and others)
      // headers: {'Access-Control-Allow-Origin': '*'}, // optional
      // ACL: 'private' // this is default
    });
};

Where app is
const app = express();

/Users/aamirafridi/Sites/uroosi_web/node_modules/express/lib/router/index.js:130
  var search = 1 + req.url.indexOf('?');
                          ^

TypeError: Cannot read property 'indexOf' of undefined
    at Function.proto.handle (/Users/aamirafridi/Sites/uroosi_web/node_modules/express/lib/router/index.js:130:27)
    at EventEmitter.app.handle (/Users/aamirafridi/Sites/uroosi_web/node_modules/express/lib/application.js:170:10)
    at app (/Users/aamirafridi/Sites/uroosi_web/node_modules/express/lib/express.js:28:9)
    at exports.default.app.get (routes.js:16:3)
    at Object.<anonymous> (app.js:37:1)
    at Module._compile (module.js:413:34)
    at loader (/Users/aamirafridi/Sites/uroosi_web/node_modules/babel-register/lib/node.js:126:5)
    at Object.require.extensions.(anonymous function) [as .js] (/Users/aamirafridi/Sites/uroosi_web/node_modules/babel-register/lib/node.js:136:7)
    at Module.load (module.js:357:32)
    at Function.Module._load (module.js:314:12)

Question: Can I have the server specify the upload filename/folder?

My scenario is I have a number of users who could upload files. I don't want them all going into the same folder, nor do I want to use the user submitted filenames. I'd rather make a request to the server which will then provide the right path to upload the file to based on the user.

Is that possible with this component?

Thanks,
Punit

onStart parameter

It would be useful if there were an onStart or onWaiting parameter where you can run a function before uploading starts, rather than checking the status inside of onProgress. What do you think?

Thanks!

Invoke 'clear' method

Hi, thanks for the awesome component.

I noticed there's a method called clear in the ReactS3Uploader.js, but not exposed. Say I want to clear the input value to be null every time after uploading, how can I use that?

Many thanks!

Upgrade to ReactJS 0.14

Hi,
i upgraded my environment to React version 0.14.
I can't use the uploader anymore, i think this depends from this instruction:
fileElement: this.getDOMNode() because the exception i get is this:
Warning: Unknown.getDOMNode(...) is deprecated. Please use ReactDOM.findDOMNode(instance) instead..

Add test suite

This library is getting reasonably popular, so we really need to get our stuff together and add a test suite.

Return mime-type

Is there any way to return the mime-type, for optional saving in our DB?

Question : How can I manually set filename?

I am working on uploading profile image. I want to manually set my filename to their userId so that when the same user upload new profile image, it will replace the old one. I found how to set directory in #61 , but I still can't figure out how to manually set the filename.

Thank you :)

Consider making a standalone version

This component is actually heavily depending on npm and latinize, onorm and object-assign.

Because in the current project I'm not using node, I had to edit the source to fit my needs. The thing is, I believe that what is being achieved with latinize and onorm should be an opt-in behavior instead default. That is part of the be preprocess step in order to make sure the filename is correct.

What object-assign achieves can also be done using $.extend which is generally better supported.

Thoughts?

Option is not defined

See this line. As option is not defined, in strict mode it will throw an error, changing it to for (var option in options) should resolve.

SyntaxError with accentuated filenames

Hi guys,

It seems that I have a problem with accentuated filenames (here, 3ème_étage.JPG, 🇫🇷 people ... 😁 )

I don't quite know where to start. Is this react-s3-uploader related ? Or does it come from my Rails server ?

Thanks for the awesome work. We're rolling it in production in a few days :)

The error :

Uncaught SyntaxError: Failed to execute 'setRequestHeader' on 'XMLHttpRequest': 'inline; filename=3ème_étage.JPG' is not a valid HTTP header field value.S3Upload.uploadToS3 @ bundle.js?body=1:52653(anonymous function) @ bundle.js?body=1:52670(anonymous function) @ bundle.js?body=1:52610onReadyStateChangeReplacement @ includes.js?v=9db1f8db18400644bd5c7449e5295620:561
bundle.js?body=1:51078 Object {productType: Object, itemId: 1446286749503, options: Object, price: Object}

manually start uploading

hi,

i'd like to use a different button to start the uploading, how can i achieve this? i could interrupt the uploading in the preprocess hook but i don't know how to trigger it after that.

thanks!

"peerinvalid" npm problems

I packed this into my pull request and almost immediately regretted it:

I can't seem to get this module to live happily inside my package.json file:

npm ERR! Darwin 15.0.0
npm ERR! argv "node" "/usr/local/bin/npm" "install"
npm ERR! node v0.12.7
npm ERR! npm  v2.11.3
npm ERR! code EPEERINVALID

npm ERR! peerinvalid The package react does not satisfy its siblings' peerDependencies requirements!
npm ERR! peerinvalid Peer [email protected] wants react@>=0.13
npm ERR! peerinvalid Peer [email protected] wants react@*

npm ERR! Please include the following file with any support request:
npm ERR!     /some/dir/nar/GrandCanyon/GrainSizeCalculator/dgs-web/npm-debug.l

I can remove the lines

  "peerDependencies": {
    "express": "4.x",
    "react": "*"
  },

and that seems to fix it but I'd love to know the "right" way to do it.

Cheers,

Needs to be updated.

Guys who separated react to react and react-dom really want to use this great library. thanks !

Separate server and client components

Seems like it would be more versatile to separate the server and client components into separate (NPM) modules. People should be able to install just the React bit without then grabbing the Express dependency

preprocess not called again

I have an issue when the 'next' callback function is not called in the preprocess handler. The issue is that preprocess won't be called again if I select a new image. It might simply be that I am using this library wrong so any input on what I am doing wrong is greatly appreciated. The code to I have now is:

ReactS3Uploader
        signingUrl="/api/s3/sign"
        accept="image/*"
        preprocess={(file, next) => {
          console.log("S3 start upload", file);
          //next(file) //temp
        }}
        onProgress={p => console.log("S3 progress", p)}
        onError={e => console.log("S3 error", e)}
        onFinish={status => {
          console.log("S3 finished upload", status);
          }}
        contentDisposition="auto"
        uploadRequestHeaders={{}}
      />

Note that I have explicitly commented out the call to next. This is to simulate the situation where uploading the image is aborted for some reason. According to the docs this should be legal:

The preprocess(file, next) prop provides an opportunity to do something before the file upload begins, modify the file (scaling the image for example), or abort the upload by not calling next(file).

This successfully aborts the upload but then I am not allowed to retry the situation (i.e. upload a new image). I can select a new image, but preprocess is not called again for the (new) image.

I am using version 3.3.0.

getSignedUrl is not getting called...

I am using ReactS3Uploader with getSignedUrl custom function as. The onUploadStart function is getting called but no activity after that. getSignedUrl is not getting called. Any idea why is this happening? Also, what is className={uploaderClassName} in official example?

      <ReactS3Uploader
        getSignedUrl={this.getS3SignedUrl.bind(this)}
        accept="image/*"
        preprocess={this.onUploadStart.bind(this)}
        onProgress={this.onUploadProgress.bind(this)}
        onError={this.onUploadError.bind(this)}
        onFinish={this.onUploadFinish.bind(this)}
        uploadRequestHeaders={{ 'x-amz-acl': 'public-read' }}
        contentDisposition="auto"/>

Document support for using with react-dropzone

I've personally integrated this with react-dropzone, but the code was too specific to my use case to release.

We should add some code and documentation for the preferred way to integrate with react-dropzone.

For those wondering, I essentially bypassed the ReactS3Uploader react component and just included s3upload.js directly. Using the S3Upload class, I just wrapped each dropped file in a new instance, which kicks off the same upload process as using the component.

403 Forbidden: The request signature we calculated does not match the signature you provided. Check your key and signing method.

While I have been able to successfully upload images via the library, when I try to upload audio files I get a 403 error.

<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>AKIAJJ5CBW6HXR23P7CQ</AWSAccessKeyId><StringToSign>PUT

Any ideas?

Integrated Progress Bar and Improved Default Styling

Do you have any thoughts on improving the component stylistically and/or including a react-bootstrap progress bar? This might be outside of the project scope, though many other (similar) components like react-dropzone come prepackaged with styles.

I have an extremely rough version of what I'm talking about at assignmentexchangestaging.herokuapp.com test_tutor1:test_tutor and click the paperclip icon OR see image below.

screen shot 2015-07-25 at 9 58 59 am

Where to get data to validate file?

I'm having issues running client side and backend validation on the uploaded files, since I don't see anywhere in the component's api giving access to the files. How would this be achieved?

Thanks!

Clear file input field

From my preprocess function, I'd like to clear the file input field if my checks fail. Right now, I can abort the upload by not calling file(next), but I don't know of a way to clear the file input field so it's clear to the user that they need to select a different file.

Re initialize the component when onFinish

Hi guys,

I've been successful in integrating your module (awesome work, thanks 👍).
With the onFinish(signResult) method I render another component, to show the uploaded file, as it's part of a form. What I'd like to know is if there is a way to "clean" the ReactS3Uploader component when upload is finished, to not display the filename anymore.

Thanks again,

Question: How to set Cache-Control on upload?

I've tried:

signingUrlQueryParams={{ 'x-amz-meta-Cache-Control': 'max-age=31536000'}}
signingUrlHeaders={{ 'x-amz-meta-Cache-Control': 'max-age=31536000'}}
uploadRequestHeaders={{ 'x-amz-meta-Cache-Control': 'max-age=31536000'}}

The first two allow the upload to go through, but not setting the cache-control meta data. The last returns a 403 error. Any insight?

Much appreciated.

Question: How does it manage to upload a handful of files at a time?

S3Upload.prototype.handleFileSelect = function(files) {
    var result = [];
    for (var i=0; i < files.length; i++) {
        var file = files[i];
        this.preprocess(file, function(processedFile){
          this.onProgress(0, 'Waiting', processedFile);
          result.push(this.uploadFile(processedFile));
          return result;
        }.bind(this));
    }
};

From the code snippet above, when a lot of files are selected, how does it manage to upload a maximum of N files at a time?

Failed to load resource: Method PUT is not allowed by Access-Control-Allow-Methods.

I am able to generate what looks like a valid signing url from my back-end, but after selecting a file this is what I see in the console:

Upload progress: 0% Waiting
Failed to load resource: Method PUT is not allowed by Access-Control-Allow-Methods.
XMLHttpRequest cannot load https://amazon.s3.endpoint/myKey?AWSAccessKeyId=myValidAcccessKey&Expires=1471656383&Signature=xYZpoaKUlgMjPdarhSaqu4AwCE0%3D. Method PUT is not allowed by Access-Control-Allow-Methods.
error – "XHR error"

I have spend a lot of time making sure the CORS settings on S3 and my back-end are correct, but no joy. I have tried this with both Amazon and an S3-compatible alternative, both with the same result.

Anything I can try?

File type check not working

In the example code in the readme there is a accept="image/*" but it is never actually used by the react component.

S3Router "region" Option Updates and Uses Global aws-sdk Region Option

As currently implemented, the region option in s3Router is passed to aws.config.update():

aws.config.update({region: options.region});

The aws.config.update function will update the global config object for the aws-sdk, meaning that any other service objects instantiated after s3Router will default to the region passed in the s3router options.

Also, because of the way that s3router instantiates an s3object on each request, any changes to the global aws.config.region later could result in a different region being used than the one passed to s3router initially.

The AWS docs reference the global nature of the aws.config object specifically just below "Setting The Region" in the "Locking API Versions" section: http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html#Setting_the_Region

The better option is to pass an object with the region setting directly to the s3 objects created by s3router. PR submitted: #47

SignatureDoesNotMatch Error

Using the latest version, created a simple test based on the readme.

I have valid credentials and tested them using the following express route:

router.get('/test', function(req, res) {
    var params = {
        Bucket: S3_BUCKET,
        Key: getFileKeyDir(req) + '/' + req.params[0]
    };
    var s3 = new aws.S3();
    s3.listBuckets(function(error, data) {
        if (error) {
            console.log(error); // error is Response.error
        } else {
            console.log(data); // data is Response.data
            res.json(data);
        }
    });        
});

However, calls to .../get/img/t.png

returns the signature error.

I have manually uploaded a file to the bucket and can view it using other 3rd party tools using the same credentials.

Any ideas?

Btw, i get the same error uploading.

unorm normalize call could be done server-side

I'm not sure that I've read the code properly, but it looks to me like we have a call to unorm.nfc to normalize the filename. I think that this code could be moved to the server-side? This would save the 140kb unorm library from being downloaded to the client, which I think would be a big improvement to this library.

Thanks!

Cannot find module 'domain'.

i have a problem with the react-s3-uploader s3router module.
i implement it this way (es6 compiled with babel)

import s3router from 'react-s3-uploader/s3router'
app.use('/s3', s3router({bucket:'bucket',region:'eu-central-1',ACL:'private'}))

if i try to run the compiled application i get this error

webpack:///./~/aws-sdk/lib_^\.\/.*$?:152
    return map[req] || (function() { throw new Error("Cannot find module '" + req
                                           ^
Error: Cannot find module 'domain'.
    at eval (webpack:///./~/aws-sdk/lib_^\.\/.*$?:152:41)
    at webpackContextResolve (webpack:///./~/aws-sdk/lib_^\.\/.*$?:152:89)
    at webpackContext (webpack:///./~/aws-sdk/lib_^\.\/.*$?:149:29)
    at Object.nodeRequire (webpack:///./~/aws-sdk/lib/util.js?:40:55)
    at Object.eval (webpack:///./~/aws-sdk/lib/request.js?:4:23)
    at Object.<anonymous> (/Users/ffx-operator/RedBull/rbphotography/server/app-compiled.js:2195:2)
    at __webpack_require__ (/Users/ffx-operator/RedBull/rbphotography/server/app-compiled.js:20:30)
    at Object.eval (webpack:///./~/aws-sdk/lib/core.js?:86:1)
    at Object.<anonymous> (/Users/ffx-operator/RedBull/rbphotography/server/app-compiled.js:1919:2)
    at __webpack_require__ (/Users/ffx-operator/RedBull/rbphotography/server/app-compiled.js:20:30)

i use iojs-v1.6.3.
on iojs CLI i can require domain with require('domain') without any problems. Could it be a problem with webpack?

Allow Options for S3 Service Object to be Passed To S3 Router

For local development, there are a handful of useful tools that mock some of S3's functionality like https://github.com/jubos/fake-s3.

Typically using such tools requires providing an "endpoint" config option when the aws-sdk's S3 service object is constructed (http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property).

It would be useful to be able to pass an optional config object to the S3Router constructor that would then be passed to the aws-sdk when constructing the S3 object so that options such as endpoint could be provided.

I wanted to get any thoughts on the name of the option, or alternative ideas. I'd be happy to submit a PR.

The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

I'm unable to retrieve the CORS permissions from my bucket because the signed URL that the router generates is of the form https://s3.amazonaws.com/example/blahblah..., but my bucket responds with an error message telling me to use an address like https://example.s3.amazonaws.com/blahblah... Is there some way to change the the way the router generates the URL?

Feature: Set header for each file

Hey,

thanks for the nice uploader. Worked well for me!

I want to save metadata with the s3 object. This can be done via amz-x-meta-NAME headers.
As I allow to upload multiple files at once the property uploadRequestHeaders should allow a function like with getSignedUrl(file, callback). So one could set individual header properties for each file.

What do you think?

Cheers
Marcel

File size check

Is it possible to add a check for the minimum and maximum file size before uploading to S3?

TransferPropsTo - ReactS3Uploader.js

TransferPropsTo doesn't work with latest version of React.

I changed the render method to:

render: function() {

    return (
        React.DOM.input({type: 'file', onChange: this.uploadFile, onProgress: this.props.onProgress, onFinish: this.props.onFinish, onError: this.props.onError})
    );

}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.