kukhariev / node-uploadx Goto Github PK
View Code? Open in Web Editor NEWNode.js middleware for handling resumable uploads
License: MIT License
Node.js middleware for handling resumable uploads
License: MIT License
Hi,
I am using node-uploadx in a typescript project. When I compile source code (.ts) to javascript file (.js), an error is from node-uploadx as follows:
Module 'node-uploadx/lib/core' has no exported member 'Range'.
Is this because '@internal' decoration in the source code (core/interface.ts) and how can I fix this issue?
/**
* @internal
*/
export interface Range {
total?: number;
end?: number;
start?: number;
id: string;
}
Thank you in advance.
Thanks for this interesting module. I'm interested in using it for Peertube and we'd like to create a video entity upon file upload and then return id, etc to the client. According to the source code and examples it looks like it's not possible to modify the server response? A workaround could be to run an additional request after the upload is finished, but that may result in "orphaned" files (if the additional request fails) and will also not be as efficient.
Hi, I have a project where I use ngx-uploadx
in the client with Angular, combined with a node server using node-uploadx
.
I am migrating the client project to React. Is there any ngx-uploadx
alternative that is compatible with React?
I want to keep node-uploadx
in the server side.
Thank you very much in advance.
Sometimes metadata updates are not saved to disk and are lost on server restart.
node-uploadx (Windows) is throwing an error when trying to upload to an existing directory due to ensureFile (utils.js) function. The fsMkdir and fsClose/fsOpen are in the same try/catch, and an existing directory is causing the fsMkdir to throw an error and therefore skip the file creation step. The server error then comes in the async create (disk-storage.js) function at the getFileSize step due to no file being created from ensureFile.
When maxUploadSize
is passed the HTTP status should be set to 413 Payload too large.
When file mime doesn't match allowMIME
the HTTP status should be set to 415 Unsupported Media Type.
As an addition it would be great if it'd be possible to customize the error message. The API could look like:
const opts = {
validation: {
mime: {
value: ['image/png'],
message: 'Only PNG is supported.',
},
size: {
value: '50GB',
message: 'The file is way too big.',
},
},
};
app.use('/upload/files', uploadx(opts));
I'm opening this issue because:
npm is crashing.
Error: EPERM: operation not permitted, open 'F:\server\universal\public\bbb_sunflower_2160p_30fps_normal.mp4'
supporting information:
npm -v prints: 6.1.0
node -v prints: 8.11.3
Windows, OS X/macOS, or Linux?: Windows 10
Network issues: no
@kukhariev thanks for your efforts!
This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
@typescript-eslint/eslint-plugin
, @typescript-eslint/parser
)@types/jest
, jest
, ts-jest
)These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.
.github/workflows/test.yml
actions/checkout v3
actions/setup-node v3
actions/cache v3
package.json
@types/express 4.17.13
@types/jest 28.1.8
@types/node 14.18.26
@types/rimraf 3.0.2
@types/supertest 2.0.12
@typescript-eslint/eslint-plugin 5.35.1
@typescript-eslint/parser 5.35.1
eslint 8.23.0
eslint-config-prettier 8.5.0
eslint-plugin-jest 27.0.1
eslint-plugin-jest-formatting 3.1.0
eslint-plugin-prettier 4.2.1
express 4.18.1
husky 8.0.1
jest 28.1.3
lint-staged 13.0.3
memfs ^3.4.4
node-mocks-http 1.11.0
oao 2.0.2
prettier 2.7.1
rimraf 3.0.2
supertest 6.2.4
ts-jest 28.0.8
ts-node-dev 2.0.0
tsconfig-paths 4.1.0
typescript 4.8.2
node >=14.18.20
yarn >=1.22.15
packages/core/package.json
bytes ^3.1.0
multiparty ^4.2.2
parse-duration ^1.0.0
@types/bytes 3.1.1
@types/multiparty 0.0.33
node >=14.18.20
packages/gcs/package.json
abort-controller ^3.0.0
google-auth-library ^8.0.0
node-fetch ^2.6.7
@types/node-fetch ^2.5.12
@uploadx/core ^6.0.0
node >=14.18.20
packages/node-uploadx/package.json
@uploadx/core ^6.0.0
@uploadx/gcs ^6.0.0
@uploadx/s3 ^6.0.0
node >=14.18.20
packages/s3/package.json
@aws-sdk/client-s3 ^3.47.0
@aws-sdk/credential-providers ^3.47.0
aws-sdk-client-mock ^1.0.0
@aws-sdk/types ^3.47.0
@uploadx/core ^6.0.0
node >=14.18.20
Below I mention the example code:
const { uploadx } = require("@uploadx/core");
const { S3Storage } = require('@uploadx/s3');
const storage = new S3Storage({
endpoint: process.env.ENDPOINT // I want to add digitalocean endpoint here eg. https://{region}.digitaloceanspaces.com
bucket: process.env.BACKET,
region: process.env.REGION,
apiVersion: '2006-03-01',
credentials: {
accessKeyId: process.env.ACCESS_KEY_ID,
secretAccessKey: process.env.SECRET_ACCESS_KEY,
},
forcePathStyle: true,
expiration: { maxAge: '1h', purgeInterval: '15min' },
onComplete: file => {
return 'completed'
},
filename: (file) => file.originalName
});
app.use('/uploads', uploadx({ storage }), function (req, res) {
var file = req.body;
res.send(file)
});
Error throw:
Error: getaddrinfo ENOTFOUND testing.s3.nyc1amazonaws.com
at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:108:26) {
errno: -3008,
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'testing.s3.nyc1.amazonaws.com',
'$metadata': { attempts: 1, totalRetryDelay: 0 }
}
"npm audit" generate a vulnerability report, if a project is using the latest version of node-uploadx.
Root cause is an outdated version of node-forge. Please update the "google-auth-library" dependency to the latest version to fix the bug.
=== npm audit security report ===
Manual Review
Some vulnerabilities require your attention to resolve
Visit https://go.npm.me/audit-guide for additional guidance
High Prototype Pollution in node-forge
Package node-forge
Patched in >= 0.10.0
Dependency of node-uploadx
Path node-uploadx > google-auth-library > gtoken > google-p12-pem
> node-forge
More info https://npmjs.com/advisories/1561
Hi @kukhariev , when using in koa2 framework, the routing controller callback for upload files is async, then upload is defined as sync but it looks like that ServerResponse callback inside the upload is doing as async.
So do you know how to make it fit into koa2? Or provide an async upload version?
Thanks
Hello there,
I am facing a problem when working with this package node-uploadx.
I installed node-uploadx normally on my local machine, I tested it by uploading a file, and it works perfectly on localhost, I found the file in this url:
localhost:3000/upload/files
But when I deployed the project on CLOUDWAYS, the package didn't worked properly, because When I tried to upload a file the file is uploading to this url :
https://127.0.0.1:3000 (this url is found and resolved from .htaccess configuration file)
while the file should be uploaded to another url (my server url on cloudways, api.thebravespirit.org)
is there any suggestions about this problem?@kukhariev @onkeloki
Thank you
workaround: add at least one listener registered for the 'error' event:
uploads.on('error', err => {});
Would it be possible to provide an API to store file parts (etags and parts number) in a custom cache for s3 direct uploads? That would make it possible to keep the server stateless.
please guide me in detail for proper demo how i use in my project.
i am using ngx-uploadx at angular side and i installed node-uploadx but i don't know completely how its works and manage resumable uploads even after i looged out and login after some time.
There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.
Error type: undefined. Note: this is a nested preset so please contact the preset author if you are unable to fix it yourself.
Options:
const uploads = uploadx({
directory: 'upload',
onComplete: file => {
return 'completed';
}
});
output:
"response": {
"0": "c",
"1": "o",
"2": "m",
"3": "p",
"4": "l",
"5": "e",
"6": "t",
"7": "e",
"8": "d"
},
"responseHeaders": {
"cache-control": "no-store",
"content-length": "73",
"content-type": "application/json"
},
"responseStatus": 200,
I'm using PeerTube and we've had a lot of problems with corrupt video files and now I'm trying to investigate where the corruption happens. One possibility is that it happens during upload (which is done with node-uploadx). We're uploading pretty big files, 6-9 gb. Is there any checksum created on the client and sent to the server to verify that the file is in good condition when the upload is done?
Hello,
Using uploadx protocol we can create a file with a specific size, but the disk storage does not check the chunk size is lower or equal to the file size. So an uploader can upload as much data it wants.
I think we should add a check in the put
handler:
node-uploadx/packages/core/src/handlers/uploadx.ts
Lines 51 to 54 in 02883fc
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.