adieuadieu / retinal Goto Github PK
View Code? Open in Web Editor NEW🏙 Retinal is a Serverless AWS Lambda service for resizing images on-demand or event-triggered
License: MIT License
🏙 Retinal is a Serverless AWS Lambda service for resizing images on-demand or event-triggered
License: MIT License
I have two questions.
So I am relatively new to lambda and I was wondering why I couldn't get sharp npm package to work on a deployed lambda function. It only works locally if I test the endpoint on my computer with serverless-offline. I looked further into it and looks like Sharp package has prerequisites (gyp, python, etc) that I assume are not installed in the deployed lambda instance:
I don't believe lamda has the node-gyp and python installed by itself if I use Node as the environment.
So is this what this package supposed to do - prepackage sharp dependencies so it works in deployed lambda?
How would I go about prepackaging these dependencies myself in a serverless framework if I want to use the sharp npm package by itself? I would like to learn this for current and feature endeavors.
Prerequisites: (link here)
Add the ability to read image-processing configuration (the outputs/resize/sharp functions) from the source S3 bucket (or, maybe, any bucket?)
incoming/something/image.jpg
-> incoming/something/config.json
) or look for config specific to this incoming image (incoming/something/image.jpg
-> incoming/something/image.jpg.config
), then look upwards and merge config? (maybe too many s3 requests/slow?)I have been trying this today, and am getting the below error and a few more errors when I do a 'yarn test'
TypeError:
todo
tests are not allowed to have an implementation. Usetest.skip()
for tests with an implementation.
Yet, when I check the bucket, I find a processed file. Though there is no Lambda function installed yet.
I then modify the metadata.test.js and remove the implementation from the todo tests. I am not sure if this is ok to do or not?
On modifying the metadata.test.js, the 'todo test' errors go away, but the other errors remain as below:
$ yarn test
yarn test v0.23.4
$ nyc ava
Processing: originals/test image ütf テスト.jpg
Processing: false
Processing: originals/test image ütf テスト.jpg
===============================
metadata: { format: 'jpeg',
width: 800,
height: 665,
space: 'srgb',
channels: 3,
depth: 'uchar',
density: 72,
hasProfile: true,
hasAlpha: false,
orientation: 1,
exif: <Buffer 45 78 69 66 00 00 49 49 2a 00 08 00 00 00 0f 00 00 01 03 00 01 00 00 00 20 03 00 00 01 01 03 00 01 00 00 00 99 02 00 00 02 01 03 00 03 00 00 00 c2 00 ... >,
icc: <Buffer 00 00 02 30 41 44 42 45 02 10 00 00 6d 6e 74 72 52 47 42 20 58 59 5a 20 07 cf 00 06 00 03 00 00 00 00 00 00 61 63 73 70 41 50 50 4c 00 00 00 00 6e 6f ... >,
contentType: 'image/jpeg',
s3: { ContentLength: 1024, Key: 'obama-test.jpg' } }
11 passed
3 failed
4 todo
metadata › should have appropriate content-type
/Users/jai/work/guides/consulting/ssi/serverless-sharp-image/src/metadata.test.js:18
17: t.is(metadata.contentType, 'image/jpeg')
18: t.is(metadata.exif.image.ImageDescription, 'TEST3')
19: t.is(metadata.exif.image.Artist, 'TEST1')
Rejected promise returned by test. Reason:
TypeError {
message: 'Cannot read property \'ImageDescription\' of undefined',
}
metadata › should include data from Rekognition if configured
Test finished without running any assertions
image › processItem()
/Users/jai/work/guides/consulting/ssi/serverless-sharp-image/src/image.test.js:61
60: await remove(
61: result.reduce((list, objects) => [...list, ...objects.map(({ Key }) => Key)], []),
62: destinationBucket
Rejected promise returned by test. Reason:
TypeError {
message: 'Cannot destructure property `Key` of \'undefined\' or \'null\'.',
}
src/image.test.js:61:63
src/image.test.js:61:59
Test.<anonymous> (src/image.test.js:61:12)
----------------------------|----------|----------|----------|----------|----------------|
File | % Stmts | % Branch | % Funcs | % Lines |Uncovered Lines |
----------------------------|----------|----------|----------|----------|----------------|
All files | 79.05 | 53.97 | 65 | 78 | |
serverless-sharp-image | 100 | 100 | 100 | 100 | |
config.js | 100 | 100 | 100 | 100 | |
serverless-sharp-image/src | 78.85 | 53.97 | 65 | 77.78 | |
config.js | 100 | 100 | 100 | 100 | |
image.js | 92 | 76.92 | 100 | 91.67 | 53,55 |
metadata.js | 60 | 45.45 | 0 | 60 |... 29,37,38,44 |
rekognition.js | 26.67 | 0 | 0 | 26.67 |... 65,77,81,91 |
s3.js | 100 | 62.5 | 100 | 100 | 4,14,22 |
sharp.js | 100 | 100 | 100 | 100 | |
utils.js | 92.31 | 66.67 | 66.67 | 92.31 | 31 |
----------------------------|----------|----------|----------|----------|----------------|
error Command failed with exit code 1.
Given that Serverless 0.16.0 added a feature which omits devDependencies from the deployment packages, we should just remove the webpack dependency entirely. No real need to minify the code. We can use babel directly in a deploy
package script. Or switch to roll-up or something that isn't more targeted towards Browsers. Seems kind of silly to use webpack when deploying to Lambda.
New version of sharp is available. Upgrade package.json
's dependency and rebuild the for-lamda zip.
Hey,
Thanks for the wonderful library.
In my use case I need to generate multiple images which could be used in the srcSet and Sizes attributes in the img tag. To do this, I had to change the code in a way not totally in sync with the way you have created this repo.
The repo I created is here
If you so wish, I could send a pull request if this functionality is something you would like to bake into this, to do that you would have to let me know how you would like me to re-structure so it fits into your mental model.
If I run yarn test
when the destination bucket does not exist then the tests fail with:
[NoSuchBucket: The specified bucket does not exist]
However, when the bucket does exist then the tests pass but yarn deploy
fails with this error:
An error occurred while provisioning your stack: imageDestinationBucket
- XXXXXXXXXXXX already exists.
This seems somewhat contradictory!
Add support for watermarking images
It's called sharp-image
but do you have any plans on supporting video too using ffmpeg
.
lib/
Add support for handling S3 DeleteObject events which, using the key's defined in config's outputs, will try to delete any previously generated images.
starting sharp v0.20.0, it provides pre-built libvips binaries & supports installing specific platform / arch via npm_config_arch
or npm_config_platform
environment variable.
so we can install required native dependencies using following command: env npm_config_arch=x64 npm_config_platform=linux npm_config_target=8.10.0 npm install
on packaging step. (target version should be matched to actual node.js runtime version. in this case, targets Lambda Node.js 8.10 runtime)
I'm using this method to deploy our lambda functions without building native deps for lambda environment, and it works like charm without having any issues :)
Seems like the script doesn't work if eu-central-1
is used as region. It will always say the bucket was not found.
Works fine if I switch region to us-west-2
While deploying the lambda function, I was getting an error of not being able to find the handler function. I then modified the serverless.yml and prefixed it with src like below
functions:
sharpImage:
description: Resizes images
memorySize: 1024
timeout: 30
handler: src/handler.processImage
The deploy went fine, but now while invoking it I get the following error:
$ yarn run invoke
yarn run v0.23.4
$ serverless invoke --function sharpImage --path ./event.json --log
{
"errorMessage": "Cannot find module '/var/task/src/handler'",
"errorType": "Error",
"stackTrace": []
}
--------------------------------------------------------------------
START RequestId: df6381c6-f0a5-11e7-baa4-a70b2f9693d9 Version: $LATEST
Unable to import module 'src/handler': Error
at require (internal/module.js:20:19)
END RequestId: df6381c6-f0a5-11e7-baa4-a70b2f9693d9
REPORT RequestId: df6381c6-f0a5-11e7-baa4-a70b2f9693d9 Duration: 0.43 ms Billed Duration: 100 ms Memory Size: 1024 MB Max Memory Used: 21 MB
Error --------------------------------------------------
Invoked function failed
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Forums: forum.serverless.com
Chat: gitter.im/serverless/serverless
Your Environment Information -----------------------------
OS: darwin
Node Version: 8.8.0
Serverless Version: 1.24.1
error Command failed with exit code 1.
The following error occurs when an image is uploaded to S3:
{
"errorMessage": "Cannot read property 'forEach' of undefined",
"errorType": "TypeError",
"stackTrace": [
"/var/task/handler.js:959:20",
"next (native)",
"step (/var/task/handler.js:279:31)",
"/var/task/handler.js:290:14",
"process._tickDomainCallback (internal/process/next_tick.js:135:7)"
]
}
I use an existing bucket for both the sourceBucket
and the destinationBucket
, with the following serverless.yml
:
service: ${file(config.json):name}
provider:
name: aws
runtime: nodejs6.10
profile: ${file(config.json):provider.profile}
stage: ${file(config.json):provider.stage}
region: ${file(config.json):provider.region}
iamRoleStatements:
- Effect: "Allow"
Action:
- "s3:ListBucket"
- "s3:GetObject"
- "s3:GetObjectAcl"
Resource:
Fn::Join:
- ""
- - "arn:aws:s3:::"
- ${file(config.json):sourceBucket}
- "/"
- ${file(config.json):sourcePrefix}
- "*"
- Effect: "Allow"
Action:
- "s3:ListBucket"
- "s3:PutObject"
- "s3:DeleteObject"
- "s3:GetObjectAcl"
- "s3:ListBucketMultipartUploads"
- "s3:ListMultipartUploadParts"
- "s3:PutObject"
- "s3:PutObjectAcl"
- "s3:PutObjectTagging"
- "s3:PutObjectVersionAcl"
- "s3:PutObjectVersionTagging"
Resource:
Fn::Join:
- ""
- - "arn:aws:s3:::"
- ${file(config.json):destinationBucket}
- "/"
- ${file(config.json):destinationPrefix}
- "*"
custom:
webpackIncludeModules: false # disable auto including modules
plugins:
- serverless-webpack
package:
exclude:
- .serverless
- .webpack
- coverage
- .babelrc
- .eslintignore
- .eslintrc
- .gitignore
- LICENSE
- package.json
- README.md
- serverless.yml
- webpack.config.js
functions:
sharpImage:
description: Resizes images
memorySize: 1024
timeout: 30
handler: handler.processImage
My guessing it has something to do with the whole 'existing bucket for both the source as the destination'.
I was trying to resize the SVG files on trigger, but there's no way to pass the density parameter.
I checked the code sharp.js
and passed it manually, and it works perfectly. How can we add it the the config file?
Want me to create a PR by tweaking some code in sharp.js
and config.json
?
Improve the readme.md documentation before 1.0 release.
~/examples
including how to setup with a single bucket, single existing bucket, etc.Objects with spaces in their key name results in "NoSuchKey: The specified key does not exist." error
~/serverless.yml
:
# Enable X-Ray tracing on Lambda functions
SharpLambdaFunction:
Properties:
TracingConfig:
Mode: Active
AWSXrayWriteOnlyAccess
IAM policy.var AWSXRay = require('aws-xray-sdk-core');
var AWS = AWSXRay.captureAWS(require('aws-sdk'));
Hi,
I installed and configured the project as described in the docs.
When I run npm run deploy, the linter is not happy :(
`/Users/Documents/workspace/serverless-sharp-image/src/image.js
17:20 error Unexpected newline after '(' function-paren-newline
19:5 error Unexpected newline before ')' function-paren-newline
32:21 error Unexpected newline after '(' function-paren-newline
57:3 error Unexpected newline before ')' function-paren-newline
/Users/Documents/workspace/serverless-sharp-image/src/image.test.js
66:1 warning test.todo()
should be not be used ava/no-todo-test
68:1 warning test.todo()
should be not be used ava/no-todo-test
/Users/Documents/workspace/serverless-sharp-image/src/metadata.js
28:38 error Unexpected newline after '(' function-paren-newline
34:7 error Unexpected newline before ')' function-paren-newline
35:5 error Unexpected newline before ')' function-paren-newline
/Users/Documents/workspace/serverless-sharp-image/src/metadata.test.js
4:8 error 'config' is defined but never used no-unused-vars
25:1 error test.todo()
should not be passed an implementation function ava/no-todo-implementation
25:1 warning test.todo()
should be not be used ava/no-todo-test
25:58 error 't' is defined but never used no-unused-vars
27:1 error test.todo()
should not be passed an implementation function ava/no-todo-implementation
27:1 warning test.todo()
should be not be used ava/no-todo-test
27:60 error 't' is defined but never used no-unused-vars
29:67 error 't' is defined but never used no-unused-vars
/Users/Documents/workspace/serverless-sharp-image/src/sharp.js
14:21 error Unexpected newline after '(' function-paren-newline
21:3 error Unexpected newline before ')' function-paren-newline
/Users/olivier/Documents/workspace/serverless-sharp-image/src/sharp.test.js
37:20 error Unexpected newline after '(' function-paren-newline
47:3 error Unexpected newline before ')' function-paren-newline
✖ 21 problems (17 errors, 4 warnings)
11 errors, 0 warnings potentially fixable with the --fix
option.`
Did I forgot to configure something ?
Fixed by updating Serverless to latest? (#10)
imageflow looks pretty cool. Perhaps we could support it alongside sharp/libvips?
... or use up
with imageflow-server
?
Use either lambda-packager or thaumaturgy to package sharp
module instead of the current custom hack.
Sharp lets you extract EXIF and other metadata from images. Add the ability to extract this data and save it either as JSON alongside the output image, or into a DB.
Progress tracked in PR #45
Serverless released 1.4. Update this packages dependency accordingly.
OK, LQIP can probably already be achieved with sharp, but SVG like the following cannot:
https://video.twimg.com/tweet_video/DMAiijTWkAEBuzb.mp4
Via https://twitter.com/Martin_Adams/status/918772434370748416 and server implementation https://github.com/Schniz/svgify
https://github.com/Schniz/svgify/blob/1d8460d2c3343a9df40ac831842ed5a6f5c658a5/src/svgify.js#L1-L40:
const SVGO = require("svgo");
const { exec } = require("child_process");
const potrace = require("potrace");
const svgo = new SVGO()
const optimize = svg => new Promise((resolve, reject) => {
svgo.optimize(svg, ({data}) => {
return resolve(data)
})
})
const resize = (fileName, maxSize = 200) =>
new Promise((resolve, reject) => {
exec(`convert ${fileName} -resize ${maxSize}x${maxSize} ${fileName}.resized`, (err, stdout, stderr) => {
if (stderr) return reject(stderr);
return resolve(`${fileName}.resized`);
});
});
const trace = (
file,
params = {
color: "#ccc",
flat: true,
turnPolicy: "majority",
turdSize: 100,
optTolerance: 0.4
}
) => {
return new Promise((resolve, reject) => {
potrace.trace(file, params, (err, svg) => {
if (err) return reject(err);
return resolve(svg);
});
}).then(svg => optimize(svg));
};
module.exports = file => resize(file).then(trace);
module.exports.resize = resize;
module.exports.trace = trace;
Q: Should this be done in a new Lambda function or as part of the same/single lambda function?
Q: Should this be a lambda function in a separate project?
Somewhat related to #21
Currently, images are processed when S3 events trigger the resizing function. This feature proposes to extend resizing functionality to API Gateway, exposing a simple REST API for resizing images on the fly (with caching).
For example:
http://domain.tld/resizer-api/{width}/{height}/the-file-name-in-s3.png?someOtherOptions=here
Add IAM Role permissions for Lambda to access both the source/destination buckets.
I am pushing pictures taken by webcam to s3 bucket and want to process those or rather optimize those images so that their file size reduces but there is not much loss of quality and resolution. Is it possible to accomplish the same with this utility ?
Make necessary updates in babel (use preset-env), webpack (remove this dependency?) to make use of Node 6.10 runtime.
I noticed that all my transformed images have Content-Type "application/octet-stream" (the default), and then found the mention of type
in the readme. However, even after setting it, it looks like it's not being respected.
My use case is basically straight from the readme, with a few modifications:
{
"key": "%(filename)s_200x200.jpg",
"type": "%(type)s",
"params": {
"ACL": "public-read"
},
"operations": [
["resize", 200, 200],
["max"],
["crop", "entropy"],
["withoutEnlargement"]
]
},
even when I hardcode the string "type": "image/jpeg",
, when I do a HTTP GET on the transformed image, I get Content-Type: application/octet-stream
This shouldn't matter, but the original file has Content-Type: image/jpeg
. Let me know if there's anything else I can do to help with this. I'll try to take a look at the code to see if there's a straightforward fix here.
This is not an issue but a question. Let's say I pause the lambda function and images keep coming on the source s3 bucket. Later on when I resume the lambda function, I not only want it to process newly added images but also the ones that were added when lambda function was not running. Is there any configuration that can accomplish this?
Hi, I'm trying this out and basing my test on v0.11.0 as indicated by the README.
serverless 1.28.0
So, I created config.json
and yarn test
completes successfully.
Now for serverless deploy -v
First issue:
Serverless Error ---------------------------------------
No matching handler found for 'handler'. Check your service definition.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information -----------------------------
OS: darwin
Node Version: 8.11.2
Serverless Version: 1.28.0
Changed serverless.yml
handler: src/handler.processImage
Serverless: Could not load webpack config '/Users/Shared/java/projects/react/projects/serverless-sharp-image/webpack.config.js'
Type Error ---------------------------------------------
webpack.optimize.OccurenceOrderPlugin is not a constructor
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information -----------------------------
OS: darwin
Node Version: 8.11.2
Serverless Version: 1.28.0
Changed new webpack.optimize.OccurenceOrderPlugin(),
to new webpack.optimize.OccurrenceOrderPlugin(),
Serverless: Bundling with Webpack...
Webpack Options Validation Error -----------------------
Invalid configuration object. Webpack has been initialised using a configuration object that does not match the API schema.
- configuration.output.path: The provided value ".webpack/service" is not an absolute path!
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information -----------------------------
OS: darwin
Node Version: 8.11.2
Serverless Version: 1.28.0
Changed output.path
to path: webpackDir
and module.loaders.loader
to loader: 'babel-loader',
WARNING in DedupePlugin: This plugin was removed from webpack. Remove it from your configuration.
ERROR in Entry module not found: Error: Can't resolve 'babel' in '/Users/Shared/java/projects/react/projects/serverless-sharp-image'
BREAKING CHANGE: It's no longer allowed to omit the '-loader' suffix when using loaders.
You need to specify 'babel-loader' instead of 'babel',
see https://webpack.js.org/guides/migrating/#automatic-loader-module-name-extension-removed
Error --------------------------------------------------
Webpack compilation error, see above
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information -----------------------------
OS: darwin
Node Version: 8.11.2
Serverless Version: 1.28.0
Now it basically works! I have na existing bucket and that failed, but that was mentioned in the README. Personally, I would prefer it to be able to handle existing buckets without requiring changes to
Now it deploys successfully. When I look on AWS, I do see a popup saying "src/handler.js" not found.
Sharp released 0.17. Update this packages dependency accordingly.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.