Comments (13)
I am not sure about terminology. My understanding is that lambda does freeze the process once the event loop is empty and that serverless-http
does nothing to keep the loop populated.
But a "cold start" is not the same as a "frozen process" -- or is it?
From docs:
if AWS Lambda re-uses the frozen process, the function execution continues with its same global state (for example, events that remained in the event loop will begin to get processed). However, when you use callback, AWS Lambda continues the Lambda function execution until the event loop is empty.
So I see no reason why "freeze" means "re-run initializers"
from serverless-http.
And my understanding is that Lambda will still freeze the process with the event loop being populated. It just resumes pumping the loop when the process is unfrozen. The question is whether the callback "returns" when there is stuff in the event loop, is it not?
serverless-http
: event loop is emptied, callback() returns as soon as it is called
aws-serverless-express
: event loop kept full, callback() would need the callbackWaitsForEmptyEventLoop
set to false
And both solutions, because of the nature of Lambda, are subject to cold start (a new container is spun up in response to scaling demand and thus your "global code" has to run (i.e. initializers)
I wonder if serverless-http
would benefit from callbackWaitsForEmptyEventLoop: false
from serverless-http.
I was wondering, with aws-serverless-express
I know there's always an http server running and so the event loop never gets empty, forbidding lambda from freezing the process, how does serverless-http
tackle this?
resource:
- callbackWaitsForEmptyEventLoop @ http://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-context.html
Reason I ask is, that I have to run through several initializers before I end up with a callback function. So it would be unwise to run them for every request, my lambda export looks like this:
'use strict';
const serverless = require('serverless-http');
const app = require('./src/app'); // > Promise
let handler;
module.exports.server = (evt, ctx, cb) => {
if (handler) {
handler(evt, ctx, cb);
return;
}
app.then((callback) => {
handler = handler || serverless(callback);
handler(evt, ctx, cb);
});
};
Now I can see, that it is not happening for every request from the logs I have in cloudwatch, so does serverless-http
populate the event loop or is this a wild fluke and lambda should just freeze the process right away?
from serverless-http.
so i did a little more digging into the topic
with serverless-http the results get sent to the gateway after the loop has been emptied (callbackWaitsForEmptyEventLoop=true). to verify: set a setTimeout of 5000 in the last middleware and it will block the sending of the request response back to the gateway by 5000ms.
aws-serverless-express still uses context.succeed, which is equal to using callback with callbackWaitsForEmptyEventLoop=false
I think setting callbackWaitsForEmptyEventLoop to false is necessary since whenever lambda starts re-using the same process to handle the request it essentially fills the eventloop, doesn't it? Meaning that the first request callback will wait for the last concurrent one to finish and in doing so empty the queue to send back the results.
Using aws-serverless-express i never had more than 2 log streams at a time, while whilst playing around with serverless-http i saw 3-8 at a time for the same result. I get that from the lambda pricing this doesn't really matter, but the number of cold starts should still be kept to the minimum and if the above is right, processing requests may actually delay other requests response if callbackWaitsForEmptyEventLoop is true.
What do you think?
from serverless-http.
Either way i'm setting callbackWaitsForEmptyEventLoop to false on the context before handing it over to serverless-http for now.
from serverless-http.
from serverless-http.
You mean setting it to false?
yes, indeed.
from serverless-http.
Re-Use continues emptying the loop until another callback is called.
example,
- set callbackWaitsForEmptyEventLoop to false and do setTimeout in 10seconds that logs
foo
to console - trigger 5 fast requests under 10 seconds
- wait 30 seconds
- trigger another request
- 5
foo
s appear in the cloudwatch stream
from serverless-http.
If this is right we should work around it (e.g. set to false) but I wonder if this is not a bug. I would like to capture the behavior in a test/scenario and then maybe involve AWS (we have Enterprise level support) to confirm.
from serverless-http.
I think this is intended behavior, as described in the docs
callbackWaitsForEmptyEventLoop
The default value is true. This property is useful only to modify the default behavior of the callback. By default, the callback will wait until the Node.js runtime event loop is empty before freezing the process and returning the results to the caller. You can set this property to false to request AWS Lambda to freeze the process soon after the callback is called, even if there are events in the event loop. AWS Lambda will freeze the process, any state data and the events in the Node.js event loop (any remaining events in the event loop processed when the Lambda function is called next and if AWS Lambda chooses to use the frozen process). For more information about callback, see Using the Callback Parameter.
from serverless-http.
I don't disagree but I think it is ambiguous whether it is intended that other incoming requests can cause the event loop to repopulate and thus prevent an initial request from completing.
I read this - perhaps naively - as a lambda invoke works something like this:
let handler = find_frozen(name, qualifier)
if (handler === null) {
handler = cold_start_handler(name, qualifier)
} else {
handler = unfreeze(handler)
}
handler(...)
freeze(handler)
(for brevity I've omitted all the kinds of crap that would have to be done to inspect the event loop, wait async, etc.)
so handlers are put in to a buffer/queue of some kind. what your findings suggest, to me, is that a invoke can be routed to an active (that is, unfrozen) handler while it is still processing another request, which is to say subsequent calls can preempt initial calls and that feels wrong ?
from serverless-http.
played around a bit more and I can say that the event loop does not delay other requests, simply because, regardless of what the setting is, a single handler does not process two requests at the same time anyway. You're totally at the mercy of lambda to optimize the execution and have as many frozen handlers ready as it needs to process a given throughput with the observed duration of each call.
What I observe with long running requests (10second ones) is, that sometimes lambda does not start another handler until the concurrency/throughput warrants it and simply goes one by one, making the first one take 10seconds to respond to the browser, the second one 20, but only charge (and log) 10 and 10.
Would be interested in your findings if you dig into it.
from serverless-http.
I did some performance testing and the short answer is, basically this library adds maybe 1ms of request latency. It's super small.
from serverless-http.
Related Issues (20)
- Using this library with AWS function URLs? HOT 3
- body-parser middleware not triggering in Express 5/body-parser 2.x HOT 2
- Cannot work with graphql, express-graphql or graphql-http
- Multiple `set-cookie` don't end up in the final response when an ALB is the trigger
- Integration with websockets HOT 1
- gRPC connect-es support
- Double logging since v3.1.1 HOT 1
- Is there a way to disable or replacing logging? HOT 1
- Xmlrpc support ? HOT 7
- How to properly wrap the http.createServer? HOT 4
- Streaming response HOT 5
- Use with Serverless Dashboard V2.0 and CLI V3 HOT 4
- {"message":"Missing Authentication Token"}
- Support Elysia.js
- Buffer in request.body
- Catch-all error handling with API GW?
- Is this repo still being maintained? HOT 1
- Cache - help or insights
- Error Using serverless/koa with serverless V4.1.11
- query string 的处理和阿里云服务器不符
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from serverless-http.