Git Product home page Git Product logo

Comments (13)

wperron avatar wperron commented on May 18, 2024 1

or perhaps it's okay to publish the top half to dockerhub then users can publish their actual image to ecr...

that would be my instinct yes, similar to what they provide for the default runtimes: they're simply based of the provided image and add the necessary packages for the different runtimes. I would assume the typical workflow with containers in Lambda would be to pull the base image, bake in user-code and publish the image to an internal ecr repo.

I think at least publishing to Docker Hub might make sense to provide this base image, as you say publishing to every region would require some automation to work well.

from deno-lambda.

hayd avatar hayd commented on May 18, 2024 1

This is now published at https://hub.docker.com/r/hayd/deno-lambda
(published in docker-deno)

Usage:

FROM hayd/deno-lambda:1.6.1

COPY hello.ts .
RUN deno cache hello.ts


CMD ["hello.handler"]

Yet to add documentation, but the instructions of aws-lambda-provided work.


docker build -t <image name> .

To run your image locally:

docker run -p 9000:8080 <image name>

In a separate terminal, you can then locally invoke the function using cURL:

curl -XPOST "http://localhost:9000/2015-03-31/functions/function/invocations" -d '{"payload":"hello world!"}'

will keep this open until documented :)

Edit: I think adding the above in example-docker/ seems a reasonable start.

from deno-lambda.

shellscape avatar shellscape commented on May 18, 2024 1

I don't think this is really the case, given that you're only paying a few ms every N minutes. But 🤷

for small systems or lambdas that get called infrequently or are not provisioned to a high degree, yes the cost will be minimal if noticeable at all. if it's a non-critical lambda and it's not part of a larger distributed system, you may not even break out of the free tier. but for lambdas that are part of large distributed systems that need high availability and our mission critical, perhaps even provisioned to hundreds of instances, it can start to have a negative impact on cost. and that's not theoretical, I've actually seen this happen. keepwarm is great until it's not 😄

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

ooh, that's really nice! I guess we should publish a base deno-lambda image on ecr ( in every region??). Or alternatively have an example Dockerfile with the deno-lambda layer/bootstrap?

FROM public.ecr.aws/lambda/provided:al2

# perhaps remove this in a single run command?
RUN yum install -y unzip

ENV DENO_VERSION=1.6.0

RUN curl -fsSL https://deno.land/x/lambda@${DENO_VERSION}/bootstrap --out ${LAMBDA_RUNTIME_DIR}/bootstrap \
 && chmod 777 ${LAMBDA_RUNTIME_DIR}/bootstrap

RUN curl -fsSL https://github.com/denoland/deno/releases/download/v${DENO_VERSION}/deno-x86_64-unknown-linux-gnu.zip \
         --output deno.zip \
 && unzip -qq deno.zip \
 && rm deno.zip \
 && chmod 777 deno \
 && mv deno /bin/deno

# the above blocks could be packaged to dockerhub...

# Note WORKDIR=${LAMBDA_TASK_ROOT} so the ${LAMBDA_TASK_ROOT} is superfluous.
COPY hello.ts ${LAMBDA_TASK_ROOT}/hello.ts

CMD ["hello.handler"]

and it works!

or perhaps it's okay to publish the top half to dockerhub then users can publish their actual image to ecr...


it's nice to have something aws supported for local testing... (vs https://github.com/hayd/deno-lambda#testing-locally-with-docker-lambda )

I'm not so eager/rushing to rewriting the tests (though it would probably be good to do)... 🤷

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

Will do. I think i need to set BOOTSTRAP_VERSION separately, that way it can use deno-docker publishing (and doesn't need to wait on deno-lambda tagging). This Dockerfile will probably live in two places (there too) but I think that's fine.

I know lambci do (or did) publish to every region... it seems like a lot of effort for little gain.

(This is a really nice thing to release from aws.)

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

cc @kyeotic will be interesting to see if this lowers cold/warm start time.

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

closed by #120

from deno-lambda.

shellscape avatar shellscape commented on May 18, 2024

@hayd did that end up lowering cold start?

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

The solution I've always gone with is warm starts: setting up an event that runs a fast endpoint for the lambda frequently...

from deno-lambda.

shellscape avatar shellscape commented on May 18, 2024

Yeah that's an old keep warm strategy. It'll bump your costs on systems with lots of provisioned fns. But still, curious to know if this work lowered cold starts.

from deno-lambda.

hayd avatar hayd commented on May 18, 2024

It'll bump your costs on systems with lots of provisioned fns.

I don't think this is really the case, given that you're only paying a few ms every N minutes. But 🤷

Once you done a successful deno cache baked into the docker image, I'm not sure there is many better ways to boost cold start.

from deno-lambda.

kyeotic avatar kyeotic commented on May 18, 2024

Sending a single request every couple minutes will only keep 1 lambda instance warm. If you get more than one request at a time then you will still get cold starts. Remember, each concurrent requests gets its own container, and if one is not ready it will be cold started. You can try to keep enough containers warm to handle your peak load, but that will be expensive. Anything less and requests will hit cold containers. You might as well use fargate at that point.

Getting cold start into usable territory is key to using lambda effectively.

from deno-lambda.

shellscape avatar shellscape commented on May 18, 2024

There are round-robin keep warm strategies as well, but highly specialized. Most common strategy I've seen is self-invoke.

We're in agreement cold start is the best focus. That's why I was generally curious about the effects #120 had.

from deno-lambda.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.