Burst Compute Framework
Serverless burst-compute implementation for AWS, using only native AWS services.
For embarassingly parallel workloads, N items may be trivially processed by T threads. Given N items, and a batchSize (the maximum number of items to be processed by a single process in serial), we divide the work into N/batchSize batches and invoke that many user-provided worker Lambdas. When all the worker functions are done, the results are combined by the user-provided combiner Lambda.
In the diagram below, the code you write is indicated by the blue lambda icons.
Here's how it works, step-by-step:
- You define a worker function and a combiner function
- Launch your burst compute job by calling the dispatch function with a range of items to process
- The dispatcher will start copies of itself recursively and efficiently start your worker lambdas
- Each worker is given a range of inputs and must compute results for those inputs and write results to DynamoDB
- The Step Function monitors all the results and calls the combiner function when all workers are done
- The combiner function reads all output from DynamoDB and aggregates them into the final result
Build
You need Node.js 12.x or later in your path, then:
npm install
Deployment
Follow the build instructions above before attempting to deploy.
To deploy this framework to your AWS account, you must have the AWS CLI configured.
To deploy to the dev stage:
npm run sls -- deploy
To deploy to production:
npm run sls -- deploy -s prod
Usage
- Create worker and combiner functions which follow the input/output specification defined in the Interfaces document.
- Invoke the dispatch function to start a burst job.