A pass-through proxy to communicate with OpenWeatherMap API server.
It acts as a simple web proxy that sends all requests it receives to https://api.openweathermap.org
. You may need to send your secret key to the proxy if it is a protected API. Check the official documentation.
The proxy runs on AWS Lambda.
/data/2.5/onecall
Terraform is used for deployment on AWS.
LocalStack is used to provide local AWS services.
Requirements
Steps
- Start LocalStack
docker run --rm -it -p 4566:4566 -p 4571:4571 -e "SERVICES=iam,lambda,apigateway,cloudwatch,logs,sts" localstack/localstack
- Run Terraform scripts
cd terraform/local terraform init terraform apply -auto-approve
- Take note of the outputs when Terraform completes. Example:
Outputs: rest_api_url = "http://localhost:4566/restapis/7rujauhl95/api/_user_request_"
- Test the deployed endpoint
curl http://localhost:4566/restapis/7rujauhl95/api/_user_request_/data/2.5/onecall?appid=aaabbbccc&lat=22.3193&lon=114.1694&exclude=minutely,alerts&units=metric&lang=en
Requirements
- Have an AWS account ready. We will need the Access and Secret Keys to run the Terraform scripts.
- Have a Terraform account ready. We will need it to store the Terraform states.
Steps
- Run Terraform scripts
cd terraform/remote terraform init terraform apply -auto-approve
- Take note of the outputs when Terraform completes
- Test the deployed endpoint