Git Product home page Git Product logo

easy_aws_instances's Introduction

Save $$$ on AWS by streamlining instance creation

So you heard about these super duper AWS instances with multiple Tesla GPU's, s**tloads of RAM and more CPU cores than Donald Trump has towers, and you're dying to work on that, but you're short of $$$?

One way to keep the costs at bay is by developing your code locally and only running your code on one of these expensive AWS instances when you feel confident that your code will actually work as expected. As you probably know, if you don't terminate your AWS instances, the clock keeps ticking and your credit card bleeds dry, so you want to keep the uptime of your instances as efficient as possible...

Developing locally and running on AWS might take a number of trial and error iterations though, so it would be nice to streamline this process of firing up new instances and killing them. That's why I created a little script that is able to create an AWS instance, upload the files I need uploaded, and starts a Jupyter notebook server. So my workflow is basically:

  • Develop my code in Jupyter notebook
  • Start AWS instance, upload notebook and start notebook server (with the script)
  • Run the notebook until I realise I need to take some time again to think and change the code
  • Download the updated notebook and kill the instance
  • Rinse and repeat

Requirements

In the script, I assume that you're running on a Mac (sorry), that you have a AWS account and that you have created a Key Pair and stored the corresponding .pem file in the folder where you're running the script to authenticate yourself with your AWS instances. If you have no clue what I'm talking about, see for example the following tutorials that will guide you through your first experience with AWS:

The AWS Command Line Interface

The way to start AWS instances from command line is done with ... *drum roll* ... the AWS Command Line interface.

You install and configure this as follows:

# pip install awscli
yes | pip install awscli --upgrade --user

# Add the path to the awscli to your PATH variable and source your updated .bash_profile
echo 'export PATH=~/.local/bin:$PATH' >> ~/.bash_profile
source ~/.bash_profile

# Configure your AWS settings and credentials
aws configure

This last command will ask for you Access Key and Secret Access Key. You can get those from 'Users' in the Identity and Access Management (IAM) section of the AWS console.

Now you're all set to start using the AWS CLI. For detailed information about the AWS CLI, see here.

Firing up an instance

The critical part of the script (see the next section) is the following command which fires up an AWS instance:

Note that you need an image_id which is the id of the Amazon Machine Image (AMI) that you want to load onto the instance. A commonly used Deep Learning image is for example this one with Image ID ami-06ae1bbcb7042c6c6. It has Tensorflow and a lot of Python stuff already pre-installed.

You also need to provide the type of instance you want to use. The beefier the instance, the more you pay of course. For example, I'm using a p3.8xlarge instance (costs about $12 per hour). For an instance like this you need to request an instance limit increase, because by default the limit is 0.

Lastly you need to provide the name of your Key Pair, the name of the Security Group you want to use (this determines which ports are going to be open and to who) and the region you want to work in (e.g. eu-central-1 which is in Frankfurt). The Security Group is created automatically by the script.

Let's try it out!

To run the script yourself, clone this repository or download the start_aws_instance.sh script directly. If you want to run the script independently from the other scripts in this repository, you should make sure that you also have a script that you want to upload to your AWS instance for configuring the instance (I called it configure_aws_instance.sh) and you should have a notebook that you want to run (test_aws_instance.ipynb in my case).

At the top of the scripts you can find all the settings that you may want to change, depending on what you're trying to do:

Let's run the script!

So the script created a Security Group opening ports 22 (so I can SSH into the instance) and port 8888 (so I can work with the Jupyter notebook server on the instance from my local machine). It then started a new EC2 instance of the requested type, and copied my notebook and a configuration script to the instance. Finally it SSH'd into the instance forwarding port 8888 of the notebook server to port 8899 on my local machine.

It then also automatically kicked off the uploaded configure_aws_instance.sh script. This script updates a couple of packages and then starts the Jupyter notebook server. This could also include downloading data from public sources or an S3 bucket for example.

Finally, the URL of the notebook server (including the security token) is then retrieved and my browser automatically opens the familiar Jupyter interface.

I can then run my notebook and see that I'm clearly on a way bigger machine than my measly Macbook :P.

When I'm done running the notebook, I can save my work and kill the instance with the terminate_instance.sh script on my local machine. This will download the updated notebook and terminate the expensive AWS instance. I can then continue developing on my local machine without wasting the AWS credit.

easy_aws_instances's People

Contributors

tijlk avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.