Git Product home page Git Product logo

scheduled-snapshots's Introduction

A step-by-step guide for taking daily, rotating snapshots of EBS volumes. You will get one new snapshot every day, and any snapshot over 7 days old will be deleted automatically.

Set up an IAM User

First we need to set up a user that we will use specifically for backups, which has the permissions we need (and just these permissions). To do this:

  1. Go to "Identity and Access Management" section in the AWS console.
  2. Click on "Users" on the left.
  3. Enter one user name, "snapshot-backup". Click Create.
  4. Download Credentials and Close
  5. Click on the new user and then click on Inline Policies
  6. Create a new policy, and click on Custom Policy
  7. Name the custom policy "snapshot-backup" and use the below for the policy document. Then click Apply Policy.

Policy to use:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": "ec2:CreateSnapshot",
            "Effect": "Allow",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": "ec2:CreateTags",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": "ec2:DeleteSnapshot",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": "ec2:DescribeSnapshots",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": "ec2:DescribeTags",
            "Resource": "*"
        }

    ]
}

Gather EBS info

In the AWS console, go to EC2 and Volumes. Copy down the Volume ID for any drive that you want to take backups for. Also note the region you are in, shown in the upper-right of the screen. Write down the short name for the region, referred to as the Code in the documentation.

Set up the AWS CLI

On the instance where you want to run the snapshot tool, first make sure you have the AWS CLI installed. On Ubuntu, you can run:

sudo apt-get install awscli

Then we need to configure the tool to use the "ssbackup" profile that will be used in the backup script.

aws configure --profile=ssbackup
AWS Access Key ID [None]: (enter the id found in the credentials file you downloaded above)
AWS Secret Access Key [None]: (enter the key found in the credentials file you downloaded above)
Default region name [None]: (enter the Code for region that was written down above)
Default output format [None]: json

Download and test the script

Now we need download the script that will be used for backups:

sudo curl -o /usr/local/bin/ssbackup.py https://raw.githubusercontent.com/bdeitte/scheduled-snapshots/master/scripts/ssbackup.py
less /usr/local/bin/ssbackup.py

The second command is there because it's always good practice to really look through what you've downloaded and as a sanity check that it's there.

Now you can test out the script. Substitute in the volume IDs that you wrote down. You can enter multiple volume IDs separated by commas.

sudo python3 /usr/local/bin/ssbackup.py --volume-ids={vol},{vol} --expiry-days=7

If there is nothing but short processing output, then things are working. You should see the snapshot show up in the AWS UI shortly, in EC2 under Snapshots.

Setting up cron

Now we need to set up the script to run every day automatically. Use cron for this:

sudo crontab -e

Create a new line at the bottom of the file that appears. It should contain the same volume IDs that you tested with above, but the line itself is a bit different.

20 0 * * * python3 /usr/local/bin/ssbackup.py --volume-ids={vol} --expiry-days=7

You can change the first two numbers in that line if you would like it to run at a slightly different time. This is set up to run at 12:20 am. If you would like it to run at 2:00am, for instance, you would change "20 0" to "0 2".

Save the file, and you're all set! The script will now run every day, and you'll always have 7 days of snapshots to use in case of an emergency.

Things still to do in this guide (someday)

  • Add more troubleshooting info in here if things don't get set up right or snapshots aren't showing up.
  • Use an IAM instance role rather than the ID and key to set up the CLI
  • Alert on errors when running the snapshot script or when cron isn't running. I've used DataDog for this kind of thing in the past, but there's lots of others free or cheap tools that could be used too.

Credits

I've modified the script and steps from a blog post on the subject. Thank this person and not me for this project, as they are largely responsible for it!

scheduled-snapshots's People

Contributors

bdeitte avatar rcansley avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

scheduled-snapshots's Issues

Error when running script

I'm trying to setup your script in a Ubuntu 15 instance but I'm getting the following error when I run

python3 varnishbackup.py --volume-ids=vol-0a7de8c7c8b38fe5a --expiry-days=3

Creating snapshots...

('Connection aborted.', gaierror(-2, 'Name or service not known'))
Traceback (most recent call last):
File "varnishbackup.py", line 120, in
snapshots = createSnapshots(volumeIds)
File "varnishbackup.py", line 41, in createSnapshots
snapshots.append(createSnapshotForVolume(volumeId))
File "varnishbackup.py", line 78, in createSnapshotForVolume
"--profile", profile
File "/usr/lib/python3.4/json/init.py", line 318, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.4/json/decoder.py", line 343, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.4/json/decoder.py", line 361, in raw_decode
raise ValueError(errmsg("Expecting value", s, err.value)) from None
ValueError: Expecting value: line 1 column 1 (char 0)

script error

The OS is a Centos 6 64bit and it has python 3.3.3 installed.
Error bellow:

[root@ip-172 /]# python3 --version
Python 3.3.3
[root@ip-172 /]# python3 /usr/local/bin/ssbackup.py --volume-ids=vol-4gg4gg4 --expiry-days=7
Traceback (most recent call last):
File "/usr/local/lib/python3.3/json/decoder.py", line 368, in raw_decode
obj, end = self.scan_once(s, idx)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/ssbackup.py", line 119, in
snapshots = createSnapshots(volumeIds)
File "/usr/local/bin/ssbackup.py", line 56, in createSnapshots
"--profile", profile
File "/usr/local/lib/python3.3/json/init.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.3/json/decoder.py", line 352, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.3/json/decoder.py", line 370, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

error on deleting old snapshots

Hello. The first time the script had to delete old snaphots I got the following errors. Can you please check why that happens? Thanks!

[root@ip-172-]# python3 /usr/local/bin/ssbackup.py --volume-ids=vol-xxxxx1,vol-xxxxxc,vol-1xxxxxa --expiry-days=3
Creating snapshots...
Deleting old snapshots...
Traceback (most recent call last):
File "/usr/local/lib/python3.3/json/decoder.py", line 368, in raw_decode
obj, end = self.scan_once(s, idx)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/ssbackup.py", line 128, in
deleteOldSnapshots(getOurSnapshots(), args.expiry_days)
File "/usr/local/bin/ssbackup.py", line 103, in deleteOldSnapshots
"--profile", profile
File "/usr/local/lib/python3.3/json/init.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.3/json/decoder.py", line 352, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.3/json/decoder.py", line 370, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[root@ip-172-]#

multiple --volume-ids

Hello. The script works as intended for a single volume id argument passed to it. If you pass multiple volume-ids arguments it only creates a snapshot for the last argument, no matter how many you use.
The example bellow only creates a snapshot for --volume-ids=vol-xxxx2.

[root@ip-172 ~]# aws --version
aws-cli/1.10.6 Python/3.3.3 Linux/2.6.32-573.7.1.el6.x86_64 botocore/1.3.28

[root@ip-172 ~]# python3 /usr/local/bin/ssbackup.py --volume-ids=vol-xxxxx1 --volume-ids=vol-xxxx2 --expiry-days=7
Creating snapshots
Deleting old snapshots
Processing completed
[root@ip-172 ~]#

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.