Git Product home page Git Product logo

rclone_jobber's Introduction

About rclone_jobber backup script

rclone_jobber.sh is a shell script to make backups. It performs backup jobs by calling rclone. Rclone is a mature tool and many third-party tools use rclone.

The rclone jobber tutorial includes backup-job and restore-job examples for a home computer.

rclone_jobber.sh features:

  • Tested on Linux, and should also run on macOS and Windows 10 wsl
  • Options to archive old backup files in their original hierarchy
  • Can backup to multiple destinations for redundant backups (one backup job per destination)
  • Aborts if the backup job is already running (maybe the previous run didn’t finish)
  • Pop-up for error conditions
  • Option for a cron-monitoring service
  • Logging
  • POSIX compliant shell script so it is easy to customize
  • Free (open source Creative Commons Zero license)

Rclone features:

  • Back up data from local machine
  • Back up data from a cloud provider (Google Drive for example)
  • Back up to local storage
  • Back up to remote cloud storage (safe from local disaster)
  • Over 30 cloud-storage providers to choose from (so you’re never locked into a provider)
  • Optional encryption (Crypt)
  • MD5/SHA1 hashes checked at all times for file integrity
  • Timestamps preserved on files
  • Filter rules (to exclude or include files)
  • rsync-like algorithm and interface
  • Sync (one way) mode to make a directory identical
  • Partial syncs supported on a whole file basis
  • Free (open source MIT license)

Both rclone and rclone_jobber.sh are command line tools. If you prefer a GUI, checkout RcloneBrowser.

Example job scripts

rclone_jobber.sh is already developed, documented, and tested. Only the backup-job files need to be written.

Example minimal backup-job script for rclone_jobber:

#!/usr/bin/env sh

source="$HOME"
dest="$usb/rclone_jobber_backup"

$rclone_jobber/rclone_jobber.sh "$source" "$dest"

Example backup job with all the rclone_jobber.sh arguments defined:

#!/usr/bin/env sh

source="$HOME"
dest="$usb/rclone_jobber_backup"
move_old_files_to="dated_files"
options="--filter-from=$rclone_jobber/filter_rules --checksum --dry-run"
monitoring_URL="https://monitor.io/12345678-1234-1234-1234-1234567890ab"

$rclone_jobber/rclone_jobber.sh "$source" "$dest" "$move_old_files_to" "$options" "$(basename $0)" "$monitoring_URL"

Contact

Please tell me if something is wrong, you’re helping me improve this project.

Submit issues to https://github.com/wolfv6/rclone_jobber/issues. Pull requests are welcome.

Licenses

https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png
The content of the tutorial, including associated setup_test_data_directory.sh, are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 license.

http://i.creativecommons.org/p/zero/1.0/88x31.png
rclone_jobber.sh source code, including associated job files, filter_rules file, and test_suite, are licensed under the Creative Commons Zero 1.0 license.

Rclone_jobber is not affiliated with rclone.

rclone_jobber's People

Contributors

brunoleon avatar nolooseends avatar wolfv6 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rclone_jobber's Issues

Use atomic mkdir rather than pidof

mkdir -m 0700 folder is atomic (with -p it is not).

Thus one can use it to make singleton instances of processes, eg:-

singleton_instance_lock()
{
local temporary_folder_path=...
local program_name=...
	single_instance_lock_folder_path=''

	local potential_single_instance_lock_folder_path="$temporary_folder_path"/"$program_name"
	local lock_holder_pid_file_path="$potential_single_instance_lock_folder_path"/lock_holder.pid
	local loop_count=0
	
	local exit_code
	while true
	do
		set +e
			mkdir -m 0700 "$potential_single_instance_lock_folder_path" 1>/dev/null 2>/dev/null
			exit_code=$?
		set -e
		
		if [ $exit_code -eq 0 ]; then
			printf '%s\n' $$ >"$lock_holder_pid_file_path"
			single_instance_lock_folder_path="$potential_single_instance_lock_folder_path"
			break
		fi
		
		local loop_division=$((loop_count % 5))
		if [ $loop_division -eq 0 ]; then
			local lock_holder_pid=''
			set +e
# Uses cat rather than $(<X) as the later echos failures to stderr irrespective.
				lock_holder_pid="$(cat "$lock_holder_pid_file_path" 2>/dev/null)"
			set -e
			
			if [ -z "$lock_holder_pid" ]; then
				lock_holder_pid='(unknown)'
			fi
			
			printf '%s:%s\n' "$_program_name" "Still waiting for single instance lock $potential_single_instance_lock_folder_path held by process $lock_holder_pid"
		fi
		loop_count=$((loop_count + 1))
		
		# Non-portable fractional sleep
		set +e
			sleep 0.1 1>/dev/null 2>/dev/null
			exit_code=$?
		set -e
		
		if [ $exit_code -ne 0 ]; then
			sleep 1 1>/dev/null 2>/dev/null
		fi
		
	done
	
	remove_single_instance_lock_folder_path()
	{
		if [ -n "$single_instance_lock_folder_path" ]; then
			rm -rf "$single_instance_lock_folder_path"
			single_instance_lock_folder_path=''
		fi
	}
	
	trap remove_single_instance_lock_folder_path EXIT
}

question

Dear Developer,
I have a question. Does this script copy every time all the data to backup or only change files that have been changed during a previous backup and a current backup?

Cheers
Luigi

move_old_files to their own directory

I've followed your example in the thread at https://forum.rclone.org/t/ive-made-a-nice-backup-script-tutorial-for-rclone/4999/9

Here is the job under discussion:

source="/home/joti/test_rclone_data/music"
dest="${test_remote}:"
new="music"
move_old_files_to="dated_directory"
#options="--dry-run"
monitoring_URL=""

This creates the directory music under test_remote i.e. test_remote:/music/files...

However, any deletions/modifications of these files don't end up under test_remote:/archive/music/VERSION

instead they end up under test_remote:/archive/_VERSION_

This is problematic because then when I backup with the following script:

source="/home/joti/test_rclone_data/docs"
dest="${test_remote}:"
new="docs"
move_old_files_to="dated_directory"
#options="--dry-run"
monitoring_URL=""

All my versions of docs are also under test_remote:/archive/_VERSION_ and I have no idea to tell whether a file came from /home/joti/test_rclone_data/music or /home/joti/test_rclone_data/docs

How do I use this script to push to a remote that already contains the first version of the files but keep archived versions in their own directories?

A couple of suggestions

Hi, thanks for this. :)

I'm just testing out rclone and this script helps to make it more of a backup tool rather than just a sync tool.

A couple of suggestions that popped up just now:

  1. Add an option to write the warnings to the journalctl. At least I have this on a VM, and not on my desktop. So the notify pop-up part does not work.

This command should work:

printf "$message" | systemd-cat -t RCLONE_JOBBER

  1. Log rotation in the script? Not sure if that is a good idea, or just to set up a log rotation manually. At least the initial backup will be huge and fill up the log file.

I also set up this to use the conf and filter file from the same directory as the script. That makes the most sense to me. All in one place if I i.e are going to put it on another VM, etc.

To make it a bit easier for myself I made a symlink of the config file to the default rclone directory. An alternative method could be to set up an alias in .bashrc/.zshrc/etc.

rclone_jobber=<path-to-backup-script-directory> #path to rclone_jobber directory
remote=<name of remote> #added this

source="<path-to-source>"
dest="${remote}:"
move_old_files_to="dated_directory"
options="--config ${rclone_jobber}/rclone.conf --filter-from=${rclone_jobber}/filter_rules"
monitoring_URL=""

${rclone_jobber}/rclone_jobber.sh "$source" "$dest" "$move_old_files_to" "$options" "$(basename $0)" "$monitoring_URL"

Thanks!

Add useragent string?

Hi, in lights of the recent Google Issue (forum) (github issue), I'm trying to add --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36", but I can't get it to work because of the spaces and ( ).

I've temporarily solved it by just using a random string, but i would prefer using an actual user agent string from Chrome.

Any suggestion on how to add that?

Doesn't work with folders/files with spaces

No matter whether you pass double quoted or single quoted paths to rclone_jobber, you get an error of the following type:

Command lsf needs 1 arguments maximum: you provided 2 non flag arguments: ["'/mnt/user/Google" "Drive/'"]
ERROR: backup_Google aborted because source is empty.

User error with setting up a remote host?

May I possibly get a nudge in the right direction? I'm having problems getting the script to run using a remote rclone connection.

I have the following elements in the script:

source="/home/x/storage"            #the directory to back up (without a trailing slash)
dest="${dropbox_crypt_WS1bkp}:"              #the directory to back up to (without a trailing slash or "last_snapshot") destination=$dest/last_snapshot

With the objective to backup /home/x/storage/* (all from this point and downwards) to Dropbox location (config made in rclone) dropbox_crypt_WS1bkp. It passes the following command

Back up in progress 2021-12-08_11:52:12 WS1bkp_storage
rclone sync /home/x/storage :/last_snapshot --backup-dir=:/archive/2021/2021-12-08_11:52:12 --log-file=/home/x/scripts/rclone_jobber/WS1backup.log --log-level=INFO
ERROR: WS1bkp_storage failed.  rclone exit_code=1

but I am not getting sense into my head. I've tried adding the source without the dollar sign, without the curly brackets as well (different parts of the script/docs give the impression that they are left out, adding to my confusion).

Any guidance and pointers are welcome.

prevent parallel runs using flock or similar?

I've realised that I sometimes have multiple instances of rclone running parallel, the check if the script is already running does not work on my Open Media Vault server (Linux nas 6.1.21-v8+ #1642 SMP PREEMPT Mon Apr 3 17:24:16 BST 2023 aarch64 GNU/Linux).

Does anyone have a modified version of the script using flock or similar to prevent running more than once at a time?

Or, as the past update to this repo is 4 years ago, what alternative script have you moved on to?

Thanks!

job_restore_last_snapshot.sh does nothing

If I run the script and say Y or y nothing happens:

root@fs  🖿  ~ ▓▒░ rclone_jobber/examples/job_restore_last_snapshot.sh                                             ░▒▓ ✘  11.81  3%  20:30 🕣  
rclone copy remote:del/last_snapshot ~/test/last_snapshot
>>>>>>>>>>>>>>> Run the above rclone command? (y) <<<<<<<<<<<<<<<<< 
y

 root@fs  🖿  ~ ▓▒░ ll                                                                                         ░▒▓ ✔  1.5s  11.82  3%  20:30 🕣  
total 0
lrwxrwxrwx 1 root root  21 Apr 29 15:57 mdcmd -> /usr/local/sbin/mdcmd
drwxrwxrwx 5 root root 180 Apr 29 19:48 rclone_jobber
drwxrwxrwx 5 root root 120 Apr 29 19:58 rclone_test_data

Conversely, if I just run the rclone command it suggests, it works fine:

 root@fs  🖿  ~ ▓▒░ rclone copy remote:del/last_snapshot ~/test/last_snapshot                               ░▒▓ ✔  11.82  3%  20:30 🕣  

 root@fs  🖿  ~ ▓▒░ ll                                                                                         ░▒▓ ✔  8.9s  11.71  3%  20:31 🕣  
total 0
lrwxrwxrwx 1 root root  21 Apr 29 15:57 mdcmd -> /usr/local/sbin/mdcmd
drwxrwxrwx 5 root root 180 Apr 29 19:48 rclone_jobber
drwxrwxrwx 5 root root 120 Apr 29 19:58 rclone_test_data
drwxrwxrwx 3 root root  60 Apr 29 20:30 test

Make timestamp an option

It would be nice to make this an option so the user does not need to edit the master script with every update:

timestamp="$(date +%F_%H%M%S)" #time w/o colons if thumb drive is FAT format, which does not allow colons in file name

Filter out files from the archive/move function?

Hi, is it possible to filter out files from the "move/archive" function?
Or any ideas on how to do that?

I have a few files that changes almost every day, that I want the current version backed up, but I don't want it to copy the old to the "archive", since most days that is the only few files that has changed.

Script will not recognize space in directory

I'm having a hard time having the job_backup_to_remote script to recognize a space in the path of my directory for my source. Here's the modified script:

rclone_jobber="/Users/Olive/Documents/GitHub/rclone_jobber" #path to rclone_jobber directory

source="/Volumes/Oliver/User Library" <----- issue here
dest="pcloud:Production/Ableton"
move_old_files_to="dated_directory"
options="--stats=10s"
monitoring_URL=""

/Users/Olive/Documents/GitHub/rclone_jobber/rclone_jobber.sh "$source" "$dest" "$move_old_files_to" "$options" "$(basename $0)" "$monitoring_URL"

Here's the error message I get:

~ '/Users/Olive/Documents/GitHub/rclone_jobber/olive_backup/ableton_remote.sh' 07:58:52
ls: /Volumes/Oliver/User: No such file or directory
/Users/Olive/Documents/GitHub/rclone_jobber/rclone_jobber.sh: line 86: pidof: command not found
Back up in progress 2018-08-02_08:02:31 ableton_remote.sh

(I renamed job_backup_to_remote to ableton_remote)

Thanks!

macOS & Homebrew & pidof: illegal option — o

Running something like:
“pidof -o $PPID -x “$job_name””

causes:
pidof: illegal option — o

rclone_jobber.sh version 1.5.1

pidof -v
pidof version 0.1.4
Copyright 2003 - 2004 Night Productions

brew -v
Homebrew 2.0.6
Homebrew/homebrew-core (git revision f321f; last commit 2019-03-26)

uname -a
Darwin localhost 18.2.0 Darwin Kernel Version 18.2.0: Thu Dec 20 20:46:53 PST 2018; root:xnu-4903.241.1~1/RELEASE_X86_64 x86_64

Are there any alternatives?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.