on strangle only some of my instances I get this error:
mysqldump: Couldn't execute 'SHOW FIELDS FROM host_summary': View 'sys.host_summary' references invalid table(s) or column(s) or function(s) or definer/invoker of view lack rights to u se them (1356)
the mysql servers that are backed up are mysql 5.7 on ubuntu 18.04.
it possible to modify the script for incremental save ? currently every time is script is running its deleting previous save, can you make a options for saving the file with the current date ? Something like : $DBDUMPSDIR/mysqldump_$db_YYYY_MM_DD_HH_MM.sql.bz2
In one of my projects, I have very large tables (several GB size). Locking while the table is being dumped would stop the whole system which the dump takes 30 minutes or more. So I decided (as a workaround instead of a "real online backup") to add MYSQLDUMP_ADD_OPTS=--skip-lock-tables. This has been working for months (or years now).
Currently (I assume since PR #8) the backup again locks the project :-(
I was not yet able to dive into that deeply but I assume the following:
Before, --skip-lock-tables was added after --lock-tables in the command line. So table locking was disabled.
Now, --lock-tables is repeatedly added only if not backing up "mysql", so it is the last option in the command line. So table locking is active.
As we have disabled tracing (-> removed set -x - for good reasons) I could not see in the log files if the above idea is correct. How about adding some DEBUG option which logs the effective mysql options?