[HOWTO] Properly backup self-hosted /e/ cloud databases

Hi !

My goal was to properly backup all my databases as plain files, so they could be embedded in my hosting provider daily plain-file backup.

Of course I installed the NextBackup app in NextCloud, but it is … limited to NextCloud :wink:
We also have a postfix database to backup (very important), and I have 2 more personal Docker apps with their databases managed by /e/ mariadb Docker container.

One note about databases backups : backing up a running transactional DB engine files is never a good way, as it can’t ensure database integrity (i.e. transaction logs files not backed up because of locks, activity during backup, memory not committed, …).

So I decided to use schnitzler/mysqldump Docker image to make these backups with SQL dumps.
Why this image ? It is small, fast, has BusyBox embedded, and it has a GitHub page.

1st step : creating the files
This image needs a storage space for backups, some backup scripts, and optional cron files.
So for /e/ Cloud I created :


now=$(date +"%Y-%m-%d_%s")
    /usr/bin/mysqldump --opt -h ${MYSQL_HOST} -u ${MYSQL_USER_NC} -p${MYSQL_PASSWORD_NC} ${MYSQL_DATABASE_NC} | gzip > "/backup/${now}_${MYSQL_DATABASE_NC}.gz"
    find /backup -type f -iname "*_${MYSQL_DATABASE_NC}.gz" -mmin +2875 -delete



now=$(date +"%Y-%m-%d_%s")
/usr/bin/mysqldump --opt -h ${MYSQL_HOST} -u ${MYSQL_USER_PF} -p${MYSQL_PASSWORD_PF} ${MYSQL_DATABASE_PF} | gzip > "/backup/${now}_${MYSQL_DATABASE_PF}.gz"
find /backup -type f -iname "*_${MYSQL_DATABASE_PF}.gz" -mmin +2875 -delete

Don’t forget to create+chmod dir for storage : /mnt/backup/mariadb/backup

2nd step : added the container to Docker Compose (/mnt/repo-base/docker-compose.yml) :

    image: schnitzler/mysqldump:latest
    container_name: mysqldump
    restart: always
      - serverbase
      - MYSQL_HOST=mariadb
# Example for user crontab
#      - /mnt/backup/mariadb/bin/crontab:/var/spool/cron/crontabs/root
#      - /mnt/backup/mariadb/bin/backup_nc:/usr/local/bin/backup_nc
# Backup scripts, scheduled using system-provided cron
      - /mnt/backup/mariadb/bin/backup_nc:/etc/periodic/daily/backup_nc
      - /mnt/backup/mariadb/bin/backup_pf:/etc/periodic/daily/backup_pf
      - /mnt/backup/mariadb/backup:/backup
    command: ["-l", "8", "-d", "8"]
      - mariadb

3rd step : fired up with

docker-compose up -d

(at first run, Docker will download the image)

And it’s done :smiley: , the day after I could find the following files in /mnt/backup/mariadb/backup :

-rw-r--r-- 1 root root 1165149 Jul 31 04:00 2020-07-31_1596160800_ncdb_Bw6l.gz
-rw-r--r-- 1 root root    3912 Jul 31 04:00 2020-07-31_1596160801_postfix.gz

Notes :

  • files are kept 2 days only to save storage
  • all vars are from /mnt/repo-base/.env
  • once working fine on a daily basis, postfix backups can be done weekly. Just map the script to /etc/periodic/weekly/backup_pf and adjust time search in “find” command

EDIT 2020/08/08 : changed the “find” criteria from “-mtime +1” to “+mmin +2875” to include files up to 5mn past backup hour (i.e. today at 04:00, 2-days-before file is 04:02)
Some cosmetics :wink:

Regain your privacy! Adopt /e/ the unGoogled mobile OS and online servicesphone


With latest /e/ Cloud (with NextCloud 20), “serverbase” network has to be replaced with “default”.

According to excellent suggestion from @arnauvp in this GitLab issue, I added a --single-transaction option to all my backup scripts.
Reference : MySQL :: MySQL 8.0 Reference Manual :: 4.5.4 mysqldump — A Database Backup Program