Howdy! I have 10+ domains installed on WordOps and I'm looking for an automated off-site backup solution. Preferably something that will create daily backups for a rolling 7 days, and uploads them to say, S3

Anyone have a solution they really like?

Great! Thanks, I will review those

I took some inspiration from those and did some research and came up with a single script to accomplish the job, tailored to WordOps.

I'll share once I post it online.

6 months later

jond1

It's a nice script but it only backups websites and their DBs. What if you did something on your server during maintenance, made a typo in the script or update some configuration files but then accidentally deleted it?

Therefore I feel much safer knowing that besides my web sites some important sections, files and folders on my server are also have a backup and I can always restore to a point in time to restore that particular file if for example it was changed or accidentally updated.

I run multiple servers in our cluster and there is always an AMI backup that runs before any backups are made. So there is constantly a redundant backup of the server.

I recently upgraded from Ubuntu 18 LTS to Ubuntu 20 LTS without issue, and had full redundancy in place in the event of an issue arising in production.

In addition, we use Github to manage all code, so there is a full backup of all code outside of the cluster.

In addition, we leverage RDS with multi-region redundancy so there is very little chance of losing any data.

Oh I see, in this case you're good. I just like the ability to restore to a particular point in time and just a single file if required, and not necessarily website code, but server config files, or any other files on the server besides the ones located in the /var/www directory.

yup that makes sense, 100% agree w/ you

3 months later

jond1 Got around to trying this and it's working really well - thanks :-)

I made one quick tweak (for my purposes which may be useful to others) to exclude the WordPress core files etc:-
tar -czf "$BACKUPPATH/$SITE/$DATEFORM-$SITE.tar.gz" --exclude-from="/usr/local/bin/s3backupexclude.txt" -C "$SITESBASE/$SITE" .

And then in s3backupexclude.txt:-

./logs
./htdocs/wp-admin
./htdocs/wp-includes
./htdocs/wp-content/index.php
./htdocs/wp-content/upgrade
./htdocs/wp-content/cache
./htdocs/wp-content/updraft
./htdocs/wp-content/ai1wm-backups
./htdocs/wp-content/*.zip
./htdocs/wp-content/*.gz
./htdocs/wp-content/plugins/index.php
./htdocs/wp-content/themes/index.php
./htdocs/wp-*.php
./htdocs/index.php
./htdocs/license.txt
./htdocs/readme.html
./htdocs/xmlrpc.php

There's also --exclude={/logs,/htdocs/wp-admin,/htdocs/wp-includes} if you don't want an external file list.

Awesome! Feel free to post up a gist of it so others can benefit from it as well.

    5 days later

    Hello marty,
    a small tip to speed up your backup : use pigz or zstd.
    Pigz is a multithreaded gzip available with the command tar -I pigz -cf yourbackup.tar.gz <other-options>
    And zstd (Zstandard) is the fastest and most powerful compression library available, it's better than gzip in speed and in compression level. You can use zstd with tar this way : tar -I zstd -cf yourbackup.tar.zst <other-options>

    19 days later

    Hi jond1
    Been tweaking this to work with non-WP sites (more on that when it's fully tested).
    But meanwhile, I think this:

        #delete old backups
        S3REM="$S3DIR/"
        aws s3 rm "$S3REM/$DAYSKEPT"

    should be?

        S3REM="$S3DIR"
        aws s3 rm "$S3REM/$SITE/$DAYSKEPT-$SITE".tar.gz
        aws s3 rm "$S3REM/$SITE/$DAYSKEPT-$SITE".sql.gz
    3 years later

    alexlii1971 if someone comes here struggling like i did .. xcloner was great for backup
    but not so great for restore
    had to finally give up and do something else
    came here for some cli scripts
    thanks

    Hosted by VirtuBox