..

Mega.io Backup Script

A Simple Backup Script: Leveraging Mega.io (and Mailgun)

Protecting your data is crucial, but using a full-fledged backup solution can be an overkill when you’re only running a small server. I’ve developed a backup script that provides an effective and easy to use solution by combining Mega.io’s encrypted storage service with Mailgun’s email sending API.

This tool is designed to back up both databases and server folders. In the following sections, we’ll explore the script’s components and discuss its setup on Unix-based systems.

The script, available on GitLab, is a verbose shell script written for Unix-based systems. It uses a combination of command-line tools, Mega’s MEGACmd tool and Mailgun’s API to create and securely store archives of your data and notify you about the process’s outcome.

Table of contents

  1. How it works (code walkthrough)
  2. System requirements
  3. How to set it up

How it works (code walkthrough)

  1. Configuration: At the beginning of the script, besides some helper functions, variables are defined in order to store information about the paths of the data you want to back up and some credentials extracted from the .env file (see the setup section).
  2. Creating the databases dumps: The script proceeds to use the mysqldump and pg_dump utilities to create dumps of the specified MySQL and PostgreSQL databases running in Docker containers. The database dumps are stored in the temporary backup directory. Needless to say, you could add more databases if you want to play around with the script.
  3. Data archiving: An archive is created, containing the database dumps created earlier and whatever else is in the path you provided to the script. Except for any folders called postgresql and mysql. You can change/remove that exclusion if needed (search for tar --exclude in the script and remove those arguments).
  4. Mega.io Upload: The script uploads the above-mentioned archive to the configured mega.io account using the mega-put command. By default, the script uploads the archives to a /backup path in your cloud storage. Change that as needed (the BACKUP_MEGA_PATH variable).
  5. (Optional) Email Notification: After completing the backup process, the script sends an email notification using Mailgun’s API. It constructs an API call using curl, which includes the Mailgun API key, domain, email sender, recipient, subject, and body. The body of the email is a summary of the backup operation, which is composed as the process takes place. Here’s an example of an email notification:
    18-04-2023-21:29:18: Creating data dumps for databases.
    18-04-2023-21:29:18: Dumping data for database mysql
    18-04-2023-21:29:19: Created data dump for mysql.
    18-04-2023-21:29:19: Dumping data for database pgsql
    18-04-2023-21:29:20: Created data dump for pgsql.
    18-04-2023-21:29:20: Archiving /path/to/data.
    18-04-2023-21:29:21: Created ./backup/18-04-2023-21:29:18-49b5f1bc30cb0f3b5608cb4ba2811e68-data-backup.tgz. File information:
    18-04-2023-21:29:21: -rw-r--r-- 1 user user 904K Apr 18 21:29 ./backup/18-04-2023-21:29:18-49b5f1bc30cb0f3b5608cb4ba2811e68-data-backup.tgz
    18-04-2023-21:29:21: Uploading ./backup/18-04-2023-21:29:18-49b5f1bc30cb0f3b5608cb4ba2811e68-data-backup.tgz to cloud.
    18-04-2023-21:29:25: Upload complete.
    18-04-2023-21:29:25: Removing ./backup/18-04-2023-21:29:18-49b5f1bc30cb0f3b5608cb4ba2811e68-data-backup.tgz and database dumps.
    18-04-2023-21:29:25: Finished backup for /path/to/data!
    18-04-2023-21:29:25: Sending email with operation logs.
    
  6. Clean Up: Finally, the script removes the archive previously uploaded to the cloud, along with the database dumps, ensuring that no significant residual data is left on the local system. Logs stored in the temporary directory are not removed, as they could be used to trace and debug past script runs and possible issues.

System requirements

The script was implemented and used in systems that meet the requirements listed below:

  • OS Ubuntu > 18 with the following packages installed: mysqldump, pg_dump, tar, gzip, curl, and MEGAcmd(see how to get it here).
  • The databases run in Docker containers on the host machine. This could be customized to your needs, but that’s what I needed.
  • You want to/can use Mega.io for cloud storage.
  • The system running the script has enough disk capacity, at least equal to the size of the data you want to back up (needed when databases dumps and the archive are created).

Optional:

  • You want to/can use a Mailgun account to send an email regarding the backup operation logs
    • Please note that Mailgun no longer has a free tier service, unfortunately.
    • The backup script works without Mailgun and sending emails.

How to set it up

  1. Create and set up your (free) Mega.io account
    • Familiarize yourself with the MegaCMD user guide.
    • You might need to log in: mega-login --auth-code=XXXXX [email protected] password.
    • Over time, if you exceed your cloud storage capacity, you can manually remove older archives.
  2. Optional: Create your Mailgun account and obtain your API key and domain
    • Please note that Mailgun no longer has a free tier service, unfortunately.
    • The backup script works without Mailgun and sending emails.
  3. Make sure you fulfill the system requirements.
  4. Download the script to your server:
    wget https://gitlab.com/ioiste/blog/-/blob/main/assets/2023-04-18-mega.io-backup-script/backup.sh
    
  5. Create a .env file in the same path where you downloaded the script and fill in the following environment variables:
    #MySQL
    MYSQL_PASSWORD=
    MYSQL_CONTAINER_NAME=
       
    #Postgres
    POSTGRES_USER=
    POSTGRES_PASSWORD=
    PGSQL_CONTAINER_NAME=
       
    #Mailgun - OPTIONAL. DO NOT ADD these to the .env file if you don't use Mailgun
    MAILGUN_API_KEY=
    MAILGUN_API_ENDPOINT=
    MAILGUN_FROM_SENDER_NAME=
    MAILGUN_FROM_SENDER_EMAIL=
    MAILGUN_TO_EMAIL=
    
  6. Create a backups directory in the same path where you downloaded the script. It will be used as a “temp” directory for logs and temporary data.
  7. Ensure the script is executable by running the command chmod +x /path/to/backup.sh.
  8. Test the script manually to ensure it works correctly and that you (optionally) receive the email notification:
    /path/to/backup.sh /path/to/data
    
    • Please note that it’s crucial you provide the path to your data without a / at the end.
  9. Use a scheduler like crontab to run the script at regular intervals. For example, to run the script at 00:00 on every Monday, you could do something like this:
    0 0 * * 1 /path/to/backup.sh /path/to/data >> /path/to/backup/backups.log 
    
    • The output of the script is sent to a log file in case that things go wrong.
    • Tip: You can use a tool like CrontabGuru to easily come up with crontab schedule expression.
  10. Monitor the email notifications (if you’re using the Mailgun integration) and backup storage to ensure the process runs smoothly and make any necessary adjustments to the script based on your evolving needs and server configuration.

ChatGPT somewhat helped me write this post based on the implementation of the script. I’ve been putting this off for months. Thanks, GPT!

Noticed something wrong or want to say hi? You can get in touch with me at [email protected].