SIGN IN

    Performing regular backups of a website or database (DB) on Linux servers is an important practice to ensure data security and integrity. Here are some of the main reasons why it is necessary to perform backups using scripts:

    1. Crashes and disaster recovery situations: Websites and databases can experience various problems such as server crashes, malicious attacks, developer errors, or accidental deletion of data. Regular backups allow you to restore your data and website before the problem occurred.

    2. Protection against data loss: Backups are a guarantee of data safety in case of deletion, damage or loss. If a website or database contains important information such as user data, orders, content, or settings, a backup helps minimize loss and restore functionality.

    3. Testing and Development: Data backups can be used for testing and development purposes. They allow you to create copies of your production environment to test new features, fix bugs, or experiment without the risk of losing real data.

    4. Migrating to a new host: When moving a website to a new host or changing the server infrastructure, backups help to save data and transfer it to a new platform.

    5. Compliance with security regulations: In some industries, such as finance or medicine, there are rules and regulations that require the creation and storage of backup copies to ensure data security. Executing backups using scripts makes it easy to automate this process and follow the corresponding requirements.

    6. Recovery from human error: Sometimes deletion or modification of data can be the result of unintentional human error. Backups allow you to return data to a previous state and roll back to a more stable version.

    Performing backups using scripts allows you to automate and simplify this process,


    Script.

    For example, you need to perform periodic website backups to an external FTP storage. You can use a bash script. To use this script and perform backups, you need to have wput installed. The script can be downloaded from GitHub.

    We perform the following actions: Go to the /backup/sql directory. We have a special directory on our server where we save our database backups. Using the mysqldump utility, we create a backup of our DATABASE database, with the connection password PASS. We export the contents of the database using the mysqldump command, and then compress it with a maximum compression level of -9 using gzip. We save the compression result to a file with a name containing the current date and a .gz extension. We then execute the wput command, which sends our database backup file (date +sql-dump.%d%m%y.gz) via FTP to our remote server. Use an FTP connection and specify the username userXXXXXX, the password PASS, and the IP address of the IP server.

    We also have a separate directory where we store backup copies of our website files. Go to the site backup directory /backup/www. We use the tar command to create an archive of our website files /var/www/html/. Preserve file permissions with the p flag and compress the archive with gzip and the z flag. The result is saved to a file with a name containing the current date and a .tar.gz extension.

    Finally, we execute the wput command to send an archive file of our website files (date +files.%d%m%y.tar.gz) via FTP to our remote server. We are passing data for the FTP connection, including the username userXXXXXX, the password PASS, and the IP address of the remote IP server or external FTP storage.

    Thus, we create backup copies of our database and website files and then send them to a remote FTP server for safe storage and data safety.

    #!/bin/bash
    cd /backup/sql
      mysqldump  -u root -pPASS  DATABASE | gzip -c -9  > `date +sql-dump.%d%m%y.gz`
      wput `date +sql-dump.%d%m%y.gz` ftp://userXXXXXX:PASS@IP/
    
    cd /backup/www
      tar -cpzf  /backup/www/`date +files.%d%m%y`.tar.gz /var/www/html/
      wput `date +files.%d%m%y.tar.gz`  ftp://userXXXXXX:PASS@IP/

    Notes.

    You can create a script and put it in /etc/cron.daily/ for daily backups with the date of archiving in the name. If the script starts automatically, there may be negative consequences, after some time, when the space on the server or external FTP storage is full, backups will stop running! This simple script can only be used if you yourself will periodically check and / or delete old backups by the date of creation or by checking the date in the archive name. Executing a script by cron and / or more complex options will be discussed in another article.