Creating a backup using cron from the command line.
If you need to set up a backup of your site to automatically back up site data on your VPS, you can use ssh or SFTP to connect to the VPS as root. You will need to create several folders in the root directory (/) to house the backups. Create the following folders:
Now in the /backup folder, create a script that will run the cron scheduler to create backups. Create a new file and name it e.g. www-backup.sh or mysql-backup.sh. You can use the backup scripts described earlier in the articles. Note that you need to replace "yoursqlpassword" with your root password for accessing the databases or the user password for the desired database.
Creating two different scripts to back up the site and database allows us to run them at different times. For example, if it is usually enough to back up the site once a week, then it is desirable to back up the database daily.
Adding scripts to the cron scheduler.
After creating the scripts for backing up the database and site files (for example, www-backup.sh and mysql-backup.sh), add tasks to the crontab file. Open the /etc/crontab file and add the following lines:
Or for another user other than root:
00 2 * * 1 root sh /backup/www-backup.sh
00 3 * * * root sh /backup/mysql-backup.sh
In this example, the site files will be archived every Monday* at 2:00 AM and the database will be backed up every day at 3:00 AM. (* Depends on the settings of the first day of the week in the OS on the server.) You can change these intervals according to your needs. After adding the jobs, restart the cron daemon for the changes to take effect. To do this, use the command:
service cron restart
Please note that the scripts save only the latest versions of backups, according to the parameters and settings of the script itself.
Creating a scheduled backup for LAML/LEMP without a script.
If In this example, a new backup will be created with the name "backup.new.tar.gz" and the old copy will be renamed to "backup.old.tar.gz". When the job runs, the old copy will be deleted and then a new copy will be created.
Here are two crontab jobs to archive just the site. Can be used for different sites, dividing the archiving execution for different times.
00 02 0 * * cd /backup/; rm -f /backup/backup.old.tar.gz; mv /backup/backup.new.tar.gz backup.old.tar.gz; tar -czvf backup.new.tar.gz /home/admin/web
00 03 0 * * cd /backup/; rm -f /backup/backup.old.tar.gz; mv /backup/backup.new.tar.gz backup.old.tar.gz; tar -czvf backup.new.tar.gz /home/admin/web
Automated backup with cron and script
Another option for backing up a website and database using cron. In this case, the task will run every Tuesday at 21:00 according to the script.
Open the nano /etc/crontab file and add the following line:
00 21 * * 2 root sh /home/admin/backup.sh