Data is an essential ingredient for any business, no matter how large or small! Losing data can cost a lot to business so it is important to take backup and keep the data secure. Talking of data backup and security lets take a walk-through of how to create a database backup file and upload rather securely save it to a cloud platform like AWS using Laravel.
Artisan command for creating the DB backup and Uploading it to AWS
We will have an artisan command that will handle the generation of the backup file and then upload it to AWS smoothly using cron. It will be using the mysqldump tool that is available along with the MYSQL installation. This tool dumps the database into a .sql file. So let us start by creating an artisan command so, in the console, run:
php artisan make:command DBFileUploadAws
This will create a class with the name in the command (here DBFileUploadAws.php) in App\Console\Commands.
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
Class DBFileUploadAws extends Command
{
/*** The name and signature of the console command.
*
* @var string
*/
protected $signature = 'file:DBFileUploadAws';
/**
* The console command description.
* @var string
*/
protected $description = 'Copy DB backup file to Amazon S3 daily ';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return mixed
*/
public function handle()
{
}
}
Now we add all the code that we want to execute when the cron job runs, inside the handle function. To add the code in the handle function (in the cron job class, here DBFileUploadAws.php), for generating a database backup file and uploading it to AWS, follow the steps as given below:
Create a Database Backup File
- To dump the database and save it to an SQL file, add the following command:For example,
mysqldump -v -u[user] -h[host] -p[password] [dbName] > [file_path]
where, the variables $dbUser, $dbHost, $dbPassword, $dbName are the config variables (get their values from the env file via config files) and dbBackupFileUC is the name of the file(uncompressed) to be created.$mysql_dump_command= “mysqldump -v -u{$dbUser} -h{$dbHost} -p{$dbPassword} {$dbName} > $backupFileUC”;
- Now to compress the file, we have command gzip -9 -f [filename]. Therefore,
$gzip_command= "gzip -9 -f $backupFileUC"
- Next, to run these commands in the console add the following code in the handle function and include the classes as follows ,
//include this on the top use Symfony\Component\Process\Process; use Symfony\Component\Process\Exception\ProcessFailedException; //add this code in the handle function $mysql_dump_process= Process::fromShellCommandline($mysql_command); $run_mysql->mustRun(); $gzip_process= Process::fromShellCommandline($gzip_command); $gzip_process->mustRun();
Upload The File to AWS S3
Amazon s3 setup
- To create a new aws account use link: https://portal.aws.amazon.com/billing/signup or if you already have an account then sign in to the AWS management console.
- After successfully logging in, create a bucket using steps as below:
- In navigation bar select Services → S3 (or directly go to the link: https://s3.console.aws.amazon.com/s3/home)
- Click on the ‘Create Bucket’ and enter the bucket name and region. Make the required settings and create the bucket.
- Now, go to the bucket and to generate bucket policy go to Permissions → Bucket Policy and find the policy editor as below:
- Click on policy generator and add details. For uploading the files to bucket, select policy type as ‘S3 Bucket Policy’, service as ‘Amazon S3’ as follows and click on ‘Add Statement’ :
- Click on ‘Generate Policy’ and on successful policy generation, a pop-up as given below must display the policy details:
- In navigation bar select Services → S3 (or directly go to the link: https://s3.console.aws.amazon.com/s3/home)
Steps for copying the file to s3 using Laravel
- Install the s3 package via composer, using command: composer require league/flysystem-aws-s3-v3
- Now add the API key and secret key to the .env file of the project. Their values can be found in ‘My Security Credentials’ from navigation bar in AWS management console.
AWS_ACCESS_KEY_ID=xxxx
AWS_SECRET_ACCESS_KEY=xxxx
AWS_DEFAULT_REGION=region
AWS_BUCKET=bucket name
- To upload the file from local storage to the amazon s3 bucket add the following code to the handle function inside the cron job:
$copyFIle= Storage::disk('s3')->put( $s3path.’/’.’$fileName, file_get_contents(public_path($localFile)));
where, $s3Path= folder or pathname in the bucket $fileName= save file as name in cloud $file= local storage file
So, what the code looks like?
The code in the cron job class that is DBFileUploadAws.php will look like the following. We have the command name, command description and the handle function where the code to be executed in the cron job is included.
namespace App\Console\Commands;
use Illuminate\Console\Command;
use League\Flysystem\MountManager;
use Illuminate\Support\Facades\Storage;
use Illuminate\Http\File;
use Illuminate\Support\Facades\Config;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
class DBFileUploadAws extends Command
{
/**
* The name and signature of the console command.
*
* @var string
*/
protected $signature = 'file:dailyBackupOnS3';
/**
* The console command description.
*
* @var string
*/
protected $description = 'Copy DB backup file to Amazon S3 daily ';
/**
* Create a new command instance.
*
* @return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* @return mixed
*/
public function handle()
{
$dbUser = config('database.connections.mysql.username');
$dbPassword = config('database.connections.mysql.password');
$dbHost = config('database.connections.mysql.host');
$dbName = config('database.connections.mysql.database');
$dbBackupFolder = storage_path('app/'.config("filesystems.local_db_backup_folder")); //local folder for backup file
$dbBackupFile = $dbBackupFolder ."/dbbackup.sql"; //uncompressed file
$dbBackupFileCompressed = $dbBackupFolder .'/dbbackup.sql.gz'; //compressed file
$localFilePath= $dbBackupFile; //local backup file
$s3Folder= config("filesystems.s3_backup_folder"); //folder or path for s3 file
//system command to backup database
$mysql_command = "mysqldump -v -u{$dbUser} -h{$dbHost} -p{$dbPassword} {$dbName} > $dbBackupFileUC";
$gzip_command = "gzip -9 -f $dbBackupFileUC";
try {
$process_mysql = Process::fromShellCommandline($mysql_command);
$process_mysql->mustRun();
$process_gzip = Process::fromShellCommandline($gzip_command);
$process_gzip->mustRun();
} catch (ProcessFailedException $exception) {
$errorOcurred= $exception->getMessage();
return $this->error($errorOcurred);
}
//check if file exists in local
$fileFound = file_exists($dbBackupFile);
if($fileFound == true){
$copyTos3= Storage::disk('s3')->put($s3Folder .'/dbbackup.' .'.sql.gz', file_get_contents($dbBackupFile));
}
}
Thus, Laravel makes it pretty simple and easy to create database backup files and wrap it in the artisan commands. Also, it allows uploading files to s3 with its package flysystem-aws-s3-v3 effortlessly. However, creating a backup of the most important aspect of the business that is data and saving it securely to a cloud like AWS makes it feel much safer and the businessmen can sit back and relax.