'backup files to google drive using PHP
I have a server and a domain name on GoDaddy.
I want to create a backup for my files to be uploaded on Google Drive
So that all my files and my database have their data on Google Drive.
I use PHP
and MySQL
for my database
After some research, I found "Automatically backing up your web server files to GoogleDrive with PHP" and did what he said.
I have downloaded the files google-api-php-client
from the backuptogoogledrive repository.
And I have a client ID, client secret and an authCode
I edited the setting.inc and I put my own client ID, client secret and authCode. I also put my MySQL
username, password and hostname.
In this page backuptogoogledrive
it should create a .tar.gz
folder and this folder should contain my website files. Then, this folder should upload it to my Google Drive and do the same thing for my database.
<?php
set_time_limit(0);
ini_set('memory_limit', '1024M');
require_once("google-api-php-client/src/Google_Client.php");
require_once("google-api-php-client/src/contrib/Google_DriveService.php");
include("settings.inc.php");
if($authCode == "") die("You need to run getauthcode.php first!\n\n");
/* PREPARE FILES FOR UPLOAD */
// Use the current date/time as unique identifier
$uid = date("YmdHis");
// Create tar.gz file
shell_exec("cd ".$homedir." && tar cf - ".$sitedir." -C ".$homedir." | gzip -9 > ".$homedir.$fprefix.$uid.".tar.gz");
// Dump datamabase
shell_exec("mysqldump -u".$dbuser." -p".$dbpass." ".$dbname." > ".$homedir.$dprefix.$uid.".sql");
shell_exec("gzip ".$homedir.$dprefix.$uid.".sql");
/* SEND FILES TO GOOGLEDRIVE */
$client = new Google_Client();
// Get your credentials from the APIs Console
$client->setClientId($clientId);
$client->setClientSecret($clientSecret);
$client->setRedirectUri($requestURI);
$client->setScopes(array("https://www.googleapis.com/auth/drive"));
$service = new Google_DriveService($client);
// Exchange authorisation code for access token
if(!file_exists("token.json")) {
// Save token for future use
$accessToken = $client->authenticate($authCode);
file_put_contents("token.json",$accessToken);
}
else $accessToken = file_get_contents("token.json");
$client->setAccessToken($accessToken);
// Upload file to Google Drive
$file = new Google_DriveFile();
$file->setTitle($fprefix.$uid.".tar.gz");
$file->setDescription("Server backup file");
$file->setMimeType("application/gzip");
$data = file_get_contents($homedir.$fprefix.$uid.".tar.gz");
$createdFile = $service->files->insert($file, array('data' => $data, 'mimeType' => "application/gzip",));
// Process response here....
print_r($createdFile);
// Upload database to Google Drive
$file = new Google_DriveFile();
$file->setTitle($dprefix.$uid.".sql.gz");
$file->setDescription("Database backup file");
$file->setMimeType("application/gzip");
$data = file_get_contents($homedir.$dprefix.$uid.".sql.gz");
$createdFile = $service->files->insert($file, array('data' => $data, 'mimeType' => "application/gzip",));
// Process response here....
print_r($createdFile);
/* CLEANUP */
// Delete created files
unlink($homedir.$fprefix.$uid.".tar.gz");
unlink($homedir.$dprefix.$uid.".sql.gz");
?>
The problem now is that I have two folders for the database and there's no problem on it, and a second folder for the files. But this folder doesn't have any files on it.
How can I solve this problem?
// User home directory (absolute)
$homedir = "/home/mhmd2991/public_html/"; // If this doesn't work, you can provide the full path yourself
// Site directory (relative)
$sitedir = "public_html/";
Solution 1:[1]
It seem the backup script is unable to find the directory where the site files are stored.
Quoting from the tutorial you followed to do the backup:
// User home directory (absolute)
$homedir = trim(shell_exec("cd ~ && pwd"))."/"; // If this doesn't work, you can provide the full path yourself
// Site directory (relative)
$sitedir = "www/";
First ensure that $sitedir
is set properly with the relative path (from the home directory) to the site files directory.
It could be something different than www/
, for example public_html/
for a website hosted on GoDaddy.
If the above is correct try to set $home
variable manually with the absolute path of the home directory.
Update
$homedir
is the home directory and $sitedir
is the website root relative to $homedir
so looking at the code you posted it's quite likely there is a mistake, the two variable should be:
// User home directory (absolute)
$homedir = "/home/mhmd2991/"; // <-- FIXED HERE
// Site directory (relative)
$sitedir = "public_html/";
This assumes your website root directory is public_html
and is located inside your home directory mhmd2991
Again make sure your website root directory is actually public_html
and not www
or html
or anything else. Check it out using the terminal.
Solution 2:[2]
Dont you miss a hyphen in front of cf
in your tar? Isnt it supposed to be tar -cf
? You have tar cf
. I dont think it recognises it as a command parameter, so there is no file created.
Solution 3:[3]
I was able to run this script successfully while using root access (available in VPS / dedicated servers). But in shared server, where root access is not available, this script is aborted by the server. Shared servers have restrictions on how much maximum time a script can run (usually less than one minute - but depends on the hosting)
Try to manually run the script in SSH and you will see that the script is aborted by the server which results in no files in the folder created.
Solution 4:[4]
In GoDaddy, or any other platforms. The shared hosting generally do not provide shell access or do not allow ssh access. Many times, you will face issues running command in the shell, as shell executable command are banned by the hosting provider.
This helps them to provide resource to all the shared hosting users in a better way without hampering the isolation of one another.
You can still see options to turn on SSH nad other similar shell access options in different hosting platforms. Refer to the link below to activate SSH on GoDaddy https://in.godaddy.com/help/enable-ssh-for-my-linux-hosting-account-16102
Solution 5:[5]
Most quality hosting sites allow users to archive their program files/directory/site. Similarly, if you are using databases, the database program usually has a link where you can archive and or export tables or an entire database. You will have to look for the instructions on how to do this as it varies between platforms and database systems, and the hosting site.
Once you have archived your files/directory/site and/or databases you can use an FTP program to export them to your computer. Likewise you can then copy them to another computer or the cloud.
My preference is to archive whatever I want to export into zip or tar files. For security reasons I also prefer to save my database files to an external drive or put them on a recordable DVD.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 | Tamali |
Solution 3 | user20152015 |
Solution 4 | Vikash Mishra |
Solution 5 | Phil R |