The free forums are no longer in use. It remains available as read-only archive.
12:43 pm
Hi,
I have a Wordpress MU installation, where there are about 30 or more subsites. I would like to configure xCloner to automatically create "partial" backups for each of the subsites so that in case any one of the subsites get corrupted, I have the option of restoring only that subsite. How would I go about doing this?
If the above is not possible: is there a way to use a FULL multisite's backup and restore only one of the subsites from it?
Thanks!
J
XCloner can only create automated partial backups of the files system, for database tables that option is only available when doing the backup from the Generate Backup screen, so the short answer would be No.
Anyway, if you do have a full backup in place, to restore a subsite, i advise doing the restore to a separate location of the full backup, then getting only the files/tables related to that specific site and restore them to the original, so that would be possible...
Hope it helps! Ovidiu
4:54 am
Hi Admin,
Thank you for responding to my last question. I thought you were right in that it's easy enough to selectively restore a subsite from a full backup, so that was what I worked towards.
I've got my backups set up the way I want, with cron working just fine.
The problem I am having is that none of my files are being transferred to Amazon S3. The access and secret keys are there, and XCloner was even able to create a new bucket in my S3 account. But the bucket is empty.
I have tried manually trigerring the backups, and having it trigerred by cron. My backup files are about 1.2 – 1.5 gigabytes, and they are getting generated OK. But they do not get transferred to S3.
What could I be doing wrong?
Here's a link to the content of xcloner.log
NOTE: I have set up my email in both cpanel's cron job and xcloner's option to email-send the cron log. I have received no email from either, does that mean that the script is still running?? It's been more than 3 hours already…
Hope you can help, and thanks in advance!
EDIT:
Right after I hit post, the cron daemon comes in. Hah! Here's what is has to say: http://pastebin.ca/2253657
Any advice?
11:03 am
September 26, 2010
2:31 pm
Actually, yeah, copying files over to Amazon for the small backup was indeed successful.
First solution that comes to mind is have the backups split apart into smaller pieces, but I am hoping for one that does not require this. Ideas?
</b><br />
<b>Backup Done</b><br />
<b>Backup file: /home/processw/public_html/administrator/backups/backup_2012-11-22_00-54_www.processworxportal.com.au-sql-nodrop.tar</b><br />
<b>Total backup size:20.94 MB</b><br />
<b>AMAZON S3: Starting communication with the Amazon S3 server...ssl mode 0</b><br />
<b>AMAZON S3: File copied to {PWX_Portal_Backups}//backup_2012-11-22_00-54_www.processworxportal.com.au-sql-nodrop.tar</b><br />
<b><br />
Deleting older backups than 6 days: </b><br />
12:01 pm
September 26, 2010
1 Guest(s)