Automate WP MU subsite backups | Wordpress Support | Forum Archive

The free forums are no longer in use. It remains available as read-only archive.

Avatar
Lost password?
Advanced Search
Forum Scope


Match



Forum Options



Minimum search word length is 3 characters - maximum search word length is 84 characters
The forums are currently locked and only available for read only access
sp_Feed Topic RSSsp_TopicIcon
Automate WP MU subsite backups
September 5, 2012
12:43 pm
Avatar
Guest
Guests

Hi,

I have a Wordpress MU installation, where there are about 30 or more subsites. I would like to configure xCloner to automatically create "partial" backups for each of the subsites so that in case any one of the subsites get corrupted, I have the option of restoring only that subsite. How would I go about doing this?

If the above is not possible: is there a way to use a FULL multisite's backup and restore only one of the subsites from it?

Thanks!

J

September 5, 2012
4:08 pm
Avatar
Ovidiu Liuta
Admin
Forum Posts: 2484
Member Since:
September 26, 2010
sp_UserOfflineSmall Offline

XCloner can only create automated partial backups of the files system, for database tables that option is only available when doing the backup from the Generate Backup screen, so the short answer would be No.

 

Anyway, if you do have a full backup in place, to restore a subsite, i advise doing the restore to a separate location of the full backup, then getting only the files/tables related to that specific site and restore them to the original, so that would be possible...

 

Hope it helps! Ovidiu

November 21, 2012
4:54 am
Avatar
Guest
Guests

Hi Admin,

Thank you for responding to my last question. I thought you were right in that it's easy enough to selectively restore a subsite from a full backup, so that was what I worked towards.

I've got my backups set up the way I want, with cron working just fine.

The problem I am having is that none of my files are being transferred to Amazon S3. The access and secret keys are there, and XCloner was even able to create a new bucket in my S3 account. But the bucket is empty.

I have tried manually trigerring the backups, and having it trigerred by cron. My backup files are about 1.2 – 1.5 gigabytes, and they are getting generated OK. But they do not get transferred to S3.

What could I be doing wrong?

Here's a link to the content of xcloner.log 

NOTE: I have set up my email in both cpanel's cron job and xcloner's option to email-send the cron log. I have received no email from either, does that mean that the script is still running?? It's been more than 3 hours already…

Hope you can help, and thanks in advance!

EDIT:

Right after I hit post, the cron daemon comes in. Hah! Here's what is has to say: http://pastebin.ca/2253657

Any advice?

November 21, 2012
9:05 am
Avatar
Ovidiu Liuta
Admin
Forum Posts: 2484
Member Since:
September 26, 2010
sp_UserOfflineSmall Offline

Seems to be a timeout issue between your current server and amazon s3, try and generate a smaller backup like a Database only one and see how it goes… Ovidiu

November 21, 2012
9:08 am
Avatar
Guest
Guests

I actually tried that, and the error went away. However, the file still was not copied over to Amazon S3. I then edited my htaccess file to increase the the timeout limit to a ridiculously long time, and no error despite the size. But still no transfer to S3.

Any more ideas?

November 21, 2012
11:03 am
Avatar
Ovidiu Liuta
Admin
Forum Posts: 2484
Member Since:
September 26, 2010
sp_UserOfflineSmall Offline

Did you get a success message that the file was copied to amazon when you've tried with the smaller backup? can you post a screenshot?

November 21, 2012
2:31 pm
Avatar
Guest
Guests

Actually, yeah, copying files over to Amazon for the small backup was indeed successful.

First solution that comes to mind is have the backups split apart into smaller pieces, but I am hoping for one that does not require this. Ideas?

 

    ### END REPORT
    </b><br />
<b>Backup Done</b><br />

<b>Backup file: /home/processw/public_html/administrator/backups/backup_2012-11-22_00-54_www.processworxportal.com.au-sql-nodrop.tar</b><br />
<b>Total backup size:20.94 MB</b><br />

<b></b><br />
<b>AMAZON S3: Starting communication with the Amazon S3 server...ssl mode 0</b><br />

<b>AMAZON S3: File copied to {PWX_Portal_Backups}//backup_2012-11-22_00-54_www.processworxportal.com.au-sql-nodrop.tar</b><br />
<b><br />
Deleting older backups than 6 days: </b><br />

November 22, 2012
12:01 pm
Avatar
Ovidiu Liuta
Admin
Forum Posts: 2484
Member Since:
September 26, 2010
sp_UserOfflineSmall Offline

It might be best to check with your hosting support to see what kind of execution limits they have, it could be something related to sockets timeout... unfortunatelly this is not an XCloner issue as the limit setup are outside the software config possibilities.... 

 

Ovidiu

November 22, 2012
12:06 pm
Avatar
Guest
Guests

Hi Admin,

Ok, I will contact my hosting provider.

However, one more issue is that I am setting cron the system to split the archive every 60MB, but the backups xCloner generates are more than 600MB. What could I possibly be doing wrong?

November 23, 2012
8:16 am
Avatar
Ovidiu Liuta
Admin
Forum Posts: 2484
Member Since:
September 26, 2010
sp_UserOfflineSmall Offline

The split option is only implemented in browser mode, cron backups are not splitted at the moment!

Forum Timezone: America/Chicago
Most Users Ever Online: 867
Currently Online:
Guest(s) 1
Currently Browsing this Page:
1 Guest(s)
Top Posters:
mlguru: 30
Django29: 29
Andy: 21
D: 21
Marcus: 20
Jamie F: 19
Member Stats:
Guest Posters: 738
Members: 10053
Moderators: 2
Admins: 3
Forum Stats:
Groups: 3
Forums: 7
Topics: 2397
Posts: 8236
Newest Members:
piotr K
Moderators: TriP: 0, Steve Burge: 0
Administrators: Ovidiu Liuta: 2484, Victor Drover: 1, Valentin Barbu: 0