The free forums are no longer in use. It remains available as read-only archive.
10:16 pm
September 30, 2010
I contacted them and they have kill scripts that won't let anything run for more
than 30 seconds of CPU time, this is to prevent the server from
overloading. Seems like a pretty common problem on most hosting accounts. Is there anything that can be done about that? I used to use another backup component that had to break down the file size to keep it from timing out.
10:36 pm
September 26, 2010
The timeout you receive is not caused by the actual backup process, but by the amazon s3 transfer period(which depends very much of the internet connection between the servers, correlated to the backup size), usually the 30 seconds timeout is related to script running in a browser, if you would run the script using the php executable cron command type, and a server based cron manager, that timeout time would be bigger, but it depends on the hosting account.
Unfortunatelly there isn't much we can do in this case, you would need to get that timeout increased.
Anyway, could you post here the cron command you are using to dublecheck?
Ovidiu
8:42 pm
September 30, 2010
I'm wondering if there might be some way to break up the uploads or something like some of the other backup components do. Some of the others are able to transfer huge backups, but have to split them into parts in order to get it to work without timeouts.
I am having terrible luck getting xcloner to upload to my AmazonS3. I love xcloner otherwise though!
In automatic cron mode, splitting the backup upload parts won't really help, as the script still needs to run in order to get the rest of the parts uploaded and it will still timeout due to the limit you indicated above. I will look however into getting a manual option that can split the upload process and which could be used from the View Backups screen.
Ovidiu
1 Guest(s)