ACBobby Posted June 9, 2010 Share Posted June 9, 2010 I'm sure there's a way to do this and I'm just out of it tonight. Basically, I'm using MySQL Administrator (the now EOL version) to fulfill automatic full backups of my database. I gave Workbench a try, but sadly to say, I messed that up really quick when I deleted most of the tables in my database. Essentially, I have backups scheduled to be automatically created every Friday at midnight. My question is: Is there a way to automatically have every new update FTP'd to a remote location so that I may have the backup in another location away from my home hard drive? I'd rather not have to actually manually interfere after every update to ensure that the correct file gets uploaded next time. Thanks, Bobby Quote Link to comment Share on other sites More sharing options...
Infiltrator Posted June 9, 2010 Share Posted June 9, 2010 (edited) http://www.debianhelp.co.uk/mysqlscript.htm http://www.noupe.com/how-tos/10-ways-to-au...l-database.html Edited June 9, 2010 by Infiltrator Quote Link to comment Share on other sites More sharing options...
3TeK Posted June 9, 2010 Share Posted June 9, 2010 (edited) this is what we use on our mysql server...(started using it AFTER we had to pay 5g to recover all the data) ($Second, $Minute, $Hour, $Day, $Month, $Year, $WeekDay, $DayOfYear, $IsDST) = localtime(time); $Year += 1900; $Month += 1; $dt = sprintf("%04d%02d%02d", $Year, $Month, $Day, ); exec "/usr/local/bin/mysqldump --user=root --password=hak5 -A -f |gzip > /backup/$dt.gz"; note.. the backup mount (/backup) is a mounted NFS over the network to our freeNAS box. Hope that helps. Edited June 9, 2010 by 3TeK Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.