February 26, 2005

Cron and mySQL Database Backups

As anyone with much computer experience knows, file backups are a good thing, and better too many than none. This bit of good sense applies absolutely to database websites, where the content of nearly your entire website is contained in a database on your web server and nowhere else. If something happens (for instance, if the database gets corrupted), you’ll get to experience the thrilling -poof!- of lost data. Sure, there may be a backup on the server, but that’s a maybe, so planning for the future is always good.

How you back up your database is your choice; many web hosts offer automated solutions. Me, I have some virtual private servers, so whatever might be automated is going to have to be automated by me. I’ve been playing with various options but, interestingly enough, just found a link in my server logs (as an Internet marketer, you *do* read your server logs, don’t you?) and found a link to BloggingPro.com — specifically, a post that just happened to be about database backup scripts on *nix.

So. My preferred method of backing up databases and other server tasks is via SSH line commands. I prefer it because, while a nice GUI interface may spare you the pain of having to understand whatever you’re doing and then doing it manually, the interface somewhat masks what’s going on — and I like to know precisely what’s happening. So, line commands first, interfaces later. Maybe. The thing is that server syntax — the precise commands used to achieve an end — seem to vary somewhat from server to server. Miss a character and the command bombs out. So, on my FreeBSD server, using the mysqldump functionality, this line command works via SSH to create a “dump” file of the entire database in ASCII format in whatever directory you’re in; no need to go into mysql or to specify *where* the database is:

mysqldump -u root -p DATABASENAME > BACKUP.sql

(Courtesy of ModWest Hosting.) Remembering that I’m not a server guru, here’s the translation:

mysqldump — start the dump sequence
-u — specifies a user
root — the user
-p — specifies that it should ask for a password
DATABASENAME — specifies the specific database
> — kind of a “to” symbol
BACKUP.sql — the new file name for the dump results

Hint: if you’ve ever changed the main server password, root’s password is likely to be the *original* one.

Now, since typing line commands into a server on a regular basis is not a good plan (I’m a big city kinda girl, doncha know), my next move was to write a shell script to run the above via the server’s cron function at intervals of my choosing, which is precisely where our unknown friend from BloggingPro.com came in. Nice, eh? Just remember that you may have to play around with his line commands before it works on your server.

Cron mySQL backup script on *nix

#/bin/sh
/usr/local/bin/mysqldump -uroot -pROOTPASSWORD –opt DATABASENAME > BACKUPNAME.sql
/usr/bin/gzip BACKUPNAME.sql
/bin/mv BACKUPNAME.sql.gz MYBACKUPS/`/bin/date +BACKUPNAME.sql-%Y%m%d.gz`

You’ll have to ensure that the pathnames are correct (e.g., /usr/local/whatever).

What the above does is request the mysqldump sequence by root, give root’s password, and then specify the database and the final .sql file output name.

Then (I told you I was a city girl), it gzips the backup, then moves it to an existing folder (here, MYBACKUPS) with the date appended.

Nice and tidy, eh?

<added> Some references on restoring the mysqldump file:

This code should work on a VPS, but you’ll have to allow for server location of the particular programs (usr/local/whatever — see the encoding above):

gunzip wordpress.sql.gz
mysql < dbname.sql

Update: You *may* run into a situation where the database doesn’t restore with the above commands. One solution (given graciously by an amiable Verio tech) is to add the following to the top of the mysql dump file:

use DATABASENAME

That said, this is what works on some servers: log in to server as root, but do not go into mysql:

mysql -u root -p -D DATABASENAME < DUMPNAME.sql

Hit Return/Enter, you’ll be prompted for your password, and life is good.

34 Comments to "Cron and mySQL Database Backups"

  1. wolfgang.lonien.de » Blog Archive » Automated backups for Wordpress says:

    […] might want to back that up. Look at the tips from Diane Vigil, and/or from Blogging Pro (she points to the latter). Note that you […]

  2. Kevin says:

    Diane,

    This is a great article on database backups. What do you do when the amount of space required for a complete database dump approaches the amount of free disk space on the system?

    I am starting to run into this problem with my system and I am trying to devise a now backup strategy.

    Thanks,

    Kevin

  3. DianeV says:

    Thanks; I thought it would be of help.

    What I do is manually delete old copies. You *could* set it not to add the date, which would mean it may overwrite older copies (not a good idea if a current copy is corrupted — it could happen) or that it would not overwrite, in which case you’d only have old copies.

    I have been meaning to look into updating the script to email the backup.

  4. DianeV says:

    Here’s a search for shell script email attachment. I’m guessing that would be helpful — though, in your case, if the backup is *that* large, email may not be your answer.

  5. James Asher says:

    For what it’s worth, here is my mysql backup script:


    #!/bin/bash

    # this is run by the root crontab

    datetime=`date '+%m-%d-%y-%H-%M-%S'`
    /opt/lampp/bin/mysqldump -uUSER -pPASSWORD xtec | gzip > /root/backups/backup-$datetime.gz

    find /root/backups/backup-$datetime.gz

    I think it’s a little more streamlined, as it pipes the output to gzip. Also, it uses the current datetime in the filename so that you can tell them all apart. And finally, using the find command will output the location of the file if it exists. Since this is in cron, any output from the cron script will be emailed to whomever is the owner of the cron job.

  6. Diane Vigil says:

    Yes, it *is* more streamlined; I hadn’t known how to pipe to gzip. What does the find command do?

  7. James Asher says:

    The find command will simply look for a filename that is passed to it, if it finds it, it echos it. This output gets emailed via cron.

  8. Diane Vigil says:

    Ah, so find locates the file, and cron emails to you … the file, or a link to the file?

    If I could trouble you for it, what would be the command in cron to include the file in the email … or is that necessary?

  9. James Asher says:

    What do you mean by ‘include the file in the email’? Any text that gets echo’d in the file execution will be included in the cron file. The find command will echo the location of the file supplied to it as an argument, assuming it finds it, otherwise it will say that it cannot locate it.

  10. Diane Vigil says:

    My mistake. Thanks for the clarification.

  11. Mike aka Jy says:

    using -pPASSSWORD in a crontab is a HORRIBLE idea. Why you ask? well say you have a large running DB and you do the backup nightly at 4am

    Ok now I have a webhosting account on a server that maybe doesn’t run php in safemode.. so I can look at running processes.. well if i catch it in time I can see the admin user/pass combo running in the process list.

    You should set the password in a ~/.my.cnf file and the mysql client and mysqldump will read it from there..

    please before writting these blogs to get on digg.. do some research

  12. Diane Vigil says:

    Thanks, Mike. Of course, one should take into account server setups and whatever is possible to ensure security as best as possible.

    By the way, I didn’t propose this post to Digg; in fact, I’ve never proposed anything to Digg.

  13. James Asher says:

    I’m not running on a shared web server.

  14. Diane Vigil says:

    Neither am I, James. Not to mention that I wrote the first post in February 2005.

    That said, many shared hosting accounts do not allow cron jobs, nor will they read a .conf file; they’re just not that fancy.

  15. J Williamson says:

    This article was very helpful! As a relative “noob” to a lot of the functionality and power in command line scripting it’s really helped me complete regular backups of my forum database which is great!

    Found via Google search “mysql database cron backup”

  16. Diane Vigil says:

    Well, thank you! (And welcome to developedtraffic.com.) It’s good to hear that this blog helps people occasionally. :)

    Command line scripting is really useful; automating things is the way to go. Servers can be set up a little differently, though, but I find that a little messing around with the commands can usually sort it out.

  17. Koesmanto Bong says:

    Thanks for this helpful article. A quick question though, is there a way I can be sent an email of the status of the cron job after it ran, eg. successfully executed or error messages, etc?

  18. Diane Vigil says:

    Actually, yes. At the top of your cron file, add:

    MAILTO=”someemail@domain.com”

    Mine report on what was run and when, and whether there might have been any problems.

  19. James Asher says:

    An email will be sent containing anything echo’d by the script that cron is running. This will be sent to the email address of the person running the cron job.

    Also, you can specify at the top of the cron job a mailto and the contents will be sent there instead of the associated email address of the user running the job. Ex: MAILTO=root or MAILTO=johndoe@example.com

  20. James Asher says:

    Looks like more than one person is subscribed to comments on this article. :-)

  21. Diane Vigil says:

    True. :)

    By the way, your single.php page at your blog seems to be having a problem (I’m guessing that it’s the section calling comments.php).

  22. Koesmanto Bong says:

    is that the MAILTO in the crontab file, or just add a MAILTO in the .sh file that I put inside cron.daily folder?

    thanks a bunch!

  23. Diane Vigil says:

    Mine’s in the cronjobs file that references the individual .sh files that contain all the commands; I broke it up because it got rather long. I’d guess that you can do it either way, but you’d have to test it first.

  24. Koesmanto Bong says:

    i am new with this cron job/shell scripting world, so forgive me for my ignorance.

    what i have going now is i created a file called mysql_backup.daily.sh and put that file inside cron.daily folder. that file has rwxr-xr-x permission. inside that file is this command:

    #!/bin/bash

    #this is run by the root crontab

    datetime=`date ‘+%m-%d-%y-%H-%M-%S’`

    mysqldump -uuser -ppass db > /root/backups/db-$datetime.sql

    mysqldump -uuser2 -ppass2 db2 > /root/backups/db2-$datetime.sql

    does that look right? that script would technically be run daily right? and where would i put the MAILTO command, after #!/bin/bash/?

    thanks for your assistance so far, Diane.

  25. Diane Vigil says:

    It looks like you’re using James’ version, as I’m not sure what datetime does. I specify time of file running like this:

    50 23 * * * /server/path/to/file.sh

    … which means 11:50 PM (23rd hour, 50th minute).

    The mailto command should be after the bin/bash (mine is different).

    What you *could* do is to set it to run in five minutes; upload it and reset via the ~cronjobs command, and see if it works. If it doesn’t then you tweak.

    By the way, however inelegantly put by Mike aka Jy (I don’t do Digg) above, his suggestion to move the root password elsewhere is good, especially if you’re using a shared web host.

  26. Koesmanto Bong says:

    where do you specify 50 23 * * * /server/path/to/file.sh, diane?

    i have a dedicated server, but thanks for the advice, i am looking into my.cnf file now.

  27. Diane Vigil says:

    Well, I’ve set mine up a little differently by moving the individual programs to .sh files, but it would be cronjobs (or wahtever your crontab file is called). In fact, you could put the whole thing in cronjobs:

    50 23 * * * /server/path/to/file.sh

  28. Koesmanto Bong says:

    I am having some problem with cron and .my.cnf. maybe you, diane, or james or Mike aka Jy can help.

    2 scenarios:

    1. i set up ~/.my.cnf file with the user name and password and host. then i set up my cron job. i am 100% sure that the username and password are correct because i manually run the script and it did what it needs to do. i went to the backup folder today and the backup files were created but they were blank. so i proceded with the #2

    2. i disabled ~/.my.cnf file and actually have -u -p in my shell mysqldump script and the set the cron job up to run again in 5 minutes. 5 minutes later, i found the backup files with content in the backup folder.

    my conclusion: the ~/.my.cnf file wasnt being read. what did i do wrong? any suggestions?

  29. Diane Vigil says:

    Koesmanto, I found some references at Google. Specifically, this discussion at the A Small Orange forums might help.

    Otherwise, you may have to tweak the code a bit, as servers differ from one to the next.

  30. Ctmax Xunil says:

    A good article and reference, although some comments complicates it, just following the tips will aid to create a good cron.
    Many thanks!

  31. Diane Vigil says:

    Hi Fili. You could try it. The username and password should be those assigned for use of that database (rather than a customer login).

    Otherwise, what I outlined above should work for you.

  32. Fili says:

    It is also possible to specify a location of your .my.cnf file on the command line:

    /usr/bin/mysqldump –defaults-file=/path/to/.my.cnf -u root mydatabase

    Put this in the .my.cnf file

    [client]
    password=mypassword

  33. WordPress 2.9.x upgrade causes weird characters on blog says:

    […] you have the ability to download your database (here's the harder version of getting a mysqldump, which is much easier if you have phpMyAdmin), I'd suggest downloading only […]

  34. I think i've been hacked - Page 2 - PhotoPost Community says:

    […] directory. To best way to backup your classifieds database is maybe with a cron task like this Cron and mySQL Database Backups Or for less technically minded users phpmyadmin is the best way and is usually installed in your […]

Have your say ...

First-time comments will be held for moderation (but comments are appreciated). Otherwise, just be courteous. If your name is a bunch of keywords, your comment will be deleted. Don't post links unless highly pertinent. Posters must be 16 or older.

Manage your subscriptions

Archives
© 2004-2014 DianeV Web Design Studio. All Rights Reserved.
33 queries. 0.241 seconds.