Here's what I'm trying to achieve: For each of my various websites, back up the web site files and the associated database for into a single dated compressed archive file (one zip file per website) and download that to my local Windows Vista machine. My websites are hosted on unix machines accessible via ssh. I've decided on using python Fabric as it seems to be well fit for the job.
Below is the code I have so far. It works but it's kind of a mess because I just hacked it together. I want to make it cleaner and a little more generalized. For example, right now it hard-codes where I'm storing web apps on the server, but actually for some of my websites they are stored in different places. Also it feels a little wonky to save db file to web file directory but I'm not sure how else to easily get it into the zip archive.
import time
from fabric.api import *
env.hosts = ['example.com']
env.user = 'someuser'
env.password = 'somepassword'
def backup_database(app_name, db_name):
db_passwords = {'somedatabasename': 'somedbpassword'}
mysqldump_command = 'MYSQL_PWD=%(password)s ionice -c2 -n6 mysqldump -u' +
' %(database_name)s %(database_name)s ' +
'> ./webapps/%(app_name)s/%(database_name)s.sql' % {
'password': db_passwords[db_name],
'database_name': db_name,
'app_name': app_name
}
run(mysqldump_command)
def backup_site(app_name):
date = time.strftime('%Y-%m-%d')
zip_filename = '%(app_name)s-%(date)s.zip' % { 'date': date, 'app_name': app_name }
zip_command = 'zip -r %(zip_filename)s ./webapps/%(app_name)s > /dev/null' % {
'zip_filename': zip_filename, 'app_name': app_name }
run(zip_command)
get(zip_filename, zip_filename)
run('rm ' + zip_filename)
def backup_webapp(app_name, db_name):
backup_database(app_name, db_name)
backup_site(app_name)
rm_command = 'rm $HOME/webapps/%(app_name)s/%(database_name)s.sql' % {
'database_name': db_name,
'app_name': app_name
}
run(rm_command)
def backup():
backup_webapp('somewebappfoldername', 'somedatabasename')
Using the fabric tool I call the script like this:
c:\> fab backup
If you don't know fabric, the script file is python and the run
commands are executed via ssh on the remote server. And the get
command will pull a file from the remote server to the local machine.
What it does is:
- Backup database to sql file and put it in webdirectory
- zip web directory
- download zip to my local machine
- delete zip file on server and delete database backup from web directory