Right now, doing it pretty simply. Using fabric to update using a git pull(and other necessary tasks).
Ultra simple compared to things like ansible, but for our current needs it works out great and scales well.
For a three-node deployment I'd probably suggest that full on chef and puppet are a little heavyweight, and there is a bit of a learning curve.
Since you know the steps it might well be the case that you could automate deployment with fabric, which is a simple python tool to run commands on remote servers via SSH. I've certainly used it in the past to just:
make compress
, or similar to minify js.)There are a few "simple" tools in this area, fabric, capistrano, and so on. They're almost certainly easier to work with than puppet, cfengine, chef, ansible, although those alternatives will scale in the future and allow you to do a lot "more".
Really it depends what you're looking for now and in the short term: Simple deployment of one application, or the ability to script and control arbitrary things upon a group of machines.
The code looks good enough, but you may be wasting your time: Fabric probably handles this for you. (At least, without knowing the context, I'm going to guess you probably should be using Fabric; it's certainly possibly you're in a situation where it doesn't make sense.) In Fabric, all that code boils down to:
from fabric.api import *
env.hosts = ['my.host.example.com']
@task def upload_my_file(): put('/path/to/local.file', '/dest/on/the/server')
And then fab upload_my_file
kicks it off.
I learned Python when I was hired on as a DevOps engineer and they needed a way to automate virtual appliance builds for clients. I ended up putting together an application which leverages Fabric, Vagrant, and VMWare vSphere to build virtual machines, install our web application onto the VM, smoke test the VM and application, then ship it as an OVA to our distribution server and then securely send the credentials and download information to the client.
I like to use fabric together with rsync. You can do some things like backup the database, download and import locally, download all images and plugins, refresh the cache and restart the web server.
All with the beauty of python.
I used to do the same. Every project had a top level ./deploy
script.
These days I use fabric instead, which allows simple automation of remote commands and uploading of files.
Generally I'll run "fab deploy
" which will bundle up a git/hg repository to a .tar file, upload that, unpack to "~/releases/XXX
" and point "~/current
" to the most recent. Then any appropriate services will be reloaded and similar.
(Everything works over SSH, providing you have suitable keys.)
You've gotta get onto with the command line, then you can dump your database using mysqldump
.
In terms of automation, you've got cron
, which can run tasks on a schedule; or you've got Fabric, which you could use to simplify/organize sets of commands. (Like instead of having to remember the full mysqldump command and parameters, you could define a "dump" task that performs that command, and then run it by executing "fab dump".)
Fabric is awesome see fabfile.org. Setup a named group of servers and then you can do:
fab -R MyServerGroup -P -- Some Command
Which will execute "Some Command" on all servers in MyServerGroup in parallel. From there you can create scripts - calling sudo along the way. I've been using it to configure around 30 servers. Now I simply do:
fab -R allServers -P sync_node