In our organization, we produce an enterprise rails application that can be installed on multiple platforms (redhat 5/6, windows, mac os x) and use multiple database backends (P, MSSQL, mysql, Oracle at last count). This poses reasonably big challenges for developers, but truly gigantic ones for our QA and continuous delivery pipeline.
We have focused in on a jRuby architecture delivered through Jenkins, Puppet, and Vagrant virtual machines to help ease the pain of these challenges and allow us to produce disposable versions of our reference environments at all stages of the pipeline from development to final pre-prod acceptance testing, and even out into the field through proof of concepts (POC) and sales efforts.
As developers, we rely heavily on the database drivers and migrations that Ruby on Rails places between our web application and the particular database we are using. In the course of our testing, we sanity test our commits on P first, then the other supported databases. While a single development machine might be able to install some of these databases locally (except Oracle or MSSQL on the Mac), once you get into multiple versions it is nearly impossible to get them all up and running reliably. Enter vagrant!
This workflow requires Vagrant v. 1.2 or greater, Virtual Box vs 4.2.10 or greater (4.2.14 was sadly broken). It uses an Ubuntu box with one of the smallest footprints since the demands on this database server are going to be light: Ubuntu Lucid 32. A list of available Vagrant boxes can be found at www.vagrantbox.es. Our goal was to get an older 8.4 edition of Postgres, so the slightly dated nature of the lucid box meant that the default apt-get package installation was pegged to the older 8.4 version of Postgres just like we hoped.
Building the Vagrant Box
At the end of this process, you will be able to package up a new vagrant box with 8.4 running on an empty database, but to get there we need to build the box. My puppet skills are still too rudimentary to accomplish all of this through a puppet initialization file, unfortunately, but I hope to update this project soon with a pure puppet work flow. Skip to the section “Deploying the Finished Postgres Server” if you just was to use the image linked to this post and get the server up and running.
Step 1: Create a project directory and initialize the vagrant box
Step 2. Edit the vagrant file to forward ports and reduce the memory footprint
Step 3. Manually configure the Ubuntu server for localhost:5433 access by Rails applications or clients
At this point, you have a single command that will bring up your postgres 8.4 server. You can destroy and recreate it as many times as you like. Upload the finished box and share it with your development team. Rails database.yml file can connect to it once it is running using port 5433.
Future steps for this workflow include an all Puppet configuration of the firewall, postgres, and locale settings. I have about 70% of that done, but am missing a few key Puppet concepts to complete the job so get in touch if you want to collaborate. With a puppet work flow, all the twitchy command line stuff would be replaced by a source controlled configuration script eliminating the need to package the configured box and working directly from the lucid32 base.
As usual, a number of blogs and gists helped me get through this workflow, but no one post got me all the way to bingo hence this post. Special thanks go to:
- Mario Zaizar, “How to install Postgresql 8.4 in a Vagrant box”, http://blog.crowdint.com/2011/08/11/postgresql-in-vagrant.html, visited 29 June 2013. Note: the last steps (2 and on) in this blog are not valid for postgres 8.4 because they undo the automatic cluster setup included in the package installation.
- jschoolcraft, ”how-to-install-postgresql-on-an-ubuntu-1004-vagrant-box.markdown”, https://gist.github.com/jschoolcraft/1963369, visited on 29 June 2013. As with Zaizar’s post, the later parts of the blog with manual running of pg_createcluster were invalid for my lucid32 box which completed that step as part of package installation. Running it again caused some problems so I had to start over with a fresh box and omit those steps.