Entries tagged “process”

At CCNMTL most of our new Python projects are written in Django, but we still support a number of older projects that were written with TurboGears 1.0.4. They've continued to be stable, and we don't do a ton of new development on them, so it hasn't been worthwhile to upgrade them to newer versions of TurboGears.

But we do occasionally make changes to their code, and recently we've begun migrating them to newer servers.  So I recently spent some time updating their deployment processes to CCNMTL's current best practices:

  • Installation with pip instead of easy_install
  • Fully pinned local source distributions versioned alongside the code
  • No Internet access required anywhere in the deployment
  • Containment with virtualenv
I ended up with a package that you can use to create an isolated TurboGears 1.0.4 environment to run legacy projects in, or (if for some reason you want to) to create new TurboGears 1.0.4 projects.  You can get it on Github here: https://github.com/ccnmtl/turbogears_pip_bootstrapper

In this post I'll go into detail about what it does, and the hurdles I ran into along the way.

Earlier this week, I wrote about how to make virtualenv install pip and setuptools from local source distributions, instead of fetching unpinned copies of them from the Internet, which it does (somewhat silently) by default. The approach relied on a somewhat buried feature of virtualenv: looking for appropriate distributions in a virtualenv_support directory before downloading them.

In a future release of virtualenv, this will be easier, and also more apparent.  I submitted patches for two new features which were accepted by virtualenv's maintainers:

These new features are documented in the source here.  If you want to start using them now, you can fetch a copy of virtualenv.py from the "develop" branch: https://github.com/pypa/virtualenv/raw/develop/virtualenv.py

In my previous post I talked about how to ensure that none of your Python project's dependencies are being downloaded from the Internet when you create a fresh virtualenv and install them. This is good for deployments: each deployment is completely reproducible since every package's source is installed from a specific version of the codebase that's versioned alongside the code you're deploying, and deployments don't require external network access to succeed.

There's one piece that's still missing, though: isolating and pinning the installation of the installation/bootstrapping tools themselves -- virtualenv, pip, and setuptools.

Anders has written several times about our deployment strategy for Django apps at CCNMTL. Aside from containment of each project with virtualenv, we also try to make sure that deployments never depend on anything external, and can be done without access to the wider Internet. We do this by an aggressive form of version pinning: in each project's repository, we check in source tarballs of all the project's dependencies, including Django itself. We then have a pip requirements file that points to each of these local files in order. (Here's an example, and the bootstrap script that uses it.)

There are two benefits to this approach. First, it removes our deployments' dependencies on external web services, like PyPI, being online. Second, it ensures that we know exactly what versions we're using of all the Python code in a project's deployment. That makes deployments trivially repeatable, and gives us the ability to roll back a deployment to any earlier version -- so if a new deployment doesn't work properly for some reason, we can re-deploy the last tagged deployment and know that (barring system-level changes) it'll work exactly as expected.

The other week, we made a new deployment to one of our Django projects, and the site stopped working. It turned out that the wrong version of Django was installed somehow: the project was built on Django 1.0, but this broken deployment ended up with Django 1.2 instead. And, oddly, rolling back to the previous deployment didn't fix the problem.

Everything is speeding up these days, even the authoring of books. Some information society researchers we know (including some of our friends from Eyebeam, Creative Commons and Shift Space) locked themselves up for a week in Berlin, and came out the other end with a print-ready book on the future of collaboration - Collaborative Futures.

Even though I didn't travel to Berlin, the authorship of the book was radically distributed and some of my writing made it into the final cut. A portion of essay I wrote last fall for a sociology seminar on the future on a (brief) history of version control systems and the significance of distributed version control systems made the cut.

The book will be released under a creative commons license, but they are also doing a print run of hard copies which will be available starting at the launch party, March 4th. Pre-order hard copy here (digital copy is available here).

Feed Subscription

If you use an RSS reader, you can subscribe to a feed of all future entries tagged “process”. [What is this?]