← Back to team overview

launchpad-dev team mailing list archive

Requirements and constraints

 

This is following up on a discussion with Kit Randel on our team call
last night, and one with Robert Collins on IRC; and, more generally, the
massively inconsistent mess that many of our projects are in.

The current state of things is that we have:

 * projects using buildout (Launchpad itself; lazr.* and friends)

   Tried and tested.  OTOH it's also a massive pain.  We're stuck on an
   old version at least for Launchpad itself because we're relying on
   buildout's partial isolation support which was removed in buildout 2;
   and buildout doesn't support setup_requires, which many of our
   dependencies now make use of directly or transitively, particularly
   via pbr.  It is massively unclear how to run tests at all on many
   lazr.* packages, never mind how to do so if you happen to be
   attempting a Python 3 port; figuring out the exact details of how to
   run ./bootstrap.py etc. is awkward at best and we have quite a lot of
   bitrot here.

 * projects using virtualenv and pip (turnip, txpkgupload, rutabaga)

   Relatively new to us, but is mostly working well.  We have some
   problems with coherent requirements management, and different
   projects use requirements vs. constraints differently, which I'd like
   to fix before we extend this to more things.

Here are a couple of relevant pages:

  http://docs.openstack.org/developer/pbr/#requirements
  https://www.caremad.io/2013/07/setup-vs-requirement/

With the exception of Launchpad itself, everything we have is basically
a library in the terms of the latter post: we do deploy some things
separately, but we either already pull them in as dependencies elsewhere
(e.g. txpkgupload) or we might reasonably want to do so at least for the
purposes of integration testing (e.g. turnip).  So we should be treating
all our packages as reusable components, which I hope is fairly
uncontroversial.  That implies that all our packages should declare
their direct dependencies *somewhere*, and at least those that are
top-level deployment targets should declare a full set of pinned
versions as well.

It's useful to be able to test a library with the versions that are
pinned for a particular application deployment (e.g. "does this
lazr.restful branch work with the set of versions pinned by Launchpad
itself?").  Our current strategy is to upgrade versions.cfg (for
buildout) in library trunk branches fairly haphazardly but generally
aiming to be in sync with Launchpad itself; this isn't very satisfactory
in general and it particularly falls down when we have multiple
application deployments with good reasons to have different versions of
things pinned.

It's useful to be able to provide a requirements.txt which just builds
whatever's current; this makes things easier for developers, services
such as readthedocs, etc.  I certainly agree that we shouldn't have our
direct dependencies declared twice, and there are a couple of possible
approaches to that.  Aesthetically I think I prefer declaring them in
setup.py and using "-e ." in requirements.txt, since we generally seem
to need editable mode anyway; but the pbr approach would work too, so I
don't feel very strongly about this.

I propose the following as a policy for all Launchpad-related projects:

 * All projects must use a virtual environment and pip to install their
   dependencies.  (This is relatively straightforward for most projects;
   Launchpad itself will take a while.)

 * All projects must declare their direct dependencies (only - must not
   be flattened/resolved) in setup.py and use "-e ." in
   requirements.txt, to be used with "pip install -r requirements.txt".
   (Alternative: declare them in requirements.txt and use pbr's
   requirements parsing support.)

 * Projects that are the top level of a deployment must include a
   flattened list of dependency versions in constraints.txt, as produced
   by "pip freeze", to be used with "pip install -c constraints.txt -e
   .".  All the necessary packages should be included in a suitable
   dependencies branch.  Unless they are deploying to a system that
   already has pip 7.1 or newer, they will typically need a
   bootstrap-requirements.txt that upgrades to modern versions of pip
   and setuptools in order for constraints to be available.

 * Any project that requires a fork of a dependency (i.e. a version not
   on PyPI) must pin the forked version in constraints.txt.  If the
   forked version is required for development builds, then it should
   also include a suitable VCS entry in requirements.txt (see
   https://pip.readthedocs.org/en/stable/reference/pip_install/#vcs-support).
   Note that this is safe because deployments use constraints.txt, not
   requirements.txt.

 * Testing libraries against application deployment versions should be
   achieved using "pip install -c /path/to/application/constraints.txt
   -e .".  Unless they are themselves the top level of a deployment,
   libraries should not include full pinned constraints; they will only
   get out of date.

Does this all sound sensible, or have I missed something?

-- 
Colin Watson                                    [cjwatson@xxxxxxxxxxxxx]


Follow ups