← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/db-bug-752279 into lp:launchpad/db-devel

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/db-bug-752279 into lp:launchpad/db-devel.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)
Related bugs:
  Bug #754453 in Launchpad itself: "publish-ftpmaster cleanup fails"
  https://bugs.launchpad.net/launchpad/+bug/754453

For more details, see:
https://code.launchpad.net/~jtv/launchpad/db-bug-752279/+merge/57277

= Summary =

The cron.publish-ftpmaster script, which we're replacing with a python rewrite, handles "dists" directories in quite a complicated way.  Trying to mimick this shuffle in the python script led to various bugs.  The "happy path" (the least complicated to analyze) shuffles them around like so: http://people.canonical.com/~jtv/publish-ftparchive-dists.png

What actually goes on is not all that difficult.  Two directories are perpetually maintained as copies of each other, using hard-linking rsync, with one holding the "currently published" state of a distribution and the other (the "backup" dists directory) a slightly outdated copy.  The latter is effectively used as a working directory to create a new version of the current one, but by maintaining it as a reasonably complete copy, the initial rsync that populates the working directory can be kept nice and fast.

With this clearer picture in hand, I simplified the directory dance to this one: http://people.canonical.com/~jtv/simplified-publish-ftparchive-dists.png

Actually that's not entirely truthful: at one point we need the backup dists directory to be in the archive root directory, where it doesn't normally live.  So we still need to move it temporarily, but with a few changes:

 * The initial rsync is done before moving anything.

 * Only very small, isolated stretches of code move dists directories to this temporary location.

 * The script recognizes and recovers dists directories in these temporary locations.

 * At most one directory is in a transient state at one time.

 * Re-using the same temporary locations for different needs reduces the recovery to one simple loop.

Lots more has changed.  I introduced a --post-rsync option that copies the current dists directory to the backup copy at the end, so that they'll be identical.  This will speed up the initial rsync on the next run, and so help reduce the turnaround time for security updates.  The cost is more work overall, but at least some of it will be out of the critical path.


== Tests ==

This is a massively slow test, and some day we'll want to try and speed it up.  Right now however it's the only test we have for an absolutely vital piece of infrastructure, so I largely ignored performance considerations.

{{{
./bin/test -vvc lp.archivepublisher.tests.test_publish_ftparchive
}}}


== Demo and Q/A ==

With this in place, we can do a new round of full Q/A on the new publish-ftpmaster script.  This includes verifying that the run-parts plugin scripts all work properly (insofar that can be verified on test servers).


= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings
  lib/lp/archivepublisher/scripts/publish_ftpmaster.py
  cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases
  lib/lp/archivepublisher/tests/test_publish_ftpmaster.py


Jeroen
-- 
https://code.launchpad.net/~jtv/launchpad/db-bug-752279/+merge/57277
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/db-bug-752279 into lp:launchpad/db-devel.
=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases'
--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases	2011-04-08 18:29:54 +0000
+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/10-sign-releases	2011-04-12 07:28:33 +0000
@@ -6,7 +6,6 @@
 GNUPGHOME=/srv/launchpad.net/ubuntu-archive/gnupg-home
 
 RELEASE_FILES=`find $DISTSROOT -maxdepth 2 -name Release`
-
 DIST_UPGRADER_TARBALLS=`
 	find $DISTSROOT/*/*/dist-upgrader* -name "*.tar.gz" || true`
 

=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings'
--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings	2011-04-07 06:30:02 +0000
+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/20-remove-uncompressed-listings	2011-04-12 07:28:33 +0000
@@ -8,4 +8,4 @@
 # It's safe to do this since the uncompressed MD5 hashes have already been
 # computed for inclusion in the Release files.
 
-find $DISTSROOT.new \( -name -o -name Sources \) -exec rm -f -- "{}" \;
+find $DISTSROOT \( -name Packages -o -name Sources \) -exec rm -f -- "{}" \;

=== modified file 'cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt'
--- cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt	2011-04-08 18:59:20 +0000
+++ cronscripts/publishing/distro-parts/ubuntu/publish-distro.d/README.txt	2011-04-12 07:28:33 +0000
@@ -15,8 +15,8 @@
 ARCHIVEROOT - the archive's root directory
 (e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/ )
 
-DISTSROOT - the archive's dists root directory
-(e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/dists )
+DISTSROOT - a working copy of the archive's dists root directory
+(e.g. /srv/launchpad.net/ubuntu-archive/ubuntu/dists.new )
 
 OVERRIDEROOT - the archive's overrides root directory (primary archive only)
 (e.g. /srv/launchpad.net/ubuntu-overrides, or the empty string for partner)

=== modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py'
--- lib/lp/archivepublisher/scripts/publish_ftpmaster.py	2011-04-08 18:59:20 +0000
+++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py	2011-04-12 07:28:33 +0000
@@ -71,6 +71,29 @@
     return ' '.join(['='.join(pair) for pair in env.iteritems()])
 
 
+def get_backup_dists(archive_config):
+    """Return the path of an archive's backup dists directory."""
+    return os.path.join(archive_config.archiveroot + "-distscopy", "dists")
+
+
+def get_dists(archive_config):
+    """Return the path of an archive's dists directory.
+
+    :param archive_config: Configuration for the archive in question.
+    """
+    return archive_config.distsroot
+
+
+def get_working_dists(archive_config):
+    """Return the working path for an archive's dists directory.
+
+    In order for publish-distro to operate on an archive, its dists
+    directory must be in the archive root.  So we move the backup
+    dists directory to a working location below the archive root just
+    for publish-distro.  This method composes the temporary path.
+    """
+    return get_dists(archive_config) + ".in-progress"
+
 def extend_PATH():
     """Produce env dict for extending $PATH.
 
@@ -83,11 +106,6 @@
     return {"PATH": '"$PATH":%s' % shell_quote(scripts_dir)}
 
 
-def get_distscopyroot(archive_config):
-    """Return the distscopy root directory for `archive_config`."""
-    return archive_config.archiveroot + "-distscopy"
-
-
 class StoreArgument:
     """Helper class: receive argument and store it."""
 
@@ -116,21 +134,54 @@
 
 
 class PublishFTPMaster(LaunchpadCronScript):
-    """Publish a distro (update)."""
-
-    # Has the publication been done?  This indicates that the distsroots
-    # have been replaced with newly generated ones.  It has implications
-    # for cleanup.
-    done_pub = False
+    """Publish a distro (update).
+
+    The publishable files are kept in the filesystem.  Most of the work
+    is done in a working "dists" directory in each archive's dists copy
+    root, which then replaces the current "dists" in the archive root.
+
+    For performance reasons, the old "dists" is not discarded.  It is
+    kept as the dists-copy version for the next run.  Its contents
+    don't matter in detail; an rsync updates it based on the currently
+    published dists directory before we start working with it.
+
+    At the end of one pass of the script, the "dists" directory in the
+    archive root and its backup copy in the dists-copy root will have
+    traded places.
+
+    However the script normally does 2 passes: once just for security
+    updates, to expedite publication of critical fixes, and once for the
+    whole distribution.  At the end of this, the directories will be
+    back in their original places (though with updated contents).
+    """
 
     def add_my_options(self):
+        """See `LaunchpadScript`."""
         self.parser.add_option(
             '-d', '--distribution', dest='distribution', default=None,
             help="Distribution to publish.")
         self.parser.add_option(
+            '-p', '--post-rsync', dest='post_rsync', action='store_true',
+            default=False,
+            help="When done, rsync backup dists to speed up the next run.")
+        self.parser.add_option(
             '-s', '--security-only', dest='security_only',
             action='store_true', default=False, help="Security upload only.")
 
+    def processOptions(self):
+        """Handle command-line options.
+
+        Sets `self.distribution` to the `Distribution` to publish.
+        """
+        if self.options.distribution is None:
+            raise LaunchpadScriptFailure("Specify a distribution.")
+
+        self.distribution = getUtility(IDistributionSet).getByName(
+            self.options.distribution)
+        if self.distribution is None:
+            raise LaunchpadScriptFailure(
+                "Distribution %s not found." % self.options.distribution)
+
     def executeShell(self, command_line, failure=None):
         """Run `command_line` through a shell.
 
@@ -161,7 +212,7 @@
             for archive in self.distribution.all_distro_archives
                 if archive.purpose in ARCHIVES_TO_PUBLISH]
 
-    def makeConfigs(self):
+    def getConfigs(self):
         """Set up configuration objects for archives to be published.
 
         The configs dict maps the archive purposes that are relevant for
@@ -171,24 +222,6 @@
             (archive.purpose, getPubConfig(archive))
             for archive in self.archives)
 
-    def cleanUp(self):
-        """Post-publishing cleanup."""
-        self.logger.debug("Cleaning up.")
-        for purpose, archive_config in self.configs.iteritems():
-            self.logger.debug(
-                "Moving %s dists backup to safe keeping for next time.",
-                purpose.title)
-            distscopyroot = get_distscopyroot(archive_config)
-            dists = os.path.join(distscopyroot, "dists")
-            if self.done_pub:
-                replacement_dists = archive_config.distsroot + ".old"
-            else:
-                replacement_dists = archive_config.distsroot + ".new"
-            if file_exists(replacement_dists):
-                self.logger.debug(
-                    "Renaming %s to %s.", replacement_dists, dists)
-                os.rename(replacement_dists, dists)
-
     def processAccepted(self):
         """Run the process-accepted script."""
         self.logger.debug(
@@ -212,50 +245,54 @@
         suites = self.getDirtySuites()
         return [suite for suite in suites if suite.endswith('-security')]
 
-    def rsyncNewDists(self, archive_purpose):
-        """Populate dists.new with a copy of distsroot.
+    def rsyncBackupDists(self):
+        """Populate the backup dists with a copy of distsroot.
 
         Uses "rsync -aH --delete" so that any obsolete files that may
-        still be in dists.new are cleaned up (bug 58835).
+        still be in the backup dists are cleaned out (bug 58835).
 
         :param archive_purpose: The (purpose of the) archive to copy.
         """
-        archive_config = self.configs[archive_purpose]
-        self.executeShell(
-            "rsync -aH --delete '%s/' '%s/dists.new'"
-            % (archive_config.distsroot, archive_config.archiveroot),
-            failure=LaunchpadScriptFailure(
-                "Failed to rsync dists.new for %s." % archive_purpose.title))
+        for purpose, archive_config in self.configs.iteritems():
+            dists = get_dists(archive_config)
+            backup_dists = get_backup_dists(archive_config)
+            self.executeShell(
+                "rsync -aH --delete '%s/' '%s'" % (dists, backup_dists),
+                failure=LaunchpadScriptFailure(
+                    "Failed to rsync new dists for %s." % purpose.title))
+
+    def recoverWorkingDists(self):
+        """Look for and recover any dists left in transient working state.
+
+        An archive's dists directory is temporarily moved into the
+        archive root for running publish-distro.  If a previous script
+        run died while in this state, restore the directory to its
+        permanent location.
+        """
+        for archive_config in self.configs.itervalues():
+            working_location = get_working_dists(archive_config)
+            if file_exists(working_location):
+                self.logger.info(
+                    "Recovering working directory %s from failed run.",
+                    working_location)
+                os.rename(working_location, get_backup_dists(archive_config))
 
     def setUpDirs(self):
-        """Copy the dists tree ready for publishing into.
-
-        We do this so that we don't get an inconsistent dists tree at
-        any point during the publishing cycle (which would cause buildds
-        to explode).
-
-        This is now done through maintaining a persistent backup copy of
-        the dists directory, which we move into place and bring up to
-        date with rsync.  Should achieve the same effect as copying, but
-        faster.
-        """
-        for archive_config in self.configs.itervalues():
+        """Create archive roots and such if they did not yet exist."""
+        for archive_purpose, archive_config in self.configs.iteritems():
             archiveroot = archive_config.archiveroot
             if not file_exists(archiveroot):
                 self.logger.debug("Creating archive root %s.", archiveroot)
                 os.makedirs(archiveroot)
-            distsroot = archive_config.distsroot
-            if not file_exists(distsroot):
-                self.logger.debug("Creating dists root %s.", distsroot)
-                os.makedirs(distsroot)
-
-        for purpose, archive_config in self.configs.iteritems():
-            dists = os.path.join(get_distscopyroot(archive_config), "dists")
-            dists_new = os.path.join(archive_config.archiveroot, "dists.new")
+            dists = get_dists(archive_config)
             if not file_exists(dists):
+                self.logger.debug("Creating dists root %s.", dists)
                 os.makedirs(dists)
-            os.rename(dists, dists_new)
-            self.rsyncNewDists(purpose)
+            distscopy = get_backup_dists(archive_config)
+            if not file_exists(distscopy):
+                self.logger.debug(
+                    "Creating backup dists directory %s", distscopy)
+                os.makedirs(distscopy)
 
     def publishDistroArchive(self, archive, security_suites=None):
         """Publish the results for an archive.
@@ -265,13 +302,19 @@
             the publishing to.
         """
         purpose = archive.purpose
+        archive_config = self.configs[purpose]
         self.logger.debug(
             "Publishing the %s %s...", self.distribution.name, purpose.title)
-        archive_config = self.configs[purpose]
+
+        # For reasons unknown, publishdistro only seems to work with a
+        # directory that's inside the archive root.  So we move it there
+        # for the duration.
+        temporary_dists = get_working_dists(archive_config)
+
         arguments = [
             '-v', '-v',
             '-d', self.distribution.name,
-            '-R', archive_config.distsroot + '.new',
+            '-R', temporary_dists,
             ]
 
         if archive.purpose == ArchivePurpose.PARTNER:
@@ -282,8 +325,14 @@
 
         parser = OptionParser()
         publishdistro.add_options(parser)
-        options, args = parser.parse_args(arguments)
-        publishdistro.run_publisher(options, txn=self.txn, log=self.logger)
+
+        os.rename(get_backup_dists(archive_config), temporary_dists)
+        try:
+            options, args = parser.parse_args(arguments)
+            publishdistro.run_publisher(
+                options, txn=self.txn, log=self.logger)
+        finally:
+            os.rename(temporary_dists, get_backup_dists(archive_config))
 
         self.runPublishDistroParts(archive)
 
@@ -292,41 +341,32 @@
         archive_config = self.configs[archive.purpose]
         env = {
             'ARCHIVEROOT': shell_quote(archive_config.archiveroot),
-            'DISTSROOT': shell_quote(archive_config.distsroot + ".new"),
+            'DISTSROOT': shell_quote(get_backup_dists(archive_config)),
+            'OVERRIDEROOT': shell_quote(archive_config.overrideroot),
             }
         if archive_config.overrideroot is not None:
             env["OVERRIDEROOT"] = shell_quote(archive_config.overrideroot)
         self.runParts('publish-distro.d', env)
 
     def installDists(self):
-        """Put the new dists into place, as near-atomically as possible."""
-        # Before we start moving directories around, make as nearly
-        # sure as possible that we can do either none or all of them.
-        self.logger.debug("Looking for impediments to publication.")
-        for purpose, archive_config in self.configs.iteritems():
-            old_distsroot = archive_config.distsroot + '.old'
-            if file_exists(old_distsroot):
-                raise LaunchpadScriptFailure(
-                    "Old %s distsroot %s is in the way."
-                    % (purpose.title, old_distsroot))
+        """Put the new dists into place, as near-atomically as possible.
 
-        # Move the existing distsroots out of the way, and move the new
-        # ones in their place.
-        self.logger.debug("Placing the new dists into place...")
+        For each archive, this switches the dists directory and the
+        backup dists directory around.
+        """
+        self.logger.debug("Moving the new dists into place...")
         for archive_config in self.configs.itervalues():
-            distsroot = archive_config.distsroot
-            os.rename(distsroot, distsroot + ".old")
-            os.rename(distsroot + ".new", distsroot)
-
-        # Yay, we did it!  Mark the fact because it makes a difference
-        # to the cleanup procedure.
-        self.done_pub = True
-
-        for purpose, archive_config in self.configs.iteritems():
-            dists = os.path.join(get_distscopyroot(archive_config), "dists")
-            self.logger.debug(
-                "Renaming old %s distsroot to %s." % (purpose.title, dists))
-            os.rename(archive_config.distsroot + ".old", dists)
+            # Use the dists "working location" as a temporary place to
+            # move the current dists out of the way for the switch.  If
+            # we die in this state, the next run will know to move the
+            # temporary directory to the backup location.
+            dists = get_dists(archive_config)
+            temp_dists = get_working_dists(archive_config)
+            backup_dists = get_backup_dists(archive_config)
+
+            os.rename(dists, temp_dists)
+            os.rename(backup_dists, dists)
+            os.rename(temp_dists, backup_dists)
 
     def runCommercialCompat(self):
         """Generate the -commercial pocket.
@@ -371,20 +411,6 @@
                 "find '%s' -type d -empty | xargs -r rmdir"
                 % archive_config.archiveroot)
 
-    def processOptions(self):
-        """Handle command-line options.
-
-        Sets `self.distribution` to the `Distribution` to publish.
-        """
-        if self.options.distribution is None:
-            raise LaunchpadScriptFailure("Specify a distribution.")
-
-        self.distribution = getUtility(IDistributionSet).getByName(
-            self.options.distribution)
-        if self.distribution is None:
-            raise LaunchpadScriptFailure(
-                "Distribution %s not found." % self.options.distribution)
-
     def runParts(self, parts, env):
         """Execute run-parts.
 
@@ -422,14 +448,9 @@
         if len(security_suites) == 0:
             self.logger.debug("Nothing to do for security publisher.")
             return
-        partner_archive = self.distribution.getArchive("partner")
-        if partner_archive is not None:
-            self.publishDistroArchive(partner_archive)
+
         self.publishDistroArchive(
             self.distribution.main_archive, security_suites=security_suites)
-        self.installDists()
-        self.runCommercialCompat()
-        self.runFinalizeParts(security_only=True)
 
     def publishAllUploads(self):
         """Publish the distro's complete uploads."""
@@ -439,26 +460,61 @@
             # most of its time.
             self.publishDistroArchive(archive)
 
-        self.installDists()
-        self.runCommercialCompat()
-        self.generateListings()
-        self.clearEmptyDirs()
-        self.runFinalizeParts()
+    def publish(self, security_only=False):
+        """Do the main publishing work.
+
+        :param security_only: If True, limit publication to security
+            updates on the main archive.  This is much faster, so it
+            makes sense to do a security-only run before the main
+            event to expedite critical fixes.
+        """
+        try:
+            if security_only:
+                self.publishSecurityUploads()
+            else:
+                self.publishAllUploads()
+
+            # Swizzle the now-updated backup dists and the current dists
+            # around.
+            self.installDists()
+        except:
+            # If we failed here, there's a chance that we left a
+            # working dists directory in its temporary location.  If so,
+            # recover it.  The next script run would do that anyway, but
+            # doing it now is easier on admins trying to recover from
+            # system problems.
+            self.recoverWorkingDists()
+            raise
 
     def setUp(self):
         """Process options, and set up internal state."""
         self.processOptions()
         self.archives = self.getArchives()
-        self.configs = self.makeConfigs()
+        self.configs = self.getConfigs()
 
     def main(self):
         """See `LaunchpadScript`."""
         self.setUp()
-        try:
-            self.processAccepted()
-            self.setUpDirs()
-            self.publishSecurityUploads()
-            if not self.options.security_only:
-                self.publishAllUploads()
-        finally:
-            self.cleanUp()
+        self.recoverWorkingDists()
+        self.processAccepted()
+        self.setUpDirs()
+
+        self.rsyncBackupDists()
+        self.publish(security_only=True)
+        self.runCommercialCompat()
+        self.runFinalizeParts(security_only=True)
+
+        if not self.options.security_only:
+            self.rsyncBackupDists()
+            self.publish(security_only=False)
+            self.runCommercialCompat()
+            self.generateListings()
+            self.clearEmptyDirs()
+            self.runFinalizeParts(security_only=False)
+
+        if self.options.post_rsync:
+            #  Update the backup dists with the published changes.  The
+            #  initial rsync on the next run will not need to make any
+            #  changes, and so it'll take the next run a little less
+            #  time to publish its security updates.
+            self.rsyncBackupDists()

=== modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py'
--- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py	2011-04-08 18:29:54 +0000
+++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py	2011-04-12 07:28:33 +0000
@@ -6,6 +6,7 @@
 __metaclass__ = type
 
 from apt_pkg import TagFile
+import logging
 import os
 from textwrap import dedent
 from zope.component import getUtility
@@ -22,7 +23,10 @@
     PackagePublishingPocket,
     pocketsuffix,
     )
-from lp.services.log.logger import DevNullLogger
+from lp.services.log.logger import (
+    BufferLogger,
+    DevNullLogger,
+    )
 from lp.services.scripts.base import LaunchpadScriptFailure
 from lp.services.utils import file_exists
 from lp.soyuz.enums import (
@@ -33,6 +37,7 @@
     compose_env_string,
     compose_shell_boolean,
     find_run_parts_dir,
+    get_working_dists,
     PublishFTPMaster,
     shell_quote,
     )
@@ -80,6 +85,26 @@
     return os.path.join("cronscripts", "publishing", "distro-parts")
 
 
+def write_marker_file(path, contents):
+    """Write a marker file for checking direction movements.
+
+    :param path: A list of path components.
+    :param contents: Text to write into the file.
+    """
+    marker = file(os.path.join(*path), "w")
+    marker.write(contents)
+    marker.flush()
+    marker.close()
+
+
+def read_marker_file(path):
+    """Read the contents of a marker file.
+
+    :param return: Contents of the marker file.
+    """
+    return file(os.path.join(*path)).read()
+
+
 class HelpersMixin:
     """Helpers for the PublishFTPMaster tests."""
 
@@ -205,11 +230,11 @@
         self.setUpForScriptRun(ubuntu)
         return ubuntu
 
-    def makeScript(self, distro=None):
+    def makeScript(self, distro=None, extra_args=[]):
         """Produce instance of the `PublishFTPMaster` script."""
         if distro is None:
             distro = self.makeDistro()
-        script = PublishFTPMaster(test_args=["-d", distro.name])
+        script = PublishFTPMaster(test_args=["-d", distro.name] + extra_args)
         script.txn = self.layer.txn
         script.logger = DevNullLogger()
         return script
@@ -220,24 +245,6 @@
         self.assertEqual(1, len(sections))
         return dict(sections[0])
 
-    def writeMarkerFile(self, path, contents):
-        """Write a marker file for checking direction movements.
-
-        :param path: A list of path components.
-        :param contents: Text to write into the file.
-        """
-        marker = file(os.path.join(*path), "w")
-        marker.write(contents)
-        marker.flush()
-        marker.close()
-
-    def readMarkerFile(self, path):
-        """Read the contents of a marker file.
-
-        :param return: Contents of the marker file.
-        """
-        return file(os.path.join(*path)).read()
-
     def enableCommercialCompat(self):
         """Enable commercial-compat.sh runs for the duration of the test."""
         config.push("commercial-compat", dedent("""\
@@ -349,41 +356,6 @@
         self.assertEqual(distro.displayname, main_release["Label"])
         self.assertEqual("source", main_release["Architecture"])
 
-    def test_cleanup_moves_dists_to_new_if_not_published(self):
-        distro = self.makeDistro()
-        pub_config = get_pub_config(distro)
-        dists_root = get_dists_root(pub_config)
-        dists_copy_root = get_distscopy_root(pub_config)
-        new_distsroot = dists_root + ".new"
-        os.makedirs(new_distsroot)
-        self.writeMarkerFile([new_distsroot, "marker"], "dists.new")
-        os.makedirs(dists_copy_root)
-
-        script = self.makeScript(distro)
-        script.setUp()
-        script.cleanUp()
-        self.assertEqual(
-            "dists.new",
-            self.readMarkerFile([dists_copy_root, "dists", "marker"]))
-
-    def test_cleanup_moves_dists_to_old_if_published(self):
-        distro = self.makeDistro()
-        pub_config = get_pub_config(distro)
-        dists_root = get_dists_root(pub_config)
-        old_distsroot = dists_root + ".old"
-        dists_copy_root = get_distscopy_root(pub_config)
-        os.makedirs(old_distsroot)
-        self.writeMarkerFile([old_distsroot, "marker"], "dists.old")
-        os.makedirs(dists_copy_root)
-
-        script = self.makeScript(distro)
-        script.setUp()
-        script.done_pub = True
-        script.cleanUp()
-        self.assertEqual(
-            "dists.old",
-            self.readMarkerFile([dists_copy_root, "dists", "marker"]))
-
     def test_getDirtySuites_returns_suite_with_pending_publication(self):
         spph = self.factory.makeSourcePackagePublishingHistory()
         script = self.makeScript(spph.distroseries.distribution)
@@ -446,24 +418,26 @@
         script = self.makeScript(distro)
         script.setUp()
         dists_root = get_dists_root(get_pub_config(distro))
+        dists_backup = os.path.join(
+            get_distscopy_root(get_pub_config(distro)), "dists")
+        os.makedirs(dists_backup)
         os.makedirs(dists_root)
-        os.makedirs(dists_root + ".new")
-        self.writeMarkerFile([dists_root, "new-file"], "New file")
-        script.rsyncNewDists(ArchivePurpose.PRIMARY)
+        write_marker_file([dists_root, "new-file"], "New file")
+        script.rsyncBackupDists()
         self.assertEqual(
-            "New file",
-            self.readMarkerFile([dists_root + ".new", "new-file"]))
+            "New file", read_marker_file([dists_backup, "new-file"]))
 
     def test_rsync_cleans_up_obsolete_files(self):
         distro = self.makeDistro()
         script = self.makeScript(distro)
         script.setUp()
-        dists_root = get_dists_root(get_pub_config(distro))
-        os.makedirs(dists_root)
-        os.makedirs(dists_root + ".new")
-        old_file = [dists_root + ".new", "old-file"]
-        self.writeMarkerFile(old_file, "old-file")
-        script.rsyncNewDists(ArchivePurpose.PRIMARY)
+        dists_backup = os.path.join(
+            get_distscopy_root(get_pub_config(distro)), "dists")
+        os.makedirs(dists_backup)
+        old_file = [dists_backup, "old-file"]
+        write_marker_file(old_file, "old-file")
+        os.makedirs(get_dists_root(get_pub_config(distro)))
+        script.rsyncBackupDists()
         self.assertFalse(path_exists(*old_file))
 
     def test_setUpDirs_creates_directory_structure(self):
@@ -480,9 +454,9 @@
 
         self.assertTrue(file_exists(archive_root))
         self.assertTrue(file_exists(dists_root))
-        self.assertTrue(file_exists(dists_root + ".new"))
+        self.assertTrue(file_exists(get_distscopy_root(pub_config)))
 
-    def test_setUpDirs_does_not_mind_if_directories_already_exist(self):
+    def test_setUpDirs_does_not_mind_if_dist_directories_already_exist(self):
         distro = self.makeDistro()
         script = self.makeScript(distro)
         script.setUp()
@@ -490,17 +464,6 @@
         script.setUpDirs()
         self.assertTrue(file_exists(get_archive_root(get_pub_config(distro))))
 
-    def test_setUpDirs_moves_dists_to_dists_new(self):
-        distro = self.makeDistro()
-        dists_root = get_dists_root(get_pub_config(distro))
-        script = self.makeScript(distro)
-        script.setUp()
-        script.setUpDirs()
-        self.writeMarkerFile([dists_root, "marker"], "X")
-        script.setUpDirs()
-        self.assertEqual(
-            "X", self.readMarkerFile([dists_root + ".new", "marker"]))
-
     def test_publishDistroArchive_runs_parts(self):
         distro = self.makeDistro()
         script = self.makeScript(distro)
@@ -527,31 +490,6 @@
         missing_parameters = required_parameters.difference(set(env.keys()))
         self.assertEqual(set(), missing_parameters)
 
-    def test_installDists_sets_done_pub(self):
-        script = self.makeScript()
-        script.setUp()
-        script.setUpDirs()
-        self.assertFalse(script.done_pub)
-        script.installDists()
-        self.assertTrue(script.done_pub)
-
-    def test_installDists_replaces_distsroot(self):
-        distro = self.makeDistro()
-        script = self.makeScript(distro)
-        script.setUp()
-        script.setUpDirs()
-        pub_config = get_pub_config(distro)
-        dists_root = get_dists_root(pub_config)
-
-        self.writeMarkerFile([dists_root, "marker"], "old")
-        self.writeMarkerFile([dists_root + ".new", "marker"], "new")
-
-        script.installDists()
-
-        self.assertEqual("new", self.readMarkerFile([dists_root, "marker"]))
-        self.assertEqual("old", self.readMarkerFile(
-            [get_distscopy_root(pub_config), "dists", "marker"]))
-
     def test_runCommercialCompat_runs_commercial_compat_script(self):
         # XXX JeroenVermeulen 2011-03-29 bug=741683: Retire
         # runCommercialCompat as soon as Dapper support ends.
@@ -611,7 +549,7 @@
         nonempty_dir = os.path.join(
             get_dists_root(get_pub_config(distro)), 'nonempty-dir')
         os.makedirs(nonempty_dir)
-        self.writeMarkerFile([nonempty_dir, "placeholder"], "Data here!")
+        write_marker_file([nonempty_dir, "placeholder"], "Data here!")
         script.clearEmptyDirs()
         self.assertTrue(file_exists(nonempty_dir))
 
@@ -690,27 +628,13 @@
         self.assertEqual(set(), missing_parameters)
 
     def test_publishSecurityUploads_skips_pub_if_no_security_updates(self):
-        script = self.makeScript(self.makeDistro())
+        script = self.makeScript()
         script.setUp()
         script.setUpDirs()
         script.installDists = FakeMethod()
         script.publishSecurityUploads()
         self.assertEqual(0, script.installDists.call_count)
 
-    def test_publishSecurityUploads_runs_finalize_parts(self):
-        distro = self.makeDistro()
-        self.factory.makeSourcePackagePublishingHistory(
-            distroseries=self.factory.makeDistroSeries(distribution=distro),
-            pocket=PackagePublishingPocket.SECURITY)
-        script = self.makeScript(distro)
-        script.setUp()
-        script.setUpDirs()
-        script.runFinalizeParts = FakeMethod()
-        script.publishSecurityUploads()
-        self.assertEqual(1, script.runFinalizeParts.call_count)
-        args, kwargs = script.runFinalizeParts.calls[0]
-        self.assertTrue(kwargs["security_only"])
-
     def test_publishAllUploads_publishes_all_distro_archives(self):
         distro = self.makeDistro()
         distroseries = self.factory.makeDistroSeries(distribution=distro)
@@ -733,14 +657,62 @@
         self.assertIn(distro.main_archive, published_archives)
         self.assertIn(partner_archive, published_archives)
 
-    def test_publishAllUploads_runs_finalize_parts(self):
+    def test_recoverWorkingDists_is_quiet_normally(self):
+        script = self.makeScript()
+        script.setUp()
+        script.logger = BufferLogger()
+        script.logger.setLevel(logging.INFO)
+        script.recoverWorkingDists()
+        self.assertEqual('', script.logger.getLogBuffer())
+
+    def test_recoverWorkingDists_recovers_working_directory(self):
         distro = self.makeDistro()
         script = self.makeScript(distro)
         script.setUp()
+        script.logger = BufferLogger()
+        script.logger.setLevel(logging.INFO)
         script.setUpDirs()
-        script.runFinalizeParts = FakeMethod()
+        archive_config = script.configs[ArchivePurpose.PRIMARY]
+        backup_dists = os.path.join(
+            archive_config.archiveroot + "-distscopy", "dists")
+        working_dists = get_working_dists(archive_config)
+        os.rename(backup_dists, working_dists)
+        write_marker_file([working_dists, "marker"], "Recovered")
+        script.recoverWorkingDists()
+        self.assertEqual(
+            "Recovered", read_marker_file([backup_dists, "marker"]))
+        self.assertNotEqual('', script.logger.getLogBuffer())
+
+    def test_publishes_first_security_updates_then_all_updates(self):
+        script = self.makeScript()
+        script.publish = FakeMethod()
+        script.main()
+        self.assertEqual(2, script.publish.call_count)
+        args, kwargs = script.publish.calls[0]
+        self.assertEqual({'security_only': True}, kwargs)
+        args, kwargs = script.publish.calls[1]
+        self.assertEqual(False, kwargs.get('security_only', False))
+
+    def test_security_run_publishes_only_security_updates(self):
+        script = self.makeScript(extra_args=['--security-only'])
+        script.publish = FakeMethod()
+        script.main()
+        self.assertEqual(1, script.publish.call_count)
+        args, kwargs = script.publish.calls[0]
+        self.assertEqual({'security_only': True}, kwargs)
+
+    def test_publishAllUploads_processes_all_archives(self):
+        distro = self.makeDistro()
+        partner_archive = self.factory.makeArchive(
+            distribution=distro, purpose=ArchivePurpose.PARTNER)
+        script = self.makeScript(distro)
+        script.publishDistroArchive = FakeMethod()
+        script.setUp()
         script.publishAllUploads()
-        self.assertEqual(1, script.runFinalizeParts.call_count)
+        published_archives = [
+            args[0] for args, kwargs in script.publishDistroArchive.calls]
+        self.assertContentEqual(
+            [distro.main_archive, partner_archive], published_archives)
 
     def test_runFinalizeParts_quotes_archiveroots(self):
         # Passing ARCHIVEROOTS to the finalize.d scripts is a bit


Follow ups