← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/more-arbitrary-lint into lp:launchpad

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/more-arbitrary-lint into lp:launchpad.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)

For more details, see:
https://code.launchpad.net/~jtv/launchpad/more-arbitrary-lint/+merge/73165

= Summary =

While trying to resolve a conflict with devel, I tried the usual "make lint" as a quick check for obvious mistakes.  But this didn't work very well because people had left so much lint in the files they touched over the past few days.

So I fixed some.  (I have another branch up for review that fixes another batch).  I also ran our automated tools for correcting bad import formatting and updating copyright notices.  In some cases I ran the automated tool for reformatting doctests; our doctests are obsolescent and not worth a lot of attention, but that means they should also be consistent and easy to read, and cause minimal trouble while we're doing more worthwhile things.

Finally, just because I was there, I cleaned up a bunch of database flushes and commits that had become unnecessary for various reasons.


= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/doc/package-cache.txt
  lib/lp/soyuz/stories/soyuz/xx-sourcepackage-changelog.txt
  lib/lp/registry/interfaces/pillar.py
  lib/lp/soyuz/doc/distroarchseriesbinarypackage.txt
  lib/lp/codehosting/codeimport/tests/test_workermonitor.py
  lib/lp/soyuz/stories/webservice/xx-archive.txt
  lib/lp/code/stories/codeimport/xx-codeimport-results.txt
  lib/lp/codehosting/codeimport/worker.py
  lib/lp/code/model/tests/test_branchmergeproposal.py
  lib/lp/soyuz/interfaces/archive.py
  lib/lp/code/model/tests/test_codeimport.py
  lib/lp/codehosting/tests/test_safe_open.py
  lib/lp/registry/browser/tests/product-portlet-packages-view.txt
  lib/lp/codehosting/safe_open.py
  lib/lp/scripts/tests/test_garbo.py
  lib/lp/registry/browser/productseries.py
  lib/lp/codehosting/codeimport/tests/servers.py
  lib/lp/codehosting/codeimport/tests/test_foreigntree.py
  lib/lp/registry/interfaces/person.py
  lib/lp/codehosting/codeimport/workermonitor.py
  lib/lp/codehosting/codeimport/tests/test_worker.py
  lib/lp/codehosting/puller/worker.py

./lib/lp/soyuz/stories/soyuz/xx-sourcepackage-changelog.txt
       9: want exceeds 78 characters.
      15: want exceeds 78 characters.
      77: want exceeds 78 characters.
      80: want exceeds 78 characters.
      97: source exceeds 78 characters.
./lib/lp/soyuz/stories/webservice/xx-archive.txt
      41: want exceeds 78 characters.
      45: want exceeds 78 characters.
     166: want exceeds 78 characters.
     182: want exceeds 78 characters.
     198: want exceeds 78 characters.
     214: want exceeds 78 characters.
     358: want exceeds 78 characters.
     419: want exceeds 78 characters.
     550: want exceeds 78 characters.
-- 
https://code.launchpad.net/~jtv/launchpad/more-arbitrary-lint/+merge/73165
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/more-arbitrary-lint into lp:launchpad.
=== modified file 'lib/lp/code/model/tests/test_branchmergeproposal.py'
--- lib/lp/code/model/tests/test_branchmergeproposal.py	2011-08-23 15:36:01 +0000
+++ lib/lp/code/model/tests/test_branchmergeproposal.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=F0401
@@ -12,9 +12,7 @@
     timedelta,
     )
 from difflib import unified_diff
-from unittest import (
-    TestCase,
-    )
+from unittest import TestCase
 
 from lazr.lifecycle.event import ObjectModifiedEvent
 from lazr.restfulclient.errors import BadRequest
@@ -28,11 +26,9 @@
 from canonical.database.constants import UTC_NOW
 from canonical.launchpad.ftests import import_secret_test_key
 from canonical.launchpad.interfaces.launchpad import IPrivacy
-from lp.services.messages.interfaces.message import IMessageJob
 from canonical.launchpad.webapp import canonical_url
 from canonical.launchpad.webapp.testing import verifyObject
 from canonical.testing.layers import (
-    AppServerLayer,
     DatabaseFunctionalLayer,
     LaunchpadFunctionalLayer,
     LaunchpadZopelessLayer,
@@ -79,6 +75,7 @@
     )
 from lp.registry.interfaces.person import IPersonSet
 from lp.registry.interfaces.product import IProductSet
+from lp.services.messages.interfaces.message import IMessageJob
 from lp.testing import (
     ExpectedException,
     launchpadlib_for,
@@ -86,8 +83,8 @@
     login_person,
     person_logged_in,
     TestCaseWithFactory,
+    WebServiceTestCase,
     ws_object,
-    WebServiceTestCase,
     )
 from lp.testing.factory import (
     GPGSigningContext,

=== modified file 'lib/lp/code/model/tests/test_codeimport.py'
--- lib/lp/code/model/tests/test_codeimport.py	2011-08-24 21:17:35 +0000
+++ lib/lp/code/model/tests/test_codeimport.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Unit tests for methods of CodeImport and CodeImportSet."""
@@ -348,7 +348,7 @@
         # Suspending a new import has no impact on jobs.
         code_import = self.factory.makeCodeImport()
         code_import.updateFromData(
-            {'review_status':CodeImportReviewStatus.SUSPENDED},
+            {'review_status': CodeImportReviewStatus.SUSPENDED},
             self.import_operator)
         self.assertIs(None, code_import.import_job)
         self.assertEqual(
@@ -378,7 +378,7 @@
         # Invalidating a new import has no impact on jobs.
         code_import = self.factory.makeCodeImport()
         code_import.updateFromData(
-            {'review_status':CodeImportReviewStatus.INVALID},
+            {'review_status': CodeImportReviewStatus.INVALID},
             self.import_operator)
         self.assertIs(None, code_import.import_job)
         self.assertEqual(
@@ -408,7 +408,7 @@
         # Marking a new import as failing has no impact on jobs.
         code_import = self.factory.makeCodeImport()
         code_import.updateFromData(
-            {'review_status':CodeImportReviewStatus.FAILING},
+            {'review_status': CodeImportReviewStatus.FAILING},
             self.import_operator)
         self.assertIs(None, code_import.import_job)
         self.assertEqual(
@@ -418,7 +418,7 @@
         # Marking an import with a pending job as failing, removes job.
         code_import = self.makeApprovedImportWithPendingJob()
         code_import.updateFromData(
-            {'review_status':CodeImportReviewStatus.FAILING},
+            {'review_status': CodeImportReviewStatus.FAILING},
             self.import_operator)
         self.assertIs(None, code_import.import_job)
         self.assertEqual(
@@ -428,7 +428,7 @@
         # Marking an import with a running job as failing leaves job.
         code_import = self.makeApprovedImportWithRunningJob()
         code_import.updateFromData(
-            {'review_status':CodeImportReviewStatus.FAILING},
+            {'review_status': CodeImportReviewStatus.FAILING},
             self.import_operator)
         self.assertIsNot(None, code_import.import_job)
         self.assertEqual(
@@ -720,7 +720,6 @@
             git_repo_url=self.factory.getUniqueURL())
         requester = self.factory.makePerson()
         code_import.requestImport(requester)
-        old_date = code_import.import_job.date_due
         e = self.assertRaises(
             CodeImportAlreadyRequested, code_import.requestImport, requester,
             error_if_already_requested=True)

=== modified file 'lib/lp/code/stories/codeimport/xx-codeimport-results.txt'
--- lib/lp/code/stories/codeimport/xx-codeimport-results.txt	2011-08-27 16:28:52 +0000
+++ lib/lp/code/stories/codeimport/xx-codeimport-results.txt	2011-08-28 08:46:25 +0000
@@ -1,4 +1,5 @@
-= Code import results =
+Code import results
+===================
 
 Information about the last ten import runs for a code import are shown
 on the branch index page with the other code import details.
@@ -15,9 +16,11 @@
 how each result type is rendered.
 
     >>> odin = factory.makeCodeImportMachine(hostname='odin')
-    >>> make_all_result_types(code_import_1, factory, machine=odin, start=0, count=7)
+    >>> make_all_result_types(
+    ...     code_import_1, factory, machine=odin, start=0, count=7)
     >>> branch_url_1 = canonical_url(code_import_1.branch)
-    >>> make_all_result_types(code_import_2, factory, machine=odin, start=7, count=7)
+    >>> make_all_result_types(
+    ...     code_import_2, factory, machine=odin, start=7, count=7)
     >>> branch_url_2 = canonical_url(code_import_2.branch)
     >>> logout()
 

=== modified file 'lib/lp/codehosting/codeimport/tests/servers.py'
--- lib/lp/codehosting/codeimport/tests/servers.py	2011-08-24 21:17:35 +0000
+++ lib/lp/codehosting/codeimport/tests/servers.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Server classes that know how to create various kinds of foreign archive."""
@@ -21,17 +21,17 @@
 import stat
 import subprocess
 import tempfile
+import threading
 import time
-import threading
 
 from bzrlib.branch import Branch
 from bzrlib.branchbuilder import BranchBuilder
 from bzrlib.bzrdir import BzrDir
-from bzrlib.tests.treeshape import build_tree_contents
 from bzrlib.tests.test_server import (
     ReadonlySmartTCPServer_for_testing,
     TestServer,
     )
+from bzrlib.tests.treeshape import build_tree_contents
 from bzrlib.transport import Server
 from bzrlib.urlutils import (
     escape,
@@ -45,14 +45,12 @@
     DictBackend,
     TCPGitServer,
     )
-from mercurial.ui import (
-    ui as hg_ui,
-    )
 from mercurial.hgweb import (
     hgweb,
     server as hgweb_server,
     )
 from mercurial.localrepo import localrepository
+from mercurial.ui import ui as hg_ui
 import subvertpy.ra
 import subvertpy.repos
 
@@ -120,7 +118,7 @@
         if self._use_svn_serve:
             conf_path = os.path.join(
                 self.repository_path, 'conf/svnserve.conf')
-            with open(conf_path , 'w') as conf_file:
+            with open(conf_path, 'w') as conf_file:
                 conf_file.write('[general]\nanon-access = write\n')
             self._svnserve = subprocess.Popen(
                 ['svnserve', '--daemon', '--foreground', '--threads',
@@ -355,7 +353,8 @@
             finally:
                 f.close()
             repo[None].add([filename])
-        repo.commit(text='<The commit message>', user='jane Foo <joe@xxxxxxx>')
+        repo.commit(
+            text='<The commit message>', user='jane Foo <joe@xxxxxxx>')
 
 
 class BzrServer(Server):
@@ -373,10 +372,11 @@
         branch.get_config().set_user_option("create_signatures", "never")
         builder = BranchBuilder(branch=branch)
         actions = [('add', ('', 'tree-root', 'directory', None))]
-        actions += [('add', (path, path+'-id', 'file', content)) for (path,
-            content) in tree_contents]
-        builder.build_snapshot(None, None,
-                actions, committer='Joe Foo <joe@xxxxxxx>',
+        actions += [
+            ('add', (path, path + '-id', 'file', content))
+            for (path, content) in tree_contents]
+        builder.build_snapshot(
+            None, None, actions, committer='Joe Foo <joe@xxxxxxx>',
                 message=u'<The commit message>')
 
     def get_url(self):
@@ -388,12 +388,17 @@
     def start_server(self):
         super(BzrServer, self).start_server()
         self.createRepository(self.repository_path)
+
         class LocalURLServer(TestServer):
             def __init__(self, repository_path):
                 self.repository_path = repository_path
-            def start_server(self): pass
+
+            def start_server(self):
+                pass
+
             def get_url(self):
                 return local_path_to_url(self.repository_path)
+
         if self._use_server:
             self._bzrserver = ReadonlySmartTCPServer_for_testing()
             self._bzrserver.start_server(

=== modified file 'lib/lp/codehosting/codeimport/tests/test_foreigntree.py'
--- lib/lp/codehosting/codeimport/tests/test_foreigntree.py	2011-08-24 14:45:35 +0000
+++ lib/lp/codehosting/codeimport/tests/test_foreigntree.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Tests for foreign branch support."""
@@ -10,7 +10,6 @@
 
 from bzrlib.tests import TestCaseWithTransport
 import CVS
-
 import subvertpy.client
 import subvertpy.ra
 import subvertpy.wc
@@ -180,7 +179,6 @@
 
         self.assertFileEqual(new_content, 'working_tree2/README')
 
-
     def test_update(self):
         # update() fetches any changes to the branch from the remote branch.
         # We test this by checking out the same branch twice, making

=== modified file 'lib/lp/codehosting/codeimport/tests/test_worker.py'
--- lib/lp/codehosting/codeimport/tests/test_worker.py	2011-08-25 10:32:14 +0000
+++ lib/lp/codehosting/codeimport/tests/test_worker.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Tests for the code import worker."""
@@ -12,23 +12,19 @@
 import tempfile
 import time
 
+from bzrlib import trace
 from bzrlib.branch import (
     Branch,
     BranchReferenceFormat,
     )
-from bzrlib.branchbuilder import (
-    BranchBuilder,
-    )
+from bzrlib.branchbuilder import BranchBuilder
 from bzrlib.bzrdir import (
     BzrDir,
     BzrDirFormat,
     format_registry,
     )
-from bzrlib.errors import (
-    NoSuchFile,
-    )
+from bzrlib.errors import NoSuchFile
 from bzrlib.tests import TestCaseWithTransport
-from bzrlib import trace
 from bzrlib.transport import get_transport
 from bzrlib.urlutils import (
     join as urljoin,
@@ -46,12 +42,6 @@
 from canonical.config import config
 from canonical.testing.layers import BaseLayer
 from lp.codehosting import load_optional_plugin
-from lp.codehosting.safe_open import (
-    AcceptAnythingPolicy,
-    BlacklistPolicy,
-    BadUrl,
-    SafeBranchOpener,
-    )
 from lp.codehosting.codeimport.tarball import (
     create_tarball,
     extract_tarball,
@@ -77,6 +67,12 @@
     ImportDataStore,
     ImportWorker,
     )
+from lp.codehosting.safe_open import (
+    AcceptAnythingPolicy,
+    BadUrl,
+    BlacklistPolicy,
+    SafeBranchOpener,
+    )
 from lp.codehosting.tests.helpers import create_branch_with_one_revision
 from lp.services.log.logger import BufferLogger
 from lp.testing import TestCase
@@ -223,7 +219,8 @@
         target_url = store._getMirrorURL(self.arbitrary_branch_id)
         knit_format = format_registry.get('knit')()
         tree = create_branch_with_one_revision(target_url, format=knit_format)
-        self.assertNotEquals(tree.bzrdir._format.repository_format.network_name(),
+        self.assertNotEquals(
+            tree.bzrdir._format.repository_format.network_name(),
             default_format.repository_format.network_name())
 
         # The fetched branch is in the default format.
@@ -494,7 +491,6 @@
         self.assertEquals(content, transport.get_bytes(remote_name))
 
 
-
 class MockForeignWorkingTree:
     """Working tree that records calls to checkout and update."""
 
@@ -669,8 +665,10 @@
         # getForeignTree returns an object that represents the 'foreign'
         # branch (i.e. a CVS or Subversion branch).
         worker = self.makeImportWorker()
+
         def _getForeignTree(target_path):
             return MockForeignWorkingTree(target_path)
+
         worker.foreign_tree_store._getForeignTree = _getForeignTree
         working_tree = worker.getForeignTree()
         self.assertIsSameRealPath(
@@ -873,7 +871,7 @@
         self.assertPositive(output.tell())
 
         self.addCleanup(
-            lambda : clean_up_default_stores_for_import(source_details))
+            lambda: clean_up_default_stores_for_import(source_details))
 
         tree_path = tempfile.mkdtemp()
         self.addCleanup(lambda: shutil.rmtree(tree_path))
@@ -977,7 +975,8 @@
         client.add('working_tree/newfile')
         client.log_msg_func = lambda c: 'Add a file'
         (revnum, date, author) = client.commit(['working_tree'], recurse=True)
-        # CSCVS breaks on commits without an author, so make sure there is one.
+        # CSCVS breaks on commits without an author, so make sure there
+        # is one.
         self.assertIsNot(None, author)
         self.foreign_commit_count += 1
         shutil.rmtree('working_tree')
@@ -1037,8 +1036,10 @@
 
     def test_invalid(self):
         # If there is no branch in the target URL, exit with FAILURE_INVALID
-        worker = self.makeImportWorker(self.factory.makeCodeImportSourceDetails(
-            rcstype=self.rcstype, url="http://localhost/path/non/existant";),
+        worker = self.makeImportWorker(
+            self.factory.makeCodeImportSourceDetails(
+                rcstype=self.rcstype,
+                url="http://localhost/path/non/existant";),
             opener_policy=AcceptAnythingPolicy())
         self.assertEqual(
             CodeImportWorkerExitCode.FAILURE_INVALID, worker.run())
@@ -1046,8 +1047,9 @@
     def test_forbidden(self):
         # If the branch specified is using an invalid scheme, exit with
         # FAILURE_FORBIDDEN
-        worker = self.makeImportWorker(self.factory.makeCodeImportSourceDetails(
-            rcstype=self.rcstype, url="file:///local/path"),
+        worker = self.makeImportWorker(
+            self.factory.makeCodeImportSourceDetails(
+                rcstype=self.rcstype, url="file:///local/path"),
             opener_policy=CodeImportBranchOpenPolicy())
         self.assertEqual(
             CodeImportWorkerExitCode.FAILURE_FORBIDDEN, worker.run())
@@ -1058,22 +1060,23 @@
             'trunk', [('bzr\\doesnt\\support\\this', 'Original contents')]),
             opener_policy=AcceptAnythingPolicy())
         self.assertEqual(
-            CodeImportWorkerExitCode.FAILURE_UNSUPPORTED_FEATURE, worker.run())
+            CodeImportWorkerExitCode.FAILURE_UNSUPPORTED_FEATURE,
+            worker.run())
 
     def test_partial(self):
-        # Only config.codeimport.revisions_import_limit will be imported in a
-        # given run.
+        # Only config.codeimport.revisions_import_limit will be imported
+        # in a given run.
         worker = self.makeImportWorker(self.makeSourceDetails(
             'trunk', [('README', 'Original contents')]),
             opener_policy=AcceptAnythingPolicy())
         self.makeForeignCommit(worker.source_details)
         self.assertTrue(self.foreign_commit_count > 1)
+        import_limit = self.foreign_commit_count - 1
         self.pushConfig(
             'codeimport',
-            git_revisions_import_limit=self.foreign_commit_count-1,
-            svn_revisions_import_limit=self.foreign_commit_count-1,
-            hg_revisions_import_limit=self.foreign_commit_count-1,
-            )
+            git_revisions_import_limit=import_limit,
+            svn_revisions_import_limit=import_limit,
+            hg_revisions_import_limit=import_limit)
         self.assertEqual(
             CodeImportWorkerExitCode.SUCCESS_PARTIAL, worker.run())
         self.assertEqual(

=== modified file 'lib/lp/codehosting/codeimport/tests/test_workermonitor.py'
--- lib/lp/codehosting/codeimport/tests/test_workermonitor.py	2011-08-27 16:28:52 +0000
+++ lib/lp/codehosting/codeimport/tests/test_workermonitor.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Tests for the CodeImportWorkerMonitor and related classes."""
@@ -16,12 +16,12 @@
 
 from bzrlib.branch import Branch
 from bzrlib.tests import TestCase as BzrTestCase
-import transaction
 from testtools.deferredruntest import (
     assert_fails_with,
     AsynchronousDeferredRunTest,
     flush_logged_errors,
     )
+import transaction
 from twisted.internet import (
     defer,
     error,
@@ -68,7 +68,6 @@
     CodeImportWorkerMonitorProtocol,
     ExitQuietly,
     )
-from lp.codehosting.safe_open import AcceptAnythingPolicy
 from lp.services.log.logger import BufferLogger
 from lp.services.twistedsupport import suppress_stderr
 from lp.services.twistedsupport.tests.test_processmonitor import (
@@ -81,6 +80,7 @@
     TestCase,
     )
 from lp.testing.factory import LaunchpadObjectFactory
+from lp.testing.fakemethod import FakeMethod
 
 
 class TestWorkerMonitorProtocol(ProcessTestsMixin, TestCase):
@@ -124,7 +124,7 @@
             self.protocol.resetTimeout()
             self.assertEqual(
                 self.worker_monitor.calls,
-                [('updateHeartbeat', '')]*i)
+                [('updateHeartbeat', '')] * i)
             self.clock.advance(
                 config.codeimportworker.heartbeat_update_interval)
 
@@ -181,9 +181,11 @@
     def callRemote(self, method_name, *args):
         method = getattr(self, '_remote_%s' % method_name, self._default)
         deferred = defer.maybeDeferred(method, *args)
+
         def append_to_log(pass_through):
             self.calls.append((method_name,) + tuple(args))
             return pass_through
+
         deferred.addCallback(append_to_log)
         return deferred
 
@@ -250,12 +252,14 @@
         log_file_name = self.factory.getUniqueString()
         worker_monitor = self.makeWorkerMonitorWithJob(
             1, (['a'], branch_url, log_file_name))
+
         def check_branch_log(ignored):
             # Looking at the _ attributes here is in slightly poor taste, but
             # much much easier than them by logging and parsing an oops, etc.
             self.assertEqual(
                 (branch_url, log_file_name),
                 (worker_monitor._branch_url, worker_monitor._log_file_name))
+
         return worker_monitor.getWorkerArguments().addCallback(
             check_branch_log)
 
@@ -288,10 +292,12 @@
         log_tail = self.factory.getUniqueString()
         job_id = self.factory.getUniqueInteger()
         worker_monitor = self.makeWorkerMonitorWithJob(job_id)
+
         def check_updated_details(result):
             self.assertEqual(
                 [('updateHeartbeat', job_id, log_tail)],
                 worker_monitor.codeimport_endpoint.calls)
+
         return worker_monitor.updateHeartbeat(log_tail).addCallback(
             check_updated_details)
 
@@ -302,10 +308,12 @@
         job_id = self.factory.getUniqueInteger()
         worker_monitor = self.makeWorkerMonitorWithJob(job_id)
         self.assertEqual(worker_monitor._log_file.tell(), 0)
+
         def check_finishJob_called(result):
             self.assertEqual(
                 [('finishJobID', job_id, 'SUCCESS', '')],
                 worker_monitor.codeimport_endpoint.calls)
+
         return worker_monitor.finishJob(
             CodeImportResultStatus.SUCCESS).addCallback(
             check_finishJob_called)
@@ -317,11 +325,13 @@
         log_text = self.factory.getUniqueString()
         worker_monitor = self.makeWorkerMonitorWithJob()
         worker_monitor._log_file.write(log_text)
+
         def check_file_uploaded(result):
             transaction.abort()
             url = worker_monitor.codeimport_endpoint.calls[0][3]
             text = urllib.urlopen(url).read()
             self.assertEqual(log_text, text)
+
         return worker_monitor.finishJob(
             CodeImportResultStatus.SUCCESS).addCallback(
             check_file_uploaded)
@@ -331,6 +341,8 @@
         # If the upload to the librarian fails for any reason, the worker
         # monitor still calls the finishJobID XML-RPC method, but logs an
         # error to indicate there was a problem.
+        class Fail(Exception):
+            """Some arbitrary failure."""
 
         # Write some text so that we try to upload the log.
         job_id = self.factory.getUniqueInteger()
@@ -338,22 +350,32 @@
         worker_monitor._log_file.write('some text')
 
         # Make _createLibrarianFileAlias fail in a distinctive way.
-        worker_monitor._createLibrarianFileAlias = lambda *args: 1/0
+        worker_monitor._createLibrarianFileAlias = FakeMethod(failure=Fail())
+
         def check_finishJob_called(result):
             self.assertEqual(
                 [('finishJobID', job_id, 'SUCCESS', '')],
                 worker_monitor.codeimport_endpoint.calls)
-            errors = flush_logged_errors(ZeroDivisionError)
+            errors = flush_logged_errors(Fail)
             self.assertEqual(1, len(errors))
+
         return worker_monitor.finishJob(
             CodeImportResultStatus.SUCCESS).addCallback(
             check_finishJob_called)
 
     def patchOutFinishJob(self, worker_monitor):
+        """Replace `worker_monitor.finishJob` with a `FakeMethod`-alike stub.
+
+        :param worker_monitor: CodeImportWorkerMonitor to patch up.
+        :return: A list of statuses that `finishJob` has been called with.
+            Future calls will be appended to this list.
+        """
         calls = []
+
         def finishJob(status):
             calls.append(status)
             return defer.succeed(None)
+
         worker_monitor.finishJob = finishJob
         return calls
 
@@ -451,10 +473,9 @@
         # calls finishJob with a status of FAILURE_UNSUPPORTED_FEATURE.
         worker_monitor = self.makeWorkerMonitorWithJob()
         calls = self.patchOutFinishJob(worker_monitor)
-        ret = worker_monitor.callFinishJob(
-            makeFailure(
-                error.ProcessTerminated,
-                exitCode=CodeImportWorkerExitCode.FAILURE_UNSUPPORTED_FEATURE))
+        ret = worker_monitor.callFinishJob(makeFailure(
+            error.ProcessTerminated,
+            exitCode=CodeImportWorkerExitCode.FAILURE_UNSUPPORTED_FEATURE))
         self.assertEqual(
             calls, [CodeImportResultStatus.FAILURE_UNSUPPORTED_FEATURE])
         self.assertOopsesLogged([])
@@ -486,6 +507,7 @@
         self.layer.force_dirty_database()
         worker_monitor = self.makeWorkerMonitorWithJob()
         ret = worker_monitor.callFinishJob(makeFailure(RuntimeError))
+
         def check_log_file(ignored):
             failures = flush_logged_errors(RuntimeError)
             self.assertEqual(1, len(failures))
@@ -543,7 +565,8 @@
             self.result_status = status
             return defer.succeed(None)
 
-    def assertFinishJobCalledWithStatus(self, ignored, worker_monitor, status):
+    def assertFinishJobCalledWithStatus(self, ignored, worker_monitor,
+                                        status):
         """Assert that finishJob was called with the given status."""
         self.assertEqual(worker_monitor.result_status, status)
 
@@ -580,6 +603,7 @@
         # If finishJob fails with ExitQuietly, the call to run() still
         # succeeds.
         worker_monitor = self.WorkerMonitor(defer.succeed(None))
+
         def finishJob(reason):
             raise ExitQuietly
         worker_monitor.finishJob = finishJob
@@ -772,6 +796,7 @@
             xmlrpc.Proxy(config.codeimportdispatcher.codeimportscheduler_url),
             "anything")
         deferred = monitor.run()
+
         def save_protocol_object(result):
             """Save the process protocol object.
 
@@ -781,6 +806,7 @@
             """
             self._protocol = monitor._protocol
             return result
+
         return deferred.addBoth(save_protocol_object)
 
     def test_import_cvs(self):
@@ -872,8 +898,11 @@
         # listening too.
         interpreter = '%s/bin/py' % config.root
         reactor.spawnProcess(
-            DeferredOnExit(process_end_deferred), interpreter,
-            [interpreter, script_path, '--access-policy=anything', str(job_id),
-                '-q'],
-            childFDs={0:0, 1:1, 2:2}, env=os.environ)
+            DeferredOnExit(process_end_deferred), interpreter, [
+                interpreter,
+                script_path,
+                '--access-policy=anything',
+                str(job_id),
+                '-q',
+                ], childFDs={0: 0, 1: 1, 2: 2}, env=os.environ)
         return process_end_deferred

=== modified file 'lib/lp/codehosting/codeimport/worker.py'
--- lib/lp/codehosting/codeimport/worker.py	2011-08-27 16:28:52 +0000
+++ lib/lp/codehosting/codeimport/worker.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """The code import worker. This imports code from foreign repositories."""
@@ -48,31 +48,29 @@
 import cscvs
 from cscvs.cmds import totla
 import CVS
-import SCM
-
-from canonical.config import config
-
 from lazr.uri import (
     InvalidURIError,
     URI,
     )
+import SCM
 
+from canonical.config import config
 from lp.code.enums import RevisionControlSystems
 from lp.code.interfaces.branch import get_blacklisted_hostnames
 from lp.codehosting.codeimport.foreigntree import (
     CVSWorkingTree,
     SubversionWorkingTree,
     )
+from lp.codehosting.codeimport.tarball import (
+    create_tarball,
+    extract_tarball,
+    )
+from lp.codehosting.codeimport.uifactory import LoggingUIFactory
 from lp.codehosting.safe_open import (
     BadUrl,
     BranchOpenPolicy,
     SafeBranchOpener,
     )
-from lp.codehosting.codeimport.tarball import (
-    create_tarball,
-    extract_tarball,
-    )
-from lp.codehosting.codeimport.uifactory import LoggingUIFactory
 from lp.services.propertycache import cachedproperty
 
 
@@ -944,7 +942,8 @@
     def getRevisionLimit(self):
         """See `PullingImportWorker.getRevisionLimit`."""
         # For now, just grab the whole branch at once.
-        # bzr does support fetch(limit=) but it isn't very efficient at the moment.
+        # bzr does support fetch(limit=) but it isn't very efficient at
+        # the moment.
         return None
 
     @property

=== modified file 'lib/lp/codehosting/codeimport/workermonitor.py'
--- lib/lp/codehosting/codeimport/workermonitor.py	2011-08-25 23:32:36 +0000
+++ lib/lp/codehosting/codeimport/workermonitor.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=W0702
@@ -163,6 +163,7 @@
         """
         deferred = self.codeimport_endpoint.callRemote(
             'getImportDataForJobID', self._job_id)
+
         def _processResult(result):
             code_import_arguments, branch_url, log_file_name = result
             self._branch_url = branch_url
@@ -170,7 +171,8 @@
             self._logger.info(
                 'Found source details: %s', code_import_arguments)
             return code_import_arguments
-        return deferred.addCallbacks(_processResult, self._trap_nosuchcodeimportjob)
+        return deferred.addCallbacks(
+            _processResult, self._trap_nosuchcodeimportjob)
 
     def updateHeartbeat(self, tail):
         """Call the updateHeartbeat method for the job we are working on."""

=== modified file 'lib/lp/codehosting/puller/worker.py'
--- lib/lp/codehosting/puller/worker.py	2011-08-25 22:34:19 +0000
+++ lib/lp/codehosting/puller/worker.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 __metaclass__ = type
@@ -17,14 +17,14 @@
     BzrBranchFormat4,
     )
 from bzrlib.bzrdir import BzrDir
+from bzrlib.plugins.loom.branch import LoomSupport
 from bzrlib.repofmt.weaverepo import (
     RepositoryFormat4,
     RepositoryFormat5,
     RepositoryFormat6,
     )
+from bzrlib.transport import get_transport
 import bzrlib.ui
-from bzrlib.plugins.loom.branch import LoomSupport
-from bzrlib.transport import get_transport
 from bzrlib.ui import SilentUIFactory
 from lazr.uri import (
     InvalidURIError,
@@ -78,8 +78,6 @@
         self.scheme = scheme
 
 
-
-
 def get_canonical_url_for_branch_name(unique_name):
     """Custom implementation of canonical_url(branch) for error reporting.
 

=== modified file 'lib/lp/codehosting/safe_open.py'
--- lib/lp/codehosting/safe_open.py	2011-08-25 10:19:25 +0000
+++ lib/lp/codehosting/safe_open.py	2011-08-28 08:46:25 +0000
@@ -5,13 +5,13 @@
 
 __metaclass__ = type
 
+import threading
+
 from bzrlib import urlutils
 from bzrlib.branch import Branch
 from bzrlib.bzrdir import BzrDir
-
 from lazr.uri import URI
 
-import threading
 
 __all__ = [
     'AcceptAnythingPolicy',
@@ -173,8 +173,8 @@
 class SafeBranchOpener(object):
     """Safe branch opener.
 
-    All locations that are opened (stacked-on branches, references) are checked
-    against a policy object.
+    All locations that are opened (stacked-on branches, references) are
+    checked against a policy object.
 
     The policy object is expected to have the following methods:
     * checkOneURL
@@ -193,12 +193,12 @@
         """Install the ``transformFallbackLocation`` hook.
 
         This is done at module import time, but transformFallbackLocationHook
-        doesn't do anything unless the `_active_openers` threading.Local object
-        has a 'opener' attribute in this thread.
+        doesn't do anything unless the `_active_openers` threading.Local
+        object has a 'opener' attribute in this thread.
 
-        This is in a module-level function rather than performed at module level
-        so that it can be called in setUp for testing `SafeBranchOpener` as
-        bzrlib.tests.TestCase.setUp clears hooks.
+        This is in a module-level function rather than performed at module
+        level so that it can be called in setUp for testing `SafeBranchOpener`
+        as bzrlib.tests.TestCase.setUp clears hooks.
         """
         Branch.hooks.install_named_hook(
             'transform_fallback_location',
@@ -288,6 +288,7 @@
         url = self.checkAndFollowBranchReference(url, open_dir=open_dir)
         if open_dir is None:
             open_dir = BzrDir.open
+
         def open_branch(url):
             dir = open_dir(url)
             return dir.open_branch()

=== modified file 'lib/lp/codehosting/tests/test_safe_open.py'
--- lib/lp/codehosting/tests/test_safe_open.py	2011-08-24 23:01:46 +0000
+++ lib/lp/codehosting/tests/test_safe_open.py	2011-08-28 08:46:25 +0000
@@ -6,6 +6,18 @@
 
 __metaclass__ = type
 
+from bzrlib.branch import (
+    Branch,
+    BranchReferenceFormat,
+    BzrBranchFormat7,
+    )
+from bzrlib.bzrdir import (
+    BzrDir,
+    BzrDirMetaFormat1,
+    )
+from bzrlib.repofmt.pack_repo import RepositoryFormatKnitPack1
+from bzrlib.tests import TestCaseWithTransport
+from bzrlib.transport import chroot
 from lazr.uri import URI
 
 from lp.codehosting.safe_open import (
@@ -13,28 +25,12 @@
     BlacklistPolicy,
     BranchLoopError,
     BranchReferenceForbidden,
+    safe_open,
     SafeBranchOpener,
     WhitelistPolicy,
-    safe_open,
     )
-
 from lp.testing import TestCase
 
-from bzrlib.branch import (
-    Branch,
-    BzrBranchFormat7,
-    BranchReferenceFormat,
-    )
-from bzrlib.bzrdir import (
-    BzrDir,
-    BzrDirMetaFormat1,
-    )
-from bzrlib.repofmt.pack_repo import RepositoryFormatKnitPack1
-from bzrlib.tests import (
-    TestCaseWithTransport,
-    )
-from bzrlib.transport import chroot
-
 
 class TestSafeBranchOpenerCheckAndFollowBranchReference(TestCase):
     """Unit tests for `SafeBranchOpener.checkAndFollowBranchReference`."""
@@ -56,7 +52,7 @@
             super(parent_cls.StubbedSafeBranchOpener, self).__init__(policy)
             self._reference_values = {}
             for i in range(len(references) - 1):
-                self._reference_values[references[i]] = references[i+1]
+                self._reference_values[references[i]] = references[i + 1]
             self.follow_reference_calls = []
 
         def followReference(self, url, open_dir=None):
@@ -240,9 +236,11 @@
         b = self.make_branch('b', format='2a')
         b.set_stacked_on_url(a.base)
         seen_urls = set()
+
         def open_dir(url):
             seen_urls.add(url)
             return BzrDir.open(url)
+
         opener = self.makeBranchOpener([a.base, b.base])
         opener.open(b.base, open_dir=open_dir)
         self.assertEquals(seen_urls, set([b.base, a.base]))
@@ -253,9 +251,11 @@
         b_dir = self.make_bzrdir('b')
         b = BranchReferenceFormat().initialize(b_dir, target_branch=a)
         seen_urls = set()
+
         def open_dir(url):
             seen_urls.add(url)
             return BzrDir.open(url)
+
         opener = self.makeBranchOpener([a.base, b.base])
         opener.open(b.base, open_dir=open_dir)
         self.assertEquals(seen_urls, set([b.base, a.base]))
@@ -286,8 +286,10 @@
         chroot_server = chroot.ChrootServer(transport)
         chroot_server.start_server()
         self.addCleanup(chroot_server.stop_server)
+
         def get_url(relpath):
             return chroot_server.get_url() + relpath
+
         return URI(chroot_server.get_url()).scheme, get_url
 
     def test_stacked_within_scheme(self):

=== modified file 'lib/lp/registry/browser/productseries.py'
--- lib/lp/registry/browser/productseries.py	2011-08-24 21:17:35 +0000
+++ lib/lp/registry/browser/productseries.py	2011-08-28 08:46:25 +0000
@@ -33,7 +33,6 @@
 from operator import attrgetter
 
 from bzrlib.revision import NULL_REVISION
-from lazr.enum import DBItem
 from lazr.restful.interface import (
     copy_field,
     use_template,

=== modified file 'lib/lp/registry/browser/tests/product-portlet-packages-view.txt'
--- lib/lp/registry/browser/tests/product-portlet-packages-view.txt	2011-08-24 05:54:07 +0000
+++ lib/lp/registry/browser/tests/product-portlet-packages-view.txt	2011-08-28 08:46:25 +0000
@@ -30,7 +30,7 @@
     ...     transaction.commit()
     ...     reconnect_stores(config.statistician.dbuser)
     ...     ubuntu = getUtility(ILaunchpadCelebrities).ubuntu
-    ...     ignore = DistributionSourcePackageCache.updateAll(
+    ...     DistributionSourcePackageCache.updateAll(
     ...         ubuntu, archive=ubuntu.main_archive, log=logger,
     ...         ztm=transaction)
     ...     transaction.commit()

=== modified file 'lib/lp/registry/interfaces/person.py'
--- lib/lp/registry/interfaces/person.py	2011-08-24 00:10:44 +0000
+++ lib/lp/registry/interfaces/person.py	2011-08-28 08:46:25 +0000
@@ -2119,7 +2119,7 @@
 
     def getByEmails(emails, include_hidden=True):
         """Search for people with the given email addresses.
-        
+
         :param emails: A list of email addresses.
         :param include_hidden: Include people who have opted to hide their
             email. Defaults to True.

=== modified file 'lib/lp/registry/interfaces/pillar.py'
--- lib/lp/registry/interfaces/pillar.py	2011-08-24 04:07:31 +0000
+++ lib/lp/registry/interfaces/pillar.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=E0211,E0213
@@ -50,6 +50,7 @@
              description=_("Whether or not this item is active.")))
     pillar_category = Attribute('The category title applicable to the pillar')
 
+
 class IHasAliases(Interface):
 
     aliases = List(
@@ -120,7 +121,6 @@
     def count_search_matches(text):
         """Return the total number of Pillars matching :text:"""
 
-
     @operation_parameters(text=TextLine(title=u"Search text"),
                           limit=Int(title=u"Maximum number of items to "
                                     "return. This is a hard limit: any "

=== modified file 'lib/lp/scripts/tests/test_garbo.py'
--- lib/lp/scripts/tests/test_garbo.py	2011-08-22 15:08:03 +0000
+++ lib/lp/scripts/tests/test_garbo.py	2011-08-28 08:46:25 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Test the database garbage collector."""
@@ -39,7 +39,6 @@
     UTC_NOW,
     )
 from canonical.launchpad.database.librarian import TimeLimitedToken
-from lp.services.messages.model.message import Message
 from canonical.launchpad.database.oauth import (
     OAuthAccessToken,
     OAuthNonce,
@@ -61,7 +60,6 @@
     ZopelessDatabaseLayer,
     )
 from lp.answers.model.answercontact import AnswerContact
-from lp.bugs.model.bugmessage import BugMessage
 from lp.bugs.model.bugnotification import (
     BugNotification,
     BugNotificationRecipient,
@@ -91,8 +89,9 @@
     OpenIDConsumerAssociationPruner,
     UnusedSessionPruner,
     )
+from lp.services.job.model.job import Job
 from lp.services.log.logger import NullHandler
-from lp.services.job.model.job import Job
+from lp.services.messages.model.message import Message
 from lp.services.session.model import (
     SessionData,
     SessionPkgData,

=== modified file 'lib/lp/soyuz/doc/distroarchseriesbinarypackage.txt'
--- lib/lp/soyuz/doc/distroarchseriesbinarypackage.txt	2011-08-24 05:54:07 +0000
+++ lib/lp/soyuz/doc/distroarchseriesbinarypackage.txt	2011-08-28 08:46:25 +0000
@@ -2,182 +2,186 @@
 Distro Arch Series Binary Package
 =================================
 
-  >>> from lp.soyuz.model.binarypackagename import (
-  ...     BinaryPackageName)
-  >>> from lp.soyuz.model.distroarchseries import (
-  ...     DistroArchSeries)
-  >>> from lp.soyuz.model.distroarchseriesbinarypackage import (
-  ...     DistroArchSeriesBinaryPackage)
-  >>> hoary_i386 = DistroArchSeries.get(6)
-  >>> pmount_name = BinaryPackageName.selectOneBy(name="pmount")
-  >>> firefox_name = BinaryPackageName.selectOneBy(name="mozilla-firefox")
-  >>> pmount_hoary_i386 = DistroArchSeriesBinaryPackage(hoary_i386,
-  ...                                                    pmount_name)
-  >>> firefox_hoary_i386 = DistroArchSeriesBinaryPackage(hoary_i386,
-  ...                                                    firefox_name)
+    >>> from lp.soyuz.model.binarypackagename import BinaryPackageName
+    >>> from lp.soyuz.model.distroarchseries import DistroArchSeries
+    >>> from lp.soyuz.model.distroarchseriesbinarypackage import (
+    ...        DistroArchSeriesBinaryPackage)
+    >>> hoary_i386 = DistroArchSeries.get(6)
+    >>> pmount_name = BinaryPackageName.selectOneBy(name="pmount")
+    >>> firefox_name = BinaryPackageName.selectOneBy(name="mozilla-firefox")
+    >>> pmount_hoary_i386 = DistroArchSeriesBinaryPackage(
+    ...     hoary_i386, pmount_name)
+    >>> firefox_hoary_i386 = DistroArchSeriesBinaryPackage(
+    ...     hoary_i386, firefox_name)
 
 `DistroArchSeriesBinaryPackage`s have a title property:
 
-  >>> print pmount_hoary_i386.title
-  "pmount" binary package in Ubuntu Hoary i386
-
-
-First, we create a new version of pmount, and a version of
-mozilla-firefox that coincides with pmount's. We're hitch-hiking on two
-existing builds that are in sampledata!
-
-  >>> from lp.soyuz.model.publishing import (
-  ...     BinaryPackagePublishingHistory)
-  >>> from canonical.database.constants import UTC_NOW
-  >>> from lp.soyuz.model.binarypackagebuild import BinaryPackageBuild
-  >>> from lp.soyuz.model.component import Component
-  >>> from lp.soyuz.model.section import Section
-
-  >>> main_component = Component.selectOneBy(name="main")
-  >>> misc_section = Section.selectOneBy(name="base")
-  >>> from lp.soyuz.enums import BinaryPackageFormat
-  >>> binpackageformat = BinaryPackageFormat.DEB
-  >>> from lp.soyuz.enums import (
-  ...     PackagePublishingPriority, PackagePublishingStatus)
-  >>> from lp.registry.interfaces.distribution import IDistributionSet
-  >>> from lp.registry.interfaces.pocket import PackagePublishingPocket
-  >>> priority = PackagePublishingPriority.STANDARD
+    >>> print pmount_hoary_i386.title
+    "pmount" binary package in Ubuntu Hoary i386
+
+First, we create a new version of pmount, and a version of mozilla-
+firefox that coincides with pmount's. We're hitch-hiking on two existing
+builds that are in sampledata!
+
+    >>> from lp.soyuz.model.publishing import (
+    ...        BinaryPackagePublishingHistory)
+    >>> from canonical.database.constants import UTC_NOW
+    >>> from lp.soyuz.model.binarypackagebuild import BinaryPackageBuild
+    >>> from lp.soyuz.model.component import Component
+    >>> from lp.soyuz.model.section import Section
+
+    >>> main_component = Component.selectOneBy(name="main")
+    >>> misc_section = Section.selectOneBy(name="base")
+    >>> from lp.soyuz.enums import BinaryPackageFormat
+    >>> binpackageformat = BinaryPackageFormat.DEB
+    >>> from lp.soyuz.enums import (
+    ...        PackagePublishingPriority, PackagePublishingStatus)
+    >>> from lp.registry.interfaces.distribution import IDistributionSet
+    >>> from lp.registry.interfaces.pocket import PackagePublishingPocket
+    >>> priority = PackagePublishingPriority.STANDARD
 
 XXX: noodles 2008-11-05 bug=294585: The dependency on a database id
 needs to be removed.
-  >>> bpr = BinaryPackageBuild.get(8).createBinaryPackageRelease(
-  ...   binarypackagename=firefox_name.id,
-  ...   version="120.6-0",
-  ...   summary="Firefox loves lollies",
-  ...   description="Lolly-pop loving application",
-  ...   binpackageformat=binpackageformat,
-  ...   component=main_component.id,
-  ...   section=misc_section.id,
-  ...   priority=priority,
-  ...   shlibdeps=None,
-  ...   depends=None,
-  ...   recommends=None,
-  ...   suggests=None,
-  ...   conflicts=None,
-  ...   replaces=None,
-  ...   provides=None,
-  ...   pre_depends=None,
-  ...   enhances=None,
-  ...   breaks=None,
-  ...   essential=False,
-  ...   installedsize=0,
-  ...   architecturespecific=False,
-  ...   debug_package=None)
-
-  >>> pe = BinaryPackagePublishingHistory(
-  ...   binarypackagerelease=bpr.id,
-  ...   component=main_component.id,
-  ...   section=misc_section.id,
-  ...   priority=priority,
-  ...   distroarchseries=hoary_i386.id,
-  ...   status=PackagePublishingStatus.PUBLISHED,
-  ...   datecreated=UTC_NOW,
-  ...   datepublished=UTC_NOW,
-  ...   pocket=PackagePublishingPocket.RELEASE,
-  ...   datesuperseded=None,
-  ...   supersededby=None,
-  ...   datemadepending=None,
-  ...   dateremoved=None,
-  ...   archive=hoary_i386.main_archive)
+
+    >>> bpr = BinaryPackageBuild.get(8).createBinaryPackageRelease(
+    ...      binarypackagename=firefox_name.id,
+    ...      version="120.6-0",
+    ...      summary="Firefox loves lollies",
+    ...      description="Lolly-pop loving application",
+    ...      binpackageformat=binpackageformat,
+    ...      component=main_component.id,
+    ...      section=misc_section.id,
+    ...      priority=priority,
+    ...      shlibdeps=None,
+    ...      depends=None,
+    ...      recommends=None,
+    ...      suggests=None,
+    ...      conflicts=None,
+    ...      replaces=None,
+    ...      provides=None,
+    ...      pre_depends=None,
+    ...      enhances=None,
+    ...      breaks=None,
+    ...      essential=False,
+    ...      installedsize=0,
+    ...      architecturespecific=False,
+    ...      debug_package=None)
+
+    >>> pe = BinaryPackagePublishingHistory(
+    ...      binarypackagerelease=bpr.id,
+    ...      component=main_component.id,
+    ...      section=misc_section.id,
+    ...      priority=priority,
+    ...      distroarchseries=hoary_i386.id,
+    ...      status=PackagePublishingStatus.PUBLISHED,
+    ...      datecreated=UTC_NOW,
+    ...      datepublished=UTC_NOW,
+    ...      pocket=PackagePublishingPocket.RELEASE,
+    ...      datesuperseded=None,
+    ...      supersededby=None,
+    ...      datemadepending=None,
+    ...      dateremoved=None,
+    ...      archive=hoary_i386.main_archive)
 
 XXX: noodles 2008-11-06 bug=294585: The dependency on a database id
 needs to be removed.
-  >>> bpr = BinaryPackageBuild.get(9).createBinaryPackageRelease(
-  ...   binarypackagename=pmount_name.id,
-  ...   version="cr98.34",
-  ...   summary="Pmount bakes cakes",
-  ...   description="Phat cake-baker application",
-  ...   binpackageformat=binpackageformat,
-  ...   component=main_component.id,
-  ...   section=misc_section.id,
-  ...   priority=priority,
-  ...   shlibdeps=None,
-  ...   depends=None,
-  ...   recommends=None,
-  ...   suggests=None,
-  ...   conflicts=None,
-  ...   replaces=None,
-  ...   provides=None,
-  ...   pre_depends=None,
-  ...   enhances=None,
-  ...   breaks=None,
-  ...   essential=False,
-  ...   installedsize=0,
-  ...   architecturespecific=False,
-  ...   debug_package=None)
-
-  >>> pe = BinaryPackagePublishingHistory(
-  ...   binarypackagerelease=bpr.id,
-  ...   component=main_component.id,
-  ...   section=misc_section.id,
-  ...   priority=priority,
-  ...   distroarchseries=hoary_i386.id,
-  ...   status=PackagePublishingStatus.PUBLISHED,
-  ...   datecreated=UTC_NOW,
-  ...   datepublished=UTC_NOW,
-  ...   pocket=PackagePublishingPocket.RELEASE,
-  ...   datesuperseded=None,
-  ...   supersededby=None,
-  ...   datemadepending=None,
-  ...   dateremoved=None,
-  ...   archive=hoary_i386.main_archive)
-
-Then, we ensure that grabbing the current release of pmount and the old release
-both are sane.
-
-  >>> current_release = pmount_hoary_i386.currentrelease
-  >>> current_release.version
-  u'cr98.34'
-  >>> current_release.name
-  u'pmount'
-
-  >>> old_release = pmount_hoary_i386['0.1-1']
-  >>> old_release.version
-  u'0.1-1'
-  >>> old_release.name
-  u'pmount'
+
+    >>> bpr = BinaryPackageBuild.get(9).createBinaryPackageRelease(
+    ...      binarypackagename=pmount_name.id,
+    ...      version="cr98.34",
+    ...      summary="Pmount bakes cakes",
+    ...      description="Phat cake-baker application",
+    ...      binpackageformat=binpackageformat,
+    ...      component=main_component.id,
+    ...      section=misc_section.id,
+    ...      priority=priority,
+    ...      shlibdeps=None,
+    ...      depends=None,
+    ...      recommends=None,
+    ...      suggests=None,
+    ...      conflicts=None,
+    ...      replaces=None,
+    ...      provides=None,
+    ...      pre_depends=None,
+    ...      enhances=None,
+    ...      breaks=None,
+    ...      essential=False,
+    ...      installedsize=0,
+    ...      architecturespecific=False,
+    ...      debug_package=None)
+
+    >>> pe = BinaryPackagePublishingHistory(
+    ...      binarypackagerelease=bpr.id,
+    ...      component=main_component.id,
+    ...      section=misc_section.id,
+    ...      priority=priority,
+    ...      distroarchseries=hoary_i386.id,
+    ...      status=PackagePublishingStatus.PUBLISHED,
+    ...      datecreated=UTC_NOW,
+    ...      datepublished=UTC_NOW,
+    ...      pocket=PackagePublishingPocket.RELEASE,
+    ...      datesuperseded=None,
+    ...      supersededby=None,
+    ...      datemadepending=None,
+    ...      dateremoved=None,
+    ...      archive=hoary_i386.main_archive)
+
+Then, we ensure that grabbing the current release of pmount and the old
+release both are sane.
+
+    >>> current_release = pmount_hoary_i386.currentrelease
+    >>> current_release.version
+    u'cr98.34'
+
+    >>> current_release.name
+    u'pmount'
+
+    >>> old_release = pmount_hoary_i386['0.1-1']
+    >>> old_release.version
+    u'0.1-1'
+
+    >>> old_release.name
+    u'pmount'
 
 The source package that was used to build the current release is
 available in the binary package's distro_source_package attribute.
 
-  >>> distro_source_package = firefox_hoary_i386.distro_source_package
-  >>> distro_source_package.displayname
-  u'mozilla-firefox in Ubuntu'
+    >>> distro_source_package = firefox_hoary_i386.distro_source_package
+    >>> distro_source_package.displayname
+    u'mozilla-firefox in Ubuntu'
 
 If a given binary package doesn't have a current release, then the
 distro_source_package attribute should return None.
 
-  >>> from zope.security.proxy import removeSecurityProxy
-  >>> deb_wdy_i386 = removeSecurityProxy(
-  ...     getUtility(IDistributionSet)['debian']['woody']['i386'])
-  >>> pmount_woody_i386 = DistroArchSeriesBinaryPackage(
-  ...     deb_wdy_i386, pmount_name)
-  >>> print pmount_woody_i386.distro_source_package
-  None
+    >>> from zope.security.proxy import removeSecurityProxy
+    >>> deb_wdy_i386 = removeSecurityProxy(
+    ...        getUtility(IDistributionSet)['debian']['woody']['i386'])
+    >>> pmount_woody_i386 = DistroArchSeriesBinaryPackage(
+    ...        deb_wdy_i386, pmount_name)
+    >>> print pmount_woody_i386.distro_source_package
+    None
 
 Check the publishing record of packages returned by 'currentrelease' and
 '__getitem__', which are different and in 'Published' state.
 
-  >>> pe.id == current_release.current_publishing_record.id
-  True
-  >>> (pe.status.title,
-  ...  pe.distroarchseries.architecturetag)
-  ('Published', u'i386')
-
-  >>> old_pubrec = old_release.current_publishing_record
-  >>> (old_pubrec.id , old_pubrec.status.title,
-  ...  old_pubrec.distroarchseries.architecturetag)
-  (12, 'Published', u'i386')
+    >>> pe.id == current_release.current_publishing_record.id
+    True
+
+    >>> (pe.status.title,
+    ...     pe.distroarchseries.architecturetag)
+    ('Published', u'i386')
+
+    >>> old_pubrec = old_release.current_publishing_record
+    >>> (old_pubrec.id , old_pubrec.status.title,
+    ...     old_pubrec.distroarchseries.architecturetag)
+    (12, 'Published', u'i386')
 
 Note that it is only really possible to have two packages in the
 "Published" status if domination hasn't run yet.
 
-== Package caches and DARBP summaries ==
+
+Package caches and DARBP summaries
+----------------------------------
 
 Bug 208233 teaches us that DistroArchSeriesBinaryPackage summaries use
 package caches to generate their output, and unfortunately that means
@@ -187,73 +191,74 @@
 
 XXX: this is really too complicated, and the code in
 DistroArchSeriesBinaryPackage.summary should be simplified.
+
     -- kiko, 2008-03-28
 
-  >>> from lp.registry.interfaces.distribution import IDistributionSet
-  >>> from lp.registry.interfaces.person import IPersonSet
-  >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
-  >>> cprov = getUtility(IPersonSet).getByName('cprov')
-  >>> warty = ubuntu['warty']
+    >>> from lp.registry.interfaces.distribution import IDistributionSet
+    >>> from lp.registry.interfaces.person import IPersonSet
+    >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
+    >>> cprov = getUtility(IPersonSet).getByName('cprov')
+    >>> warty = ubuntu['warty']
 
 First, update the cache tables for Celso's PPA:
 
-  >>> from canonical.config import config
-  >>> from canonical.testing.layers import LaunchpadZopelessLayer
-  >>> LaunchpadZopelessLayer.switchDbUser(config.statistician.dbuser)
-
-  >>> from lp.services.log.logger import FakeLogger
-  >>> from lp.soyuz.model.distributionsourcepackagecache import (
-  ...     DistributionSourcePackageCache)
-  >>> DistributionSourcePackageCache.updateAll(
-  ...    ubuntu, archive=cprov.archive, ztm=LaunchpadZopelessLayer.txn,
-  ...    log=FakeLogger())
-  DEBUG ...
-  DEBUG Considering source 'pmount'
-  ...
-
-  >>> from lp.soyuz.model.distroseriespackagecache import (
-  ...     DistroSeriesPackageCache)
-  >>> DistroSeriesPackageCache.updateAll(
-  ...    warty, archive=cprov.archive, ztm=LaunchpadZopelessLayer.txn,
-  ...    log=FakeLogger())
-  DEBUG Considering binary 'mozilla-firefox'
-  ...
-
-  >>> cprov.archive.updateArchiveCache()
-  >>> from canonical.database.sqlbase import commit
-  >>> commit()
-  >>> flush_database_updates()
+    >>> from canonical.config import config
+    >>> from canonical.testing.layers import LaunchpadZopelessLayer
+    >>> LaunchpadZopelessLayer.switchDbUser(config.statistician.dbuser)
+
+    >>> from lp.services.log.logger import FakeLogger
+    >>> from lp.soyuz.model.distributionsourcepackagecache import (
+    ...        DistributionSourcePackageCache)
+    >>> DistributionSourcePackageCache.updateAll(
+    ...       ubuntu, archive=cprov.archive, ztm=LaunchpadZopelessLayer.txn,
+    ...       log=FakeLogger())
+    DEBUG ...
+    DEBUG Considering source 'pmount'
+    ...
+
+    >>> from lp.soyuz.model.distroseriespackagecache import (
+    ...        DistroSeriesPackageCache)
+    >>> DistroSeriesPackageCache.updateAll(
+    ...       warty, archive=cprov.archive, ztm=LaunchpadZopelessLayer.txn,
+    ...       log=FakeLogger())
+    DEBUG Considering binary 'mozilla-firefox'
+    ...
+
+    >>> cprov.archive.updateArchiveCache()
+    >>> from canonical.database.sqlbase import commit
+    >>> commit()
+    >>> flush_database_updates()
 
 Then, supersede all pmount publications in warty for pmount (this sets
 us up to demonstrate bug 208233).
 
-  >>> LaunchpadZopelessLayer.switchDbUser('archivepublisher')
-  >>> from lp.soyuz.model.binarypackagename import BinaryPackageName
-  >>> from lp.soyuz.model.distroarchseries import DistroArchSeries
-  >>> from lp.soyuz.model.distroarchseriesbinarypackage import (
-  ...     DistroArchSeriesBinaryPackage)
-  >>> from lp.soyuz.model.publishing import BinaryPackagePublishingHistory
-  >>> warty_i386 = DistroArchSeries.get(1)
-  >>> pmount_name = BinaryPackageName.selectOneBy(name="pmount")
-  >>> pmount_warty_i386 = DistroArchSeriesBinaryPackage(warty_i386,
-  ...                                                   pmount_name)
-  >>> pubs = BinaryPackagePublishingHistory.selectBy(
-  ...       archive=1,
-  ...       distroarchseries=warty_i386,
-  ...       status=PackagePublishingStatus.PUBLISHED)
-  >>> for p in pubs:
-  ...   if p.binarypackagerelease.binarypackagename == pmount_name:
-  ...       s = p.supersede()
-  >>> commit()
-  >>> flush_database_updates()
-  >>> LaunchpadZopelessLayer.switchDbUser(config.statistician.dbuser)
+    >>> LaunchpadZopelessLayer.switchDbUser('archivepublisher')
+    >>> from lp.soyuz.model.binarypackagename import BinaryPackageName
+    >>> from lp.soyuz.model.distroarchseries import DistroArchSeries
+    >>> from lp.soyuz.model.distroarchseriesbinarypackage import (
+    ...        DistroArchSeriesBinaryPackage)
+    >>> from lp.soyuz.model.publishing import BinaryPackagePublishingHistory
+    >>> warty_i386 = DistroArchSeries.get(1)
+    >>> pmount_name = BinaryPackageName.selectOneBy(name="pmount")
+    >>> pmount_warty_i386 = DistroArchSeriesBinaryPackage(warty_i386,
+    ...                                                      pmount_name)
+    >>> pubs = BinaryPackagePublishingHistory.selectBy(
+    ...          archive=1,
+    ...          distroarchseries=warty_i386,
+    ...          status=PackagePublishingStatus.PUBLISHED)
+    >>> for p in pubs:
+    ...      if p.binarypackagerelease.binarypackagename == pmount_name:
+    ...          s = p.supersede()
+    >>> commit()
+    >>> flush_database_updates()
+    >>> LaunchpadZopelessLayer.switchDbUser(config.statistician.dbuser)
 
 Now, if that bug is actually fixed, this works:
 
-  >>> pmount_warty_i386.summary
-  u'pmount shortdesc'
+    >>> pmount_warty_i386.summary
+    u'pmount shortdesc'
 
-  >>> pmount_warty_i386.description
-  u'pmount description'
+    >>> pmount_warty_i386.description
+    u'pmount description'
 
 Yay!

=== modified file 'lib/lp/soyuz/doc/package-cache.txt'
--- lib/lp/soyuz/doc/package-cache.txt	2011-08-24 06:07:34 +0000
+++ lib/lp/soyuz/doc/package-cache.txt	2011-08-28 08:46:25 +0000
@@ -21,30 +21,29 @@
  * fti, a tsvector generated on insert or update with the text indexes
    for name, binpkgnames, binpkgsummaries and binpkgdescriptions.
 
-  >>> from lp.registry.interfaces.distribution import IDistributionSet
-  >>> from lp.soyuz.model.distributionsourcepackagecache import (
-  ...     DistributionSourcePackageCache)
-
-  >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
-
-  >>> ubuntu_caches = DistributionSourcePackageCache._find(ubuntu)
-
-  >>> ubuntu_caches.count()
-  10
-
-  >>> for name in sorted([cache.name for cache in ubuntu_caches]):
-  ...     print name
-  alsa-utils
-  cnews
-  commercialpackage
-  evolution
-  foobar
-  libstdc++
-  linux-source-2.6.15
-  mozilla-firefox
-  netapplet
-  pmount
-
+    >>> from lp.registry.interfaces.distribution import IDistributionSet
+    >>> from lp.soyuz.model.distributionsourcepackagecache import (
+    ...              DistributionSourcePackageCache)
+
+    >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
+
+    >>> ubuntu_caches = DistributionSourcePackageCache._find(ubuntu)
+
+    >>> ubuntu_caches.count()
+    10
+
+    >>> for name in sorted([cache.name for cache in ubuntu_caches]):
+    ...              print name
+    alsa-utils
+    cnews
+    commercialpackage
+    evolution
+    foobar
+    libstdc++
+    linux-source-2.6.15
+    mozilla-firefox
+    netapplet
+    pmount
 
 Cached information for binaries is stored in DistroSeriesPackageCache
 table, including:
@@ -60,41 +59,42 @@
  * fti, a tsvector generated on insert or update with the text indexes
    for name, summary, description, summaries and descriptions.
 
-  >>> from lp.soyuz.model.distroseriespackagecache import (
-  ...     DistroSeriesPackageCache)
-  >>> warty = ubuntu['warty']
-  >>> warty_caches = DistroSeriesPackageCache._find(warty)
-  >>> warty_caches.count()
-  5
-  >>> for name in sorted([cache.name for cache in warty_caches]):
-  ...     print name
-  at
-  foobar
-  linux-2.6.12
-  mozilla-firefox
-  pmount
+    >>> from lp.soyuz.model.distroseriespackagecache import (
+    ...              DistroSeriesPackageCache)
+    >>> warty = ubuntu['warty']
+    >>> warty_caches = DistroSeriesPackageCache._find(warty)
+    >>> warty_caches.count()
+    5
+
+    >>> for name in sorted([cache.name for cache in warty_caches]):
+    ...              print name
+    at
+    foobar
+    linux-2.6.12
+    mozilla-firefox
+    pmount
 
 Different than sources, where multiple generated binaries are very
 common, multiple binaries of with the same name are only possible when
 versions are not the same across architectures.
 
-Building these caches we can reach good performance on full and
-partial term searching.
-
-  >>> ubuntu.searchSourcePackages('mozilla').count()
-  1
-
-  >>> ubuntu.searchSourcePackages('moz').count()
-  1
-
-  >>> ubuntu.searchSourcePackages('biscoito').count()
-  0
+Building these caches we can reach good performance on full and partial
+term searching.
+
+    >>> ubuntu.searchSourcePackages('mozilla').count()
+    1
+
+    >>> ubuntu.searchSourcePackages('moz').count()
+    1
+
+    >>> ubuntu.searchSourcePackages('biscoito').count()
+    0
 
 The cache update procedure is done by cronscripts/update-pkgcache.py,
-which removes obsolete records, update existing ones and add new
-records during the scanning of all the publishing records. This
-scripts runs periodically in our infrastructure. See usage at
-package-cache-script.txt.
+which removes obsolete records, update existing ones and add new records
+during the scanning of all the publishing records. This scripts runs
+periodically in our infrastructure. See usage at package-cache-
+script.txt.
 
 
 Dealing with Source Caches
@@ -103,101 +103,87 @@
 A SourcePackage that has the status DELETED will be deleted from the
 cache when it's updated.
 
-  >>> foobar_in_ubuntu = ubuntu.getSourcePackage('foobar')
-  >>> foobar_rel = foobar_in_ubuntu.releases[0]
-  >>> foobar_pub = foobar_rel.publishing_history[0]
-  >>> foobar_pub.status.name
-  'DELETED'
-
-  >>> ubuntu.searchSourcePackages('foobar').count()
-  1
-
-  >>> foobar_cache = DistributionSourcePackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distribution=ubuntu, name='foobar')
-
-  >>> foobar_cache is not None
-  True
-
-Source cache updates are driven by distribution, IDistribution
-instance offers a method for removing obsolete records in cache:
+    >>> foobar_in_ubuntu = ubuntu.getSourcePackage('foobar')
+    >>> foobar_rel = foobar_in_ubuntu.releases[0]
+    >>> foobar_pub = foobar_rel.publishing_history[0]
+    >>> foobar_pub.status.name
+    'DELETED'
+
+    >>> ubuntu.searchSourcePackages('foobar').count()
+    1
+
+    >>> foobar_cache = DistributionSourcePackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distribution=ubuntu, name='foobar')
+
+    >>> foobar_cache is not None
+    True
+
+Source cache updates are driven by distribution, IDistribution instance
+offers a method for removing obsolete records in cache:
 
 Let's use a fake logger object:
 
-  >>> from lp.services.log.logger import FakeLogger
-  >>> DistributionSourcePackageCache.removeOld(
-  ...     ubuntu, archive=ubuntu.main_archive, log=FakeLogger())
-  DEBUG Removing source cache for 'foobar' (10)
-
-  >>> import transaction
-  >>> transaction.commit()
-
-  >>> ubuntu.searchSourcePackages('foobar').count()
-  0
-
-  >>> foobar_cache = DistributionSourcePackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distribution=ubuntu, name='foobar')
-
-  >>> foobar_cache is None
-  True
+    >>> from lp.services.log.logger import FakeLogger
+    >>> DistributionSourcePackageCache.removeOld(
+    ...      ubuntu, archive=ubuntu.main_archive, log=FakeLogger())
+    DEBUG Removing source cache for 'foobar' (10)
+
+    >>> import transaction
+    >>> transaction.commit()
+
+    >>> ubuntu.searchSourcePackages('foobar').count()
+    0
+
+    >>> foobar_cache = DistributionSourcePackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distribution=ubuntu, name='foobar')
+
+    >>> foobar_cache is None
+    True
 
 A source package that has the status PUBLISHED will be added to the
 cache when it's updated the next time.
 
-  >>> cdrkit_in_ubuntu = ubuntu.getSourcePackage('cdrkit')
-  >>> cdrkit_rel = cdrkit_in_ubuntu.releases[0]
-  >>> cdrkit_pub = cdrkit_rel.publishing_history[0]
-  >>> cdrkit_pub.status.name
-  'PUBLISHED'
-
-  >>> ubuntu.searchSourcePackages('cdrkit').count()
-  0
-
-  >>> cdrkit_cache = DistributionSourcePackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distribution=ubuntu, name='cdrkit')
-
-  >>> cdrkit_cache is None
-  True
+    >>> cdrkit_in_ubuntu = ubuntu.getSourcePackage('cdrkit')
+    >>> cdrkit_rel = cdrkit_in_ubuntu.releases[0]
+    >>> cdrkit_pub = cdrkit_rel.publishing_history[0]
+    >>> cdrkit_pub.status.name
+    'PUBLISHED'
+
+    >>> ubuntu.searchSourcePackages('cdrkit').count()
+    0
+
+    >>> cdrkit_cache = DistributionSourcePackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distribution=ubuntu, name='cdrkit')
+
+    >>> cdrkit_cache is None
+    True
 
 We can invoke the cache updater directly on IDistroSeries:
 
-  >>> updates = DistributionSourcePackageCache.updateAll(
-  ...    ubuntu, archive=ubuntu.main_archive, ztm=transaction,
-  ...    log=FakeLogger())
-  DEBUG ...
-  DEBUG Considering source 'cdrkit'
-  DEBUG Creating new source cache entry.
-  ...
-  DEBUG Considering source 'mozilla-firefox'
-  ...
-
-  >>> print updates
-  10
-
-The current transaction should be committed since the script only
-commits full batches of 50 elements:
-
-  >>> transaction.commit()
-
-Also updates should be flushed since the 'name' (among other texts)
-are set after the row creation:
-
-XXX cprov 20061201: flush_database_updates should not be required
-after issuing a commit(), bug #3989
-
-  >>> from canonical.database.sqlbase import flush_database_updates
-  >>> flush_database_updates()
+    >>> updates = DistributionSourcePackageCache.updateAll(
+    ...     ubuntu, archive=ubuntu.main_archive, ztm=transaction,
+    ...     log=FakeLogger())
+    DEBUG ...
+    DEBUG Considering source 'cdrkit'
+    DEBUG Creating new source cache entry.
+    ...
+    DEBUG Considering source 'mozilla-firefox'
+    ...
+
+    >>> print updates
+    10
 
 Now we see that the 'cdrkit' source is part of the caches and can be
 reached via searches:
 
-  >>> ubuntu.searchSourcePackages('cdrkit').count()
-  1
-
-  >>> cdrkit_cache = DistributionSourcePackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distribution=ubuntu, name='cdrkit')
-
-  >>> cdrkit_cache is not None
-  True
+    >>> ubuntu.searchSourcePackages('cdrkit').count()
+    1
+
+    >>> cdrkit_cache = DistributionSourcePackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distribution=ubuntu, name='cdrkit')
+
+    >>> cdrkit_cache is not None
+    True
 
 
 Dealing with Binary Caches
@@ -206,463 +192,457 @@
 A BinaryPackage that has the status DELETED will be deleted from the
 cache when it's updated.
 
-  >>> foobar_bin_in_warty = warty.getBinaryPackage('foobar')
-  >>> foobar_bin_rel = foobar_in_ubuntu.releases[0]
-  >>> foobar_bin_pub = foobar_rel.publishing_history[0]
-  >>> foobar_bin_pub.status.name
-  'DELETED'
-
-  >>> warty.searchPackages('foobar').count()
-  1
-
-  >>> foobar_bin_cache = DistroSeriesPackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distroseries=warty, name='foobar')
-
-  >>> foobar_bin_cache is not None
-  True
-
-Binary cache updates are driven by distroseries, IDistroSeries
-instance offers a method for removing obsolete records in cache:
-
-  >>> DistroSeriesPackageCache.removeOld(
-  ...     warty, archive=ubuntu.main_archive, log=FakeLogger())
-  DEBUG Removing binary cache for 'foobar' (8)
-
-  >>> transaction.commit()
-
-  >>> warty.searchPackages('foobar').count()
-  0
-
-  >>> foobar_bin_cache = DistroSeriesPackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distroseries=warty, name='foobar')
-
-  >>> foobar_bin_cache is None
-  True
+    >>> foobar_bin_in_warty = warty.getBinaryPackage('foobar')
+    >>> foobar_bin_rel = foobar_in_ubuntu.releases[0]
+    >>> foobar_bin_pub = foobar_rel.publishing_history[0]
+    >>> foobar_bin_pub.status.name
+    'DELETED'
+
+    >>> warty.searchPackages('foobar').count()
+    1
+
+    >>> foobar_bin_cache = DistroSeriesPackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distroseries=warty, name='foobar')
+
+    >>> foobar_bin_cache is not None
+    True
+
+Binary cache updates are driven by distroseries, IDistroSeries instance
+offers a method for removing obsolete records in cache:
+
+    >>> DistroSeriesPackageCache.removeOld(
+    ...      warty, archive=ubuntu.main_archive, log=FakeLogger())
+    DEBUG Removing binary cache for 'foobar' (8)
+
+    >>> transaction.commit()
+
+    >>> warty.searchPackages('foobar').count()
+    0
+
+    >>> foobar_bin_cache = DistroSeriesPackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distroseries=warty, name='foobar')
+
+    >>> foobar_bin_cache is None
+    True
 
 A binary package that has been published since the last update of the
 cache will be added to it.
 
-  >>> cdrkit_bin_in_warty = warty.getBinaryPackage('cdrkit')
-  >>> cdrkit_bin_pub = cdrkit_bin_in_warty.current_publishings[0]
-  >>> cdrkit_bin_pub.status.name
-  'PUBLISHED'
-
-  >>> warty.searchPackages('cdrkit').count()
-  0
-
-  >>> cdrkit_bin_cache = DistroSeriesPackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distroseries=warty, name='cdrkit')
-
-  >>> cdrkit_bin_cache is None
-  True
+    >>> cdrkit_bin_in_warty = warty.getBinaryPackage('cdrkit')
+    >>> cdrkit_bin_pub = cdrkit_bin_in_warty.current_publishings[0]
+    >>> cdrkit_bin_pub.status.name
+    'PUBLISHED'
+
+    >>> warty.searchPackages('cdrkit').count()
+    0
+
+    >>> cdrkit_bin_cache = DistroSeriesPackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distroseries=warty, name='cdrkit')
+
+    >>> cdrkit_bin_cache is None
+    True
 
 We can invoke the cache updater directly on IDistroSeries:
 
-  >>> updates = DistroSeriesPackageCache.updateAll(
-  ...    warty, archive=ubuntu.main_archive, ztm=transaction,
-  ...    log=FakeLogger())
-  DEBUG Considering binary 'at'
-  ...
-  DEBUG Considering binary 'cdrkit'
-  DEBUG Creating new binary cache entry.
-  ...
-  DEBUG Considering binary 'mozilla-firefox'
-  ...
+    >>> updates = DistroSeriesPackageCache.updateAll(
+    ...     warty, archive=ubuntu.main_archive, ztm=transaction,
+    ...     log=FakeLogger())
+    DEBUG Considering binary 'at'
+    ...
+    DEBUG Considering binary 'cdrkit'
+    DEBUG Creating new binary cache entry.
+    ...
+    DEBUG Considering binary 'mozilla-firefox'
+    ...
 
-  >>> print updates
-  6
+    >>> print updates
+    6
 
 Transaction behaves exactly the same as for Source Caches, except that
 it commits full batches of 100 elements.
 
-  >>> transaction.commit()
-
-XXX cprov 20061201: see above, same than Source cache interface), bug #3989
-
-  >>> from canonical.database.sqlbase import flush_database_updates
-  >>> flush_database_updates()
+    >>> transaction.commit()
 
 Now we see that the 'cdrkit' binary is part of the caches and can be
 reached via searches:
 
-  >>> warty.searchPackages('cdrkit').count()
-  1
-
-  >>> cdrkit_bin_cache = DistroSeriesPackageCache.selectOneBy(
-  ...     archive=ubuntu.main_archive, distroseries=warty, name='cdrkit')
-
-  >>> cdrkit_bin_cache is not None
-  True
+    >>> warty.searchPackages('cdrkit').count()
+    1
+
+    >>> cdrkit_bin_cache = DistroSeriesPackageCache.selectOneBy(
+    ...      archive=ubuntu.main_archive, distroseries=warty, name='cdrkit')
+
+    >>> cdrkit_bin_cache is not None
+    True
 
 
 PPA package caches
 ==================
 
-Package caches are also populated for PPAs, allowing users to search
-for them considering the packages currently published in their context.
+Package caches are also populated for PPAs, allowing users to search for
+them considering the packages currently published in their context.
 
 We will use Celso's PPA.
 
-  >>> from lp.registry.interfaces.person import IPersonSet
-  >>> cprov = getUtility(IPersonSet).getByName('cprov')
+    >>> from lp.registry.interfaces.person import IPersonSet
+    >>> cprov = getUtility(IPersonSet).getByName('cprov')
 
 With empty cache contents in Archive table we can't even find a PPA by
 owner name.
 
-  >>> print ubuntu.searchPPAs(text='cprov').count()
-  0
+    >>> print ubuntu.searchPPAs(text='cprov').count()
+    0
 
 Sampledata contains stub counters.
 
-  >>> print cprov.archive.sources_cached
-  3
+    >>> print cprov.archive.sources_cached
+    3
 
-  >>> print cprov.archive.binaries_cached
-  3
+    >>> print cprov.archive.binaries_cached
+    3
 
 We have to issue 'updateArchiveCache' to include the owner 'name' and
 'displayname' field in the archive caches.
 
-  >>> cprov.archive.updateArchiveCache()
-  >>> transaction.commit()
-  >>> flush_database_updates()
+    >>> cprov.archive.updateArchiveCache()
 
 Now Celso's PPA can be found via searches and the package counters got
 reset, reflecting that nothing is cached in the database yet.
 
-  >>> print ubuntu.searchPPAs(text='cprov')[0].displayname
-  PPA for Celso Providelo
-
-  >>> print cprov.archive.sources_cached
-  0
-
-  >>> print cprov.archive.binaries_cached
-  0
-
-The sampledata contains no package caches, so attempts to find
-'pmount' (a source), 'firefox' (a binary name term) or 'shortdesc' (a
-term used in the pmount binary summary) fail.
-
-  >>> ubuntu.searchPPAs(text='pmount').count()
-  0
-  >>> ubuntu.searchPPAs(text='firefox').count()
-  0
-  >>> ubuntu.searchPPAs(text='warty').count()
-  0
-  >>> ubuntu.searchPPAs(text='shortdesc').count()
-  0
+    >>> print ubuntu.searchPPAs(text='cprov')[0].displayname
+    PPA for Celso Providelo
+
+    >>> print cprov.archive.sources_cached
+    0
+
+    >>> print cprov.archive.binaries_cached
+    0
+
+The sampledata contains no package caches, so attempts to find 'pmount'
+(a source), 'firefox' (a binary name term) or 'shortdesc' (a term used
+in the pmount binary summary) fail.
+
+    >>> ubuntu.searchPPAs(text='pmount').count()
+    0
+
+    >>> ubuntu.searchPPAs(text='firefox').count()
+    0
+
+    >>> ubuntu.searchPPAs(text='warty').count()
+    0
+
+    >>> ubuntu.searchPPAs(text='shortdesc').count()
+    0
 
 If we populate the package caches and update the archive caches, the
 same queries work, pointing to Celso's PPA.
 
-  >>> source_updates = DistributionSourcePackageCache.updateAll(
-  ...    ubuntu, archive=cprov.archive, ztm=transaction, log=FakeLogger())
-  DEBUG ...
-  DEBUG Considering source 'pmount'
-  ...
-
-  >>> binary_updates = DistroSeriesPackageCache.updateAll(
-  ...    warty, archive=cprov.archive, ztm=transaction,
-  ...    log=FakeLogger())
-  DEBUG Considering binary 'mozilla-firefox'
-  ...
-
-  >>> cprov.archive.updateArchiveCache()
-  >>> transaction.commit()
-  >>> flush_database_updates()
-
-  >>> cprov.archive.sources_cached == source_updates
-  True
-
-  >>> print cprov.archive.sources_cached
-  3
-
-  >>> cprov.archive.binaries_cached == binary_updates
-  True
-
-  >>> print cprov.archive.binaries_cached
-  2
-
-  >>> print ubuntu.searchPPAs(text='cprov')[0].displayname
-  PPA for Celso Providelo
-  >>> print ubuntu.searchPPAs(text='pmount')[0].displayname
-  PPA for Celso Providelo
-  >>> print ubuntu.searchPPAs(text='firefox')[0].displayname
-  PPA for Celso Providelo
-  >>> print ubuntu.searchPPAs(text='warty')[0].displayname
-  PPA for Celso Providelo
-  >>> print ubuntu.searchPPAs(text='shortdesc')[0].displayname
-  PPA for Celso Providelo
+    >>> source_updates = DistributionSourcePackageCache.updateAll(
+    ...     ubuntu, archive=cprov.archive, ztm=transaction, log=FakeLogger())
+    DEBUG ...
+    DEBUG Considering source 'pmount'
+    ...
+
+    >>> binary_updates = DistroSeriesPackageCache.updateAll(
+    ...     warty, archive=cprov.archive, ztm=transaction,
+    ...     log=FakeLogger())
+    DEBUG Considering binary 'mozilla-firefox'
+    ...
+
+    >>> cprov.archive.updateArchiveCache()
+
+    >>> cprov.archive.sources_cached == source_updates
+    True
+
+    >>> print cprov.archive.sources_cached
+    3
+
+    >>> cprov.archive.binaries_cached == binary_updates
+    True
+
+    >>> print cprov.archive.binaries_cached
+    2
+
+    >>> print ubuntu.searchPPAs(text='cprov')[0].displayname
+    PPA for Celso Providelo
+
+    >>> print ubuntu.searchPPAs(text='pmount')[0].displayname
+    PPA for Celso Providelo
+
+    >>> print ubuntu.searchPPAs(text='firefox')[0].displayname
+    PPA for Celso Providelo
+
+    >>> print ubuntu.searchPPAs(text='warty')[0].displayname
+    PPA for Celso Providelo
+
+    >>> print ubuntu.searchPPAs(text='shortdesc')[0].displayname
+    PPA for Celso Providelo
 
 The method which populates the archive caches also cleans the texts up
-to work around the current FTI limitation (see bug #207969). It
-performs the following tasks:
+to work around the current FTI limitation (see bug #207969). It performs
+the following tasks:
 
  * Exclude punctuation ([,.;:!?])
  * Store only unique lower case words
 
 We remove all caches related to Celso's PPA.
 
-  >>> celso_source_caches = DistributionSourcePackageCache.selectBy(
-  ...    archive=cprov.archive)
-
-  >>> celso_binary_caches = DistroSeriesPackageCache.selectBy(
-  ...    archive=cprov.archive)
-
-  >>> from zope.security.proxy import removeSecurityProxy
-  >>> def purge_caches(caches):
-  ...     for cache in caches:
-  ...         naked_cache = removeSecurityProxy(cache)
-  ...         naked_cache.destroySelf()
-
-
-  >>> purge_caches(celso_source_caches)
-  >>> purge_caches(celso_binary_caches)
-  >>> transaction.commit()
+    >>> celso_source_caches = DistributionSourcePackageCache.selectBy(
+    ...             archive=cprov.archive)
+
+    >>> celso_binary_caches = DistroSeriesPackageCache.selectBy(
+    ...             archive=cprov.archive)
+
+    >>> from zope.security.proxy import removeSecurityProxy
+    >>> def purge_caches(caches):
+    ...              for cache in caches:
+    ...                  naked_cache = removeSecurityProxy(cache)
+    ...                  naked_cache.destroySelf()
+
+    >>> purge_caches(celso_source_caches)
+    >>> purge_caches(celso_binary_caches)
 
 Now, when we update the caches for Celso's PPA, only the owner
 information will be available, no packages information will be cached.
 
-  >>> cprov.archive.updateArchiveCache()
-  >>> transaction.commit()
-
-  >>> print cprov.archive.sources_cached
-  0
-
-  >>> print cprov.archive.binaries_cached
-  0
-
-  >>> print cprov.archive.package_description_cache
-  celso providelo cprov
+    >>> cprov.archive.updateArchiveCache()
+
+    >>> print cprov.archive.sources_cached
+    0
+
+    >>> print cprov.archive.binaries_cached
+    0
+
+    >>> print cprov.archive.package_description_cache
+    celso providelo cprov
 
 We insert a new source cache with texts containing punctuation and
 duplicated words pointing to Celso's PPA.
 
-  >>> from lp.registry.interfaces.sourcepackagename import ISourcePackageNameSet
-  >>> cdrkit_name = getUtility(ISourcePackageNameSet).queryByName('cdrkit')
+    >>> from lp.registry.interfaces.sourcepackagename import (
+    ...     ISourcePackageNameSet)
+    >>> cdrkit_name = getUtility(ISourcePackageNameSet).queryByName('cdrkit')
 
-  >>> unclean_cache = DistributionSourcePackageCache(
-  ...    archive=cprov.archive,
-  ...    distribution=ubuntu,
-  ...    sourcepackagename=cdrkit_name,
-  ...    name=cdrkit_name.name,
-  ...    binpkgnames='cdrkit-bin cdrkit-extra',
-  ...    binpkgsummaries='Ding! Dong? Ding,Dong. Ding; DONG: ding dong')
+    >>> unclean_cache = DistributionSourcePackageCache(
+    ...     archive=cprov.archive,
+    ...     distribution=ubuntu,
+    ...     sourcepackagename=cdrkit_name,
+    ...     name=cdrkit_name.name,
+    ...     binpkgnames='cdrkit-bin cdrkit-extra',
+    ...     binpkgsummaries='Ding! Dong? Ding,Dong. Ding; DONG: ding dong')
 
 Note that 'binpkgdescription' and 'changelog' are not considered yet,
 and we have no binary cache.
 
 Let's update the archive cache and see how it goes.
 
-  >>> transaction.commit()
-  >>> cprov.archive.updateArchiveCache()
-  >>> transaction.commit()
-
-Only one source cached and the 'package_description_cache' only
-contains unique and lowercase words free of any punctuation.
-
-  >>> print cprov.archive.sources_cached
-  1
-
-  >>> print cprov.archive.binaries_cached
-  0
-
-  >>> print cprov.archive.package_description_cache
-  ding providelo celso cdrkit cdrkit-bin dong ubuntu cdrkit-extra cprov
+    >>> cprov.archive.updateArchiveCache()
+
+Only one source cached and the 'package_description_cache' only contains
+unique and lowercase words free of any punctuation.
+
+    >>> print cprov.archive.sources_cached
+    1
+
+    >>> print cprov.archive.binaries_cached
+    0
+
+    >>> print cprov.archive.package_description_cache
+    ding providelo celso cdrkit cdrkit-bin dong ubuntu cdrkit-extra cprov
 
 Let's remove the unclean cache and update Celso's PPA cache, so
 everything will be back to normal.
 
-  >>> removeSecurityProxy(unclean_cache).destroySelf()
-  >>> transaction.commit()
-  >>> cprov.archive.updateArchiveCache()
-  >>> transaction.commit()
+    >>> removeSecurityProxy(unclean_cache).destroySelf()
+    >>> cprov.archive.updateArchiveCache()
 
 
 Package Counters
 ================
 
-We also store counters for the number of Sources and Binaries
-published in a DistroSeries pocket RELEASE:
-
-  >>> warty.sourcecount
-  3
-  >>> warty.binarycount
-  4
+We also store counters for the number of Sources and Binaries published
+in a DistroSeries pocket RELEASE:
+
+    >>> warty.sourcecount
+    3
+
+    >>> warty.binarycount
+    4
 
 Since we have modified the publication list for warty in order to test
-the caching system, we expect similar changes in the
-counters. IDistroSeries provides a method to update its own cache:
+the caching system, we expect similar changes in the counters.
+IDistroSeries provides a method to update its own cache:
 
-  >>> warty.updatePackageCount()
+    >>> warty.updatePackageCount()
 
 New values were stored:
 
-  >>> warty.sourcecount
-  6
-  >>> warty.binarycount
-  6
+    >>> warty.sourcecount
+    6
+
+    >>> warty.binarycount
+    6
 
 Only PENDING and PUBLISHED publications are considered.
 
 We will use `SoyuzTestPublisher` for creating convenient publications.
 
-  >>> from canonical.database.sqlbase import commit
-  >>> from lp.soyuz.enums import (
-  ...     PackagePublishingStatus)
-  >>> from lp.soyuz.tests.test_publishing import (
-  ...     SoyuzTestPublisher)
-  >>> from canonical.testing.layers import LaunchpadZopelessLayer
-
-  >>> test_publisher = SoyuzTestPublisher()
-
-  >>> LaunchpadZopelessLayer.switchDbUser('launchpad')
-  >>> commit()
-
-  >>> unused = test_publisher.setUpDefaultDistroSeries(warty)
-  >>> test_publisher.addFakeChroots()
+    >>> from canonical.database.sqlbase import commit
+    >>> from lp.soyuz.enums import (
+    ...              PackagePublishingStatus)
+    >>> from lp.soyuz.tests.test_publishing import (
+    ...              SoyuzTestPublisher)
+    >>> from canonical.testing.layers import LaunchpadZopelessLayer
+
+    >>> test_publisher = SoyuzTestPublisher()
+
+    >>> LaunchpadZopelessLayer.switchDbUser('launchpad')
+
+    >>> unused = test_publisher.setUpDefaultDistroSeries(warty)
+    >>> test_publisher.addFakeChroots()
 
 Let's create one source with a single binary in PENDING status.
 
-  >>> pending_source = test_publisher.getPubSource(
-  ...     sourcename = 'pending-source',
-  ...     status=PackagePublishingStatus.PENDING)
-
-  >>> pending_binaries = test_publisher.getPubBinaries(
-  ...     binaryname="pending-binary", pub_source=pending_source,
-  ...     status=PackagePublishingStatus.PENDING)
-
-  >>> print len(
-  ...     set(pub.binarypackagerelease.name for pub in pending_binaries))
-  1
+    >>> pending_source = test_publisher.getPubSource(
+    ...      sourcename = 'pending-source',
+    ...      status=PackagePublishingStatus.PENDING)
+
+    >>> pending_binaries = test_publisher.getPubBinaries(
+    ...      binaryname="pending-binary", pub_source=pending_source,
+    ...      status=PackagePublishingStatus.PENDING)
+
+    >>> print len(
+    ...      set(pub.binarypackagerelease.name for pub in pending_binaries))
+    1
 
 And one source with a single binary in PUBLISHED status.
 
-  >>> published_source = test_publisher.getPubSource(
-  ...     sourcename = 'published-source',
-  ...     status=PackagePublishingStatus.PUBLISHED)
-
-  >>> published_binaries = test_publisher.getPubBinaries(
-  ...     binaryname="published-binary", pub_source=published_source,
-  ...     status=PackagePublishingStatus.PUBLISHED)
-
-  >>> print len(
-  ...     set(pub.binarypackagerelease.name for pub in published_binaries))
-  1
-
-  >>> commit()
-  >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser)
+    >>> published_source = test_publisher.getPubSource(
+    ...      sourcename = 'published-source',
+    ...      status=PackagePublishingStatus.PUBLISHED)
+
+    >>> published_binaries = test_publisher.getPubBinaries(
+    ...      binaryname="published-binary", pub_source=published_source,
+    ...      status=PackagePublishingStatus.PUBLISHED)
+
+    >>> print len(
+    ...      set(pub.binarypackagerelease.name for pub in published_binaries))
+    1
+
+    >>> commit()
+    >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser)
 
 Exactly 2 new sources and 2 new binaries will be accounted.
 
-  >>> warty.updatePackageCount()
-  >>> warty.sourcecount
-  8
-  >>> warty.binarycount
-  8
+    >>> warty.updatePackageCount()
+    >>> warty.sourcecount
+    8
+
+    >>> warty.binarycount
+    8
 
 Let's create one source with a single binary in DELETED status.
 
-  >>> LaunchpadZopelessLayer.switchDbUser('launchpad')
-  >>> commit()
-
-  >>> deleted_source = test_publisher.getPubSource(
-  ...     sourcename = 'pending-source',
-  ...     status=PackagePublishingStatus.DELETED)
-
-  >>> deleted_binaries = test_publisher.getPubBinaries(
-  ...     binaryname="pending-binary", pub_source=deleted_source,
-  ...     status=PackagePublishingStatus.DELETED)
-
-  >>> print len(
-  ...     set(pub.binarypackagerelease.name for pub in deleted_binaries))
-  1
-
-  >>> commit()
-  >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser)
+    >>> LaunchpadZopelessLayer.switchDbUser('launchpad')
+
+    >>> deleted_source = test_publisher.getPubSource(
+    ...              sourcename = 'pending-source',
+    ...              status=PackagePublishingStatus.DELETED)
+
+    >>> deleted_binaries = test_publisher.getPubBinaries(
+    ...      binaryname="pending-binary", pub_source=deleted_source,
+    ...      status=PackagePublishingStatus.DELETED)
+
+    >>> print len(
+    ...      set(pub.binarypackagerelease.name for pub in deleted_binaries))
+    1
+
+    >>> LaunchpadZopelessLayer.switchDbUser(test_dbuser)
 
 Distroseries package counters will not account DELETED publications.
 
-  >>> warty.updatePackageCount()
-  >>> warty.sourcecount
-  8
-  >>> warty.binarycount
-  8
+    >>> warty.updatePackageCount()
+    >>> warty.sourcecount
+    8
+
+    >>> warty.binarycount
+    8
 
 A similar mechanism is offered by IDistroArchSeries, but only for
 binaries (of course):
 
-  >>> warty_i386 = warty['i386']
+    >>> warty_i386 = warty['i386']
 
-  >>> warty_i386.package_count
-  5
+    >>> warty_i386.package_count
+    5
 
 Invoke the counter updater on this architecture:
 
-  >>> warty_i386.updatePackageCount()
+    >>> warty_i386.updatePackageCount()
 
 New values were stored:
 
-  >>> warty_i386.package_count
-  9
+    >>> warty_i386.package_count
+    9
 
 
 DistroSeriesBinaryPackage cache lookups
 =======================================
 
-The DistroSeriesBinaryPackage and DistroArchSeriesBinaryPackage
-objects uses a DistroSeriesPackageCache record to present summary and
+The DistroSeriesBinaryPackage and DistroArchSeriesBinaryPackage objects
+uses a DistroSeriesPackageCache record to present summary and
 description for the context binary package.
 
-  >>> from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet
-  >>> foobar_name = getUtility(IBinaryPackageNameSet).queryByName('foobar')
+    >>> from lp.soyuz.interfaces.binarypackagename import (
+    ...     IBinaryPackageNameSet)
+    >>> foobar_name = getUtility(IBinaryPackageNameSet).queryByName('foobar')
 
-  >>> primary_cache = DistroSeriesPackageCache(
-  ...     archive=ubuntu.main_archive, distroseries=warty,
-  ...     binarypackagename=foobar_name, summary='main foobar',
-  ...     description='main foobar description')
+    >>> primary_cache = DistroSeriesPackageCache(
+    ...      archive=ubuntu.main_archive, distroseries=warty,
+    ...      binarypackagename=foobar_name, summary='main foobar',
+    ...      description='main foobar description')
 
 The DistroSeriesBinaryPackage.
 
-  >>> foobar_binary = warty.getBinaryPackage('foobar')
-
-  >>> foobar_binary.cache == primary_cache
-  True
-
-  >>> print foobar_binary.summary
-  main foobar
-
-  >>> print foobar_binary.description
-  main foobar description
+    >>> foobar_binary = warty.getBinaryPackage('foobar')
+
+    >>> foobar_binary.cache == primary_cache
+    True
+
+    >>> print foobar_binary.summary
+    main foobar
+
+    >>> print foobar_binary.description
+    main foobar description
 
 The DistroArchSeriesBinaryPackage.
 
-  >>> warty_i386 = warty['i386']
-  >>> foobar_arch_binary = warty_i386.getBinaryPackage('foobar')
-
-  >>> foobar_arch_binary.cache == primary_cache
-  True
-
-  >>> print foobar_arch_binary.summary
-  main foobar
-
-  >>> print foobar_arch_binary.description
-  main foobar description
+    >>> warty_i386 = warty['i386']
+    >>> foobar_arch_binary = warty_i386.getBinaryPackage('foobar')
+
+    >>> foobar_arch_binary.cache == primary_cache
+    True
+
+    >>> print foobar_arch_binary.summary
+    main foobar
+
+    >>> print foobar_arch_binary.description
+    main foobar description
 
 This lookup mechanism will continue to work even after we have added a
 cache entry for a PPA package with the same name.
 
-  >>> ppa_cache = DistroSeriesPackageCache(
-  ...     archive=cprov.archive, distroseries=warty,
-  ...     binarypackagename=foobar_name, summary='ppa foobar')
-
-  >>> foobar_binary = warty.getBinaryPackage('foobar')
-  >>> foobar_binary.cache != ppa_cache
-  True
-
-  >>> foobar_arch_binary = warty_i386.getBinaryPackage('foobar')
-  >>> foobar_arch_binary.cache != ppa_cache
-  True
+    >>> ppa_cache = DistroSeriesPackageCache(
+    ...              archive=cprov.archive, distroseries=warty,
+    ...              binarypackagename=foobar_name, summary='ppa foobar')
+
+    >>> foobar_binary = warty.getBinaryPackage('foobar')
+    >>> foobar_binary.cache != ppa_cache
+    True
+
+    >>> foobar_arch_binary = warty_i386.getBinaryPackage('foobar')
+    >>> foobar_arch_binary.cache != ppa_cache
+    True
 
 
 Content class declarations for FTI
@@ -674,12 +654,11 @@
 
     >>> transaction.commit()
     >>> binary_name = getUtility(
-    ...     IBinaryPackageNameSet).queryByName('commercialpackage')
+    ...      IBinaryPackageNameSet).queryByName('commercialpackage')
     >>> cache_with_unicode = DistroSeriesPackageCache(
-    ...     archive=ubuntu.main_archive, distroseries=warty,
-    ...     binarypackagename=binary_name, summaries=u"“unicode“")
+    ...      archive=ubuntu.main_archive, distroseries=warty,
+    ...      binarypackagename=binary_name, summaries=u"“unicode“")
     >>> transaction.commit()
-    >>> flush_database_updates()
 
     >>> cache = DistroSeriesPackageCache.get(id=cache_with_unicode.id)
 
@@ -692,53 +671,54 @@
 Disabled archives caches
 ========================
 
-Once recognized as disabled, archives have their caches purged, so
-they won't be listed in package searches anymore.
+Once recognized as disabled, archives have their caches purged, so they
+won't be listed in package searches anymore.
 
 First, we rebuild and examinate the caches for Celso's PPA.
 
     # Helper functions for completing rebuilding and dumping cache
     # contents for a given Archive.
+
     >>> from lp.services.log.logger import BufferLogger
     >>> logger = BufferLogger()
     >>> def rebuild_caches(archive):
-    ...     DistributionSourcePackageCache.removeOld(
-    ...         ubuntu, archive=archive, log=logger)
-    ...     DistributionSourcePackageCache.updateAll(
-    ...         ubuntu, archive=archive, ztm=transaction, log=logger)
-    ...     for series in ubuntu.series:
-    ...         DistroSeriesPackageCache.removeOld(
-    ...             series, archive=archive, log=logger)
-    ...         DistroSeriesPackageCache.updateAll(
-    ...             series, archive=archive, ztm=transaction, log=logger)
-    ...     archive.updateArchiveCache()
+    ...      DistributionSourcePackageCache.removeOld(
+    ...          ubuntu, archive=archive, log=logger)
+    ...      DistributionSourcePackageCache.updateAll(
+    ...          ubuntu, archive=archive, ztm=transaction, log=logger)
+    ...      for series in ubuntu.series:
+    ...          DistroSeriesPackageCache.removeOld(
+    ...              series, archive=archive, log=logger)
+    ...          DistroSeriesPackageCache.updateAll(
+    ...              series, archive=archive, ztm=transaction, log=logger)
+    ...      archive.updateArchiveCache()
     >>> def print_caches(archive):
-    ...     source_caches = DistributionSourcePackageCache.selectBy(
-    ...         archive=archive)
-    ...     binary_caches = DistroSeriesPackageCache.selectBy(
-    ...         archive=archive)
-    ...     print '%d sources cached [%d]' % (
-    ...         archive.sources_cached, source_caches.count())
-    ...     print '%d binaries cached [%d]' % (
-    ...         archive.binaries_cached, binary_caches.count())
+    ...      source_caches = DistributionSourcePackageCache.selectBy(
+    ...          archive=archive)
+    ...      binary_caches = DistroSeriesPackageCache.selectBy(
+    ...          archive=archive)
+    ...      print '%d sources cached [%d]' % (
+    ...          archive.sources_cached, source_caches.count())
+    ...      print '%d binaries cached [%d]' % (
+    ...          archive.binaries_cached, binary_caches.count())
     >>> def print_search_results(text, user=None):
-    ...     commit()
-    ...     LaunchpadZopelessLayer.switchDbUser('launchpad')
-    ...     for ppa in ubuntu.searchPPAs(text, user=user):
-    ...         print ppa.displayname
-    ...     LaunchpadZopelessLayer.switchDbUser(test_dbuser)
+    ...      commit()
+    ...      LaunchpadZopelessLayer.switchDbUser('launchpad')
+    ...      for ppa in ubuntu.searchPPAs(text, user=user):
+    ...          print ppa.displayname
+    ...      LaunchpadZopelessLayer.switchDbUser(test_dbuser)
     >>> def enable_archive(archive):
-    ...     commit()
-    ...     LaunchpadZopelessLayer.switchDbUser('launchpad')
-    ...     archive.enable()
-    ...     commit()
-    ...     LaunchpadZopelessLayer.switchDbUser(test_dbuser)
+    ...      commit()
+    ...      LaunchpadZopelessLayer.switchDbUser('launchpad')
+    ...      archive.enable()
+    ...      commit()
+    ...      LaunchpadZopelessLayer.switchDbUser(test_dbuser)
     >>> def disable_archive(archive):
-    ...     commit()
-    ...     LaunchpadZopelessLayer.switchDbUser('launchpad')
-    ...     archive.disable()
-    ...     commit()
-    ...     LaunchpadZopelessLayer.switchDbUser(test_dbuser)
+    ...      commit()
+    ...      LaunchpadZopelessLayer.switchDbUser('launchpad')
+    ...      archive.disable()
+    ...      commit()
+    ...      LaunchpadZopelessLayer.switchDbUser(test_dbuser)
 
     >>> rebuild_caches(cprov.archive)
 
@@ -757,9 +737,8 @@
     3 sources cached [3]
     2 binaries cached [2]
 
-However the disabled PPA is not included in search results for
-anonymous requests or requests from users with no view permission to
-Celso's PPA.
+However the disabled PPA is not included in search results for anonymous
+requests or requests from users with no view permission to Celso's PPA.
 
     >>> print_search_results('pmount')
 
@@ -797,3 +776,5 @@
 
     >>> print_search_results('cprov')
     PPA for Celso Providelo
+
+

=== modified file 'lib/lp/soyuz/interfaces/archive.py'
--- lib/lp/soyuz/interfaces/archive.py	2011-08-26 14:57:39 +0000
+++ lib/lp/soyuz/interfaces/archive.py	2011-08-28 08:46:25 +0000
@@ -1243,7 +1243,7 @@
     @operation_for_version('devel')
     def copyPackages(source_names, from_archive, to_pocket, person,
                      to_series=None, include_binaries=False):
-        """Atomically copy multiple named sources into this archive from another.
+        """Atomically copy named sources into this archive from another.
 
         Asynchronously copy the most recent PUBLISHED versions of the named
         sources to the destination archive if necessary.  Calls to this

=== modified file 'lib/lp/soyuz/stories/soyuz/xx-sourcepackage-changelog.txt'
--- lib/lp/soyuz/stories/soyuz/xx-sourcepackage-changelog.txt	2011-08-22 13:59:47 +0000
+++ lib/lp/soyuz/stories/soyuz/xx-sourcepackage-changelog.txt	2011-08-28 08:46:25 +0000
@@ -1,117 +1,118 @@
-== Source package changelog ==
+Source package changelog
+------------------------
 
 Browse the changelog of a sourcepackage..
 
-  >>> user_browser.open(
-  ...     "http://launchpad.dev/ubuntu/hoary/+source/pmount/+changelog";)
-  >>> print_location(user_browser.contents)
-  Hierarchy: Ubuntu > Hoary (5.04) > ...pmount... source package > Change log
-  Tabs:
-  * Overview (selected) - http://launchpad.dev/ubuntu/hoary/+source/pmount
-  * Code - http://code.launchpad.dev/ubuntu/hoary/+source/pmount
-  * Bugs - http://bugs.launchpad.dev/ubuntu/hoary/+source/pmount
-  * Blueprints - not linked
-  * Translations - http://translations.launchpad.dev/ubuntu/hoary/+source/pmount
-  * Answers - not linked
-  Main heading: Change logs for ...pmount... in Hoary
-
-
-  >>> print extract_text(
-  ...     find_tag_by_id(user_browser.contents, 'changelogs'))
-  This is a placeholder changelog for pmount 0.1-2
-  pmount (0.1-1) hoary; urgency=low
-  * Fix description (Malone #1)
-  * Fix debian (Debian #2000)
-  * Fix warty (Warty Ubuntu #1)
-  -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
+    >>> user_browser.open(
+    ...     "http://launchpad.dev/ubuntu/hoary/+source/pmount/+changelog";)
+    >>> print_location(user_browser.contents)
+    Hierarchy: Ubuntu > Hoary (5.04) > ...pmount... source package > Change log
+    Tabs:
+    * Overview (selected) - http://launchpad.dev/ubuntu/hoary/+source/pmount
+    * Code - http://code.launchpad.dev/ubuntu/hoary/+source/pmount
+    * Bugs - http://bugs.launchpad.dev/ubuntu/hoary/+source/pmount
+    * Blueprints - not linked
+    * Translations - http://translations.launchpad.dev/ubuntu/hoary/+source/pmount
+    * Answers - not linked
+    Main heading: Change logs for ...pmount... in Hoary
+
+    >>> print extract_text(
+    ...           find_tag_by_id(user_browser.contents, 'changelogs'))
+    This is a placeholder changelog for pmount 0.1-2
+    pmount (0.1-1) hoary; urgency=low
+    * Fix description (Malone #1)
+    * Fix debian (Debian #2000)
+    * Fix warty (Warty Ubuntu #1)
+    -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
 
 .. and another one:
 
-  >>> user_browser.open(
-  ...     "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/+changelog";)
-  >>> print extract_text(
-  ...     find_tag_by_id(user_browser.contents, 'changelogs'))
-  alsa-utils (1.0.9a-4ubuntu1) hoary; urgency=low
-  * Placeholder
-  LP: #10
-  LP: #999
-  LP: #badid
-  LP: #7, #8,
-  #11
-  -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
-  alsa-utils (1.0.9a-4) warty; urgency=low
-  * Placeholder
-  -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
+    >>> user_browser.open(
+    ...     "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/+changelog";)
+    >>> print extract_text(
+    ...           find_tag_by_id(user_browser.contents, 'changelogs'))
+    alsa-utils (1.0.9a-4ubuntu1) hoary; urgency=low
+    * Placeholder
+    LP: #10
+    LP: #999
+    LP: #badid
+    LP: #7, #8,
+    #11
+    -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
+    alsa-utils (1.0.9a-4) warty; urgency=low
+    * Placeholder
+    -- Sample Person &lt;test@xxxxxxxxxxxxx&gt; Tue, 7 Feb 2006 12:10:08 +0300
 
 The LP: #<number> entries are also linkified:
 
-  >>> user_browser.getLink('#10').url
-  'http://launchpad.dev/bugs/10'
-
-  >>> user_browser.getLink('#999').url
-  'http://launchpad.dev/bugs/999'
-
-  >>> user_browser.getLink('#badid').url
-  Traceback (most recent call last):
-  ...
-  LinkNotFoundError
-
-  >>> user_browser.getLink('#7').url
-  'http://launchpad.dev/bugs/7'
-
-  >>> user_browser.getLink('#8').url
-  'http://launchpad.dev/bugs/8'
-
-  >>> user_browser.getLink('#11').url
-  'http://launchpad.dev/bugs/11'
+    >>> user_browser.getLink('#10').url
+    'http://launchpad.dev/bugs/10'
+
+    >>> user_browser.getLink('#999').url
+    'http://launchpad.dev/bugs/999'
+
+    >>> user_browser.getLink('#badid').url
+    Traceback (most recent call last):
+    ...
+    LinkNotFoundError
+
+    >>> user_browser.getLink('#7').url
+    'http://launchpad.dev/bugs/7'
+
+    >>> user_browser.getLink('#8').url
+    'http://launchpad.dev/bugs/8'
+
+    >>> user_browser.getLink('#11').url
+    'http://launchpad.dev/bugs/11'
 
 The output above shows email addresses, however any email addresses in
 the changelog are obfuscated when the user is not logged in (this stops
 bots from picking them up):
 
-  >>> anon_browser.open(
-  ...     "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/+changelog";)
-  >>> print extract_text(find_main_content(anon_browser.contents))
-  Change logs for ...alsa-utils... in Hoary
-  ...
-  -- Sample Person &lt;email address hidden&gt; Tue, 7 Feb 2006 12:10:08 +0300
-  alsa-utils (1.0.9a-4) warty; urgency=low
-  * Placeholder
-  -- Sample Person &lt;email address hidden&gt; Tue, 7 Feb 2006 12:10:08 +0300
+    >>> anon_browser.open(
+    ...     "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/+changelog";)
+    >>> print extract_text(find_main_content(anon_browser.contents))
+    Change logs for ...alsa-utils... in Hoary
+    ...
+    -- Sample Person &lt;email address hidden&gt; Tue, 7 Feb 2006 12:10:08 +0300
+    alsa-utils (1.0.9a-4) warty; urgency=low
+    * Placeholder
+    -- Sample Person &lt;email address hidden&gt; Tue, 7 Feb 2006 12:10:08 +0300
 
 If the email address is recognised as one registered in Launchpad, the
 address is linkified to point to the person's profile page.  Here,
 'commercialpackage' has a known email address in its changelog:
 
-  >>> user_browser.open(
-  ...     "http://launchpad.dev/ubuntu/breezy-autotest/+source/";
-  ...     "commercialpackage/+changelog")
-  >>> changelog = find_tag_by_id(
-  ...     user_browser.contents, 'commercialpackage_1.0-1')
-  >>> print extract_text(changelog.find('a'))
-  foo.bar@xxxxxxxxxxxxx
+    >>> user_browser.open(
+    ...           "http://launchpad.dev/ubuntu/breezy-autotest/+source/";
+    ...           "commercialpackage/+changelog")
+    >>> changelog = find_tag_by_id(
+    ...           user_browser.contents, 'commercialpackage_1.0-1')
+    >>> print extract_text(changelog.find('a'))
+    foo.bar@xxxxxxxxxxxxx
 
 Browsing the individual sourcepackage changelog.
 
-  >>> user_browser.open(
-  ...  "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/1.0.9a-4ubuntu1";)
+    >>> user_browser.open(
+    ...    "http://launchpad.dev/ubuntu/hoary/+source/alsa-utils/1.0.9a-4ubuntu1";)
 
 The changelog is still linkified here so that the bug links work,
 although the version link will point to the same page we're already on.
 
-  >>> print find_tag_by_id(
-  ...     user_browser.contents, 'alsa-utils_1.0.9a-4ubuntu1')
-  <pre ... id="alsa-utils_1.0.9a-4ubuntu1">alsa-utils (1.0.9a-4ubuntu1) ...
-  <BLANKLINE>
-  ...
-   LP: <a href="/bugs/10" class="bug-link">#10</a>
-   LP: <a href="/bugs/999" class="bug-link">#999</a>
-   LP: #badid
-  ...
+    >>> print find_tag_by_id(
+    ...           user_browser.contents, 'alsa-utils_1.0.9a-4ubuntu1')
+    <pre ... id="alsa-utils_1.0.9a-4ubuntu1">alsa-utils (1.0.9a-4ubuntu1) ...
+    <BLANKLINE>
+    ...
+     LP: <a href="/bugs/10" class="bug-link">#10</a>
+     LP: <a href="/bugs/999" class="bug-link">#999</a>
+     LP: #badid
+    ...
 
 We should see some changelog information on the main package page.
 
-  >>> user_browser.open("http://launchpad.dev/ubuntu/+source/pmount/";)
-  >>> user_browser.title
-  '\xe2\x80\x9cpmount\xe2\x80\x9d package : Ubuntu'
+    >>> user_browser.open("http://launchpad.dev/ubuntu/+source/pmount/";)
+    >>> user_browser.title
+    '\xe2\x80\x9cpmount\xe2\x80\x9d package : Ubuntu'
+
 

=== modified file 'lib/lp/soyuz/stories/webservice/xx-archive.txt'
--- lib/lp/soyuz/stories/webservice/xx-archive.txt	2011-08-24 10:54:30 +0000
+++ lib/lp/soyuz/stories/webservice/xx-archive.txt	2011-08-28 08:46:25 +0000
@@ -34,7 +34,8 @@
 
 For "devel" additional attributes are available.
 
-    >>> cprov_archive_devel = webservice.get("/~cprov/+archive/ppa", api_version='devel').jsonBody()
+    >>> cprov_archive_devel = webservice.get(
+    ...     "/~cprov/+archive/ppa", api_version='devel').jsonBody()
     >>> pprint_entry(cprov_archive_devel)
     commercial: False
     dependencies_collection_link: u'http://.../~cprov/+archive/ppa/dependencies'