← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/megalint-1 into lp:launchpad

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/megalint-1 into lp:launchpad.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)

For more details, see:
https://code.launchpad.net/~jtv/launchpad/megalint-1/+merge/75965

= Summary =

One of the things you want to do after resolving a conflict with devel, but before committing, is a “make lint” to check that the changes make a basic amount of sense and do not degrade readability of the code.

But it's no cakewalk.

A few days' worth of changed files in devel will produce lots and lots and lots of lint.  This is not a very visible cost to not cleaning up a file you touch, but it's there.  Plus, of course, we want our codebase to be maintainable!


== Proposed fix ==

Fix lint.  In the vocabulary test I went a little further and introduced a helper to make the tests a bit punchier.  Oh, and I deleted an obsolete Translations file in database/schema/pending.  It still referenced POSubmission, which makes me think it wasn't really written in 2009 like the copyright message said.  More likely 2007 or earlier.


== Tests ==

I ran the tests that I touched individually; the rest will have to come out in the EC2 run I'm currently also trying to launch.


= Launchpad lint =

The _pythonpath ones are, alas, inevitable.  The harness ones are for variables that you can use in your harness session, although I doubt we still need things like “ds” for the DistroSeries with id 1.  I'll email the list about it.

The “redefinition of unused etree” is an alternative import for a module that may not be available; not much to be done about it without fully understanding the ramifications.

I regretfully left one “trailing whitespace” in the bug-notification doctest because the whitespace is part of a whitespace-sensitive output check.  Ironically, I do think that the whitespace is probably a bug.  But that's for another time.

Oh, the whitespace-around-operator in fti.py?  No idea what that is.  I'm just not seeing it in the code, but it's weird enough stuff that I may easily have missed something.


Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/blueprints/browser/sprintspecification.py
  lib/canonical/lp/ftests/test_zopeless.py
  lib/lp/registry/model/distributionsourcepackage.py
  lib/canonical/librarian/client.py
  database/replication/report.py
  lib/lp/codehosting/puller/tests/test_scheduler.py
  lib/lp/codehosting/codeimport/worker.py
  lib/canonical/launchpad/webapp/publisher.py
  lib/canonical/launchpad/scripts/tests/test_scriptmonitor.py
  lib/lp/bugs/doc/initial-bug-contacts.txt
  lib/lp/archivepublisher/model/ftparchive.py
  lib/canonical/launchpad/doc/librarian.txt
  lib/lp/code/browser/branchsubscription.py
  lib/lp/blueprints/browser/sprint.py
  database/replication/slon_ctl.py
  lib/lp/registry/browser/tests/test_distroseries.py
  lib/canonical/launchpad/utilities/looptuner.py
  lib/canonical/database/ftests/test_sqlbaseconnect.txt
  lib/lp/registry/browser/productrelease.py
  database/replication/new-slave.py
  lib/lp/hardwaredb/scripts/hwdbsubmissions.py
  lib/canonical/database/harness.py
  lib/lp/bugs/doc/bugnotification-email.txt
  lib/canonical/launchpad/webapp/errorlog.py
  lib/canonical/librarian/ftests/test_gc.py
  database/schema/fti.py
  lib/lp/bugs/doc/bug-heat.txt
  database/schema/pending/prune-nonce.py
  lib/lp/codehosting/codeimport/tests/test_worker.py
  lib/lp/app/browser/tests/test_vocabulary.py
  lib/canonical/launchpad/doc/helpers.txt
  database/replication/preamble.py
  lib/canonical/lazr/doc/utils.txt

./database/replication/report.py
      24: '_pythonpath' imported but unused
./database/replication/slon_ctl.py
      17: '_pythonpath' imported but unused
./database/replication/new-slave.py
      17: '_pythonpath' imported but unused
./lib/lp/hardwaredb/scripts/hwdbsubmissions.py
      26: redefinition of unused 'etree' from line 24
./lib/canonical/database/harness.py
      25: 'from storm.expr import *' used; unable to detect undefined names
      27: 'from storm.locals import *' used; unable to detect undefined names
      81: local variable 'd' is assigned to but never used
      92: local variable 'factory' is assigned to but never used
      89: local variable 'q' is assigned to but never used
      82: local variable 'p' is assigned to but never used
      88: local variable 's' is assigned to but never used
      85: local variable 'proj' is assigned to but never used
      86: local variable 'b2' is assigned to but never used
      84: local variable 'prod' is assigned to but never used
      83: local variable 'ds' is assigned to but never used
      76: local variable 'store' is assigned to but never used
      87: local variable 'b1' is assigned to but never used
      34: 'canonical_url' imported but unused
      35: 'DEFAULT_FLAVOR' imported but unused
      35: 'SLAVE_FLAVOR' imported but unused
      31: 'removeSecurityProxy' imported but unused
      24: 'utc' imported but unused
      28: 'transaction' imported but unused
      30: 'verifyObject' imported but unused
      21: 'rlcompleter' imported but unused
./lib/lp/bugs/doc/bugnotification-email.txt
     214: want has trailing whitespace.
./database/schema/fti.py
      23: '_pythonpath' imported but unused
     468: E221 multiple spaces before operator
./database/schema/pending/prune-nonce.py
      13: '_pythonpath' imported but unused
./database/replication/preamble.py
      15: '_pythonpath' imported but unused
-- 
https://code.launchpad.net/~jtv/launchpad/megalint-1/+merge/75965
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/megalint-1 into lp:launchpad.
=== modified file 'database/replication/new-slave.py'
--- database/replication/new-slave.py	2011-09-06 04:36:09 +0000
+++ database/replication/new-slave.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Bring a new slave online."""
@@ -8,23 +8,28 @@
 __metaclass__ = type
 __all__ = []
 
-import _pythonpath
-
 from optparse import OptionParser
 import subprocess
 import sys
+from textwrap import dedent
 import time
-from textwrap import dedent
 
+import _pythonpath
 import psycopg2
+import replication.helpers
+from replication.helpers import LPMAIN_SET_ID
 
 from canonical.database.postgresql import ConnectionString
 from canonical.database.sqlbase import (
-    connect_string, ISOLATION_LEVEL_AUTOCOMMIT)
-from canonical.launchpad.scripts import db_options, logger_options, logger
+    connect_string,
+    ISOLATION_LEVEL_AUTOCOMMIT,
+    )
+from canonical.launchpad.scripts import (
+    db_options,
+    logger,
+    logger_options,
+    )
 
-import replication.helpers
-from replication.helpers import LPMAIN_SET_ID
 
 def main():
     parser = OptionParser(
@@ -194,8 +199,7 @@
         """)
 
     full_sync = []
-    sync_nicknames = [node.nickname for node in existing_nodes]
-    sync_nicknames.append('new_node');
+    sync_nicknames = [node.nickname for node in existing_nodes] + ['new_node']
     for nickname in sync_nicknames:
         full_sync.append(dedent("""\
             echo 'Waiting for %(nickname)s sync.';
@@ -248,7 +252,8 @@
 
     # Confirm we can connect from here.
     try:
-        test_con = psycopg2.connect(str(connection_string))
+        # Test connection only.  We're not going to use it.
+        psycopg2.connect(str(connection_string))
     except psycopg2.Error, exception:
         parser.error("Failed to connect to using '%s' (%s)" % (
             connection_string, str(exception).strip()))

=== modified file 'database/replication/preamble.py'
--- database/replication/preamble.py	2011-09-06 05:00:26 +0000
+++ database/replication/preamble.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Generate a preamble for slonik(1) scripts based on the current LPCONFIG.
@@ -9,16 +9,16 @@
 __metaclass__ = type
 __all__ = []
 
+from optparse import OptionParser
+import time
+
 import _pythonpath
-
-import time
-from optparse import OptionParser
+import replication.helpers
 
 from canonical.config import config
 from canonical.database.sqlbase import connect
 from canonical.launchpad import scripts
 
-import replication.helpers
 
 if __name__ == '__main__':
     parser = OptionParser()
@@ -33,4 +33,3 @@
     print '# LPCONFIG=%s' % config.instance_name
     print
     print replication.helpers.preamble(con)
-

=== modified file 'database/replication/report.py'
--- database/replication/report.py	2011-09-06 05:00:26 +0000
+++ database/replication/report.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Generate a report on the replication setup.
@@ -16,22 +16,28 @@
 __metaclass__ = type
 __all__ = []
 
-import _pythonpath
-
 from cgi import escape as html_escape
 from cStringIO import StringIO
 from optparse import OptionParser
 import sys
 
-from canonical.database.sqlbase import connect, quote_identifier, sqlvalues
+import _pythonpath
+import replication.helpers
+
+from canonical.database.sqlbase import (
+    connect,
+    quote_identifier,
+    sqlvalues,
+    )
 from canonical.launchpad.scripts import db_options
-import replication.helpers
 
 
 class Table:
-    labels = None # List of labels to render as the first row of the table.
-    rows = None # List of rows, each row being a list of strings.
+    """Representation of a table.
 
+    :ivar labels: List of labels to render as the table's first row.
+    :ivar rows: List of rows, each being a list of strings.
+    """
     def __init__(self, labels=None):
         if labels is None:
             self.labels = []
@@ -76,11 +82,9 @@
         for label in table.labels:
             max_col_widths.append(len(label))
         for row in table.rows:
-            row = list(row) # We need len()
-            for col_idx in range(0,len(row)):
-                col = row[col_idx]
+            for col_idx, col in enumerate(row):
                 max_col_widths[col_idx] = max(
-                    len(str(row[col_idx])), max_col_widths[col_idx])
+                    len(str(col)), max_col_widths[col_idx])
 
         out = StringIO()
         for label_idx in range(0, len(table.labels)):
@@ -88,12 +92,13 @@
                 max_col_widths[label_idx]),
         print >> out
         for width in max_col_widths:
-            print >> out, '='*width,
+            print >> out, '=' * width,
         print >> out
         for row in table.rows:
             row = list(row)
             for col_idx in range(0, len(row)):
-                print >> out, str(row[col_idx]).ljust(max_col_widths[col_idx]),
+                print >> out, str(
+                    row[col_idx]).ljust(max_col_widths[col_idx]),
             print >> out
         print >> out
 
@@ -266,7 +271,6 @@
                 % replication.helpers.CLUSTERNAME)
         return 1
 
-
     # Set our search path to the schema of the cluster we care about.
     cur.execute(
             "SET search_path TO %s, public"

=== modified file 'database/replication/slon_ctl.py'
--- database/replication/slon_ctl.py	2011-09-12 14:26:53 +0000
+++ database/replication/slon_ctl.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Startup and shutdown slon processes.
@@ -9,17 +9,21 @@
 /etc/init.d/slony1 script instead of this tool.
 """
 
-import _pythonpath
-
+from optparse import OptionParser
 import os.path
 import subprocess
 import sys
-from optparse import OptionParser
+
+import _pythonpath
+import replication.helpers
 
 from canonical.config import config
 from canonical.database.sqlbase import connect
-from canonical.launchpad.scripts import logger, logger_options
-import replication.helpers
+from canonical.launchpad.scripts import (
+    logger,
+    logger_options,
+    )
+
 
 __metaclass__ = type
 __all__ = []
@@ -131,7 +135,6 @@
 def stop(log, nodes):
     for node in nodes:
         pidfile = get_pidfile(node.nickname)
-        logfile = get_logfile(node.nickname)
 
         if not os.path.exists(pidfile):
             log.info(

=== modified file 'database/schema/fti.py'
--- database/schema/fti.py	2011-09-15 00:30:26 +0000
+++ database/schema/fti.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 #
 # This modules uses relative imports.
@@ -11,34 +11,41 @@
 """
 __metaclass__ = type
 
-import _pythonpath
-
 from distutils.version import LooseVersion
+from optparse import OptionParser
 import os.path
-from optparse import OptionParser
 import subprocess
 import sys
 from tempfile import NamedTemporaryFile
 from textwrap import dedent
 import time
 
+import _pythonpath
 import psycopg2.extensions
+import replication.helpers
 
 from canonical.config import config
 from canonical.database.postgresql import ConnectionString
 from canonical.database.sqlbase import (
-    connect, ISOLATION_LEVEL_AUTOCOMMIT, ISOLATION_LEVEL_READ_COMMITTED,
-    quote, quote_identifier)
-from canonical.launchpad.scripts import logger, logger_options, db_options
-
-import replication.helpers
+    connect,
+    ISOLATION_LEVEL_AUTOCOMMIT,
+    ISOLATION_LEVEL_READ_COMMITTED,
+    quote,
+    quote_identifier,
+    )
+from canonical.launchpad.scripts import (
+    db_options,
+    logger,
+    logger_options,
+    )
 
 # Defines parser and locale to use.
 DEFAULT_CONFIG = 'default'
 
 PGSQL_BASE = '/usr/share/postgresql'
 
-A, B, C, D = 'ABCD' # tsearch2 ranking constants
+# tsearch2 ranking constants:
+A, B, C, D = 'ABCD'
 
 # This data structure defines all of our full text indexes.  Each tuple in the
 # top level list creates a 'fti' column in the specified table.
@@ -228,7 +235,7 @@
     # Create the trigger
     columns_and_weights = []
     for column, weight in qcolumns:
-        columns_and_weights.extend( (column, weight) )
+        columns_and_weights.extend((column, weight))
 
     sql = """
         CREATE TRIGGER tsvectorupdate BEFORE UPDATE OR INSERT ON %s
@@ -260,7 +267,9 @@
 def liverebuild(con):
     """Rebuild the data in all the fti columns against possibly live database.
     """
-    batch_size = 50 # Update maximum of this many rows per commit
+    # Update number of rows per transaction.
+    batch_size = 50
+
     cur = con.cursor()
     for table, ignored in ALL_FTI:
         table = quote_identifier(table)
@@ -320,13 +329,14 @@
         p = subprocess.Popen(
             cmd.split(' '), stdin=subprocess.PIPE,
             stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+        tsearch2_sql = open(tsearch2_sql_path).read()
         out, err = p.communicate(
-            "SET client_min_messages=ERROR; CREATE SCHEMA ts2;"
-            + open(tsearch2_sql_path).read().replace('public;','ts2, public;'))
+            "SET client_min_messages=ERROR; CREATE SCHEMA ts2;" +
+            tsearch2_sql.replace('public;', 'ts2, public;'))
         if p.returncode != 0:
-            log.fatal('Error executing %s:', cmd)
+            log.fatal("Error executing %s:", cmd)
             log.debug(out)
-            sys.exit(rv)
+            sys.exit(p.returncode)
 
     # Create ftq helper and its sibling _ftq.
     # ftq(text) returns a tsquery, suitable for use querying the full text
@@ -367,7 +377,8 @@
         # Strip ! characters inside and at the end of a word
         query = re.sub(r"(?u)(?<=\w)[\!]+", " ", query)
 
-        # Now that we have handle case sensitive booleans, convert to lowercase
+        # Now that we have handled case-sensitive booleans, convert to
+        # lowercase.
         query = query.lower()
 
         # Convert foo-bar to ((foo&bar)|foobar) and foo-bar-baz to
@@ -573,7 +584,7 @@
     We know this by looking in our cache to see what the previous
     definitions were, and the --force command line argument
     '''
-    current_columns = repr(sorted(columns)) # Convert to a string
+    current_columns = repr(sorted(columns))
 
     existing = execute(
         con, "SELECT columns FROM FtiCache WHERE tablename=%(table)s",
@@ -625,6 +636,7 @@
 # Files for output generated for slonik(1). None if not a Slony-I install.
 slonik_sql = None
 
+
 def main():
     parser = OptionParser()
     parser.add_option(
@@ -710,5 +722,3 @@
 
 if __name__ == '__main__':
     sys.exit(main())
-
-

=== modified file 'database/schema/pending/prune-nonce.py'
--- database/schema/pending/prune-nonce.py	2011-09-06 05:00:26 +0000
+++ database/schema/pending/prune-nonce.py	2011-09-19 07:28:22 +0000
@@ -1,6 +1,6 @@
 #!/usr/bin/python -S
 #
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Prune old nonces."""
@@ -8,13 +8,20 @@
 __metaclass__ = type
 __all__ = []
 
+from optparse import OptionParser
+
 import _pythonpath
 
-from optparse import OptionParser
-
-from canonical.database.sqlbase import connect, ISOLATION_LEVEL_AUTOCOMMIT
+from canonical.database.sqlbase import (
+    connect,
+    ISOLATION_LEVEL_AUTOCOMMIT,
+    )
 from canonical.launchpad.scripts import db_options
-from canonical.launchpad.scripts.logger import log, logger_options
+from canonical.launchpad.scripts.logger import (
+    log,
+    logger_options,
+    )
+
 
 def update_until_done(con, table, query, vacuum_every=100):
     log.info("Running %s" % query)

=== removed file 'database/schema/pending/update-translation-credits.py'
--- database/schema/pending/update-translation-credits.py	2011-09-06 05:00:26 +0000
+++ database/schema/pending/update-translation-credits.py	1970-01-01 00:00:00 +0000
@@ -1,111 +0,0 @@
-#!/usr/bin/python -S
-#
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
-# GNU Affero General Public License version 3 (see the file LICENSE).
-
-"""Populate some new columns on the Person table."""
-
-__metaclass__ = type
-__all__ = []
-
-import _pythonpath
-
-from optparse import OptionParser
-import sys
-
-from canonical.database.sqlbase import connect, ISOLATION_LEVEL_AUTOCOMMIT
-from canonical.launchpad.scripts import db_options
-from canonical.launchpad.scripts.logger import log, logger_options
-
-def update_until_done(con, table, query, vacuum_every=100):
-    log.info("Running %s" % query)
-    loops = 0
-    total_rows = 0
-    cur = con.cursor()
-    while True:
-        loops += 1
-        cur.execute(query)
-        rowcount = cur.rowcount
-        total_rows += rowcount
-        log.debug("Updated %d" % total_rows)
-        if loops % vacuum_every == 0:
-            log.debug("Vacuuming %s" % table)
-            cur.execute("VACUUM %s" % table)
-        if rowcount <= 0:
-            log.info("Done")
-            return
-
-parser = OptionParser()
-logger_options(parser)
-db_options(parser)
-options, args = parser.parse_args()
-
-con = connect(isolation=ISOLATION_LEVEL_AUTOCOMMIT)
-
-#  People have so far updated translation credits, often mis-crediting people,
-#  or removing credits to upstream translators: we want to disable all of these
-#  translation, so automatic handling will pick up from now on.
-update_until_done(con, 'posubmission', """
-    UPDATE posubmission SET active=FALSE WHERE id IN (
-        SELECT posubmission.id
-        FROM posubmission,
-             pomsgset,
-             potmsgset,
-             pomsgid
-        WHERE
-            posubmission.active IS TRUE AND
-            posubmission.pomsgset=pomsgset.id AND
-            potmsgset=potmsgset.id AND
-            primemsgid=pomsgid.id AND
-            published IS NOT TRUE AND
-            (msgid='translation-credits' OR
-             msgid='translator-credits' OR
-             msgid='translator_credits' OR
-             msgid=E'_: EMAIL OF TRANSLATORS\nYour emails' OR
-             msgid=E'_: NAME OF TRANSLATORS\nYour names')
-        LIMIT 200
-        )
-    """)
-
-# Set any existing inactive published translations as active
-update_until_done(con, 'posubmission', """
-    UPDATE posubmission SET active=TRUE WHERE id IN (
-        SELECT posubmission.id
-        FROM posubmission,
-             pomsgset,
-             potmsgset,
-             pomsgid
-        WHERE
-            posubmission.active IS FALSE AND
-            posubmission.pomsgset=pomsgset.id AND
-            pomsgset.potmsgset=potmsgset.id AND
-            potmsgset.primemsgid=pomsgid.id AND
-            posubmission.published IS TRUE AND
-            (msgid='translation-credits' OR
-             msgid='translator-credits' OR
-             msgid='translator_credits' OR
-             msgid=E'_: EMAIL OF TRANSLATORS\nYour emails' OR
-             msgid=E'_: NAME OF TRANSLATORS\nYour names')
-        LIMIT 200
-        )
-    """)
-
-# Remove reviewer, date_reviewed from all translation credit POMsgSets
-update_until_done(con, 'pomsgset', """
-    UPDATE POMsgSet SET reviewer=NULL, date_reviewed=NULL
-    WHERE id IN (
-        SELECT POMsgSet.id
-        FROM POMsgSet, POTMsgSet, POMsgId
-        WHERE
-            POMsgSet.reviewer IS NOT NULL AND
-            POMsgSet.potmsgset=POTMsgSet.id AND
-            POTMsgSet.primemsgid=POMsgId.id AND
-            (msgid='translation-credits' OR
-            msgid='translator-credits' OR
-            msgid='translator_credits' OR
-            msgid=E'_: EMAIL OF TRANSLATORS\nYour emails' OR
-            msgid=E'_: NAME OF TRANSLATORS\nYour names')
-        LIMIT 200
-        )
-    """)
-

=== modified file 'lib/canonical/database/ftests/test_sqlbaseconnect.txt'
--- lib/canonical/database/ftests/test_sqlbaseconnect.txt	2011-09-06 05:00:26 +0000
+++ lib/canonical/database/ftests/test_sqlbaseconnect.txt	2011-09-19 07:28:22 +0000
@@ -2,8 +2,7 @@
 
     >>> from canonical.config import config
     >>> from canonical.database.sqlbase import (
-    ...     connect, ISOLATION_LEVEL_AUTOCOMMIT, ISOLATION_LEVEL_DEFAULT,
-    ...     ISOLATION_LEVEL_READ_COMMITTED, ISOLATION_LEVEL_SERIALIZABLE)
+    ...     connect, ISOLATION_LEVEL_DEFAULT, ISOLATION_LEVEL_SERIALIZABLE)
 
     >>> def do_connect(user, dbname=None, isolation=ISOLATION_LEVEL_DEFAULT):
     ...     con = connect(user=user, dbname=dbname, isolation=isolation)
@@ -24,7 +23,8 @@
 Specifying the database name connects to that database.
 
     >>> do_connect(user=config.launchpad.dbuser, dbname='launchpad_empty')
-    Connected as launchpad_main to launchpad_empty in read committed isolation.
+    Connected as launchpad_main to launchpad_empty in read committed
+    isolation.
 
 Specifying the isolation level works too.
 
@@ -32,4 +32,3 @@
     ...     user=config.launchpad.dbuser,
     ...     isolation=ISOLATION_LEVEL_SERIALIZABLE)
     Connected as launchpad_main to ... in serializable isolation.
-

=== modified file 'lib/canonical/database/harness.py'
--- lib/canonical/database/harness.py	2011-09-18 06:59:33 +0000
+++ lib/canonical/database/harness.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Scripts for starting a Python prompt with Launchpad initialized.
@@ -17,17 +17,28 @@
 
 #
 import os
+import readline
+import rlcompleter
 import sys
 
 from pytz import utc
+from storm.expr import *
+# Bring in useful bits of Storm.
+from storm.locals import *
 import transaction
-
 from zope.component import getUtility
+from zope.interface.verify import verifyObject
 from zope.security.proxy import removeSecurityProxy
 
 from canonical.launchpad.scripts import execute_zcml_for_scripts
 from canonical.launchpad.webapp import canonical_url
-
+from canonical.launchpad.webapp.interfaces import (
+    DEFAULT_FLAVOR,
+    IStoreSelector,
+    MAIN_STORE,
+    MASTER_FLAVOR,
+    SLAVE_FLAVOR,
+    )
 from lp.answers.model.question import Question
 from lp.blueprints.model.specification import Specification
 from lp.bugs.model.bug import Bug
@@ -38,38 +49,25 @@
 from lp.registry.model.projectgroup import ProjectGroup
 from lp.testing.factory import LaunchpadObjectFactory
 
-from zope.interface.verify import verifyObject
-
-import readline
-import rlcompleter
-
-# Bring in useful bits of Storm.
-from storm.locals import *
-from storm.expr import *
-from canonical.launchpad.webapp.interfaces import (
-    IStoreSelector, MAIN_STORE, MASTER_FLAVOR, SLAVE_FLAVOR, DEFAULT_FLAVOR)
-
 
 def _get_locals():
     if len(sys.argv) > 1:
         dbuser = sys.argv[1]
     else:
         dbuser = None
-    print 'execute_zcml_for_scripts()...'
     execute_zcml_for_scripts()
     readline.parse_and_bind('tab: complete')
-    # Mimic the real interactive interpreter's loading of any $PYTHONSTARTUP file.
-    print 'Reading $PYTHONSTARTUP...'
+    # Mimic the real interactive interpreter's loading of any
+    # $PYTHONSTARTUP file.
     startup = os.environ.get('PYTHONSTARTUP')
     if startup:
         execfile(startup)
-    print 'Initializing storm...'
     store_selector = getUtility(IStoreSelector)
     store = store_selector.get(MAIN_STORE, MASTER_FLAVOR)
 
-    # Let's get a few handy objects going.
     if dbuser == 'launchpad':
-        print 'Creating a few handy objects...'
+        # Create a few variables "in case they come in handy."
+        # Do we really use these?  Are they worth carrying around?
         d = Distribution.get(1)
         p = Person.get(1)
         ds = DistroSeries.get(1)
@@ -81,7 +79,6 @@
         q = Question.get(1)
 
     # Having a factory instance is handy.
-    print 'Creating the factory...'
     factory = LaunchpadObjectFactory()
     res = {}
     res.update(locals())

=== modified file 'lib/canonical/launchpad/doc/helpers.txt'
--- lib/canonical/launchpad/doc/helpers.txt	2011-09-13 13:04:10 +0000
+++ lib/canonical/launchpad/doc/helpers.txt	2011-09-19 07:28:22 +0000
@@ -1,16 +1,18 @@
-= Helper Functions and Classes =
+Helper Functions and Classes
+============================
 
 There are some helper functions and classes located in
 canonical.launchpad.helpers which can be used in order to avoid
 duplicating code in different parts of Launchpad.
 
 
-== Concatenating lists English-style ==
+Concatenating lists English-style
+---------------------------------
 
-The english_list function takes a list of strings and concatenates
-them in a form suitable for inclusion in an English sentence. For
-lists of 3 or more elements it follows the advice given in The
-Elements of Style, chapter I, section 2.
+The english_list function takes a list of strings and concatenates them
+in a form suitable for inclusion in an English sentence. For lists of 3
+or more elements it follows the advice given in The Elements of Style,
+chapter I, section 2.
 
     >>> from canonical.launchpad.helpers import english_list
 
@@ -20,10 +22,13 @@
 
     >>> english_list([])
     ''
+
     >>> english_list(['Fred'])
     'Fred'
+
     >>> english_list(['Fred', 'Bob'])
     'Fred and Bob'
+
     >>> english_list(['Fred', 'Bob', 'Harold'])
     'Fred, Bob, and Harold'
 
@@ -31,6 +36,7 @@
 
     >>> english_list('12345')
     '1, 2, 3, 4, and 5'
+
     >>> english_list(str(i) for i in xrange(5))
     '0, 1, 2, 3, and 4'
 
@@ -47,16 +53,17 @@
     '1, 2, or 3'
 
 
-== get_contact_email_addresses ==
+get_contact_email_addresses
+---------------------------
 
-The get_contact_email_addresses function is used to get a person's preferred
-address as a set for function that require the a set of email addresses.
+The get_contact_email_addresses function is used to get a person's
+preferred address as a set for function that require the a set of email
+addresses.
 
     >>> from canonical.launchpad.helpers import get_contact_email_addresses
 
     >>> user = factory.makePerson(
-    ...     email='user@xxxxxxxxxxxxx',
-    ...     name='user',)
+    ...        email='user@xxxxxxxxxxxxx',
+    ...        name='user',)
     >>> get_contact_email_addresses(user)
     set(['user@xxxxxxxxxxxxx'])
-

=== modified file 'lib/canonical/launchpad/doc/librarian.txt'
--- lib/canonical/launchpad/doc/librarian.txt	2011-09-13 05:23:16 +0000
+++ lib/canonical/launchpad/doc/librarian.txt	2011-09-19 07:28:22 +0000
@@ -1,33 +1,42 @@
-= Librarian Access =
-
-The librarian is a file storage service for launchpad. Conceptually similar to
-other file storage API's like S3, it is used to store binary or large
-content - bug attachments, package builds, images and so on.
-
-Content in the librarian can be exposed at different urls. To expose some
-content use a LibraryFileAlias. Private content is supported as well - for
-that tokens are added to permit access for a limited time by a client - each
-time a client attempts to dereference a private LibraryFileAlias a token is
-emitted.
-
-= Deployment notes =
-
-(These may seem a bit out of place - they are, but they need to be written
-down somewhere, and the deployment choices inform the implementation choices).
-
-The basics are simple: The librarian talks to clients. However restricted file
-access makes things a little more complex. As the librarian itself doesn't do
-SSL processing, and we want restricted files to be kept confidential the
-librarian will need a hint from the SSL front end that SSL was in fact used.
-The semi standard header Front-End-Https can be used for this if we filter it
-in incoming requests from clients.
-
-== setUp ==
+Librarian Access
+================
+
+The librarian is a file storage service for launchpad. Conceptually
+similar to other file storage API's like S3, it is used to store binary
+or large content - bug attachments, package builds, images and so on.
+
+Content in the librarian can be exposed at different urls. To expose
+some content use a LibraryFileAlias. Private content is supported as
+well - for that tokens are added to permit access for a limited time by
+a client - each time a client attempts to dereference a private
+LibraryFileAlias a token is emitted.
+
+
+Deployment notes
+================
+
+(These may seem a bit out of place - they are, but they need to be
+written down somewhere, and the deployment choices inform the
+implementation choices).
+
+The basics are simple: The librarian talks to clients. However
+restricted file access makes things a little more complex. As the
+librarian itself doesn't do SSL processing, and we want restricted files
+to be kept confidential the librarian will need a hint from the SSL
+front end that SSL was in fact used. The semi standard header Front-End-
+Https can be used for this if we filter it in incoming requests from
+clients.
+
+
+setUp
+-----
 
     >>> from canonical.database.sqlbase import session_store
     >>> from canonical.launchpad.database.librarian import TimeLimitedToken
 
-== High Level ==
+
+High Level
+----------
 
     >>> from StringIO import StringIO
     >>> from canonical.launchpad.interfaces.librarian import (
@@ -44,6 +53,7 @@
     ...     )
     >>> alias.mimetype
     u'text/plain'
+
     >>> alias.filename
     u'text.txt'
 
@@ -53,8 +63,9 @@
 
     >>> alias.expires == NEVER_EXPIRES
     True
+
     >>> alias = lfas.create(
-    ...     'text.txt', len(data), StringIO(data), 'text/plain')
+    ...        'text.txt', len(data), StringIO(data), 'text/plain')
 
 The default expiry of None means the file will expire a few days after
 it is no longer referenced in the database.
@@ -68,7 +79,8 @@
     >>> alias.date_created
     datetime.datetime(...)
 
-We can retrieve the LibraryFileAlias we just created using its ID or sha1.
+We can retrieve the LibraryFileAlias we just created using its ID or
+sha1.
 
     >>> org_alias_id = alias.id
     >>> alias = lfas[org_alias_id]
@@ -83,38 +95,39 @@
     >>> from canonical.config import config
     >>> import re
     >>> re.search(
-    ...     r'^%s\d+/text.txt$' % config.librarian.download_url,
-    ...     alias.http_url
-    ...     ) is not None
+    ...        r'^%s\d+/text.txt$' % config.librarian.download_url,
+    ...        alias.http_url
+    ...        ) is not None
     True
 
-Librarian also serves the same file through https, we use this for branding
-and similar inline-presented objects which would trigger security warnings on
-https pages otherwise.
+Librarian also serves the same file through https, we use this for
+branding and similar inline-presented objects which would trigger
+security warnings on https pages otherwise.
 
     >>> re.search(r'^https://.+/\d+/text.txt$', alias.https_url) is not None
     True
 
-And we even have a convenient method which returns either the http URL or the
-https one, depending on a config value.
+And we even have a convenient method which returns either the http URL
+or the https one, depending on a config value.
 
     >>> config.vhosts.use_https
     False
+
     >>> re.search(
-    ...     r'^%s\d+/text.txt$' % config.librarian.download_url,
-    ...     alias.getURL()
-    ...     ) is not None
+    ...        r'^%s\d+/text.txt$' % config.librarian.download_url,
+    ...        alias.getURL()
+    ...        ) is not None
     True
 
     >>> from textwrap import dedent
     >>> test_data = dedent("""
-    ...     [librarian]
-    ...     use_https: true
-    ...     """)
+    ...        [librarian]
+    ...        use_https: true
+    ...        """)
     >>> config.push('test', test_data)
     >>> re.search(
-    ...     r'^https://.+/\d+/text.txt$', alias.https_url
-    ...     ) is not None
+    ...        r'^https://.+/\d+/text.txt$', alias.https_url
+    ...        ) is not None
     True
 
 However, we can force the use of HTTP by setting the 'HTTP_X_SCHEME'
@@ -124,8 +137,8 @@
     >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest
     >>> from urlparse import urlparse
     >>> request = LaunchpadTestRequest(
-    ...     environ={'REQUEST_METHOD': 'GET',
-    ...              'HTTP_X_SCHEME' : 'http' })
+    ...        environ={'REQUEST_METHOD': 'GET',
+    ...                 'HTTP_X_SCHEME' : 'http' })
     >>> view = getMultiAdapter((alias,request), name='+index')
     >>> view.initialize()
     >>> print urlparse(request.response.getHeader('Location'))[0]
@@ -135,8 +148,8 @@
 unaffected.
 
     >>> request = LaunchpadTestRequest(
-    ...     environ={'REQUEST_METHOD': 'GET',
-    ...              'HTTP_X_SCHEME' : 'https' })
+    ...        environ={'REQUEST_METHOD': 'GET',
+    ...                 'HTTP_X_SCHEME' : 'https' })
     >>> view = getMultiAdapter((alias,request), name='+index')
     >>> view.initialize()
     >>> print urlparse(request.response.getHeader('Location'))[0]
@@ -162,6 +175,7 @@
     >>> alias.open()
     >>> alias.read()
     'This is some data'
+
     >>> alias.close()
 
 We can also read it in chunks.
@@ -169,10 +183,13 @@
     >>> alias.open()
     >>> alias.read(2)
     'Th'
+
     >>> alias.read(6)
     'is is '
+
     >>> alias.read()
     'some data'
+
     >>> alias.close()
 
 If you don't want to read the file in chunks you can neglect to call
@@ -181,9 +198,9 @@
     >>> alias.read()
     'This is some data'
 
-Each alias also has an expiry date associated with it, the default of None
-meaning the file will expire a few days after nothing references it any
-more:
+Each alias also has an expiry date associated with it, the default of
+None meaning the file will expire a few days after nothing references it
+any more:
 
     >>> alias.expires is None
     True
@@ -195,10 +212,11 @@
     >>> alias.close()
 
 
-== Low Level ==
+Low Level
+---------
 
-We can also use the ILibrarianClient Utility directly to store and access
-files in the Librarian.
+We can also use the ILibrarianClient Utility directly to store and
+access files in the Librarian.
 
     >>> from canonical.librarian.interfaces import ILibrarianClient
     >>> client = getUtility(ILibrarianClient)
@@ -209,21 +227,21 @@
     >>> f = client.getFileByAlias(aid)
     >>> f.read()
     'This is some data'
+
     >>> url = client.getURLForAlias(aid)
     >>> re.search(
     ...     r'^%s\d+/text.txt$' % config.librarian.download_url, url
     ...     ) is not None
     True
 
-When secure=True, the returned url has the id as part of the domain name and
-the protocol is https:
+When secure=True, the returned url has the id as part of the domain name
+and the protocol is https:
 
     >>> expected = r'^https://i%d\..+:\d+/%d/text.txt$' % (aid, aid)
     >>> found = client.getURLForAlias(aid, secure=True)
     >>> re.search(expected, found) is not None
     True
 
-
 Librarian reads are logged in the request timeline.
 
     >>> from lazr.restful.utils import get_current_browser_request
@@ -234,27 +252,31 @@
     >>> action = timeline.actions[-1]
     >>> action.category
     'librarian-connection'
+
     >>> action.detail.endswith('/text.txt')
     True
+
     >>> _unused = f.read()
     >>> action = timeline.actions[-1]
     >>> action.category
     'librarian-read'
+
     >>> action.detail.endswith('/text.txt')
     True
 
-At this level we can also reverse the transactional semantics by using the
-remoteAddFile instead of the addFile method. In this case, the database
-rows are added by the Librarian, which means that the file is downloadable
-immediately and will exist even if the client transaction rolls back.
-However, the records in the database will not be visible to the client
-until it begins a new transaction.
+At this level we can also reverse the transactional semantics by using
+the remoteAddFile instead of the addFile method. In this case, the
+database rows are added by the Librarian, which means that the file is
+downloadable immediately and will exist even if the client transaction
+rolls back. However, the records in the database will not be visible to
+the client until it begins a new transaction.
 
     >>> url = client.remoteAddFile(
-    ...     'text.txt', len(data), StringIO(data), 'text/plain'
-    ...     )
+    ...        'text.txt', len(data), StringIO(data), 'text/plain'
+    ...        )
     >>> print url
     http://.../text.txt
+
     >>> from urllib2 import urlopen
     >>> urlopen(url).read()
     'This is some data'
@@ -270,15 +292,15 @@
     >>> from datetime import date, datetime
     >>> from pytz import utc
     >>> url = client.remoteAddFile(
-    ...     'text.txt', len(data), StringIO(data), 'text/plain',
-    ...     expires=datetime(2005,9,1,12,0,0, tzinfo=utc)
-    ...     )
+    ...        'text.txt', len(data), StringIO(data), 'text/plain',
+    ...        expires=datetime(2005,9,1,12,0,0, tzinfo=utc)
+    ...        )
     >>> transaction.abort()
 
-To check the expiry is set, we need to extract the alias id from the URL.
-remoteAddFile deliberatly returns the URL instead of the alias id because,
-except for test cases, the URL is the only thing useful (because the
-client can't see the database records yet).
+To check the expiry is set, we need to extract the alias id from the
+URL. remoteAddFile deliberatly returns the URL instead of the alias id
+because, except for test cases, the URL is the only thing useful
+(because the client can't see the database records yet).
 
     >>> import re
     >>> match = re.search('/(\d+)/', url)
@@ -288,7 +310,8 @@
     2005-09-01T12:00:00+00:00
 
 
-== Restricted Librarian ==
+Restricted Librarian
+--------------------
 
 Some files should not be generally available publicly. If you know the
 URL, any file can be retrieved directly from the librarian. For this
@@ -318,35 +341,39 @@
     >>> file_alias.open()
     >>> print file_alias.read()
     This is private data.
+
     >>> file_alias.close()
 
 Restricted files are accessible with HTTP on a private domain.
 
     >>> print file_alias.http_url
     http://.../private.txt
+
     >>> file_alias.http_url.startswith(
-    ...     config.librarian.restricted_download_url)
+    ...        config.librarian.restricted_download_url)
     True
 
 They can also be accessed externally using a time-limited token appended
-to their private_url. Possession of a token is sufficient to grant access
-to a file, regardless of who is logged in. getURL can be asked to provide
-such a token.
+to their private_url. Possession of a token is sufficient to grant
+access to a file, regardless of who is logged in. getURL can be asked to
+provide such a token.
 
     >>> token_url = file_alias.getURL(include_token=True)
     >>> print token_url
     https://i...restricted.../private.txt?token=...
+
     >>> token_url.startswith('https://i%d.restricted.' % file_alias.id)
     True
+
     >>> private_path = TimeLimitedToken.url_to_token_path(
-    ...     file_alias.private_url)
+    ...        file_alias.private_url)
     >>> token_url.endswith(session_store().find(TimeLimitedToken,
-    ...     path=private_path).any().token)
+    ...        path=private_path).any().token)
     True
 
-LibraryFileAliasView doesn't work on restricted files. This is a temporary
-measure until we're sure no restricted files leak into the traversal
-hierarchy.
+LibraryFileAliasView doesn't work on restricted files. This is a
+temporary measure until we're sure no restricted files leak into the
+traversal hierarchy.
 
     >>> view = getMultiAdapter((file_alias, request), name='+index')
     >>> view.initialize()
@@ -380,8 +407,8 @@
 (by switching the port)
 
     >>> sneaky_url = file_url.replace(
-    ...     config.librarian.restricted_download_url,
-    ...     config.librarian.download_url)
+    ...        config.librarian.restricted_download_url,
+    ...        config.librarian.download_url)
     >>> urlopen(sneaky_url).read()
     Traceback (most recent call last):
       ...
@@ -397,11 +424,12 @@
 
     >>> public_content = 'This is public data.'
     >>> public_file_id = getUtility(ILibrarianClient).addFile(
-    ...     'public.txt', len(public_content), StringIO(public_content),
-    ...     'text/plain')
+    ...        'public.txt', len(public_content), StringIO(public_content),
+    ...        'text/plain')
     >>> file_alias = getUtility(ILibraryFileAliasSet)[public_file_id]
     >>> file_alias.restricted
     False
+
     >>> transaction.commit()
 
     >>> restricted_client.getURLForAlias(public_file_id)
@@ -418,10 +446,11 @@
 file:
 
     >>> url = restricted_client.remoteAddFile(
-    ...     'another-private.txt', len(private_content),
-    ...     StringIO(private_content), 'text/plain')
+    ...        'another-private.txt', len(private_content),
+    ...        StringIO(private_content), 'text/plain')
     >>> print url
     http://.../another-private.txt
+
     >>> url.startswith(config.librarian.restricted_download_url)
     True
 
@@ -434,8 +463,8 @@
 parameter to ILibraryFileAliasSet:
 
     >>> restricted_file = getUtility(ILibraryFileAliasSet).create(
-    ...     'yet-another-private.txt', len(private_content),
-    ...     StringIO(private_content), 'text/plain', restricted=True)
+    ...        'yet-another-private.txt', len(private_content),
+    ...        StringIO(private_content), 'text/plain', restricted=True)
     >>> restricted_file.restricted
     True
 
@@ -456,15 +485,16 @@
     >>> result = result.splitlines()
     >>> print result[0]
     3
+
     >>> sorted(file_path.split('/')[1] for file_path in result[1:])
-    ['another-private.txt',
-     'private.txt',
-     'yet-another-private.txt']
-
-
-== Odds and Sods ==
-
-An UploadFailed will be raised if you try to create a file with no content
+    ['another-private.txt', 'private.txt', 'yet-another-private.txt']
+
+
+Odds and Sods
+-------------
+
+An UploadFailed will be raised if you try to create a file with no
+content
 
     >>> client.addFile('test.txt', 0, StringIO('hello'), 'text/plain')
     Traceback (most recent call last):
@@ -474,14 +504,14 @@
 If you really want a zero length file you can do it:
 
     >>> aid = client.addFile('test.txt', 0, StringIO(''), 'text/plain',
-    ...     allow_zero_length=True)
+    ...        allow_zero_length=True)
     >>> transaction.commit()
     >>> f = client.getFileByAlias(aid)
     >>> f.read()
     ''
 
-An AssertionError will be raised if the number of bytes that could be read
-from the file don't match the declared size.
+An AssertionError will be raised if the number of bytes that could be
+read from the file don't match the declared size.
 
     >>> client.addFile('test.txt', 42, StringIO(''), 'text/plain')
     Traceback (most recent call last):
@@ -491,55 +521,63 @@
 Filenames with spaces in them work.
 
     >>> aid = client.addFile(
-    ...     'hot dog', len(data), StringIO(data), 'text/plain')
+    ...        'hot dog', len(data), StringIO(data), 'text/plain')
     >>> transaction.commit()
     >>> f = client.getFileByAlias(aid)
     >>> f.read()
     'This is some data'
+
     >>> url = client.getURLForAlias(aid)
     >>> re.search(r'/\d+/hot%20dog$', url) is not None
     True
 
-Unicode file names work.  Note that the filename in the resulting URL
-is encoded as UTF-8.
+Unicode file names work.  Note that the filename in the resulting URL is
+encoded as UTF-8.
 
     >>> aid = client.addFile(u'Yow\N{INTERROBANG}', len(data), StringIO(data),
-    ...                      'text/plain')
+    ...                         'text/plain')
     >>> transaction.commit()
     >>> f = client.getFileByAlias(aid)
     >>> f.read()
     'This is some data'
+
     >>> url = client.getURLForAlias(aid)
     >>> re.search(r'/\d+/Yow%E2%80%BD$', url) is not None
     True
 
 Files will get garbage collected on production systems as per
-LibrarianGarbageCollection. If you request the URL of a deleted file, you
-will be given None
+LibrarianGarbageCollection. If you request the URL of a deleted file,
+you will be given None
 
     >>> alias = lfas[36]
     >>> alias.deleted
     True
+
     >>> alias.http_url is None
     True
+
     >>> alias.https_url is None
     True
+
     >>> alias.getURL() is None
     True
+
     >>> client.getURLForAlias(alias.id) is None
     True
 
 
-== Default View ==
+Default View
+------------
 
-A librarian file has a default view that should redirect to the download URL.
+A librarian file has a default view that should redirect to the download
+URL.
 
     >>> from zope.component import getMultiAdapter
     >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest
     >>> req = LaunchpadTestRequest()
     >>> alias = lfas.create(
-    ...     'text2.txt', len(data), StringIO(data), 'text/plain',
-    ...     NEVER_EXPIRES)
+    ...        'text2.txt', len(data), StringIO(data), 'text/plain',
+    ...        NEVER_EXPIRES)
     >>> transaction.commit()
     >>> lfa_view = getMultiAdapter((alias, req), name='+index')
     >>> lfa_view.initialize()
@@ -547,32 +585,34 @@
     True
 
 
-== File views setup ==
+File views setup
+----------------
 
 We need some files to test different ways of accessing them.
 
     >>> filename = 'public.txt'
     >>> content = 'PUBLIC'
     >>> public_file = getUtility(ILibraryFileAliasSet).create(
-    ...     filename, len(content), StringIO(content), 'text/plain',
-    ...     NEVER_EXPIRES, restricted=False)
+    ...        filename, len(content), StringIO(content), 'text/plain',
+    ...        NEVER_EXPIRES, restricted=False)
 
     >>> filename = 'restricted.txt'
     >>> content = 'RESTRICTED'
     >>> restricted_file = getUtility(ILibraryFileAliasSet).create(
-    ...     filename, len(content), StringIO(content), 'text/plain',
-    ...     NEVER_EXPIRES, restricted=True)
+    ...        filename, len(content), StringIO(content), 'text/plain',
+    ...        NEVER_EXPIRES, restricted=True)
 
     # Create a new LibraryFileAlias not referencing any LibraryFileContent
     # record. Such records are considered as being deleted.
+
     >>> from canonical.launchpad.database.librarian import LibraryFileAlias
     >>> from canonical.launchpad.webapp.interfaces import (
-    ...     IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
+    ...        IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
 
     >>> store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR)
     >>> deleted_file = LibraryFileAlias(
-    ...     content=None, filename='deleted.txt',
-    ...     mimetype='text/plain')
+    ...        content=None, filename='deleted.txt',
+    ...        mimetype='text/plain')
     >>> ignore = store.add(deleted_file)
 
 Commit the just-created files.
@@ -589,7 +629,8 @@
     >>> _ = session_store().find(TimeLimitedToken).remove()
 
 
-== LibraryFileAliasMD5View ==
+LibraryFileAliasMD5View
+-----------------------
 
 The MD5 summary for a file can be downloaded. The text file contains the
 hash and file name.
@@ -602,24 +643,27 @@
     text/plain
 
 
-== Download counts ==
+Download counts
+---------------
 
 The download counts for librarian files are stored in the
-LibraryFileDownloadCount table, broken down by day and country, but there's
-also a 'hits' attribute on ILibraryFileAlias, which holds the total number of
-times that file has been downloaded.
+LibraryFileDownloadCount table, broken down by day and country, but
+there's also a 'hits' attribute on ILibraryFileAlias, which holds the
+total number of times that file has been downloaded.
 
 The count starts at 0, and cannot be changed directly.
 
     >>> public_file.hits
     0
+
     >>> public_file.hits = 10
     Traceback (most recent call last):
     ...
     ForbiddenAttribute: ...
 
-To change that, we have to use the updateDownloadCount() method, which takes
-care of creating/updating the necessary LibraryFileDownloadCount entries.
+To change that, we have to use the updateDownloadCount() method, which
+takes care of creating/updating the necessary LibraryFileDownloadCount
+entries.
 
     >>> from lp.services.worlddata.interfaces.country import ICountrySet
     >>> country_set = getUtility(ICountrySet)
@@ -629,21 +673,21 @@
     >>> public_file.hits
     1
 
-This was the first hit for that file from Brazil on 2006 November first, so a
-new LibraryFileDownloadCount was created.
+This was the first hit for that file from Brazil on 2006 November first,
+so a new LibraryFileDownloadCount was created.
 
     >>> from canonical.launchpad.database.librarian import (
-    ...     LibraryFileDownloadCount)
+    ...        LibraryFileDownloadCount)
     >>> from storm.locals import Store
     >>> store = Store.of(public_file)
     >>> brazil_entry = store.find(
-    ...     LibraryFileDownloadCount, libraryfilealias=public_file,
-    ...     country=brazil, day=november_1st_2006).one()
+    ...        LibraryFileDownloadCount, libraryfilealias=public_file,
+    ...        country=brazil, day=november_1st_2006).one()
     >>> brazil_entry.count
     1
 
-Below we simulate a hit from Japan on that same day, which will also create a
-new LibraryFileDownloadCount.
+Below we simulate a hit from Japan on that same day, which will also
+create a new LibraryFileDownloadCount.
 
     >>> japan = country_set['JP']
     >>> public_file.updateDownloadCount(november_1st_2006, japan, count=3)
@@ -651,38 +695,44 @@
     4
 
     >>> japan_entry = store.find(
-    ...     LibraryFileDownloadCount, libraryfilealias=public_file,
-    ...     country=japan, day=november_1st_2006).one()
+    ...        LibraryFileDownloadCount, libraryfilealias=public_file,
+    ...        country=japan, day=november_1st_2006).one()
     >>> japan_entry.count
     3
 
-If there's another hit from Brazil on the same day, the existing entry will be
-updated.
+If there's another hit from Brazil on the same day, the existing entry
+will be updated.
 
     >>> public_file.updateDownloadCount(november_1st_2006, brazil, count=2)
     >>> public_file.hits
     6
+
     >>> brazil_entry.count
     3
 
-If the hit happened on a different day, a separate entry would be created.
+If the hit happened on a different day, a separate entry would be
+created.
 
     >>> november_2nd_2006 = date(2006, 11, 2)
     >>> public_file.updateDownloadCount(november_2nd_2006, brazil, count=10)
     >>> public_file.hits
     16
+
     >>> brazil_entry2 = store.find(
-    ...     LibraryFileDownloadCount, libraryfilealias=public_file,
-    ...     country=brazil, day=november_2nd_2006).one()
+    ...        LibraryFileDownloadCount, libraryfilealias=public_file,
+    ...        country=brazil, day=november_2nd_2006).one()
     >>> brazil_entry2.count
     10
+
     >>> last_downloaded_date = november_2nd_2006
 
 
-== Time to last download ==
+Time to last download
+---------------------
 
-The .last_downloaded property gives us the time delta from today to the day
-that file was last downloaded, or None if it's never been downloaded.
+The .last_downloaded property gives us the time delta from today to the
+day that file was last downloaded, or None if it's never been
+downloaded.
 
     >>> today = datetime.now(utc).date()
     >>> public_file.last_downloaded == today - last_downloaded_date
@@ -690,7 +740,7 @@
 
     >>> content = 'something'
     >>> brand_new_file = getUtility(ILibraryFileAliasSet).create(
-    ...     'new.txt', len(content), StringIO(content), 'text/plain',
-    ...     NEVER_EXPIRES, restricted=False)
+    ...        'new.txt', len(content), StringIO(content), 'text/plain',
+    ...        NEVER_EXPIRES, restricted=False)
     >>> print brand_new_file.last_downloaded
     None

=== modified file 'lib/canonical/launchpad/scripts/tests/test_scriptmonitor.py'
--- lib/canonical/launchpad/scripts/tests/test_scriptmonitor.py	2011-09-06 05:05:42 +0000
+++ lib/canonical/launchpad/scripts/tests/test_scriptmonitor.py	2011-09-19 07:28:22 +0000
@@ -1,11 +1,10 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Test scriptmonitor.py."""
 
 __metaclass__ = type
 
-import logging
 from unittest import TestCase
 
 from canonical.database.sqlbase import connect

=== modified file 'lib/canonical/launchpad/utilities/looptuner.py'
--- lib/canonical/launchpad/utilities/looptuner.py	2011-09-13 05:23:16 +0000
+++ lib/canonical/launchpad/utilities/looptuner.py	2011-09-19 07:28:22 +0000
@@ -155,15 +155,15 @@
                 # a reasonable minimum for time_taken, just in case we
                 # get weird values for whatever reason and destabilize
                 # the algorithm.
-                time_taken = max(self.goal_seconds/10, time_taken)
-                chunk_size *= (1 + self.goal_seconds/time_taken)/2
+                time_taken = max(self.goal_seconds / 10, time_taken)
+                chunk_size *= (1 + self.goal_seconds / time_taken) / 2
                 chunk_size = max(chunk_size, self.minimum_chunk_size)
                 chunk_size = min(chunk_size, self.maximum_chunk_size)
                 iteration += 1
 
             total_time = last_clock - self.start_time
-            average_size = total_size/max(1, iteration)
-            average_speed = total_size/max(1, total_time)
+            average_size = total_size / max(1, iteration)
+            average_speed = total_size / max(1, total_time)
             self.log.debug2(
                 "Done. %d items in %d iterations, %3f seconds, "
                 "average size %f (%s/s)",
@@ -234,10 +234,10 @@
     """
 
     # We block until replication lag is under this threshold.
-    acceptable_replication_lag = timedelta(seconds=30) # In seconds.
+    acceptable_replication_lag = timedelta(seconds=30)  # In seconds.
 
     # We block if there are transactions running longer than this threshold.
-    long_running_transaction = 30*60 # In seconds
+    long_running_transaction = 30 * 60  # In seconds.
 
     def _blockWhenLagged(self):
         """When database replication lag is high, block until it drops."""
@@ -256,7 +256,8 @@
                     "Database replication lagged %s. "
                     "Sleeping up to 10 minutes.", lag)
 
-            transaction.abort() # Don't become a long running transaction!
+            # Don't become a long running transaction!
+            transaction.abort()
             self._sleep(10)
 
     def _blockForLongRunningTransactions(self):
@@ -291,7 +292,8 @@
                         "Blocked on %s old xact %s@%s/%d - %s.",
                         runtime, usename, datname, procpid, query)
                 self.log.info("Sleeping for up to 10 minutes.")
-            transaction.abort() # Don't become a long running transaction!
+            # Don't become a long running transaction!
+            transaction.abort()
             self._sleep(10)
 
     def _coolDown(self, bedtime):
@@ -313,7 +315,7 @@
 
     goal_seconds = 2
     minimum_chunk_size = 1
-    maximum_chunk_size = None # Override
+    maximum_chunk_size = None  # Override.
     cooldown_time = 0
 
     def __init__(self, log, abort_time=None):

=== modified file 'lib/canonical/launchpad/webapp/errorlog.py'
--- lib/canonical/launchpad/webapp/errorlog.py	2011-09-13 05:23:16 +0000
+++ lib/canonical/launchpad/webapp/errorlog.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=W0702
@@ -14,9 +14,6 @@
 import operator
 import os
 import re
-import stat
-import types
-import urllib
 import urlparse
 
 from lazr.restful.utils import (
@@ -109,7 +106,8 @@
         self.username = username
         self.url = url
         self.duration = duration
-        # informational is ignored - will be going from the oops module soon too.
+        # informational is ignored - will be going from the oops module
+        # soon too.
         self.req_vars = req_vars
         self.db_statements = db_statements
         self.branch_nick = branch_nick or versioninfo.branch_nick
@@ -144,7 +142,7 @@
     This reads the 'exc_info' key from the context and sets the:
     * type
     * value
-    * tb_text 
+    * tb_text
     keys in the report.
     """
     info = context.get('exc_info')
@@ -162,9 +160,10 @@
 
 _ignored_exceptions_for_unauthenticated_users = set(['Unauthorized'])
 
+
 def attach_http_request(report, context):
     """Add request metadata into the error report.
-    
+
     This reads the exc_info and http_request keys from the context and will
     write to:
     * url
@@ -308,8 +307,8 @@
         # threadsafe - so only scripts) - a todo item is to only add this
         # for scripts (or to make it threadsafe)
         self._oops_config.on_create.append(self._attach_messages)
-        # In the zope environment we track how long a script / http request has
-        # been running for - this is useful data!
+        # In the zope environment we track how long a script / http
+        # request has been running for - this is useful data!
         self._oops_config.on_create.append(attach_adapter_duration)
         # We want to publish reports to disk for gathering to the central
         # analysis server.
@@ -327,7 +326,7 @@
                 operator.methodcaller('get', 'ignore'))
         #  - have a type listed in self._ignored_exceptions.
         self._oops_config.filters.append(
-                lambda report:report['type'] in self._ignored_exceptions)
+                lambda report: report['type'] in self._ignored_exceptions)
         #  - have a missing or offset REFERER header with a type listed in
         #    self._ignored_exceptions_for_offsite_referer
         self._oops_config.filters.append(self._filter_bad_urls_by_referer)
@@ -347,12 +346,12 @@
         """Return the contents of the OOPS report logged at 'time'."""
         # How this works - get a serial that was logging in the dir
         # that logs for time are logged in.
-        serial_from_time = self._oops_datedir_repo.log_namer._findHighestSerial(
-            self._oops_datedir_repo.log_namer.output_dir(time))
+        log_namer = self._oops_datedir_repo.log_namer
+        serial_from_time = log_namer._findHighestSerial(
+            log_namer.output_dir(time))
         # Calculate a filename which combines this most recent serial,
         # the current log_namer naming rules and the exact timestamp.
-        oops_filename = self._oops_datedir_repo.log_namer.getFilename(
-                serial_from_time, time)
+        oops_filename = log_namer.getFilename(serial_from_time, time)
         # Note that if there were no logs written, or if there were two
         # oops that matched the time window of directory on disk, this
         # call can raise an IOError.

=== modified file 'lib/canonical/launchpad/webapp/publisher.py'
--- lib/canonical/launchpad/webapp/publisher.py	2011-09-13 05:23:16 +0000
+++ lib/canonical/launchpad/webapp/publisher.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Publisher of objects as web pages.
@@ -26,7 +26,6 @@
     ]
 
 import httplib
-import simplejson
 
 from lazr.restful import (
     EntryResource,
@@ -36,6 +35,7 @@
 from lazr.restful.interfaces import IJSONRequestCache
 from lazr.restful.tales import WebLayerAPI
 from lazr.restful.utils import get_current_browser_request
+import simplejson
 from zope.app import zapi
 from zope.app.publisher.interfaces.xmlrpc import IXMLRPCView
 from zope.app.publisher.xmlrpc import IMethodPublisher
@@ -80,7 +80,6 @@
 from lp.app.errors import NotFoundError
 from lp.services.encoding import is_ascii_only
 
-
 # Monkeypatch NotFound to always avoid generating OOPS
 # from NotFound in web service calls.
 error_status(httplib.NOT_FOUND)(NotFound)
@@ -360,8 +359,8 @@
 
     def publishTraverse(self, request, name):
         """See IBrowserPublisher."""
-        # By default, any LaunchpadView cannot be traversed through. Those that
-        # can override this method.
+        # By default, a LaunchpadView cannot be traversed through.
+        # Those that can be must override this method.
         raise NotFound(self, name, request=request)
 
 

=== modified file 'lib/canonical/lazr/doc/utils.txt'
--- lib/canonical/lazr/doc/utils.txt	2011-09-13 05:23:16 +0000
+++ lib/canonical/lazr/doc/utils.txt	2011-09-19 07:28:22 +0000
@@ -1,7 +1,9 @@
-= Various utility functions =
-
-
-== camelcase_to_underscore_separated ==
+Various utility functions
+=========================
+
+
+camelcase_to_underscore_separated
+---------------------------------
 
 LAZR provides a way of converting TextThatIsWordSeparatedWithInterCaps
 to text_that_is_word_separated_with_underscores.
@@ -9,19 +11,25 @@
     >>> from lazr.restful.utils import camelcase_to_underscore_separated
     >>> camelcase_to_underscore_separated('lowercase')
     'lowercase'
+
     >>> camelcase_to_underscore_separated('TwoWords')
     'two_words'
+
     >>> camelcase_to_underscore_separated('twoWords')
     'two_words'
+
     >>> camelcase_to_underscore_separated('ThreeLittleWords')
     'three_little_words'
+
     >>> camelcase_to_underscore_separated('UNCLE')
     'u_n_c_l_e'
+
     >>> camelcase_to_underscore_separated('_StartsWithUnderscore')
     '__starts_with_underscore'
 
 
-== safe_hasattr() ==
+safe_hasattr()
+--------------
 
 LAZR provides a safe_hasattr() that doesn't hide exception from the
 caller. This behaviour of the builtin hasattr() is annoying because it
@@ -30,12 +38,13 @@
     >>> from lazr.restful.utils import safe_hasattr
 
     >>> class Oracle(object):
-    ...     @property
-    ...     def is_full_moon(self):
-    ...         return full_moon
+    ...        @property
+    ...        def is_full_moon(self):
+    ...            return full_moon
     >>> oracle = Oracle()
     >>> hasattr(oracle, 'is_full_moon')
     False
+
     >>> safe_hasattr(oracle, 'is_full_moon')
     Traceback (most recent call last):
       ...
@@ -44,40 +53,51 @@
     >>> full_moon = True
     >>> hasattr(oracle, 'is_full_moon')
     True
+
     >>> safe_hasattr(oracle, 'is_full_moon')
     True
 
     >>> hasattr(oracle, 'weather')
     False
+
     >>> safe_hasattr(oracle, 'weather')
     False
 
 
-== smartquote() ==
+smartquote()
+------------
 
-smartquote() converts pairs of inch marks (") in a string to typographical
-quotation marks.
+smartquote() converts pairs of inch marks (") in a string to
+typographical quotation marks.
 
     >>> from lazr.restful.utils import smartquote
     >>> smartquote('')
     u''
+
     >>> smartquote('foo "bar" baz')
     u'foo \u201cbar\u201d baz'
+
     >>> smartquote('foo "bar baz')
     u'foo \u201cbar baz'
+
     >>> smartquote('foo bar" baz')
     u'foo bar\u201d baz'
+
     >>> smartquote('""foo " bar "" baz""')
     u'""foo " bar "" baz""'
+
     >>> smartquote('" foo "')
     u'" foo "'
+
     >>> smartquote('"foo".')
     u'\u201cfoo\u201d.'
+
     >>> smartquote('a lot of "foo"?')
     u'a lot of \u201cfoo\u201d?'
 
 
-== safe_js_escape() ==
+safe_js_escape()
+----------------
 
 This will escape the given text so that it can be used in Javascript
 code.
@@ -85,7 +105,10 @@
     >>> from lazr.restful.utils import safe_js_escape
     >>> print safe_js_escape('John "nasty" O\'Brien')
     "John &quot;nasty&quot; O'Brien"
+
     >>> print safe_js_escape("John O\'Brien")
     "John O'Brien"
+
     >>> print safe_js_escape("John <strong>O\'Brien</strong>")
     "John &lt;strong&gt;O'Brien&lt;/strong&gt;"
+

=== modified file 'lib/canonical/librarian/client.py'
--- lib/canonical/librarian/client.py	2011-09-13 05:23:16 +0000
+++ lib/canonical/librarian/client.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 __metaclass__ = type
@@ -14,18 +14,19 @@
 
 
 import hashlib
-import re
+from select import select
 import socket
+from socket import (
+    SOCK_STREAM,
+    AF_INET,
+    )
+import threading
 import time
-import threading
 import urllib
 import urllib2
-
-from select import select
-from socket import SOCK_STREAM, AF_INET
 from urlparse import (
+    urljoin,
     urlparse,
-    urljoin,
     urlunparse,
     )
 
@@ -34,13 +35,24 @@
 from zope.component import getUtility
 from zope.interface import implements
 
-from canonical.config import config, dbconfig
+from canonical.config import (
+    config,
+    dbconfig,
+    )
 from canonical.database.postgresql import ConnectionString
 from canonical.launchpad.webapp.interfaces import (
-        IStoreSelector, MAIN_STORE, MASTER_FLAVOR)
+    IStoreSelector,
+    MAIN_STORE,
+    MASTER_FLAVOR,
+    )
 from canonical.librarian.interfaces import (
-    DownloadFailed, ILibrarianClient, IRestrictedLibrarianClient,
-    LIBRARIAN_SERVER_DEFAULT_TIMEOUT, LibrarianServerError, UploadFailed)
+    DownloadFailed,
+    ILibrarianClient,
+    IRestrictedLibrarianClient,
+    LIBRARIAN_SERVER_DEFAULT_TIMEOUT,
+    LibrarianServerError,
+    UploadFailed,
+    )
 from lp.services.timeline.requesttimeline import get_request_timeline
 
 
@@ -516,9 +528,12 @@
         return config.librarian.download_url
 
     @property
-    def _internal_download_url(self): # used by _getURLForDownload
-        return 'http://%s:%s/' % (config.librarian.download_host,
-                                  config.librarian.download_port)
+    def _internal_download_url(self):
+        """Used by `_getURLForDownload`."""
+        return 'http://%s:%s/' % (
+            config.librarian.download_host,
+            config.librarian.download_port,
+            )
 
 
 class RestrictedLibrarianClient(LibrarianClient):
@@ -540,6 +555,9 @@
         return config.librarian.restricted_download_url
 
     @property
-    def _internal_download_url(self): # used by _getURLForDownload
-        return 'http://%s:%s/' % (config.librarian.restricted_download_host,
-                                  config.librarian.restricted_download_port)
+    def _internal_download_url(self):
+        """Used by `_getURLForDownload`."""
+        return 'http://%s:%s/' % (
+            config.librarian.restricted_download_host,
+            config.librarian.restricted_download_port,
+            )

=== modified file 'lib/canonical/librarian/ftests/test_gc.py'
--- lib/canonical/librarian/ftests/test_gc.py	2011-09-06 05:18:33 +0000
+++ lib/canonical/librarian/ftests/test_gc.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Librarian garbage collection tests"""
@@ -9,7 +9,11 @@
 from datetime import timedelta
 import os
 import shutil
-from subprocess import Popen, PIPE, STDOUT
+from subprocess import (
+    PIPE,
+    Popen,
+    STDOUT,
+    )
 import sys
 import tempfile
 
@@ -166,8 +170,8 @@
         self.ztm.begin()
 
         # Confirm that the LibaryFileContents are still there.
-        c1 = LibraryFileContent.get(c1_id)
-        c2 = LibraryFileContent.get(c2_id)
+        LibraryFileContent.get(c1_id)
+        LibraryFileContent.get(c2_id)
 
         # But the LibraryFileAliases should be gone
         self.assertRaises(SQLObjectNotFound, LibraryFileAlias.get, self.f1_id)
@@ -283,7 +287,7 @@
         # recent past.
         self.ztm.begin()
         f1 = LibraryFileAlias.get(self.f1_id)
-        f1.expires = self.recent_past # Within stay of execution.
+        f1.expires = self.recent_past  # Within stay of execution.
         del f1
         self.ztm.commit()
 
@@ -504,8 +508,10 @@
             # Pretend it is tomorrow to ensure the files don't count as
             # recently created, and run the delete_unwanted_files process.
             org_time = librariangc.time
+
             def tomorrow_time():
                 return org_time() + 24 * 60 * 60 + 1
+
             try:
                 librariangc.time = tomorrow_time
                 librariangc.delete_unwanted_files(self.con)

=== added directory 'lib/canonical/lp'
=== added directory 'lib/canonical/lp/ftests'
=== added file 'lib/canonical/lp/ftests/test_zopeless.py.OTHER'
--- lib/canonical/lp/ftests/test_zopeless.py.OTHER	1970-01-01 00:00:00 +0000
+++ lib/canonical/lp/ftests/test_zopeless.py.OTHER	2011-09-19 07:28:22 +0000
@@ -0,0 +1,37 @@
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""
+Tests to make sure that initZopeless works as expected.
+"""
+
+from doctest import DocTestSuite
+
+from canonical.testing.layers import ZopelessDatabaseLayer
+
+
+def test_isZopeless():
+    """
+    >>> from canonical.lp import (
+    ...     initZopeless,
+    ...     isZopeless,
+    ...     )
+
+    >>> isZopeless()
+    False
+
+    >>> tm = initZopeless(dbuser='launchpad')
+    >>> isZopeless()
+    True
+
+    >>> tm.uninstall()
+    >>> isZopeless()
+    False
+
+    """
+
+
+def test_suite():
+    doctests = DocTestSuite()
+    doctests.layer = ZopelessDatabaseLayer
+    return doctests

=== modified file 'lib/lp/app/browser/tests/test_vocabulary.py'
--- lib/lp/app/browser/tests/test_vocabulary.py	2011-09-16 20:26:51 +0000
+++ lib/lp/app/browser/tests/test_vocabulary.py	2011-09-19 07:28:22 +0000
@@ -10,7 +10,6 @@
 
 import pytz
 import simplejson
-
 from zope.app.form.interfaces import MissingInputError
 from zope.component import (
     getSiteManager,
@@ -21,7 +20,6 @@
 from zope.schema.vocabulary import SimpleTerm
 from zope.security.proxy import removeSecurityProxy
 
-
 from canonical.launchpad.interfaces.launchpad import ILaunchpadRoot
 from canonical.launchpad.webapp.vocabulary import (
     CountableIterator,
@@ -44,6 +42,13 @@
 from lp.testing.views import create_view
 
 
+def get_picker_entry(item_subject, context_object, **kwargs):
+    """Adapt `item_subject` to `IPickerEntrySource` and return its item."""
+    [entry] = IPickerEntrySource(item_subject).getPickerEntries(
+        [item_subject], context_object, **kwargs)
+    return entry
+
+
 class PersonPickerEntrySourceAdapterTestCase(TestCaseWithFactory):
 
     layer = DatabaseFunctionalLayer
@@ -57,16 +62,17 @@
     def test_PersonPickerEntrySourceAdapter_email_anonymous(self):
         # Anonymous users cannot see entry email addresses.
         person = self.factory.makePerson(email='snarf@xxxxxx')
-        [entry] = IPickerEntrySource(person).getPickerEntries([person], None)
-        self.assertEqual('<email address hidden>', entry.description)
+        self.assertEqual(
+            "<email address hidden>",
+            get_picker_entry(person, None).description)
 
     def test_PersonPickerEntrySourceAdapter_visible_email_logged_in(self):
         # Logged in users can see visible email addresses.
         observer = self.factory.makePerson()
         login_person(observer)
         person = self.factory.makePerson(email='snarf@xxxxxx')
-        [entry] = IPickerEntrySource(person).getPickerEntries([person], None)
-        self.assertEqual('snarf@xxxxxx', entry.description)
+        self.assertEqual(
+            'snarf@xxxxxx', get_picker_entry(person, None).description)
 
     def test_PersonPickerEntrySourceAdapter_hidden_email_logged_in(self):
         # Logged in users cannot see hidden email addresses.
@@ -75,16 +81,16 @@
         person.hide_email_addresses = True
         observer = self.factory.makePerson()
         login_person(observer)
-        [entry] = IPickerEntrySource(person).getPickerEntries([person], None)
-        self.assertEqual('<email address hidden>', entry.description)
+        self.assertEqual(
+            "<email address hidden>",
+            get_picker_entry(person, None).description)
 
     def test_PersonPickerEntrySourceAdapter_no_email_logged_in(self):
         # Teams without email address have no desriptions.
         team = self.factory.makeTeam()
         observer = self.factory.makePerson()
         login_person(observer)
-        [entry] = IPickerEntrySource(team).getPickerEntries([team], None)
-        self.assertEqual(None, entry.description)
+        self.assertEqual(None, get_picker_entry(team, None).description)
 
     def test_PersonPickerEntrySourceAdapter_logged_in(self):
         # Logged in users can see visible email addresses.
@@ -92,7 +98,7 @@
         login_person(observer)
         person = self.factory.makePerson(
             email='snarf@xxxxxx', name='snarf')
-        [entry] = IPickerEntrySource(person).getPickerEntries([person], None)
+        entry = get_picker_entry(person, None)
         self.assertEqual('sprite person', entry.css)
         self.assertEqual('sprite new-window', entry.link_css)
 
@@ -104,8 +110,8 @@
         removeSecurityProxy(person).datecreated = creation_date
         getUtility(IIrcIDSet).new(person, 'eg.dom', 'snarf')
         getUtility(IIrcIDSet).new(person, 'ex.dom', 'pting')
-        [entry] = IPickerEntrySource(person).getPickerEntries(
-            [person], None, enhanced_picker_enabled=True,
+        entry = get_picker_entry(
+            person, None, enhanced_picker_enabled=True,
             picker_expander_enabled=True)
         self.assertEqual('http://launchpad.dev/~snarf', entry.alt_title_link)
         self.assertEqual(
@@ -115,8 +121,8 @@
     def test_PersonPickerEntrySourceAdapter_enhanced_picker_team(self):
         # The enhanced person picker provides more information for teams.
         team = self.factory.makeTeam(email='fnord@xxxxxx', name='fnord')
-        [entry] = IPickerEntrySource(team).getPickerEntries(
-            [team], None, enhanced_picker_enabled=True,
+        entry = get_picker_entry(
+            team, None, enhanced_picker_enabled=True,
             picker_expander_enabled=True)
         self.assertEqual('http://launchpad.dev/~fnord', entry.alt_title_link)
         self.assertEqual(['Team members: 1'], entry.details)
@@ -127,8 +133,8 @@
         project = self.factory.makeProduct(
             name='fnord', owner=person, bug_supervisor=person)
         bugtask = self.factory.makeBugTask(target=project)
-        [entry] = IPickerEntrySource(person).getPickerEntries(
-            [person], bugtask, enhanced_picker_enabled=True,
+        entry = get_picker_entry(
+            person, bugtask, enhanced_picker_enabled=True,
             picker_expander_enabled=True,
             personpicker_affiliation_enabled=True)
         self.assertEqual(3, len(entry.badges))
@@ -147,11 +153,11 @@
         # IHasAffilliation.
         person = self.factory.makePerson(email='snarf@xxxxxx', name='snarf')
         thing = object()
-        [entry] = IPickerEntrySource(person).getPickerEntries(
-            [person], thing, enhanced_picker_enabled=True,
+        entry = get_picker_entry(
+            person, thing, enhanced_picker_enabled=True,
             picker_expander_enabled=True,
             personpicker_affiliation_enabled=True)
-        self.assertEqual(None, None)
+        self.assertIsNot(None, entry)
 
 
 class TestDistributionSourcePackagePickerEntrySourceAdapter(
@@ -162,9 +168,12 @@
     def setUp(self):
         super(TestDistributionSourcePackagePickerEntrySourceAdapter,
               self).setUp()
-        flag = {'disclosure.target_picker_enhancements.enabled':'on'}
+        flag = {'disclosure.target_picker_enhancements.enabled': 'on'}
         self.useFixture(FeatureFixture(flag))
 
+    def getPickerEntry(self, dsp):
+        return get_picker_entry(dsp, object())
+
     def test_dsp_to_picker_entry(self):
         dsp = self.factory.makeDistributionSourcePackage()
         adapter = IPickerEntrySource(dsp)
@@ -179,8 +188,7 @@
         self.factory.makeSourcePackagePublishingHistory(
             distroseries=series,
             sourcepackagerelease=release)
-        [entry] = IPickerEntrySource(dsp).getPickerEntries([dsp], object())
-        self.assertEqual('package', entry.target_type)
+        self.assertEqual('package', self.getPickerEntry(dsp).target_type)
 
     def test_dsp_provides_details(self):
         dsp = self.factory.makeDistributionSourcePackage()
@@ -191,9 +199,9 @@
         self.factory.makeSourcePackagePublishingHistory(
             distroseries=series,
             sourcepackagerelease=release)
-        [entry] = IPickerEntrySource(dsp).getPickerEntries([dsp], object())
-        expected = "Maintainer: %s" % dsp.currentrelease.maintainer.displayname
-        self.assertEqual([expected], entry.details)
+        self.assertEqual(
+            ["Maintainer: %s" % dsp.currentrelease.maintainer.displayname],
+            self.getPickerEntry(dsp).details)
 
     def test_dsp_provides_summary(self):
         dsp = self.factory.makeDistributionSourcePackage()
@@ -204,8 +212,8 @@
         self.factory.makeSourcePackagePublishingHistory(
             distroseries=series,
             sourcepackagerelease=release)
-        [entry] = IPickerEntrySource(dsp).getPickerEntries([dsp], object())
-        self.assertEqual(entry.description, 'Not yet built.')
+        self.assertEqual(
+            "Not yet built.", self.getPickerEntry(dsp).description)
 
         archseries = self.factory.makeDistroArchSeries(distroseries=series)
         bpn = self.factory.makeBinaryPackageName(name='fnord')
@@ -214,8 +222,7 @@
             source_package_release=release,
             sourcepackagename=dsp.sourcepackagename,
             distroarchseries=archseries)
-        [entry] = IPickerEntrySource(dsp).getPickerEntries([dsp], object())
-        self.assertEqual(entry.description, 'fnord')
+        self.assertEqual("fnord", self.getPickerEntry(dsp).description)
 
 
 class TestProductPickerEntrySourceAdapter(TestCaseWithFactory):
@@ -224,41 +231,37 @@
 
     def setUp(self):
         super(TestProductPickerEntrySourceAdapter, self).setUp()
-        flag = {'disclosure.target_picker_enhancements.enabled':'on'}
+        flag = {'disclosure.target_picker_enhancements.enabled': 'on'}
         self.useFixture(FeatureFixture(flag))
 
+    def getPickerEntry(self, product):
+        return get_picker_entry(product, object())
+
     def test_product_to_picker_entry(self):
         product = self.factory.makeProduct()
         adapter = IPickerEntrySource(product)
         self.assertTrue(IPickerEntrySource.providedBy(adapter))
 
     def test_product_provides_alt_title(self):
-        with FeatureFixture({
-            'disclosure.target_picker_enhancements.enabled':'on'}):
-            product = self.factory.makeProduct()
-            [entry] = IPickerEntrySource(product).getPickerEntries(
-                    [product], object())
-            self.assertEqual(entry.alt_title, product.name)
+        product = self.factory.makeProduct()
+        self.assertEqual(product.name, self.getPickerEntry(product).alt_title)
 
     def test_product_target_type(self):
         product = self.factory.makeProduct()
-        [entry] = IPickerEntrySource(product).getPickerEntries(
-                [product], object())
-        # We check for project, not product, because users don't see products.
-        self.assertEqual('project', entry.target_type)
+        # We check for project, not product, because users don't see
+        # products.
+        self.assertEqual('project', self.getPickerEntry(product).target_type)
 
     def test_product_provides_details(self):
         product = self.factory.makeProduct()
-        [entry] = IPickerEntrySource(product).getPickerEntries(
-                [product], object())
-        expected = "Maintainer: %s" % product.owner.displayname
-        self.assertEqual([expected], entry.details)
+        self.assertEqual(
+            ["Maintainer: %s" % product.owner.displayname],
+            self.getPickerEntry(product).details)
 
     def test_product_provides_summary(self):
         product = self.factory.makeProduct()
-        [entry] = IPickerEntrySource(product).getPickerEntries(
-                [product], object())
-        self.assertEqual(entry.description, product.summary)
+        self.assertEqual(
+            product.summary, self.getPickerEntry(product).description)
 
 
 class TestProjectGroupPickerEntrySourceAdapter(TestCaseWithFactory):
@@ -267,40 +270,38 @@
 
     def setUp(self):
         super(TestProjectGroupPickerEntrySourceAdapter, self).setUp()
-        flag = {'disclosure.target_picker_enhancements.enabled':'on'}
+        flag = {'disclosure.target_picker_enhancements.enabled': 'on'}
         self.useFixture(FeatureFixture(flag))
 
+    def getPickerEntry(self, projectgroup):
+        return get_picker_entry(projectgroup, object())
+
     def test_projectgroup_to_picker_entry(self):
         projectgroup = self.factory.makeProject()
         adapter = IPickerEntrySource(projectgroup)
         self.assertTrue(IPickerEntrySource.providedBy(adapter))
 
     def test_projectgroup_provides_alt_title(self):
-        with FeatureFixture({
-            'disclosure.target_picker_enhancements.enabled':'on'}):
-            projectgroup = self.factory.makeProject()
-            [entry] = IPickerEntrySource(projectgroup).getPickerEntries(
-                    [projectgroup], object())
-            self.assertEqual(entry.alt_title, projectgroup.name)
+        projectgroup = self.factory.makeProject()
+        self.assertEqual(
+            projectgroup.name, self.getPickerEntry(projectgroup).alt_title)
 
     def test_projectgroup_target_type(self):
         projectgroup = self.factory.makeProject()
-        [entry] = IPickerEntrySource(projectgroup).getPickerEntries(
-                [projectgroup], object())
-        self.assertEqual('project group', entry.target_type)
+        self.assertEqual(
+            'project group', self.getPickerEntry(projectgroup).target_type)
 
     def test_projectgroup_provides_details(self):
         projectgroup = self.factory.makeProject()
-        [entry] = IPickerEntrySource(projectgroup).getPickerEntries(
-                [projectgroup], object())
-        expected = "Maintainer: %s" % projectgroup.owner.displayname
-        self.assertEqual([expected], entry.details)
+        self.assertEqual(
+            ["Maintainer: %s" % projectgroup.owner.displayname],
+            self.getPickerEntry(projectgroup).details)
 
     def test_projectgroup_provides_summary(self):
         projectgroup = self.factory.makeProject()
-        [entry] = IPickerEntrySource(projectgroup).getPickerEntries(
-                [projectgroup], object())
-        self.assertEqual(entry.description, projectgroup.summary)
+        self.assertEqual(
+            projectgroup.summary,
+            self.getPickerEntry(projectgroup).description)
 
 
 class TestDistributionPickerEntrySourceAdapter(TestCaseWithFactory):
@@ -309,44 +310,41 @@
 
     def setUp(self):
         super(TestDistributionPickerEntrySourceAdapter, self).setUp()
-        flag = {'disclosure.target_picker_enhancements.enabled':'on'}
+        flag = {'disclosure.target_picker_enhancements.enabled': 'on'}
         self.useFixture(FeatureFixture(flag))
 
+    def getPickerEntry(self, distribution):
+        return get_picker_entry(distribution, object())
+
     def test_distribution_to_picker_entry(self):
         distribution = self.factory.makeDistribution()
         adapter = IPickerEntrySource(distribution)
         self.assertTrue(IPickerEntrySource.providedBy(adapter))
 
     def test_distribution_provides_alt_title(self):
-        with FeatureFixture({
-            'disclosure.target_picker_enhancements.enabled':'on'}):
-            distribution = self.factory.makeDistribution()
-            [entry] = IPickerEntrySource(distribution).getPickerEntries(
-                    [distribution], object())
-            self.assertEqual(entry.alt_title, distribution.name)
+        distribution = self.factory.makeDistribution()
+        self.assertEqual(
+            distribution.name, self.getPickerEntry(distribution).alt_title)
 
     def test_distribution_provides_details(self):
         distribution = self.factory.makeDistribution()
-        series = self.factory.makeDistroSeries(
-            distribution=distribution,
-            status=SeriesStatus.CURRENT)
-        [entry] = IPickerEntrySource(distribution).getPickerEntries(
-                [distribution], object())
-        maintain_name = distribution.currentseries.owner.displayname
-        expected = "Maintainer: %s" % maintain_name
-        self.assertEqual([expected], entry.details)
+        self.factory.makeDistroSeries(
+            distribution=distribution, status=SeriesStatus.CURRENT)
+        self.assertEqual(
+            ["Maintainer: %s" % distribution.currentseries.owner.displayname],
+            self.getPickerEntry(distribution).details)
 
     def test_distribution_provides_summary(self):
         distribution = self.factory.makeDistribution()
-        [entry] = IPickerEntrySource(distribution).getPickerEntries(
-                [distribution], object())
-        self.assertEqual(entry.description, distribution.summary)
+        self.assertEqual(
+            distribution.summary,
+            self.getPickerEntry(distribution).description)
 
     def test_distribution_target_type(self):
         distribution = self.factory.makeDistribution()
-        [entry] = IPickerEntrySource(distribution).getPickerEntries(
-                [distribution], object())
-        self.assertEqual('distribution', entry.target_type)
+        self.assertEqual(
+            'distribution', self.getPickerEntry(distribution).target_type)
+
 
 class TestPersonVocabulary:
     implements(IHugeVocabulary)

=== modified file 'lib/lp/archivepublisher/model/ftparchive.py'
--- lib/lp/archivepublisher/model/ftparchive.py	2011-09-15 11:41:39 +0000
+++ lib/lp/archivepublisher/model/ftparchive.py	2011-09-19 07:28:22 +0000
@@ -841,4 +841,4 @@
                     if not distroseries.include_long_descriptions:
                         safe_mkdir(os.path.join(base_path, "i18n"))
                 for arch in archs:
-                    safe_mkdir(os.path.join(base_path, "binary-"+arch))
+                    safe_mkdir(os.path.join(base_path, "binary-" + arch))

=== modified file 'lib/lp/blueprints/browser/sprint.py'
--- lib/lp/blueprints/browser/sprint.py	2011-09-13 05:23:16 +0000
+++ lib/lp/blueprints/browser/sprint.py	2011-09-19 07:28:22 +0000
@@ -225,6 +225,7 @@
         return dt.strftime('%Y-%m-%d')
 
     _local_timeformat = '%H:%M %Z on %A, %Y-%m-%d'
+
     @property
     def local_start(self):
         """The sprint start time, in the local time zone, as text."""
@@ -376,7 +377,6 @@
         self.attendee_ids = set(
             attendance.attendeeID for attendance in self.context.attendances)
 
-
     @cachedproperty
     def spec_filter(self):
         """Return the specification links with PROPOSED status for this
@@ -400,7 +400,7 @@
         if 'SUBMIT_CANCEL' in form:
             self.status_message = 'Cancelled'
             self.request.response.redirect(
-                canonical_url(self.context)+'/+specs')
+                canonical_url(self.context) + '/+specs')
             return
 
         if 'SUBMIT_ACCEPT' not in form and 'SUBMIT_DECLINE' not in form:
@@ -439,7 +439,7 @@
         if leftover == 0:
             # they are all done, so redirect back to the spec listing page
             self.request.response.redirect(
-                canonical_url(self.context)+'/+specs')
+                canonical_url(self.context) + '/+specs')
 
 
 class SprintMeetingExportView(LaunchpadView):
@@ -479,11 +479,12 @@
                 continue
             people[subscription.specificationID][subscription.personID] = \
                 subscription.essential
-        # Spec specials - drafter/assignee. Don't need approver for performance
-        # as specifications() above eager loaded the people, and approvers
-        # don't count as a 'required person'.
+
+        # Spec specials - drafter/assignee.  Don't need approver for
+        # performance, as specifications() above eager-loaded the
+        # people, and approvers don't count as "required persons."
         for spec in model_specs:
-            # get the list of attendees that will attend the sprint
+            # Get the list of attendees that will attend the sprint.
             spec_people = people[spec.id]
             if spec.assigneeID is not None:
                 spec_people[spec.assigneeID] = True
@@ -583,10 +584,10 @@
                  # We used to store phone, organization, city and
                  # country, but this was a lie because users could not
                  # update these fields.
-                 '', # attendance.attendee.phone
-                 '', # attendance.attendee.organization
-                 '', # attendance.attendee.city
-                 '', # country
+                 '',  # attendance.attendee.phone
+                 '',  # attendance.attendee.organization
+                 '',  # attendance.attendee.city
+                 '',  # country
                  time_zone,
                  attendance.time_starts.strftime('%Y-%m-%dT%H:%M:%SZ'),
                  attendance.time_ends.strftime('%Y-%m-%dT%H:%M:%SZ'),

=== modified file 'lib/lp/blueprints/browser/sprintspecification.py'
--- lib/lp/blueprints/browser/sprintspecification.py	2011-09-13 05:23:16 +0000
+++ lib/lp/blueprints/browser/sprintspecification.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Views for SprintSpecification."""
@@ -38,4 +38,3 @@
         if decided or cancel is not None:
             self.request.response.redirect(
                 canonical_url(self.context.specification))
-

=== modified file 'lib/lp/bugs/doc/bug-heat.txt'
--- lib/lp/bugs/doc/bug-heat.txt	2011-09-14 00:24:05 +0000
+++ lib/lp/bugs/doc/bug-heat.txt	2011-09-19 07:28:22 +0000
@@ -293,8 +293,8 @@
 Caculating the maximum heat for a target
 ----------------------------------------
 
-When we update the heat value for a bug, the maximum heat value for the targets
-for all of its tasks is calculated and cached.
+When we update the heat value for a bug, the maximum heat value for the
+targets for all of its tasks is calculated and cached.
 
     >>> product = factory.makeProduct()
     >>> bug = factory.makeBug(product=product)

=== modified file 'lib/lp/bugs/doc/bugnotification-email.txt'
--- lib/lp/bugs/doc/bugnotification-email.txt	2011-09-11 12:50:33 +0000
+++ lib/lp/bugs/doc/bugnotification-email.txt	2011-09-19 07:28:22 +0000
@@ -1,4 +1,5 @@
-== Bug Notification Email ==
+Bug Notification Email
+----------------------
 
 This document describes the internal workings of how bug notification
 emails are generated and how said emails are formatted. It does not
@@ -22,21 +23,24 @@
     >>> from lp.bugs.adapters.bugchange import get_bug_changes
     >>> from lp.bugs.mail.newbug import generate_bug_add_email
 
-Let's demonstrate what the bugmails will look like, by going through
-the various events that can happen that would cause a notification to
-be sent. We'll start by importing some things we'll need for the
-examples that follow:
+Let's demonstrate what the bugmails will look like, by going through the
+various events that can happen that would cause a notification to be
+sent. We'll start by importing some things we'll need for the examples
+that follow:
 
     >>> from zope.component import getUtility
-    >>> from canonical.launchpad.interfaces.emailaddress import IEmailAddressSet
+    >>> from canonical.launchpad.interfaces.emailaddress import (
+    ...     IEmailAddressSet)
     >>> from lp.bugs.adapters.bugdelta import BugDelta
     >>> from lp.bugs.interfaces.bug import (
-    ...     IBugDelta,
-    ...     IBugSet,
-    ...     )
+    ...        IBugDelta,
+    ...        IBugSet,
+    ...        )
     >>> from lp.registry.interfaces.person import IPersonSet
 
-= Filing a bug =
+
+Filing a bug
+============
 
 generate_bug_add_email accepts one argument: the IBug that was just
 added. With that, it generates an appropriately-formatted notification
@@ -52,6 +56,7 @@
     >>> subject, body = generate_bug_add_email(bug_four)
     >>> subject
     u'[Bug 4] [NEW] Reflow problems with complex page layouts'
+
     >>> print body
     Public bug reported:
     <BLANKLINE>
@@ -73,6 +78,7 @@
     >>> subject, body = generate_bug_add_email(bug_four)
     >>> subject
     u'[Bug 4] [NEW] Reflow problems with complex page layouts'
+
     >>> print body
     Public bug reported:
     <BLANKLINE>
@@ -87,11 +93,13 @@
 
 New security related bugs are sent with a prominent warning:
 
-    >>> changed = bug_four.setSecurityRelated(True, getUtility(ILaunchBag).user)
+    >>> changed = bug_four.setSecurityRelated(
+    ...     True, getUtility(ILaunchBag).user)
 
     >>> subject, body = generate_bug_add_email(bug_four)
     >>> subject
     u'[Bug 4] [NEW] Reflow problems with complex page layouts'
+
     >>> print body
     *** This bug is a security vulnerability ***
     <BLANKLINE>
@@ -113,7 +121,8 @@
     ...
 
 
-= Editing a bug =
+Editing a bug
+=============
 
 get_bug_changes() accepts an object that provides IBugDelta, and
 generates IBugChange objects that describe the changes to the bug.
@@ -125,33 +134,33 @@
     >>> edited_bug.title = "the new title"
     >>> old_description = edited_bug.description
     >>> edited_bug.description = (
-    ...     "The Trash folder seems to have significant problems! At the"
-    ...     " moment, dragging an item to the Trash results in immediate"
-    ...     " deletion. The item does not appear in the Trash, it is just"
-    ...     " deleted from my hard disk. There is no undo or ability to"
-    ...     " recover the deleted file. Help!")
+    ...        "The Trash folder seems to have significant problems! At the"
+    ...        " moment, dragging an item to the Trash results in immediate"
+    ...        " deletion. The item does not appear in the Trash, it is just"
+    ...        " deleted from my hard disk. There is no undo or ability to"
+    ...        " recover the deleted file. Help!")
 
     >>> bug_delta = BugDelta(
-    ...     bug=edited_bug,
-    ...     bugurl="http://www.example.com/bugs/2";,
-    ...     user=sample_person,
-    ...     title={'new': edited_bug.title, 'old': old_title},
-    ...     description={'new': edited_bug.description,
-    ...                  'old': old_description})
+    ...        bug=edited_bug,
+    ...        bugurl="http://www.example.com/bugs/2";,
+    ...        user=sample_person,
+    ...        title={'new': edited_bug.title, 'old': old_title},
+    ...        description={'new': edited_bug.description,
+    ...                     'old': old_description})
     >>> IBugDelta.providedBy(bug_delta)
     True
 
     >>> from lp.bugs.interfaces.bugchange import IBugChange
     >>> changes = get_bug_changes(bug_delta)
     >>> for change in changes:
-    ...     IBugChange.providedBy(change)
+    ...        IBugChange.providedBy(change)
     True
     True
 
     >>> for change in get_bug_changes(bug_delta):
-    ...     notification = change.getBugNotification()
-    ...     print notification['text'] #doctest: -NORMALIZE_WHITESPACE
-    ...     print "-----------------------------"
+    ...        notification = change.getBugNotification()
+    ...        print notification['text'] #doctest: -NORMALIZE_WHITESPACE
+    ...        print "-----------------------------"
     ** Summary changed:
     <BLANKLINE>
     - Blackhole Trash folder
@@ -170,15 +179,24 @@
 is wrapped properly:
 
     >>> old_description = edited_bug.description
-    >>> edited_bug.description = """\
-    ... a new description that is quite long. but the nice thing is that the edit notification email generator knows how to indent and wrap descriptions, so this will appear quite nice in the actual email that gets sent.\n\nit's also smart enough to preserve whitespace, finally!"""
+    >>> edited_bug.description = ''.join([
+    ...     "A new description that is quite long. ",
+    ...     "But the nice thing is that the edit notification email ",
+    ...     "generator knows how to indent and wrap descriptions, so this ",
+    ...     "will appear quite nice in the actual email that gets sent.",
+    ...     "\n",
+    ...     "\n",
+    ...     "It's also smart enough to preserve whitespace, finally!",
+    ...     ])
 
     >>> bug_delta = BugDelta(
     ...     bug=edited_bug,
     ...     bugurl="http://www.example.com/bugs/2";,
     ...     user=sample_person,
-    ...     description={'new': edited_bug.description,
-    ...                  'old': old_description})
+    ...     description={
+    ...         'new': edited_bug.description,
+    ...         'old': old_description,
+    ...     })
     >>> for change in get_bug_changes(bug_delta):
     ...     notification = change.getBugNotification()
     ...     print notification['text'] #doctest: -NORMALIZE_WHITESPACE
@@ -189,12 +207,12 @@
     - dragging an item to the Trash results in immediate deletion. The item
     - does not appear in the Trash, it is just deleted from my hard disk.
     - There is no undo or ability to recover the deleted file. Help!
-    + a new description that is quite long. but the nice thing is that the
+    + A new description that is quite long. But the nice thing is that the
     + edit notification email generator knows how to indent and wrap
     + descriptions, so this will appear quite nice in the actual email that
     + gets sent.
     + 
-    + it's also smart enough to preserve whitespace, finally!
+    + It's also smart enough to preserve whitespace, finally!
     -----------------------------
 
 Let's make the bug security-related, and private (we need to switch
@@ -204,13 +222,18 @@
 
     >>> edited_bug.setPrivate(True, getUtility(ILaunchBag).user)
     True
-    >>> changed = edited_bug.setSecurityRelated(True, getUtility(ILaunchBag).user)
+
+    >>> changed = edited_bug.setSecurityRelated(
+    ...     True, getUtility(ILaunchBag).user)
     >>> bug_delta = BugDelta(
     ...     bug=edited_bug,
     ...     bugurl="http://www.example.com/bugs/2";,
     ...     user=sample_person,
     ...     private={'old': False, 'new': edited_bug.private},
-    ...     security_related={'old': False, 'new': edited_bug.security_related})
+    ...     security_related={
+    ...         'old': False,
+    ...         'new': edited_bug.security_related,
+    ...     })
 
     >>> for change in get_bug_changes(bug_delta):
     ...     notification = change.getBugNotification()
@@ -227,13 +250,18 @@
 
     >>> edited_bug.setPrivate(False, getUtility(ILaunchBag).user)
     True
-    >>> changed = edited_bug.setSecurityRelated(False, getUtility(ILaunchBag).user)
+
+    >>> changed = edited_bug.setSecurityRelated(
+    ...     False, getUtility(ILaunchBag).user)
     >>> bug_delta = BugDelta(
     ...     bug=edited_bug,
     ...     bugurl="http://www.example.com/bugs/2";,
     ...     user=sample_person,
     ...     private={'old': True, 'new': edited_bug.private},
-    ...     security_related={'old': True, 'new': edited_bug.security_related})
+    ...     security_related={
+    ...         'old': True,
+    ...         'new': edited_bug.security_related,
+    ...         })
     >>> for change in get_bug_changes(bug_delta):
     ...     notification = change.getBugNotification()
     ...     print notification['text'] #doctest: -NORMALIZE_WHITESPACE
@@ -248,14 +276,14 @@
     >>> old_tags = []
     >>> edited_bug.tags = [u'foo', u'bar']
     >>> bug_delta = BugDelta(
-    ...     bug=edited_bug,
-    ...     bugurl="http://www.example.com/bugs/2";,
-    ...     user=sample_person,
-    ...     tags={'old': old_tags, 'new': edited_bug.tags})
+    ...        bug=edited_bug,
+    ...        bugurl="http://www.example.com/bugs/2";,
+    ...        user=sample_person,
+    ...        tags={'old': old_tags, 'new': edited_bug.tags})
     >>> for change in get_bug_changes(bug_delta):
-    ...     notification = change.getBugNotification()
-    ...     print notification['text'] #doctest: -NORMALIZE_WHITESPACE
-    ...     print "-----------------------------"
+    ...        notification = change.getBugNotification()
+    ...        print notification['text'] #doctest: -NORMALIZE_WHITESPACE
+    ...        print "-----------------------------"
     ** Tags added: bar foo
     -----------------------------
 
@@ -264,32 +292,33 @@
     >>> old_tags = edited_bug.tags
     >>> edited_bug.tags = [u'foo', u'baz']
     >>> bug_delta = BugDelta(
-    ...     bug=edited_bug,
-    ...     bugurl="http://www.example.com/bugs/2";,
-    ...     user=sample_person,
-    ...     tags={'old': old_tags, 'new': edited_bug.tags})
+    ...        bug=edited_bug,
+    ...        bugurl="http://www.example.com/bugs/2";,
+    ...        user=sample_person,
+    ...        tags={'old': old_tags, 'new': edited_bug.tags})
     >>> for change in get_bug_changes(bug_delta):
-    ...     notification = change.getBugNotification()
-    ...     print notification['text'] #doctest: -NORMALIZE_WHITESPACE
-    ...     print "-----------------------------"
+    ...        notification = change.getBugNotification()
+    ...        print notification['text'] #doctest: -NORMALIZE_WHITESPACE
+    ...        print "-----------------------------"
     ** Tags removed: bar
     ** Tags added: baz
     -----------------------------
 
 
-= Editing a bug task =
+Editing a bug task
+==================
 
-As you might expect, get_bug_changes handles generating the
-text representations of the changes when a bug task is edited.
+As you might expect, get_bug_changes handles generating the text
+representations of the changes when a bug task is edited.
 
 We use a BugTaskDelta to represent changes to a BugTask.
 
     >>> from canonical.launchpad.webapp.testing import verifyObject
     >>> from lp.bugs.interfaces.bugtask import (
-    ...     BugTaskStatus,
-    ...     IBugTaskDelta,
-    ...     IBugTaskSet,
-    ...     )
+    ...        BugTaskStatus,
+    ...        IBugTaskDelta,
+    ...        IBugTaskSet,
+    ...        )
     >>> from lp.bugs.model.bugtask import BugTaskDelta
     >>> example_bug_task = factory.makeBugTask()
     >>> example_delta = BugTaskDelta(example_bug_task)
@@ -298,7 +327,7 @@
 
     >>> edited_bugtask = getUtility(IBugTaskSet).get(3)
     >>> edited_bugtask.transitionToStatus(
-    ...     BugTaskStatus.CONFIRMED, getUtility(ILaunchBag).user)
+    ...        BugTaskStatus.CONFIRMED, getUtility(ILaunchBag).user)
     >>> edited_bugtask.transitionToAssignee(sample_person)
     >>> bugtask_delta = BugTaskDelta(
     ...     bugtask=edited_bugtask,
@@ -326,6 +355,7 @@
     >>> debian_bugtask = getUtility(IBugTaskSet).get(5)
     >>> print debian_bugtask.bugtargetname
     mozilla-firefox (Debian)
+
     >>> debian_bugtask.transitionToAssignee(None)
     >>> bugtask_delta = BugTaskDelta(
     ...     bugtask=debian_bugtask,
@@ -344,7 +374,8 @@
     -----------------------------
 
 
-= Adding attachments =
+Adding attachments
+==================
 
 Adding an attachment will generate a notification that looks as follows:
 
@@ -414,7 +445,8 @@
     -----------------------------
 
 
-= Generation of From: and Reply-To: addresses =
+Generation of From: and Reply-To: addresses
+===========================================
 
 The Reply-To: and From: addresses used to send email are generated in a
 pair of handy functions defined in mailnotification.py:
@@ -432,7 +464,7 @@
 
     >>> stub = getUtility(IPersonSet).getByName("stub")
     >>> [(email.email, email.status.name) for email
-    ...  in getUtility(IEmailAddressSet).getByPerson(stub)]
+    ...     in getUtility(IEmailAddressSet).getByPerson(stub)]
     [(u'stuart.bishop@xxxxxxxxxxxxx', 'PREFERRED'),
      (u'stuart@xxxxxxxxxxxxxxxx', 'VALIDATED'),
      (u'stub@xxxxxxxxxxx', 'NEW'),
@@ -479,15 +511,16 @@
     'Ford Prefect <4@xxxxxxxxxxxxxxxxxx>'
 
 
-== Construction of bug notification emails ==
+Construction of bug notification emails
+---------------------------------------
 
 mailnotification.py contains a class, BugNotificationBuilder, which is
 used to construct bug notification emails.
 
     >>> from lp.bugs.mail.bugnotificationbuilder import BugNotificationBuilder
 
-When instantiatiated it derives a list of common unchanging headers
-from the bug so that they are not calculated for every recipient.
+When instantiatiated it derives a list of common unchanging headers from
+the bug so that they are not calculated for every recipient.
 
     >>> bug_four_notification_builder = BugNotificationBuilder(bug_four,
     ...     private_person)
@@ -504,9 +537,9 @@
     X-Launchpad-Bug-Modifier: Ford Prefect (person-name...)
 
 The build() method of a builder accepts a number of parameters and
-returns an instance of email.MIMEText. The most basic invocation of
-this method requires a from address, a to address, a body, a subject
-and a sending date for the mail.
+returns an instance of email.MIMEText. The most basic invocation of this
+method requires a from address, a to address, a body, a subject and a
+sending date for the mail.
 
     >>> from datetime import datetime
     >>> import pytz
@@ -519,9 +552,9 @@
     ...     from_address, 'foo.bar@xxxxxxxxxxxxx',
     ...     "A test body.", "A test subject.", sending_date)
 
-The fields of the generated notification email will be set according
-to the parameters that were used to instantiate BugNotificationBuilder
-and passed to <builder>.build().
+The fields of the generated notification email will be set according to
+the parameters that were used to instantiate BugNotificationBuilder and
+passed to <builder>.build().
 
     >>> print notification_email['From']
     Launchpad Bug Tracker <4@xxxxxxxxxxxxxxxxxx>

=== modified file 'lib/lp/bugs/doc/initial-bug-contacts.txt'
--- lib/lp/bugs/doc/initial-bug-contacts.txt	2011-09-14 00:24:05 +0000
+++ lib/lp/bugs/doc/initial-bug-contacts.txt	2011-09-19 07:28:22 +0000
@@ -1,7 +1,8 @@
-= Bug Subscriptions =
+Bug Subscriptions
+=================
 
-Package bug subscriptions allow zero, one, or more people or teams that get
-explicitly Cc'd to all public bugs filed on a package.
+Package bug subscriptions allow zero, one, or more people or teams that
+get explicitly Cc'd to all public bugs filed on a package.
 
 The package bug subscriptions are obtained from looking at the
 StructuralSubscription table.
@@ -20,8 +21,8 @@
     []
 
 Adding a package subscription is done with the
-IDistributionSourcePackage.addBugSubscription method. You have to
-be logged in to call this method:
+IDistributionSourcePackage.addBugSubscription method. You have to be
+logged in to call this method:
 
     >>> from lp.registry.interfaces.person import IPersonSet
     >>> personset = getUtility(IPersonSet)
@@ -39,6 +40,7 @@
 
     >>> debian_firefox.addBugSubscription(sample_person, sample_person)
     <...StructuralSubscription object at ...>
+
     >>> [pbc.subscriber.name for pbc in debian_firefox.bug_subscriptions]
     [u'name12']
 
@@ -54,18 +56,20 @@
     >>> debian_firefox.addBugSubscription(ubuntu_team, ubuntu_team)
     <...StructuralSubscription object at ...>
 
-    >>> sorted([sub.subscriber.name for sub in debian_firefox.bug_subscriptions])
+    >>> sorted([
+    ...     sub.subscriber.name for sub in debian_firefox.bug_subscriptions])
     [u'name12', u'ubuntu-team']
 
 To remove a subscription, use
 IStructuralSubscriptionTarget.removeBugSubscription:
 
     >>> debian_firefox.removeBugSubscription(sample_person, sample_person)
-    >>> sorted([sub.subscriber.id for sub in debian_firefox.bug_subscriptions])
+    >>> sorted([
+    ...     sub.subscriber.id for sub in debian_firefox.bug_subscriptions])
     [17]
 
-Trying to remove a subscription that doesn't exist on a source package raises a
-DeleteSubscriptionError.
+Trying to remove a subscription that doesn't exist on a source package
+raises a DeleteSubscriptionError.
 
     >>> foobar = personset.getByName("name16")
     >>> debian_firefox.removeBugSubscription(foobar, foobar)
@@ -74,17 +78,17 @@
     DeleteSubscriptionError: ...
 
 
-== Package Subscriptions and Bug Tasks ==
+Package Subscriptions and Bug Tasks
+-----------------------------------
 
 Often a bug gets reported on package foo, when it should have been
 reported on bar. When a user, likely a bug triager or developer, changes
-the source package, the subscribers for the new package get
-subscribed. The subscribers of the previous package also remain
-subscribed.
+the source package, the subscribers for the new package get subscribed.
+The subscribers of the previous package also remain subscribed.
 
-To demonstrate, let's change the source package for bug #1 in
-mozilla-firefox in Ubuntu to be pmount in Ubuntu, and see how the
-subscribers list changes.
+To demonstrate, let's change the source package for bug #1 in mozilla-
+firefox in Ubuntu to be pmount in Ubuntu, and see how the subscribers
+list changes.
 
     >>> from lp.bugs.interfaces.bugtask import IBugTaskSet
 
@@ -103,17 +107,23 @@
     ...     subscribers = chain(
     ...         bug.getDirectSubscribers(),
     ...         bug.getIndirectSubscribers())
-    ...     return sorted(subscriber.displayname for subscriber in subscribers)
+    ...     return sorted(
+    ...         subscriber.displayname for subscriber in subscribers)
 
-    >>> subscriber_names(bug_one_in_ubuntu_firefox.bug)
-    [u'Foo Bar', u'Mark Shuttleworth', u'Sample Person', u'Steve Alexander',
-     u'Ubuntu Team']
+    >>> names = subscriber_names(bug_one_in_ubuntu_firefox.bug)
+    >>> for name in names:
+    ...     print name
+    Foo Bar
+    Mark Shuttleworth
+    Sample Person
+    Steve Alexander
+    Ubuntu Team
 
 Changing the package for bug_one_in_ubuntu_firefox to pmount will
 implicitly subscribe the new package's subscribers to the bug. In
-demonstrating this, we'll also make Sample Person a subscriber to
-ubuntu pmount, to show that the subscription changes behave correctly
-when a subscriber to the new package is already subscribed to the bug:
+demonstrating this, we'll also make Sample Person a subscriber to ubuntu
+pmount, to show that the subscription changes behave correctly when a
+subscriber to the new package is already subscribed to the bug:
 
     >>> from zope.event import notify
 
@@ -126,17 +136,18 @@
     >>> daf = personset.getByName("daf")
     >>> ubuntu_pmount.addBugSubscription(daf, daf)
     <...StructuralSubscription object at ...>
+
     >>> ubuntu_pmount.addBugSubscription(sample_person, sample_person)
     <...StructuralSubscription object at ...>
 
     >>> old_state = Snapshot(
-    ...     bug_one_in_ubuntu_firefox, providing=IBugTask)
+    ...        bug_one_in_ubuntu_firefox, providing=IBugTask)
 
     >>> bug_one_in_ubuntu_firefox.transitionToTarget(ubuntu_pmount)
 
     >>> source_package_changed = ObjectModifiedEvent(
-    ...     bug_one_in_ubuntu_firefox, old_state,
-    ...     ["id", "title", "sourcepackagename"])
+    ...        bug_one_in_ubuntu_firefox, old_state,
+    ...        ["id", "title", "sourcepackagename"])
 
     >>> notify(source_package_changed)
     >>> transaction.commit()
@@ -149,36 +160,41 @@
 
 daf is sent an email giving him complete information about the bug that
 has just been retargeted, including the title, description, status,
-importance, etc. The References header of the email contains the
-msgid of the initial bug report (as if daf was a original recipient of
-the bug notification). The email has the X-Launchpad-Message-Rationale
-header to track why daf received the email. The rational is repeated
-in the footer of the email with the bug title and URL.
+importance, etc. The References header of the email contains the msgid
+of the initial bug report (as if daf was a original recipient of the bug
+notification). The email has the X-Launchpad-Message-Rationale header to
+track why daf received the email. The rational is repeated in the footer
+of the email with the bug title and URL.
 
     >>> import email
 
     >>> def by_to_addrs(a, b):
-    ...     return cmp(a[1], b[1])
+    ...        return cmp(a[1], b[1])
 
     >>> test_emails = list(stub.test_emails)
     >>> test_emails.sort(by_to_addrs)
 
     >>> len(test_emails)
     1
+
     >>> from_addr, to_addr, raw_message = test_emails.pop()
     >>> print from_addr
     bounces@xxxxxxxxxxxxx
+
     >>> print to_addr
     ['daf@xxxxxxxxxxxxx']
 
     >>> msg = email.message_from_string(raw_message)
     >>> msg['References'] == (
-    ...     bug_one_in_ubuntu_firefox.bug.initial_message.rfc822msgid)
+    ...        bug_one_in_ubuntu_firefox.bug.initial_message.rfc822msgid)
     True
+
     >>> msg['X-Launchpad-Message-Rationale']
     'Subscriber (pmount in Ubuntu)'
+
     >>> msg['Subject']
     '[Bug 1] [NEW] Firefox does not support SVG'
+
     >>> print msg.get_payload(decode=True)
     You have been subscribed to a public bug:
     <BLANKLINE>
@@ -218,16 +234,16 @@
 
     >>> stub.test_emails = []
 
-Let's see that nothing unexpected happens when we set the source
-package to None.
+Let's see that nothing unexpected happens when we set the source package
+to None.
 
     >>> old_state = Snapshot(
-    ...     bug_one_in_ubuntu_firefox, providing=IBugTask)
+    ...        bug_one_in_ubuntu_firefox, providing=IBugTask)
 
     >>> bug_one_in_ubuntu_firefox.transitionToTarget(ubuntu)
 
     >>> source_package_changed = ObjectModifiedEvent(
-    ...     bug_one_in_ubuntu_firefox, old_state, ["sourcepackagename"])
+    ...        bug_one_in_ubuntu_firefox, old_state, ["sourcepackagename"])
 
     >>> notify(source_package_changed)
     >>> transaction.commit()
@@ -235,26 +251,32 @@
 
 The package subscribers, Daf and Foo Bar, are implicitly unsubscribed:
 
-    >>> subscriber_names(bug_one_in_ubuntu_firefox.bug)
-    [u'Mark Shuttleworth', u'Sample Person', u'Steve Alexander', u'Ubuntu Team']
+    >>> names = subscriber_names(bug_one_in_ubuntu_firefox.bug)
+    >>> for name in names:
+    ...     print name
+    Mark Shuttleworth
+    Sample Person
+    Steve Alexander
+    Ubuntu Team
 
-Subscriptions are not limited to persons; teams are also allowed to subscribe.
-Teams are a bit different, since they might not have a contact address. Let's
-add such a team as a subscriber.
+Subscriptions are not limited to persons; teams are also allowed to
+subscribe. Teams are a bit different, since they might not have a
+contact address. Let's add such a team as a subscriber.
 
     >>> ubuntu_gnome = personset.getByName("name18")
     >>> ubuntu_gnome.preferredemail is None
     True
+
     >>> ubuntu_pmount.addBugSubscription(ubuntu_gnome, ubuntu_gnome)
     <...StructuralSubscription object at ...>
 
     >>> old_state = Snapshot(
-    ...     bug_one_in_ubuntu_firefox, providing=IBugTask)
+    ...        bug_one_in_ubuntu_firefox, providing=IBugTask)
 
     >>> bug_one_in_ubuntu_firefox.transitionToTarget(ubuntu_pmount)
 
     >>> source_package_changed = ObjectModifiedEvent(
-    ...     bug_one_in_ubuntu_firefox, old_state, ["sourcepackagename"])
+    ...        bug_one_in_ubuntu_firefox, old_state, ["sourcepackagename"])
 
     >>> notify(source_package_changed)
     >>> transaction.commit()
@@ -270,7 +292,7 @@
 bug is reassigned to a different package.
 
     >>> bug_one_in_ubuntu_firefox.bug.setPrivate(
-    ...     True, getUtility(ILaunchBag).user)
+    ...        True, getUtility(ILaunchBag).user)
     True
 
 So, if Martin Pitt subscribes to ubuntu mozilla-firefox:
@@ -282,13 +304,13 @@
 and then the bug gets reassigned to mozilla firefox:
 
     >>> old_state = Snapshot(
-    ...     bug_one_in_ubuntu_firefox, providing=IBugTask)
+    ...        bug_one_in_ubuntu_firefox, providing=IBugTask)
 
     >>> bug_one_in_ubuntu_firefox.transitionToTarget(ubuntu_firefox)
 
     >>> source_package_changed = ObjectModifiedEvent(
-    ...     bug_one_in_ubuntu_firefox, old_state,
-    ...     ["id", "title", "sourcepackagename"])
+    ...        bug_one_in_ubuntu_firefox, old_state,
+    ...        ["id", "title", "sourcepackagename"])
 
     >>> notify(source_package_changed)
     >>> transaction.commit()
@@ -300,7 +322,8 @@
     [u'Foo Bar', u'Mark Shuttleworth', u'Sample Person', u'Ubuntu Team']
 
 
-== Product Bug Supervisors and Bug Tasks ==
+Product Bug Supervisors and Bug Tasks
+-------------------------------------
 
 Like reassigning a bug task to another package, reassigning a bug task
 to another product will subscribe any new product bug supervisors to the
@@ -320,12 +343,13 @@
     >>> bug_two_in_ubuntu = getUtility(IBugTaskSet).get(3)
     >>> print bug_two_in_ubuntu.bug.id
     2
+
     >>> print bug_two_in_ubuntu.product.name
     tomcat
 
     >>> sorted(
-    ...     [subscription.person.displayname for subscription in
-    ...      bug_two_in_ubuntu.bug.subscriptions])
+    ...        [subscription.person.displayname for subscription in
+    ...         bug_two_in_ubuntu.bug.subscriptions])
     [u'Steve Alexander']
 
     >>> old_state = Snapshot(bug_two_in_ubuntu, providing=IBugTask)
@@ -333,7 +357,7 @@
     >>> bug_two_in_ubuntu.transitionToTarget(mozilla_firefox)
 
     >>> product_changed = ObjectModifiedEvent(
-    ...     bug_two_in_ubuntu, old_state, ["id", "title", "product"])
+    ...        bug_two_in_ubuntu, old_state, ["id", "title", "product"])
 
     >>> notify(product_changed)
     >>> transaction.commit()
@@ -355,15 +379,18 @@
 
     >>> len(test_emails)
     1
+
     >>> from_addr, to_addr, raw_message = test_emails.pop()
     >>> print from_addr
     bounces@xxxxxxxxxxxxx
+
     >>> print to_addr
     ['foo.bar@xxxxxxxxxxxxx']
 
     >>> msg = email.message_from_string(raw_message)
     >>> msg['Subject']
     '[Bug 2] [NEW] Blackhole Trash folder'
+
     >>> print msg.get_payload(decode=True)
     You have been subscribed to a public bug:
     ...
@@ -376,10 +403,11 @@
     are subscribed to Mozilla Firefox.
 
 
-== Teams as bug supervisors ==
+Teams as bug supervisors
+------------------------
 
-The list of teams that a user may add to a package as a bug supervisor will
-only contain those teams of which the user is an administrator.
+The list of teams that a user may add to a package as a bug supervisor
+will only contain those teams of which the user is an administrator.
 
     >>> from zope.component import getMultiAdapter
     >>> from canonical.launchpad.webapp.servers import LaunchpadTestRequest
@@ -396,16 +424,18 @@
 
     >>> sample_person = view.user
     >>> ["%s: %s" % (membership.team.displayname, membership.status.name)
-    ...  for membership in sample_person.team_memberships]
+    ...     for membership in sample_person.team_memberships]
     [u'HWDB Team: APPROVED',
      u'Landscape Developers: ADMIN',
      u'Launchpad Users: ADMIN',
      u'Warty Security Team: APPROVED']
 
-But is only an administrator of Landscape Developers, so that is the only
-team that will be listed when the user is changing a package bug supervisor:
+But is only an administrator of Landscape Developers, so that is the
+only team that will be listed when the user is changing a package bug
+supervisor:
 
     >>> for team in view.user.getAdministratedTeams():
-    ...     print team.displayname
+    ...        print team.displayname
     Landscape Developers
     Launchpad Users
+

=== modified file 'lib/lp/code/browser/branchsubscription.py'
--- lib/lp/code/browser/branchsubscription.py	2011-09-13 05:23:16 +0000
+++ lib/lp/code/browser/branchsubscription.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 __metaclass__ = type
@@ -270,4 +270,3 @@
         return canonical_url(self.branch)
 
     cancel_url = next_url
-

=== modified file 'lib/lp/codehosting/codeimport/tests/test_worker.py'
--- lib/lp/codehosting/codeimport/tests/test_worker.py	2011-09-15 16:38:16 +0000
+++ lib/lp/codehosting/codeimport/tests/test_worker.py	2011-09-19 07:28:22 +0000
@@ -28,8 +28,8 @@
     )
 from bzrlib.errors import NoSuchFile
 from bzrlib.tests import (
+    http_utils,
     TestCaseWithTransport,
-    http_utils,
     )
 from bzrlib.transport import (
     get_transport,
@@ -1154,10 +1154,10 @@
         self.makeForeignCommit(source_details, ref="refs/heads/master",
             message="Message for master")
         source_details.url = urlutils.join_segment_parameters(
-                source_details.url, { "branch": "other" })
+                source_details.url, {"branch": "other"})
         source_transport = get_transport_from_url(source_details.url)
         self.assertEquals(
-            { "branch": "other" },
+            {"branch": "other"},
             source_transport.get_segment_parameters())
         worker = self.makeImportWorker(source_details,
             opener_policy=AcceptAnythingPolicy())
@@ -1205,11 +1205,11 @@
         repo = localrepository(ui(), local_path_from_url(source_details.url))
         extra = {}
         if branch is not None:
-            extra = { "branch": branch }
+            extra = {"branch": branch}
         if message is None:
             message = self.factory.getUniqueString()
-        repo.commit(text=message, user="Jane Random Hacker", force=1,
-            extra=extra)
+        repo.commit(
+            text=message, user="Jane Random Hacker", force=1, extra=extra)
         self.foreign_commit_count += 1
 
     def makeSourceDetails(self, branch_name, files):
@@ -1235,10 +1235,10 @@
         self.makeForeignCommit(source_details, branch="default",
             message="Message for default")
         source_details.url = urlutils.join_segment_parameters(
-                source_details.url, { "branch": "other" })
+                source_details.url, {"branch": "other"})
         source_transport = get_transport_from_url(source_details.url)
         self.assertEquals(
-            { "branch": "other" },
+            {"branch": "other"},
             source_transport.get_segment_parameters())
         worker = self.makeImportWorker(source_details,
             opener_policy=AcceptAnythingPolicy())

=== modified file 'lib/lp/codehosting/codeimport/worker.py'
--- lib/lp/codehosting/codeimport/worker.py	2011-09-15 14:15:38 +0000
+++ lib/lp/codehosting/codeimport/worker.py	2011-09-19 07:28:22 +0000
@@ -43,8 +43,8 @@
     do_catching_redirections,
     get_transport,
     )
+import bzrlib.ui
 from bzrlib.upgrade import upgrade
-import bzrlib.ui
 from bzrlib.urlutils import (
     join as urljoin,
     local_path_from_url,
@@ -681,12 +681,14 @@
         """
         def redirected(transport, e, redirection_notice):
             self._opener_policy.checkOneURL(e.target)
-            redirected_transport = transport._redirected_to(e.source, e.target)
+            redirected_transport = transport._redirected_to(
+                e.source, e.target)
             if redirected_transport is None:
                 raise NotBranchError(e.source)
             self._logger.info('%s is%s redirected to %s',
                  transport.base, e.permanently, redirected_transport.base)
             return redirected_transport
+
         def find_format(transport):
             last_error = None
             for prober_kls in self.probers:

=== modified file 'lib/lp/codehosting/puller/tests/test_scheduler.py'
--- lib/lp/codehosting/puller/tests/test_scheduler.py	2011-09-14 00:41:36 +0000
+++ lib/lp/codehosting/puller/tests/test_scheduler.py	2011-09-19 07:28:22 +0000
@@ -1,11 +1,10 @@
-# Copyright 2009-2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=W0222,W0231
 
 __metaclass__ = type
 
-from datetime import datetime
 import logging
 import os
 import textwrap
@@ -16,7 +15,6 @@
     format_registry,
     )
 from bzrlib.urlutils import join as urljoin
-import pytz
 from testtools.deferredruntest import (
     assert_fails_with,
     AsynchronousDeferredRunTest,
@@ -62,9 +60,11 @@
     def callRemote(self, method_name, *args):
         method = getattr(self, '_remote_%s' % method_name, self._default)
         deferred = method(*args)
+
         def append_to_log(pass_through):
             self.calls.append((method_name,) + tuple(args))
             return pass_through
+
         deferred.addCallback(append_to_log)
         return deferred
 
@@ -140,7 +140,7 @@
             self.calls.append(('method',) + args)
 
         def do_raise(self):
-            return 1/0
+            return 1 / 0
 
         def unexpectedError(self, failure):
             self.failure = failure
@@ -406,9 +406,11 @@
         # attempt to call mirrorFailed().
 
         runtime_error_failure = makeFailure(RuntimeError)
+
         class FailingMirrorFailedStubPullerListener(self.StubPullerListener):
             def mirrorFailed(self, message, oops):
                 return runtime_error_failure
+
         self.protocol.listener = FailingMirrorFailedStubPullerListener()
         self.listener = self.protocol.listener
         self.protocol.errReceived('traceback')
@@ -436,7 +438,6 @@
         """The puller master logs an OOPS when it receives an unexpected
         error.
         """
-        now = datetime.now(pytz.timezone('UTC'))
         fail = makeFailure(RuntimeError, 'error message')
         self.eventHandler.unexpectedError(fail)
         oops = self.oopses[-1]
@@ -565,8 +566,10 @@
         deferred = self.eventHandler.run()
         # Fake a successful run.
         deferred.callback(None)
+
         def check_available_prefixes(ignored):
             self.assertEqual(self.available_oops_prefixes, set(['foo']))
+
         return deferred.addCallback(check_available_prefixes)
 
     def test_restoresOopsPrefixToSetOnFailure(self):
@@ -579,8 +582,10 @@
         except RuntimeError:
             fail = failure.Failure()
         deferred.errback(fail)
+
         def check_available_prefixes(ignored):
             self.assertEqual(self.available_oops_prefixes, set(['foo']))
+
         return deferred.addErrback(check_available_prefixes)
 
     def test_logOopsWhenNoAvailablePrefix(self):
@@ -591,8 +596,10 @@
         self.available_oops_prefixes.clear()
 
         unexpected_errors = []
+
         def unexpectedError(failure):
             unexpected_errors.append(failure)
+
         self.eventHandler.unexpectedError = unexpectedError
         self.assertRaises(KeyError, self.eventHandler.run)
         self.assertEqual(unexpected_errors[0].type, KeyError)
@@ -717,8 +724,10 @@
 
         old_oops_raising = errorlog.globalErrorUtility.raising
         errorlog.globalErrorUtility.raising = new_oops_raising
+
         def restore_oops():
             errorlog.globalErrorUtility.raising = old_oops_raising
+
         self.addCleanup(restore_oops)
 
         expected_output = 'foo\nbar'
@@ -756,7 +765,6 @@
                 """Record the lock id on the listener."""
                 self.listener.lock_ids.append(id)
 
-
         class PullerMasterWithLockID(scheduler.PullerMaster):
             """A subclass of PullerMaster that allows recording of lock ids.
             """
@@ -887,7 +895,8 @@
 
         # We need to create a branch at the destination_url, so that the
         # subprocess can actually create a lock.
-        BzrDir.create_branch_convenience(locking_puller_master.destination_url)
+        BzrDir.create_branch_convenience(
+            locking_puller_master.destination_url)
 
         # Because when the deferred returned by 'func' is done we kill the
         # locking subprocess, we know that when the subprocess is done, the
@@ -944,6 +953,7 @@
             puller_master = self.makePullerMaster(
                 script_text=lower_timeout_script)
             deferred = puller_master.mirror()
+
             def check_mirror_failed(ignored):
                 self.assertEqual(len(self.client.calls), 1)
                 mirror_failed_call = self.client.calls[0]
@@ -953,6 +963,7 @@
                 self.assertTrue(
                     "Could not acquire lock" in mirror_failed_call[2])
                 return ignored
+
             deferred.addCallback(check_mirror_failed)
             return deferred
 

=== modified file 'lib/lp/hardwaredb/scripts/hwdbsubmissions.py'
--- lib/lp/hardwaredb/scripts/hwdbsubmissions.py	2011-09-13 14:47:12 +0000
+++ lib/lp/hardwaredb/scripts/hwdbsubmissions.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Parse Hardware Database submissions.
@@ -1693,7 +1693,7 @@
         0: HWBus.SCSI,
         1: HWBus.IDE,
         2: HWBus.FLOPPY,
-        3: HWBus.IPI, # Intelligent Peripheral Interface
+        3: HWBus.IPI,  # Intelligent Peripheral Interface.
         5: HWBus.ATA,
         6: HWBus.SATA,
         7: HWBus.SAS,
@@ -3048,7 +3048,7 @@
             except (KeyboardInterrupt, SystemExit):
                 # We should never catch these exceptions.
                 raise
-            except LibrarianServerError, error:
+            except LibrarianServerError:
                 # LibrarianServerError is raised when the server could
                 # not be reaches for 30 minutes.
                 #
@@ -3067,7 +3067,7 @@
                     'Could not reach the Librarian while processing HWDB '
                     'submission %s' % submission.submission_key)
                 raise
-            except Exception, error:
+            except Exception:
                 self.transaction.abort()
                 self.reportOops(
                     'Exception while processing HWDB submission %s'

=== modified file 'lib/lp/registry/browser/productrelease.py'
--- lib/lp/registry/browser/productrelease.py	2011-09-13 05:23:16 +0000
+++ lib/lp/registry/browser/productrelease.py	2011-09-19 07:28:22 +0000
@@ -15,7 +15,6 @@
     'ProductReleaseView',
     ]
 
-import cgi
 import mimetypes
 
 from lazr.restful.interface import copy_field
@@ -135,8 +134,6 @@
         # should not be targeted to a milestone in the past.
         if data.get('keep_milestone_active') is False:
             milestone.active = False
-            milestone_link = '<a href="%s">%s milestone</a>' % (
-                canonical_url(milestone), cgi.escape(milestone.name))
         self.next_url = canonical_url(newrelease.milestone)
         notify(ObjectCreatedEvent(newrelease))
 

=== modified file 'lib/lp/registry/browser/tests/test_distroseries.py'
--- lib/lp/registry/browser/tests/test_distroseries.py	2011-09-13 08:46:35 +0000
+++ lib/lp/registry/browser/tests/test_distroseries.py	2011-09-19 07:28:22 +0000
@@ -100,7 +100,6 @@
     )
 from lp.soyuz.model import distroseriesdifferencejob
 from lp.soyuz.model.archivepermission import ArchivePermission
-from lp.soyuz.model.initializedistroseriesjob import InitializeDistroSeriesJob
 from lp.soyuz.model.packagecopyjob import PlainPackageCopyJob
 from lp.soyuz.scripts.initialize_distroseries import InitializationError
 from lp.testing import (

=== modified file 'lib/lp/registry/model/distributionsourcepackage.py'
--- lib/lp/registry/model/distributionsourcepackage.py	2011-09-16 19:02:49 +0000
+++ lib/lp/registry/model/distributionsourcepackage.py	2011-09-19 07:28:22 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=E0611,W0212
@@ -572,7 +572,7 @@
     po_message_count = Int()
     is_upstream_link_allowed = Bool()
     enable_bugfiling_duplicate_search = Bool()
-    
+
     # XXX kiko 2006-08-16: Bad method name, no need to be a property.
     @property
     def currentrelease(self):