launchpad-reviewers team mailing list archive
-
launchpad-reviewers team
-
Mailing list archive
-
Message #05077
[Merge] lp:~jtv/launchpad/megalint-2 into lp:launchpad
Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/megalint-2 into lp:launchpad.
Requested reviews:
Launchpad code reviewers (launchpad-reviewers)
For more details, see:
https://code.launchpad.net/~jtv/launchpad/megalint-2/+merge/76937
= Summary =
Every Launchpad engineer's responsibilities include
(1) checking their branches for lint;
(2) fixing any lint they added, with a few exceptions; and
(3) cleaning up at least some of other people's lint that they find in files they touch.
However not everybody is doing this, and so a “bzr merge devel” will typically introduce a lot of lint into your branch. This effectively sabotages your first line of defense against mistakes in resolving merge conflicts, and in some cases the noise also hides long-standing bugs.
== Proposed fix ==
Yet another humongous lint cleanup.
== Pre-implementation notes ==
I filed one bug and sent out a notice about dubious javascript that need fixing, but which I'm loath to address myself for fear of getting the intention of the code wrong.
== Implementation details ==
I had to get creative in a few places:
* When evaluating a property twice to establish performance characteristics, don't assign to an unused variable or assert its result for no good reason; introduce a do-nothing function whose name documents the intent.
* Replace repetitive percentage calculations with a simple helper class.
= Launchpad lint =
In most of the remaining lint, our linter is simply wrong (particularly “'_pythonpath' imported but unused”) or I see nothing that can be easily done. I filed a bug for the JavaScript warnings. The close-account warnings are about a horrible construct where the script passes its local variables dict for interpolation. I haven't figured out where the quoting of SQL strings is supposed to be done; I'll just leave the warnings in there for another time.
The remaining warnings about “'lp' imported but unused” are hard to avoid because the import in question has the side effect of initializing bzr. We may want to find a more explicit solution for this.
Checking for conflicts and issues in changed files.
Linting changed files:
lib/lp/app/stories/basics/notfound-error.txt
lib/lp/app/browser/launchpadform.py
lib/lp/codehosting/puller/tests/test_worker_formats.py
lib/lp/registry/stories/announcements/xx-announcements.txt
lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt
lib/lp/testing/pgsql.py
lib/canonical/launchpad/tests/test_login.py
lib/lp/soyuz/scripts/packagecopier.py
lib/lp/registry/browser/announcement.py
lib/canonical/launchpad/doc/canonical-config.txt
lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt
lib/canonical/launchpad/pagetitles.py
lib/lp/services/mail/incoming.py
lib/lp/code/interfaces/branchmergeproposal.py
lib/lp/soyuz/scripts/tests/test_processaccepted.py
lib/lp/soyuz/doc/archive-override-check.txt
lib/canonical/database/postgresql.py
lib/lp/bugs/stories/bug-privacy/xx-bug-privacy.txt
cronscripts/librarian-gc.py
lib/canonical/launchpad/browser/oauth.py
scripts/close-account.py
lib/lp/soyuz/browser/sourcepackagebuilds.py
lib/lp/bugs/doc/bugtask-expiration.txt
lib/lp/registry/browser/tests/test_distribution_views.py
lib/lp/bugs/browser/tests/special/bugtarget-recently-touched-bugs.txt
standard_test_template.js
lib/lp/soyuz/model/publishing.py
database/schema/Makefile
lib/canonical/launchpad/scripts/tests/test_hwdb_submission_processing.py
scripts/script-monitor-nagios.py
lib/lp/bugs/model/tests/test_bug.py
lib/lp/code/browser/codeimport.py
lib/lp/app/javascript/privacy.js
lib/lp/soyuz/doc/gina.txt
lib/lp/app/templates/launchpad-notfound.pt
lib/canonical/launchpad/scripts/tests/test_hwdb_submission_parser.py
lib/lp/translations/stories/translationgroups/xx-translationgroups.txt
scripts/script-monitor.py
lib/lp/bugs/javascript/tests/test_async_comment_loading.js
lib/lp/code/model/diff.py
lib/lp/code/model/tests/test_sourcepackagerecipe.py
lib/lp/bugs/doc/checkwatches-cli-switches.txt
scripts/ftpmaster-tools/sync-source.py
lib/lp/soyuz/tests/test_sourcepackagerelease.py
lib/lp/translations/browser/pofile.py
lib/lp/testing/tests/test_pgsql.py
lib/lp/translations/tests/pofiletranslator.txt
lib/canonical/launchpad/webapp/ftests/test_adapter.txt
lib/lp/bugs/model/bug.py
lib/lp/bugs/doc/bug.txt
lib/lp/archiveuploader/tests/nascentupload-epoch-handling.txt
lib/lp/bugs/javascript/bugtask_index.js
lib/lp/services/mail/tests/test_dkim.py
lib/lp/bugs/doc/bugnotification-email.txt
database/replication/initialize.py
lib/lp/registry/stories/team-polls/create-polls.txt
lib/lp/hardwaredb/tests/test_hwdb_submission_validation.py
lib/lp/soyuz/doc/buildd-mass-retry.txt
lib/canonical/database/ftests/test_zopelesstransactionmanager.txt
lib/canonical/buildd/debian/changelog
lib/lp/bugs/browser/tests/test_bugtask.py
lib/lp/bugs/browser/bugtask.py
lib/lp/app/javascript/choiceedit/choiceedit.js
lib/lp/soyuz/scripts/tests/test_copypackage.py
lib/lp/services/mail/sendmail.py
lib/canonical/launchpad/doc/scripts-and-zcml.txt
scripts/process-one-mail.py
lib/lp/code/model/tests/test_branchcollection.py
lib/lp/code/interfaces/tests/test_branch.py
lib/canonical/launchpad/webapp/dbpolicy.py
utilities/report-database-stats.py
lib/lp/app/javascript/picker/picker.js
lib/canonical/database/ftests/script_isolation.py
database/schema/unautovacuumable.py
lib/lp/bugs/doc/bug-export.txt
lib/lp/app/browser/tests/test_vocabulary.py
lib/lp/app/browser/vocabulary.py
lib/canonical/librarian/libraryprotocol.py
lib/lp/code/interfaces/codeimport.py
lib/canonical/buildd/umount-chroot
lib/lp/archiveuploader/tests/nascentupload-closing-bugs.txt
utilities/check-templates.sh
lib/lp/codehosting/puller/worker.py
./lib/lp/codehosting/puller/tests/test_worker_formats.py
8: 'lp' imported but unused
./lib/lp/registry/stories/announcements/xx-announcements.txt
646: want exceeds 78 characters.
719: want exceeds 78 characters.
767: want exceeds 78 characters.
./lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt
145: want exceeds 78 characters.
389: want exceeds 78 characters.
391: want exceeds 78 characters.
401: want exceeds 78 characters.
403: want exceeds 78 characters.
./lib/lp/registry/browser/announcement.py
162: E251 no spaces around keyword / parameter equals
./cronscripts/librarian-gc.py
19: '_pythonpath' imported but unused
./scripts/close-account.py
52: local variable 'new_name' is assigned to but never used
64: local variable 'unknown_rationale' is assigned to but never used
107: local variable 'closed_question_status' is assigned to but never used
11: '_pythonpath' imported but unused
./database/schema/Makefile
56: Line exceeds 78 characters.
78: Line exceeds 78 characters.
95: Line exceeds 78 characters.
160: Line exceeds 78 characters.
164: Line exceeds 78 characters.
169: Line exceeds 78 characters.
182: Line exceeds 78 characters.
199: Line exceeds 78 characters.
./scripts/script-monitor-nagios.py
24: '_pythonpath' imported but unused
./lib/lp/soyuz/doc/gina.txt
162: want exceeds 78 characters.
179: want exceeds 78 characters.
223: want exceeds 78 characters.
236: want exceeds 78 characters.
242: want exceeds 78 characters.
./lib/canonical/launchpad/scripts/tests/test_hwdb_submission_parser.py
12: redefinition of unused 'etree' from line 10
./lib/lp/translations/stories/translationgroups/xx-translationgroups.txt
1119: source exceeds 78 characters.
1146: want exceeds 78 characters.
1159: want exceeds 78 characters.
./scripts/script-monitor.py
11: '_pythonpath' imported but unused
./scripts/ftpmaster-tools/sync-source.py
18: '_pythonpath' imported but unused
./lib/lp/bugs/doc/bugnotification-email.txt
214: want has trailing whitespace.
./database/replication/initialize.py
16: '_pythonpath' imported but unused
./lib/lp/registry/stories/team-polls/create-polls.txt
113: want exceeds 78 characters.
./lib/canonical/buildd/debian/changelog
226: Line exceeds 78 characters.
292: Line exceeds 78 characters.
482: Line exceeds 78 characters.
489: Line exceeds 78 characters.
510: Line exceeds 78 characters.
516: Line exceeds 78 characters.
522: Line exceeds 78 characters.
528: Line exceeds 78 characters.
535: Line exceeds 78 characters.
543: Line exceeds 78 characters.
550: Line exceeds 78 characters.
564: Line exceeds 78 characters.
570: Line exceeds 78 characters.
576: Line exceeds 78 characters.
584: Line exceeds 78 characters.
590: Line exceeds 78 characters.
597: Line exceeds 78 characters.
608: Line exceeds 78 characters.
617: Line exceeds 78 characters.
624: Line exceeds 78 characters.
630: Line exceeds 78 characters.
637: Line exceeds 78 characters.
643: Line exceeds 78 characters.
649: Line exceeds 78 characters.
656: Line exceeds 78 characters.
663: Line exceeds 78 characters.
670: Line exceeds 78 characters.
677: Line exceeds 78 characters.
683: Line exceeds 78 characters.
./lib/lp/app/javascript/choiceedit/choiceedit.js
26: Use the object literal notation {}.
38: 'ChoiceSource' has not been fully defined yet.
213: Expected '===' and instead saw '=='.
246: Expected '!==' and instead saw '!='.
356: 'ChoiceList' has not been fully defined yet.
459: Expected '===' and instead saw '=='.
499: Expected '===' and instead saw '=='.
602: 'NullChoiceSource' has not been fully defined yet.
650: Expected '===' and instead saw '=='.
./lib/lp/soyuz/scripts/tests/test_copypackage.py
1444: Line exceeds 78 characters.
1444: E501 line too long (82 characters)
./scripts/process-one-mail.py
8: '_pythonpath' imported but unused
./lib/lp/code/interfaces/tests/test_branch.py
8: 'lp' imported but unused
./utilities/report-database-stats.py
9: '_pythonpath' imported but unused
./database/schema/unautovacuumable.py
25: '_pythonpath' imported but unused
./lib/lp/bugs/doc/bug-export.txt
26: redefinition of unused 'ET' from line 24
./lib/lp/archiveuploader/tests/nascentupload-closing-bugs.txt
138: want exceeds 78 characters.
./lib/lp/codehosting/puller/worker.py
11: 'lp' imported but unused
--
https://code.launchpad.net/~jtv/launchpad/megalint-2/+merge/76937
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/megalint-2 into lp:launchpad.
=== modified file 'cronscripts/librarian-gc.py'
--- cronscripts/librarian-gc.py 2011-09-24 03:56:19 +0000
+++ cronscripts/librarian-gc.py 2011-09-26 07:13:14 +0000
@@ -1,6 +1,6 @@
#!/usr/bin/python -S
#
-# Copyright 2009 Canonical Ltd. This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
# pylint: disable-msg=C0103,W0403
@@ -14,8 +14,9 @@
__metaclass__ = type
+import logging
+
import _pythonpath
-import logging
from canonical.config import config
from canonical.launchpad.database.librarian import LibraryFileAlias
@@ -78,7 +79,8 @@
if not self.options.skip_expiry:
librariangc.expire_aliases(conn)
if not self.options.skip_content:
- librariangc.delete_unreferenced_content(conn) # first sweep
+ # First sweep.
+ librariangc.delete_unreferenced_content(conn)
if not self.options.skip_blobs:
librariangc.delete_expired_blobs(conn)
if not self.options.skip_duplicates:
@@ -86,7 +88,8 @@
if not self.options.skip_aliases:
librariangc.delete_unreferenced_aliases(conn)
if not self.options.skip_content:
- librariangc.delete_unreferenced_content(conn) # second sweep
+ # Second sweep.
+ librariangc.delete_unreferenced_content(conn)
if not self.options.skip_files:
librariangc.delete_unwanted_files(conn)
@@ -94,5 +97,9 @@
if __name__ == '__main__':
script = LibrarianGC('librarian-gc',
dbuser=config.librarian_gc.dbuser)
+<<<<<<< TREE
script.lock_and_run(isolation='autocommit')
+=======
+ script.lock_and_run(isolation=ISOLATION_LEVEL_AUTOCOMMIT)
+>>>>>>> MERGE-SOURCE
=== modified file 'database/replication/initialize.py'
--- database/replication/initialize.py 2011-09-06 05:00:26 +0000
+++ database/replication/initialize.py 2011-09-26 07:13:14 +0000
@@ -1,6 +1,6 @@
#!/usr/bin/python -S
#
-# Copyright 2009 Canonical Ltd. This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
"""Initialize the cluster.
@@ -9,32 +9,42 @@
a replicated setup.
"""
-import _pythonpath
-
from optparse import OptionParser
import subprocess
import sys
+import _pythonpath
import helpers
from canonical.config import config
-from canonical.database.sqlbase import connect, ISOLATION_LEVEL_AUTOCOMMIT
from canonical.database.postgresql import (
- all_sequences_in_schema, all_tables_in_schema, ConnectionString
- )
+ all_sequences_in_schema,
+ all_tables_in_schema,
+ ConnectionString,
+ )
+from canonical.database.sqlbase import (
+ connect,
+ ISOLATION_LEVEL_AUTOCOMMIT,
+ )
from canonical.launchpad.scripts import (
- logger, logger_options, db_options
- )
+ db_options,
+ logger,
+ logger_options,
+ )
+
__metaclass__ = type
__all__ = []
-log = None # Global logger, initialized in main()
-
-options = None # Parsed command line options, initialized in main()
-
-cur = None # Shared database cursor to the master, initialized in main()
+# Global logger, initialized in main().
+log = None
+
+# Parsed command line options, initialized in main().
+options = None
+
+# Shared database cursor to the master, initialized in main().
+cur = None
def duplicate_schema():
@@ -83,7 +93,8 @@
def ensure_live():
log.info('Ensuring slon daemons are live and propagating events.')
- helpers.sync(120) # Will exit on failure.
+ # This will exit on failure.
+ helpers.sync(120)
def create_replication_sets(lpmain_tables, lpmain_sequences):
@@ -134,7 +145,8 @@
""")
helpers.execute_slonik('\n'.join(script), sync=600)
- helpers.validate_replication(cur) # Explode now if we have messed up.
+ # Explode now if we have messed up.
+ helpers.validate_replication(cur)
def main():
=== modified file 'database/schema/Makefile'
--- database/schema/Makefile 2011-07-21 09:53:54 +0000
+++ database/schema/Makefile 2011-09-26 07:13:14 +0000
@@ -1,4 +1,4 @@
-# Copyright 2010 Canonical Ltd. This software is licensed under the
+# Copyright 2010-2011 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
#
# Quick hack makefile to (re)create the Launchpad database.
@@ -63,7 +63,7 @@
# to ensure that the development database remains in sync with reality
# on production. It is generated using newbaseline.py in
# bzr+ssh://devpad.canonical.com/code/stub/dbascripts
-#
+#
REV=2208
BASELINE=launchpad-${REV}-00-0.sql
MD5SUM=12a258f8651ae3bba0c96ec8e62be1dd launchpad-2208-00-0.sql
@@ -71,7 +71,7 @@
default: all
# Create a launchpad_ftest_template DB and load the test sample data into it.
-test: create
+test: create
@ echo "* Creating database \"$(TEMPLATE_WITH_TEST_SAMPLEDATA)\"."
@ ${CREATEDB} ${EMPTY_DBNAME} ${TEMPLATE_WITH_TEST_SAMPLEDATA}
@ echo "* Loading sample data"
@@ -110,7 +110,7 @@
# database schema, full text indexes and grants into it.
# It will also create session DBs for the test and dev environments.
# No sample data is added at this point.
-create:
+create:
@ echo "* If this fails you need to run as the postgresql superuser"
@ echo "* eg. sudo -u postgres make create"
@ echo
=== modified file 'database/schema/unautovacuumable.py'
--- database/schema/unautovacuumable.py 2011-09-06 05:00:26 +0000
+++ database/schema/unautovacuumable.py 2011-09-26 07:13:14 +0000
@@ -1,6 +1,6 @@
#!/usr/bin/python -S
#
-# Copyright 2009 Canonical Ltd. This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
"""Disable autovacuum on all tables in the database and kill off
@@ -16,16 +16,23 @@
__metaclass__ = type
__all__ = []
-# pylint: disable-msg=W0403
-import _pythonpath
-
from distutils.version import LooseVersion
from optparse import OptionParser
import sys
import time
-from canonical.database.sqlbase import connect
-from canonical.launchpad.scripts import logger_options, db_options, logger
+# pylint: disable-msg=W0403
+import _pythonpath
+
+from canonical.database.sqlbase import (
+ connect,
+ ISOLATION_LEVEL_AUTOCOMMIT,
+ )
+from canonical.launchpad.scripts import (
+ db_options,
+ logger,
+ logger_options,
+ )
def main():
@@ -42,7 +49,7 @@
log.debug("Connecting")
con = connect()
- con.set_isolation_level(0) # Autocommit
+ con.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
cur = con.cursor()
cur.execute('show server_version')
=== modified file 'lib/canonical/buildd/debian/changelog'
--- lib/canonical/buildd/debian/changelog 2011-09-19 10:58:27 +0000
+++ lib/canonical/buildd/debian/changelog 2011-09-26 07:13:14 +0000
@@ -143,7 +143,7 @@
launchpad-buildd (62) hardy-cat; urgency=low
- * Make the buildds cope with not having a sourcepackagename LP#587109
+ * Make the buildds cope with not having a sourcepackagename LP#587109
-- LaMont Jones <lamont@xxxxxxxxxxxxx> Tue, 08 Jun 2010 13:02:31 -0600
@@ -325,7 +325,7 @@
launchpad-buildd (43) dapper-cat; urgency=low
* unpack-chroot: Move the ntpdate calls below the bunzip/exec bit,
- so we don't run ntpdate twice when unzipping tarballs, which
+ so we don't run ntpdate twice when unzipping tarballs, which
happens on every single build on Xen hosts (like the PPA hosts).
* debian/control: We use adduser in postinst, depending on it helps.
* debian/control: Set myself as the Maintainer, since I'm in here.
@@ -604,7 +604,7 @@
* If upgrading from < v12, will remove -A from sbuildargs and add in
a default ogrepath to any buildd configs found in /etc/launchpad-buildd
* Prevent launchpad-buildd init from starting ~ files
-
+
-- Daniel Silverstone <daniel.silverstone@xxxxxxxxxxxxx> Sun, 2 Oct 2005 23:20:08 +0100
launchpad-buildd (11) unstable; urgency=low
=== modified file 'lib/canonical/buildd/umount-chroot'
--- lib/canonical/buildd/umount-chroot 2011-09-19 10:58:27 +0000
+++ lib/canonical/buildd/umount-chroot 2011-09-26 07:13:14 +0000
@@ -1,6 +1,6 @@
#!/bin/sh
#
-# Copyright 2009 Canonical Ltd. This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
# Buildd Slave tool to mount a chroot
@@ -26,7 +26,7 @@
# pass at umounting fails unless we reverse the list. Leave the while
# loop in just to handle pathological cases, too.
COUNT=0
-while $GREP "$HOME/build-$BUILDID/chroot-autobuild" /proc/mounts; do
+while $GREP "$HOME/build-$BUILDID/chroot-autobuild" /proc/mounts; do
COUNT=$(($COUNT+1))
if [ $COUNT -ge 20 ]; then
echo "failed to umount $HOME/build-$BUILDID/chroot-autobuild"
=== modified file 'lib/canonical/database/ftests/script_isolation.py'
=== added file 'lib/canonical/database/ftests/test_zopelesstransactionmanager.txt.OTHER'
--- lib/canonical/database/ftests/test_zopelesstransactionmanager.txt.OTHER 1970-01-01 00:00:00 +0000
+++ lib/canonical/database/ftests/test_zopelesstransactionmanager.txt.OTHER 2011-09-26 07:13:14 +0000
@@ -0,0 +1,60 @@
+ZopelessTransactionManager
+==========================
+
+The ZopelessTransactionManager used to be an alternative to the Zope
+transaction manager and SQLOS database adapter.
+
+With the move to Storm, it was converted to a small compatibility shim
+used to configure the database adapter.
+
+
+Initialization
+--------------
+
+The ZopelessTransactionManager is initialized with its initZopeless()
+class method.
+
+ >>> from canonical.database.sqlbase import ZopelessTransactionManager
+ >>> ZopelessTransactionManager.initZopeless(dbuser='launchpad_main')
+
+After initZopeless() has been called, the '_installed' attribute of
+ZopelessTransactionManager will be set to the transaction manager:
+
+ >>> ZopelessTransactionManager._installed is ZopelessTransactionManager
+ True
+
+The initZopeless() call defaults to read committed isolation:
+
+ >>> from canonical.database.sqlbase import cursor
+ >>> c = cursor()
+ >>> c.execute("SHOW transaction_isolation")
+ >>> print c.fetchone()[0]
+ read committed
+
+The uninstall() method can be used to uninstall the transaction manager:
+
+ >>> ZopelessTransactionManager.uninstall()
+ >>> print ZopelessTransactionManager._installed
+ None
+
+We can log in as alternative users with initZopeless():
+
+ >>> ZopelessTransactionManager.initZopeless(dbuser='testadmin')
+ >>> c = cursor()
+ >>> c.execute("SELECT current_user")
+ >>> print c.fetchone()[0]
+ testadmin
+
+ >>> ZopelessTransactionManager.uninstall()
+
+Or we can specify other transaction isolation modes:
+
+ >>> from canonical.database.sqlbase import ISOLATION_LEVEL_SERIALIZABLE
+ >>> ZopelessTransactionManager.initZopeless(
+ ... dbuser='librarian', isolation=ISOLATION_LEVEL_SERIALIZABLE)
+ >>> c = cursor()
+ >>> c.execute("SHOW transaction_isolation")
+ >>> print c.fetchone()[0]
+ serializable
+
+ >>> ZopelessTransactionManager.uninstall()
=== modified file 'lib/canonical/database/postgresql.py'
--- lib/canonical/database/postgresql.py 2011-09-07 00:24:46 +0000
+++ lib/canonical/database/postgresql.py 2011-09-26 07:13:14 +0000
@@ -85,7 +85,8 @@
for t in cur.fetchall():
# t == (src_table, src_column, dest_table, dest_column, upd, del)
- if t not in _state: # Avoid loops
+ # Avoid loops:
+ if t not in _state:
_state.append(t)
# Recurse, Locating references to the reference we just found.
listReferences(cur, t[0], t[1], _state)
@@ -93,6 +94,7 @@
# from the original (table, column), making it easier to change keys
return _state
+
def listUniques(cur, table, column):
'''Return a list of unique indexes on `table` that include the `column`
@@ -175,6 +177,7 @@
rv.append(tuple(keys))
return rv
+
def listSequences(cur):
"""Return a list of (schema, sequence, table, column) tuples.
@@ -206,7 +209,7 @@
for schema, sequence in list(cur.fetchall()):
match = re.search('^(\w+)_(\w+)_seq$', sequence)
if match is None:
- rv.append( (schema, sequence, None, None) )
+ rv.append((schema, sequence, None, None))
else:
table = match.group(1)
column = match.group(2)
@@ -225,11 +228,12 @@
cur.execute(sql, dict(schema=schema, table=table, column=column))
num = cur.fetchone()[0]
if num == 1:
- rv.append( (schema, sequence, table, column) )
+ rv.append((schema, sequence, table, column))
else:
- rv.append( (schema, sequence, None, None) )
+ rv.append((schema, sequence, None, None))
return rv
+
def generateResetSequencesSQL(cur):
"""Return SQL that will reset table sequences to match the data in them.
"""
@@ -257,6 +261,7 @@
else:
return ''
+
def resetSequences(cur):
"""Reset table sequences to match the data in them.
@@ -281,6 +286,7 @@
# Regular expression used to parse row count estimate from EXPLAIN output
_rows_re = re.compile("rows=(\d+)\swidth=")
+
def estimateRowCount(cur, query):
"""Ask the PostgreSQL query optimizer for an estimated rowcount.
@@ -422,9 +428,10 @@
cur.execute("SET enable_seqscan=%s" % permission_value)
+
def all_tables_in_schema(cur, schema):
"""Return a set of all tables in the given schema.
-
+
:returns: A set of quoted, fully qualified table names.
"""
cur.execute("""
@@ -574,4 +581,3 @@
for table, column in listReferences(cur, 'person', 'id'):
print '%32s %32s' % (table, column)
-
=== modified file 'lib/canonical/launchpad/browser/oauth.py'
--- lib/canonical/launchpad/browser/oauth.py 2011-09-18 18:00:21 +0000
+++ lib/canonical/launchpad/browser/oauth.py 2011-09-26 07:13:14 +0000
@@ -58,12 +58,13 @@
if include_secret:
structure['oauth_token_secret'] = token.secret
access_levels = [{
- 'value' : permission.name,
- 'title' : permission.title
+ 'value': permission.name,
+ 'title': permission.title,
}
for permission in permissions]
structure['access_levels'] = access_levels
- self.request.response.setHeader('Content-Type', HTTPResource.JSON_TYPE)
+ self.request.response.setHeader(
+ 'Content-Type', HTTPResource.JSON_TYPE)
return simplejson.dumps(structure)
@@ -105,6 +106,7 @@
return u'oauth_token=%s&oauth_token_secret=%s' % (
token.key, token.secret)
+
def token_exists_and_is_not_reviewed(form, action):
return form.token is not None and not form.token.is_reviewed
@@ -122,14 +124,14 @@
WEEK = "Week"
DURATION = {
- HOUR : 60 * 60,
- DAY : 60 * 60 * 24,
- WEEK : 60 * 60 * 24 * 7
+ HOUR: 60 * 60,
+ DAY: 60 * 60 * 24,
+ WEEK: 60 * 60 * 24 * 7,
}
def create_oauth_permission_actions():
- """Return two `Actions` objects containing each possible `OAuthPermission`.
+ """Make two `Actions` objects containing each possible `OAuthPermission`.
The first `Actions` object contains every action supported by the
OAuthAuthorizeTokenView. The second list contains a good default
@@ -422,7 +424,8 @@
if not token.is_reviewed:
self.request.unauthorized(OAUTH_CHALLENGE)
- return u'Request token has not yet been reviewed. Try again later.'
+ return (
+ u"Request token has not yet been reviewed. Try again later.")
if token.permission == OAuthPermission.UNAUTHORIZED:
# The end-user explicitly refused to authorize this
=== modified file 'lib/canonical/launchpad/doc/canonical-config.txt'
--- lib/canonical/launchpad/doc/canonical-config.txt 2011-09-05 15:42:27 +0000
+++ lib/canonical/launchpad/doc/canonical-config.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Canonical Config =
+Canonical Config
+================
`canonical.config` provides singleton access to the Launchpad
configuration, accessed via the `config` module global. It is
@@ -6,12 +7,13 @@
correct config.
-== CanonicalConfig AKA config ==
+CanonicalConfig AKA config
+--------------------------
-CanonicalConfig is a singleton that manages access to the config.
-Cached copies are kept in thread locals ensuring the configuration
-is thread safe (not that this will be a problem if we stick with
-simple configuration).
+CanonicalConfig is a singleton that manages access to the config. Cached
+copies are kept in thread locals ensuring the configuration is thread
+safe (not that this will be a problem if we stick with simple
+configuration).
>>> from canonical.config import config
>>> from canonical.testing.layers import DatabaseLayer
@@ -19,10 +21,13 @@
... 'dbname=%s host=localhost' % DatabaseLayer._db_fixture.dbname)
>>> expected == config.database.rw_main_master
True
+
>>> config.database.db_statement_timeout is None
True
+
>>> config.launchpad.dbuser
'launchpad_main'
+
>>> config.librarian.dbuser
'librarian'
@@ -40,17 +45,19 @@
otherwise it loads launchpad-lazr.conf. The general rule is
configs/<instance_name>/<process_name>.conf. The testrunner sets the
instance_name to 'testrunner' and additionally uses unique temporary
-configs to permit changing the config during tests (but not if we
-are using persistent helpers - see canonical.testing.layers).
+configs to permit changing the config during tests (but not if we are
+using persistent helpers - see canonical.testing.layers).
>>> (config.instance_name == 'testrunner' or
- ... config.instance_name == os.environ['LPCONFIG'])
+ ... config.instance_name == os.environ['LPCONFIG'])
True
+
>>> config.process_name
'test'
>>> config.filename
'.../launchpad-lazr.conf'
+
>>> config.extends.filename
'.../launchpad-lazr.conf'
@@ -69,6 +76,7 @@
>>> example_path = canonical.config.__file__
>>> config.root in example_path
True
+
>>> example_path[len(config.root):]
'/lib/canonical/...'
@@ -77,15 +85,19 @@
>>> config.devmode
True
+
>>> len(config.servers)
4
+
>>> config.threads
4
-== Working with test configurations ==
-
-Tests can update the config with test data. For example, the domain
-can be changed for a feature.
+
+Working with test configurations
+--------------------------------
+
+Tests can update the config with test data. For example, the domain can
+be changed for a feature.
>>> test_data = ("""
... [answertracker]
@@ -98,18 +110,19 @@
>>> config.pop('test_data')
(<lazr.config...ConfigData ...>,)
+
>>> config.answertracker.email_domain
'answers.launchpad.net'
-== Selecting the conf file with instance and process names ==
+Selecting the conf file with instance and process names
+-------------------------------------------------------
-The name of the conf file, and the directory from which is resides,
-is controlled by the config's process_name and instance_name. These
-may be set by their corresponding methods, *before* accessing the
-config, to set where the config values are loaded from. After the
-config is loaded, changing the instance and process names will have
-no affect.
+The name of the conf file, and the directory from which is resides, is
+controlled by the config's process_name and instance_name. These may be
+set by their corresponding methods, *before* accessing the config, to
+set where the config values are loaded from. After the config is loaded,
+changing the instance and process names will have no affect.
Setting just the instance_name will change the directory from which the
conf file is loaded.
@@ -122,6 +135,7 @@
>>> test_config.filename
'.../configs/development/launchpad-lazr.conf'
+
>>> test_config.extends.filename
'.../config/schema-lazr.conf'
@@ -144,14 +158,15 @@
>>> test_config.filename
'.../configs/testrunner/test-process-lazr.conf'
+
>>> test_config.extends.filename
'.../configs/testrunner/launchpad-lazr.conf'
>>> test_config.answertracker.days_before_expiration
300
-The default 'launchpad-lazr.conf' is loaded if no conf files match
-the process's name.
+The default 'launchpad-lazr.conf' is loaded if no conf files match the
+process's name.
>>> test_config.setInstance('testrunner')
>>> test_config.instance_name
@@ -163,20 +178,19 @@
>>> test_config.filename
'.../configs/testrunner/launchpad-lazr.conf'
+
>>> test_config.extends.filename
'.../configs/development/launchpad-lazr.conf'
>>> test_config.answertracker.days_before_expiration
15
- >>> # setInstance sets the LPCONFIG environment variable, so set it
- >>> # back to the real value.
>>> config.setInstance(config.instance_name)
The initial instance_name is set via the LPCONFIG environment variable.
Because Config is designed to failover to the default development
-environment, and the testrunner overrides the environment and config,
-we need to reconfigure the environment and reload the canonical.config
+environment, and the testrunner overrides the environment and config, we
+need to reconfigure the environment and reload the canonical.config
module to test CanonicalConfig's behaviour.
Alternatively, the instance name and process name can be specified as
@@ -185,6 +199,7 @@
>>> dev_config = CanonicalConfig('development', 'authserver')
>>> dev_config.instance_name
'development'
+
>>> dev_config.process_name
'authserver'
@@ -193,7 +208,6 @@
# XXX sinzui 2008-03-25 bug=78545: This cannot be tested until the
# config can be restored when this test is torn down.
-
# >>> true_config = config
# >>> import os
# >>> from canonical.config import LPCONFIG, DEFAULT_SECTION
@@ -203,7 +217,6 @@
# # reload the CanonicalConfig class object.
# >>> config_module = reload(canonical.config)
# >>> from canonical.config import config
-
# >>> config.filename
# '.../configs/mailman-itests/launchpad-lazr.conf'
# >>> config.extends.filename
@@ -219,7 +232,7 @@
# <SectionValue for canonical 'default'>
# >>> config._cache.testrunner
# Traceback (most recent call last):
-# ...
+# ...
# AttributeError: 'zope.thread.local' object has no attribute 'testrunner'
#We need to reset the config for the testrunner.
=== modified file 'lib/canonical/launchpad/doc/scripts-and-zcml.txt'
--- lib/canonical/launchpad/doc/scripts-and-zcml.txt 2011-09-18 07:01:40 +0000
+++ lib/canonical/launchpad/doc/scripts-and-zcml.txt 2011-09-26 07:13:14 +0000
@@ -1,39 +1,42 @@
Scripts and ZCML
----------------
-The full Zope component architecture is available from scripts so long as they
-call `execute_zcml_for_scripts()`.
-
-Let's make a simple example script that uses getUtility and the database to
-demonstrate this:
-
- >>> import tempfile, subprocess, os, sys
- >>> script_file = tempfile.NamedTemporaryFile()
- >>> script_file.write("""
- ... from canonical.launchpad.scripts import execute_zcml_for_scripts
- ... from lp.registry.interfaces.person import IPersonSet
- ... from zope.component import getUtility
- ...
- ... execute_zcml_for_scripts()
- ... print getUtility(IPersonSet).get(1).displayname
- ... """)
- >>> script_file.flush()
+The full Zope component architecture is available from scripts so long
+as they call `execute_zcml_for_scripts()`.
+
+Let's make a simple example script that uses getUtility and the database
+to demonstrate this:
+
+ >>> import os
+ >>> import subprocess
+ >>> import tempfile
+ >>> from textwrap import dedent
+ >>> script_file = tempfile.NamedTemporaryFile()
+ >>> script_file.write(dedent("""\
+ ... from canonical.launchpad.scripts import execute_zcml_for_scripts
+ ... from lp.registry.interfaces.person import IPersonSet
+ ... from zope.component import getUtility
+ ...
+ ... execute_zcml_for_scripts()
+ ... print getUtility(IPersonSet).get(1).displayname
+ ... """))
+ >>> script_file.flush()
Run the script (making sure it uses the testrunner configuration).
- >>> from canonical.config import config
- >>> bin_py = os.path.join(config.root, 'bin', 'py')
- >>> proc = subprocess.Popen([bin_py, script_file.name],
- ... stdout=subprocess.PIPE, stderr=None)
+ >>> from canonical.config import config
+ >>> bin_py = os.path.join(config.root, 'bin', 'py')
+ >>> proc = subprocess.Popen(
+ ... [bin_py, script_file.name], stdout=subprocess.PIPE, stderr=None)
Check that we get the expected output.
- >>> print proc.stdout.read()
- Mark Shuttleworth
- >>> print proc.wait()
- 0
+ >>> print proc.stdout.read()
+ Mark Shuttleworth
+
+ >>> print proc.wait()
+ 0
Remove the temporary file.
- >>> script_file.close()
-
+ >>> script_file.close()
=== modified file 'lib/canonical/launchpad/pagetitles.py'
--- lib/canonical/launchpad/pagetitles.py 2011-09-18 19:25:11 +0000
+++ lib/canonical/launchpad/pagetitles.py 2011-09-26 07:13:14 +0000
@@ -118,6 +118,7 @@
branch_index = ContextDisplayName(smartquote('"%s" branch in Launchpad'))
+
def branchmergeproposal_index(context, view):
return 'Proposal to merge %s' % context.source_branch.bzr_identity
@@ -144,10 +145,12 @@
buglinktarget_unlinkbugs = 'Remove links to bug reports'
+
def buglisting_embedded_advanced_search(context, view):
"""Return the view's page heading."""
return view.getSearchPageHeading()
+
def bugnomination_edit(context, view):
"""Return the title for the page to manage bug nominations."""
return 'Manage nomination for bug #%d in %s' % (
@@ -178,6 +181,7 @@
codeimport_machines = ViewLabel()
+
def codeimport_machine_index(context, view):
return smartquote('Code Import machine "%s"' % context.hostname)
=== modified file 'lib/canonical/launchpad/scripts/tests/test_hwdb_submission_parser.py'
--- lib/canonical/launchpad/scripts/tests/test_hwdb_submission_parser.py 2011-09-01 13:43:32 +0000
+++ lib/canonical/launchpad/scripts/tests/test_hwdb_submission_parser.py 2011-09-26 07:13:14 +0000
@@ -493,6 +493,7 @@
def testDevice(self):
"""A device node is converted into a dictionary."""
test = self
+
def _parseProperties(self, node):
test.assertTrue(isinstance(self, SubmissionParser))
test.assertEqual(node.tag, 'device')
@@ -533,6 +534,7 @@
def testHal(self):
"""The <hal> node is converted into a Python dict."""
test = self
+
def _parseDevice(self, node):
test.assertTrue(isinstance(self, SubmissionParser))
test.assertEqual(node.tag, 'device')
@@ -559,6 +561,7 @@
The list elements represent the <processor> nodes.
"""
test = self
+
def _parseProperties(self, node):
test.assertTrue(isinstance(self, SubmissionParser))
test.assertEqual(node.tag, 'processor')
@@ -2256,7 +2259,6 @@
"set(['model']) "
"'/devices/pci0000:00/0000:00:1f.1/host4/target4:0:0/4:0:0:0'")
-
class UdevTestSubmissionParser(SubmissionParser):
"""A variant of SubmissionParser that shortcuts udev related tests.
=== modified file 'lib/canonical/launchpad/scripts/tests/test_hwdb_submission_processing.py'
--- lib/canonical/launchpad/scripts/tests/test_hwdb_submission_processing.py 2011-09-01 14:55:57 +0000
+++ lib/canonical/launchpad/scripts/tests/test_hwdb_submission_processing.py 2011-09-26 07:13:14 +0000
@@ -54,6 +54,16 @@
)
+def evaluate_property(value):
+ """Evaluate a property.
+
+ This function does nothing in itself; passing it a property evaluates the
+ property. But it lets the code express that the evaluation is all that's
+ needed, without assigning to an unused variable etc.
+ """
+ return value
+
+
class TestCaseHWDB(TestCase):
"""Common base class for HWDB processing tests."""
@@ -482,7 +492,7 @@
# The warning appears only once per submission, even if the
# property kernel_package_name is accessed more than once.
num_warnings = len(self.handler.records)
- test = parser.kernel_package_name
+ evaluate_property(parser.kernel_package_name)
self.assertEqual(
num_warnings, len(self.handler.records),
'Warning for missing HAL property system.kernel.version '
@@ -524,7 +534,7 @@
# The warning appears only once per submission, even if the
# property kernel_package_name is accessed more than once.
num_warnings = len(self.handler.records)
- test = parser.kernel_package_name
+ evaluate_property(parser.kernel_package_name)
self.assertEqual(
num_warnings, len(self.handler.records),
'Warning for missing HAL property system.kernel.version '
@@ -637,7 +647,7 @@
# The warning appears only once per submission, even if the
# property kernel_package_name is accessed more than once.
num_warnings = len(self.handler.records)
- test = parser.kernel_package_name
+ evaluate_property(parser.kernel_package_name)
self.assertEqual(
num_warnings, len(self.handler.records),
'Warning for missing HAL property system.kernel.version '
@@ -675,7 +685,7 @@
# The warning appears only once per submission, even if the
# property kernel_package_name is accessed more than once.
num_warnings = len(self.handler.records)
- test = parser.kernel_package_name
+ evaluate_property(parser.kernel_package_name)
self.assertEqual(
num_warnings, len(self.handler.records),
'Warning for missing HAL property system.kernel.version '
@@ -925,7 +935,6 @@
'Unexpected value of HALDevice.raw_bus for '
'HAL property info.bus.')
-
def test_HALDevice_scsi_controller_usb_storage_device(self):
"""test of HALDevice.scsi_controller.
@@ -1309,7 +1318,7 @@
(1, HWBus.IDE),
(2, HWBus.FLOPPY),
(3, HWBus.IPI),
- (4, None), # subclass RAID is ignored.
+ (4, None), # subclass RAID is ignored.
(5, HWBus.ATA),
(6, HWBus.SATA),
(7, HWBus.SAS),
@@ -1694,7 +1703,6 @@
'scsi_generic', 'scsi_host', 'sound', 'ssb', 'tty', 'usb'
or 'video4linux'.
"""
- UDI_SSB = '/org/freedesktop/Hal/devices/ssb__null__0'
devices = [
{
'id': 1,
@@ -1762,12 +1770,12 @@
}
pci_subclass_bus = (
- (0, True), # a real SCSI controller
- (1, True), # an IDE device
- (4, False), # subclass RAID is ignored.
- (5, True), # an ATA device
- (6, True), # a SATA device
- (7, True), # a SAS device
+ (0, True), # a real SCSI controller
+ (1, True), # an IDE device
+ (4, False), # subclass RAID is ignored.
+ (5, True), # an ATA device
+ (6, True), # a SATA device
+ (7, True), # a SAS device
)
parser = SubmissionParser(self.log)
@@ -2095,11 +2103,11 @@
},
}
parser = SubmissionParser(self.log)
- properties = devices[0]['properties']
parser.buildHalDeviceList(parsed_data)
device = parser.devices[self.UDI_COMPUTER]
- self.failUnless(device.has_reliable_data,
- 'Root device not treated as having reliable data.')
+ self.failUnless(
+ device.has_reliable_data,
+ "Root device not treated as having reliable data.")
def testHasReliableDataForInsuffientData(self):
"""Test of HALDevice.has_reliable_data, insufficent device data.
@@ -3164,7 +3172,7 @@
parser.submission_key,
'USB device found with vendor ID==0, product ID==0, where the '
'parent device does not look like a USB host controller: '
- + self.UDI_USB_CONTROLLER_USB_SIDE)
+ + self.UDI_USB_CONTROLLER_USB_SIDE)
self.renameInfoBusToInfoSubsystem()
parser.buildHalDeviceList(self.parsed_data)
@@ -3177,7 +3185,7 @@
parser.submission_key,
'USB device found with vendor ID==0, product ID==0, where the '
'parent device does not look like a USB host controller: '
- + self.UDI_USB_CONTROLLER_USB_SIDE)
+ + self.UDI_USB_CONTROLLER_USB_SIDE)
def testUSBHostControllerUnexpectedParentBus(self):
"""Test of HALDevice.is_real_device: info.bus == 'usb_device'.
@@ -4094,7 +4102,6 @@
for kwargs in device_data:
device = UdevDevice(parser, **kwargs)
devices[device.device_id] = device
- parent = device
# Build the parent-child relations so that the parent device
# is that device which has the longest path matching the
@@ -4111,7 +4118,7 @@
device_paths = sorted(devices, key=len, reverse=True)
for path_index, path in enumerate(device_paths):
- for parent_path in device_paths[path_index+1:]:
+ for parent_path in device_paths[path_index + 1:]:
if path.startswith(parent_path):
devices[parent_path].addChild(devices[path])
break
@@ -4366,7 +4373,6 @@
"<DBItem HWBus.SYSTEM, (0) System> None 'LIFEBOOK E8210' "
"'LIFEBOOK E8210' /devices/LNXSYSTM:00")
-
def test_has_reliable_data_system_no_product_name(self):
"""Test of UdevDevice.has_reliable_data for a system.
@@ -5103,7 +5109,6 @@
def assertSampleDeviceCreated(
self, bus, vendor_id, product_id, driver_name, submission):
"""Assert that data for the device exists in HWDB tables."""
- device_set = getUtility(IHWDeviceSet)
device = getUtility(IHWDeviceSet).getByDeviceID(
bus, vendor_id, product_id)
self.assertNotEqual(
=== modified file 'lib/canonical/launchpad/tests/test_login.py'
--- lib/canonical/launchpad/tests/test_login.py 2011-08-31 09:57:35 +0000
+++ lib/canonical/launchpad/tests/test_login.py 2011-09-26 07:13:14 +0000
@@ -44,7 +44,7 @@
# person and account created later don't end up with the same IDs,
# which could happen since they're both sequential.
# We need them to be different for one of our tests here.
- dummy_account = getUtility(IAccountSet).new(
+ getUtility(IAccountSet).new(
AccountCreationRationale.UNKNOWN, 'Dummy name')
person = self.factory.makePerson('foo.bar@xxxxxxxxxxx')
self.failIfEqual(person.id, person.account.id)
=== modified file 'lib/canonical/launchpad/webapp/dbpolicy.py'
--- lib/canonical/launchpad/webapp/dbpolicy.py 2011-09-19 13:23:38 +0000
+++ lib/canonical/launchpad/webapp/dbpolicy.py 2011-09-26 07:13:14 +0000
@@ -335,7 +335,7 @@
logging.error(
"No data in DatabaseReplicationLag for node %d"
% slave_node_id)
- return timedelta(days=999) # A long, long time.
+ return timedelta(days=999)
return lag[0]
=== modified file 'lib/canonical/launchpad/webapp/ftests/test_adapter.txt'
--- lib/canonical/launchpad/webapp/ftests/test_adapter.txt 2011-09-20 07:47:13 +0000
+++ lib/canonical/launchpad/webapp/ftests/test_adapter.txt 2011-09-26 07:13:14 +0000
@@ -414,7 +414,8 @@
>>> clear_request_started()
-== Switching Database Users ==
+Switching Database Users
+========================
The Launchpad store uses canonical.config to configure its
connection. This can be adjusted by choosing a different database
=== modified file 'lib/canonical/librarian/libraryprotocol.py'
--- lib/canonical/librarian/libraryprotocol.py 2011-09-18 06:53:21 +0000
+++ lib/canonical/librarian/libraryprotocol.py 2011-09-26 07:13:14 +0000
@@ -5,7 +5,6 @@
from datetime import datetime
from pytz import utc
-import sys
from twisted.internet import protocol
from twisted.internet.threads import deferToThread
@@ -40,10 +39,10 @@
Recognised headers are:
:Content-Type: a mime-type to associate with the file
- :File-Content-ID: if specified, the integer file id for this file. If not
- specified, the server will generate one.
- :File-Alias-ID: if specified, the integer file alias id for this file. If
- not specified, the server will generate one.
+ :File-Content-ID: if specified, the integer file id for this file.
+ If not specified, the server will generate one.
+ :File-Alias-ID: if specified, the integer file alias id for this file.
+ If not specified, the server will generate one.
:File-Expires: if specified, the expiry time of this alias in ISO 8601
format. As per LibrarianGarbageCollection.
:Database-Name: if specified, the name of the database the client is
@@ -55,9 +54,9 @@
Unrecognised headers will be ignored.
- If something goes wrong, the server will reply with a 400 (bad request, i.e.
- client error) or 500 (internal server error) response codes instead, and an
- appropriate message.
+ If something goes wrong, the server will reply with a 400 (bad request,
+ i.e. client error) or 500 (internal server error) response codes instead,
+ and an appropriate message.
Once the server has replied, the client may re-use the connection as if it
were just established to start a new upload.
@@ -209,12 +208,13 @@
self.newFile.append(realdata)
if self.bytesLeft == 0:
- # Store file
+ # Store file.
deferred = self._storeFile()
+
def _sendID((fileID, aliasID)):
- # Send ID to client
+ # Send ID to client.
if self.newFile.contentID is None:
- # Respond with deprecated server-generated IDs
+ # Respond with deprecated server-generated IDs.
self.sendLine('200 %s/%s' % (fileID, aliasID))
else:
self.sendLine('200')
@@ -224,7 +224,7 @@
deferred.addErrback(self.protocolErrors)
deferred.addErrback(self.unknownError)
- # Treat remaining bytes (if any) as a new command
+ # Treat remaining bytes (if any) as a new command.
self.state = 'command'
self.setLineMode(rest)
@@ -240,5 +240,6 @@
class FileUploadFactory(protocol.Factory):
protocol = FileUploadProtocol
+
def __init__(self, fileLibrary):
self.fileLibrary = fileLibrary
=== modified file 'lib/lp/app/browser/launchpadform.py'
--- lib/lp/app/browser/launchpadform.py 2011-08-30 11:07:14 +0000
+++ lib/lp/app/browser/launchpadform.py 2011-09-26 07:13:14 +0000
@@ -116,7 +116,8 @@
self.setUpWidgets()
data = {}
- errors, action = form.handleSubmit(self.actions, data, self._validate)
+ errors, form_action = form.handleSubmit(
+ self.actions, data, self._validate)
# no action selected, so return
if action is None:
@@ -240,8 +241,8 @@
If False is returned, the view or template probably needs to explain
why no actions can be performed and offer a cancel link.
"""
- for action in self.actions:
- if action.available():
+ for form_action in self.actions:
+ if form_action.available():
return True
return False
=== modified file 'lib/lp/app/browser/tests/test_vocabulary.py'
--- lib/lp/app/browser/tests/test_vocabulary.py 2011-09-20 20:33:47 +0000
+++ lib/lp/app/browser/tests/test_vocabulary.py 2011-09-26 07:13:14 +0000
@@ -224,6 +224,7 @@
distroarchseries=archseries)
self.assertEqual("fnord", self.getPickerEntry(dsp).description)
+
class TestProductPickerEntrySourceAdapter(TestCaseWithFactory):
layer = DatabaseFunctionalLayer
@@ -328,6 +329,7 @@
self.assertEqual(
expected_details, entry.details[0])
+
class TestDistributionPickerEntrySourceAdapter(TestCaseWithFactory):
layer = DatabaseFunctionalLayer
@@ -370,9 +372,10 @@
'distribution', self.getPickerEntry(distribution).target_type)
def test_distribution_truncates_summary(self):
- summary = ("This is a deliberately, overly long summary. It goes on"
- "and on and on so as to break things up a good bit.")
- distribution= self.factory.makeDistribution(summary=summary)
+ summary = (
+ "This is a deliberately, overly long summary. It goes on "
+ "and on and on so as to break things up a good bit.")
+ distribution = self.factory.makeDistribution(summary=summary)
index = summary.rfind(' ', 0, 45)
expected_summary = summary[:index + 1]
expected_details = summary[index:]
@@ -382,6 +385,7 @@
self.assertEqual(
expected_details, entry.details[0])
+
class TestPersonVocabulary:
implements(IHugeVocabulary)
test_persons = []
=== modified file 'lib/lp/app/browser/vocabulary.py'
--- lib/lp/app/browser/vocabulary.py 2011-09-20 20:34:30 +0000
+++ lib/lp/app/browser/vocabulary.py 2011-09-26 07:13:14 +0000
@@ -260,7 +260,7 @@
picker_entry.details = []
summary = picker_entry.description
if len(summary) > 45:
- index = summary.rfind(' ', 0, 45)
+ index = summary.rfind(' ', 0, 45)
first_line = summary[0:index + 1]
second_line = summary[index:]
else:
@@ -268,15 +268,15 @@
second_line = ''
if len(second_line) > 90:
- index = second_line.rfind(' ', 0, 90)
- second_line = second_line[0:index+1]
+ index = second_line.rfind(' ', 0, 90)
+ second_line = second_line[:index + 1]
picker_entry.description = first_line
picker_entry.details.append(second_line)
picker_entry.alt_title = target.name
picker_entry.target_type = self.target_type
maintainer = self.getMaintainer(target)
if maintainer is not None:
- picker_entry.details.append(
+ picker_entry.details.append(
'Maintainer: %s' % self.getMaintainer(target))
return entries
=== modified file 'lib/lp/app/javascript/choiceedit/choiceedit.js'
--- lib/lp/app/javascript/choiceedit/choiceedit.js 2011-08-31 21:34:40 +0000
+++ lib/lp/app/javascript/choiceedit/choiceedit.js 2011-09-26 07:13:14 +0000
@@ -43,8 +43,10 @@
ChoiceSource.NAME = CHOICESOURCE;
/**
- * Dictionary of selectors to define subparts of the widget that we care about.
- * YUI calls ATTRS.set(foo) for each foo defined here
+ * Dictionary of selectors to define subparts of the widget that we care
+ * about.
+ *
+ * YUI calls ATTRS.set(foo) for each foo defined here.
*
* @property InlineEditor.HTML_PARSER
* @type Object
@@ -106,7 +108,9 @@
/**
* Y.Node (img) displaying the editicon, which is exchanged for a spinner
- * while saving happens. Should be automatically calculated by HTML_PARSER.
+ * while saving happens. Should be automatically calculated by
+ * HTML_PARSER.
+ *
* Setter function returns Y.one(parameter) so that you can pass
* either a Node (as expected) or a selector.
*
@@ -176,10 +180,11 @@
*/
_bindUIChoiceSource: function() {
var that = this;
+ var clickable_element;
if (this.get('clickable_content')) {
- var clickable_element = this.get('contentBox');
+ clickable_element = this.get('contentBox');
} else {
- var clickable_element = this.get('editicon');
+ clickable_element = this.get('editicon');
}
clickable_element.on("click", this.onClick, this);
@@ -203,7 +208,8 @@
var items = this.get("items");
var value = this.get("value");
var node = this.get("value_location");
- for (var i=0; i<items.length; i++) {
+ var i;
+ for (i = 0; i < items.length; i++) {
if (items[i].value == value) {
node.set("innerHTML", items[i].source_name || items[i].name);
}
@@ -445,7 +451,8 @@
var items = this.get("items");
var value = this.get("value");
var li;
- for (var i=0; i<items.length; i++) {
+ var i;
+ for (i = 0; i < items.length; i++) {
if (items[i].disabled) {
li = Y.Node.create('<li><span class="disabled">' +
items[i].name + '</span></li>');
@@ -487,7 +494,8 @@
var target = e.currentTarget;
var value = target._value;
var items = that.get("items");
- for (var i=0; i<items.length; i++) {
+ var i;
+ for (i = 0; i < items.length; i++) {
if (items[i].value == value) {
that.fire("valueChosen", items[i].value);
that.destroy();
@@ -524,6 +532,8 @@
*/
_positionCorrectly: function(e) {
var boundingBox = this.get('boundingBox');
+ var client_width = document.body.clientWidth;
+ var offset_width = boundingBox.get('offsetWidth');
var selectedListItem = boundingBox.one('span.current');
valueX = this._mouseX - (boundingBox.get('offsetWidth') / 2);
var valueY;
@@ -535,14 +545,12 @@
} else {
valueY = this._mouseY - (boundingBox.get('offsetHeight') / 2);
}
+ if (valueX > client_width - offset_width) {
+ valueX = client_width - offset_width;
+ }
if (valueX < 0) {
valueX = 0;
}
- if ((valueX >
- document.body.clientWidth - boundingBox.get('offsetWidth')) &&
- (document.body.clientWidth > boundingBox.get('offsetWidth'))) {
- valueX = document.body.clientWidth - boundingBox.get('offsetWidth');
- }
if (valueY < 0) {
valueY = 0;
}
@@ -557,7 +565,7 @@
},
/**
- * Return the absolute position of any node
+ * Return the absolute position of any node.
*
* @private
* @method _findPosition
@@ -569,7 +577,8 @@
do {
curleft += obj.get("offsetLeft");
curtop += obj.get("offsetTop");
- } while ((obj = obj.get("offsetParent")));
+ obj = obj.get("offsetParent");
+ } while (Y.Lang.isValue(obj));
}
return [curleft,curtop];
}
@@ -635,7 +644,8 @@
return (Y.Lang.isValue(item.value));
});
}
- for (var i = 0; i < v.length; i++) {
+ var i;
+ for (i = 0; i < v.length; i++) {
if (!Y.Lang.isValue(v[i].value) &&
v[i].name.indexOf('<img') == -1) {
// Only append the icon if the value for this item is
=== modified file 'lib/lp/app/javascript/picker/picker.js'
--- lib/lp/app/javascript/picker/picker.js 2011-09-17 02:55:39 +0000
+++ lib/lp/app/javascript/picker/picker.js 2011-09-26 07:13:14 +0000
@@ -465,7 +465,7 @@
.set('text', data.target_type)
.addClass('badge');
}
- return badges
+ return badges;
},
/**
=== modified file 'lib/lp/app/javascript/privacy.js'
--- lib/lp/app/javascript/privacy.js 2011-09-05 06:47:09 +0000
+++ lib/lp/app/javascript/privacy.js 2011-09-26 07:13:14 +0000
@@ -10,8 +10,9 @@
*/
function setup_privacy_notification(config) {
- if (notification_node !== null)
+ if (notification_node !== null) {
return;
+ }
var notification_text = 'The information on this page is private';
var hidden = true;
var target_id = "maincontent";
=== modified file 'lib/lp/app/stories/basics/notfound-error.txt'
--- lib/lp/app/stories/basics/notfound-error.txt 2011-09-12 21:23:06 +0000
+++ lib/lp/app/stories/basics/notfound-error.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= 404 pages =
+404 pages
+=========
If you follow a link from another page to a Launchpad page that doesn't
exist, the 404 page includes a link back to the referring page.
=== modified file 'lib/lp/app/templates/launchpad-notfound.pt'
--- lib/lp/app/templates/launchpad-notfound.pt 2011-08-31 16:00:31 +0000
+++ lib/lp/app/templates/launchpad-notfound.pt 2011-09-26 07:13:14 +0000
@@ -10,8 +10,8 @@
<div class="top-portlet" metal:fill-slot="main">
<h1>Lost something?</h1>
<p>
- This page does not exist, or you may not have permission to see it.
- </p>
+ This page does not exist, or you may not have permission to see it.
+ </p>
<p>
If you have been to this page before, it is possible it has been removed.
@@ -22,7 +22,7 @@
sorry about that.
We’ve recorded the problem,
and we’ll fix it as soon as we can.
- </p>
+ </p>
<p>
Otherwise, complain to the maintainers of the page that linked here.
</p>
=== modified file 'lib/lp/archiveuploader/tests/nascentupload-closing-bugs.txt'
--- lib/lp/archiveuploader/tests/nascentupload-closing-bugs.txt 2011-09-09 13:52:21 +0000
+++ lib/lp/archiveuploader/tests/nascentupload-closing-bugs.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Closing Bugs when Publishing Accepted Source =
+Closing Bugs when Publishing Accepted Source
+============================================
We have implemented 'premature publication of accepted sources' in
NascentUpload to increase the publishing throughput (see
@@ -11,10 +12,11 @@
of 'bar' source package in ubuntu/hoary.
-== Publishing package and creating a bugtask ==
+Publishing package and creating a bugtask
+-----------------------------------------
>>> bar_src = getUploadForSource(
- ... 'suite/bar_1.0-1/bar_1.0-1_source.changes')
+ ... 'suite/bar_1.0-1/bar_1.0-1_source.changes')
>>> bar_src.process()
>>> result = bar_src.do_accept()
>>> bar_src.queue_root.status.name
@@ -49,8 +51,8 @@
Inspect the current bugtasks for bug #6:
>>> for bugtask in the_bug.bugtasks:
- ... print bugtask.title
- ... print bugtask.status.name
+ ... print bugtask.title
+ ... print bugtask.status.name
Bug #6 in Mozilla Firefox: "Firefox crashes when ...
NEW
Bug #6 in bar (Ubuntu): "Firefox crashes when ...
@@ -64,7 +66,8 @@
>>> login('foo.bar@xxxxxxxxxxxxx')
-== Testing bug closing ==
+Testing bug closing
+-------------------
Once the base source is published every posterior version will be
automatically published in upload time as described in
@@ -104,8 +107,8 @@
>>> the_bug = getUtility(IBugSet).get(6)
>>> for bugtask in the_bug.bugtasks:
- ... print bugtask.title
- ... print bugtask.status.name
+ ... print bugtask.title
+ ... print bugtask.status.name
Bug #6 in Mozilla Firefox: "Firefox crashes when ...
NEW
Bug #6 in bar (Ubuntu): "Firefox crashes when ...
=== modified file 'lib/lp/archiveuploader/tests/nascentupload-epoch-handling.txt'
--- lib/lp/archiveuploader/tests/nascentupload-epoch-handling.txt 2011-09-18 08:43:52 +0000
+++ lib/lp/archiveuploader/tests/nascentupload-epoch-handling.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= NascentUpload Epoch Handling =
+NascentUpload Epoch Handling
+============================
NascentUpload class supports 'epoch' on versions as specified by:
@@ -20,7 +21,8 @@
DEVELOPMENT
-== Epoch Version Conflicts ==
+Epoch Version Conflicts
+-----------------------
Debian-like archives don't include epochs in the version specified in
the filename of pool files.
@@ -38,7 +40,8 @@
archive against the proposed source files before 'accepting' it.
-=== Collision in Upload-Time ===
+Collision in Upload-Time
+........................
When the 'non-epoched' version is already published before uploading
the candidate, we need to detect collisions at upload-time, returning
@@ -82,7 +85,8 @@
>>> transaction.abort()
-=== Collision in queue DONE ===
+Collision in queue DONE
+.......................
When the 'non-epoched' version is still in queue, we need to identify
collisions when publishing the new candidate:
@@ -149,7 +153,8 @@
>>> bar_src_queue_epoch.realiseUpload()
Traceback (most recent call last):
...
- QueueInconsistentStateError: bar_1.0-1.diff.gz is already published in archive for hoary
+ QueueInconsistentStateError: bar_1.0-1.diff.gz is already published in
+ archive for hoary
>>> bar_src_queue_epoch.status.name
'ACCEPTED'
@@ -167,7 +172,10 @@
>>> bar_src_queue_epoch2.realiseUpload()
Traceback (most recent call last):
...
- QueueInconsistentStateError: bar_1.0.orig.tar.gz is already published in archive for hoary with a different SHA1 hash (e918d6f5ec2184ae1d795a130da36c9a82c4beaf != 73a04163fee97fd2257ab266bd48f1d3d528e012)
+ QueueInconsistentStateError: bar_1.0.orig.tar.gz is already published in
+ archive for hoary with a different SHA1 hash
+ (e918d6f5ec2184ae1d795a130da36c9a82c4beaf !=
+ 73a04163fee97fd2257ab266bd48f1d3d528e012)
>>> bar_src_queue_epoch2.status.name
'ACCEPTED'
@@ -185,7 +193,8 @@
>>> transaction.abort()
-== Dealing with Epochs and diverging binary versions ==
+Dealing with Epochs and diverging binary versions
+-------------------------------------------------
Let's process an source upload and ensure that the resulting
SourcePackageRelease record store a proper 'version':
=== modified file 'lib/lp/bugs/browser/bugtask.py'
--- lib/lp/bugs/browser/bugtask.py 2011-09-23 13:25:35 +0000
+++ lib/lp/bugs/browser/bugtask.py 2011-09-26 07:13:14 +0000
@@ -784,7 +784,7 @@
if offset is None:
offset = self.visible_initial_comments
comments = self._getComments([
- slice(offset, offset+batch_size)])
+ slice(offset, offset + batch_size)])
else:
# the comment function takes 0-offset counts where comment 0 is
# the initial description, so we need to add one to the limits
=== modified file 'lib/lp/bugs/browser/tests/special/bugtarget-recently-touched-bugs.txt'
--- lib/lp/bugs/browser/tests/special/bugtarget-recently-touched-bugs.txt 2011-09-22 03:28:31 +0000
+++ lib/lp/bugs/browser/tests/special/bugtarget-recently-touched-bugs.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Recently Touched Bugs for a IBugTarget =
+Recently Touched Bugs for a IBugTarget
+======================================
Every IBugTarget has a portlet for showing the most recently touched
bugs (i.e bugs that have been recently modified/created).
@@ -44,7 +45,8 @@
Bug C
Bug A
-== Private bugs ==
+Private bugs
+------------
Only bugs that the user is allowed to see will be present in the list.
@@ -72,7 +74,8 @@
Bug A
Bug C
-== Duplicate bugs ==
+Duplicate bugs
+--------------
Bugs that are duplicates of other bugs will be omitted from the list as
well.
@@ -84,7 +87,8 @@
Bug A
Bug C
-== Limit ==
+Limit
+-----
By default only five bugs will be returned, but it's possible to specify
a custom limit.
=== modified file 'lib/lp/bugs/browser/tests/test_bugtask.py'
--- lib/lp/bugs/browser/tests/test_bugtask.py 2011-09-23 13:25:35 +0000
+++ lib/lp/bugs/browser/tests/test_bugtask.py 2011-09-26 07:13:14 +0000
@@ -149,7 +149,7 @@
f.makeSourcePackage(distroseries=ds, publish=True)
for i in range(5)]
for sp in sourcepackages:
- bugtask = f.makeBugTask(bug=bug, owner=owner, target=sp)
+ f.makeBugTask(bug=bug, owner=owner, target=sp)
url = canonical_url(bug.default_bugtask)
recorder = QueryCollector()
recorder.register()
@@ -1056,7 +1056,7 @@
bug.default_bugtask.product.owner, 'status',
BugTaskStatus.NEW, BugTaskStatus.TRIAGED)
bug.addChange(change)
- for i in range (number_of_comments):
+ for i in range(number_of_comments):
msg = self.factory.makeMessage(
owner=bug.owner, content="Message %i." % i)
bug.linkMessage(msg, user=bug.owner)
@@ -1084,7 +1084,7 @@
# already shown on the page won't appear twice).
bug_task = self.factory.makeBugTask()
view = create_initialized_view(bug_task, '+batched-comments')
- self.assertEqual(view.visible_initial_comments+1, view.offset)
+ self.assertEqual(view.visible_initial_comments + 1, view.offset)
view = create_initialized_view(
bug_task, '+batched-comments', form={'offset': 100})
self.assertEqual(100, view.offset)
=== modified file 'lib/lp/bugs/doc/bug-export.txt'
--- lib/lp/bugs/doc/bug-export.txt 2011-09-19 02:23:59 +0000
+++ lib/lp/bugs/doc/bug-export.txt 2011-09-26 07:13:14 +0000
@@ -2,12 +2,12 @@
==========
Some projects require regular exports of their bugs as a condition of
-using Launchpad. This provides them an exit strategy in the event
-that they can't use Launchpad anymore.
+using Launchpad. This provides them an exit strategy in the event that
+they can't use Launchpad anymore.
Since the aim is to provide an export of a product's bugs, we don't
-export information about all bug tasks -- only those for the product
-in question.
+export information about all bug tasks -- only those for the product in
+question.
The export is also limited to bug information -- it does not include
links to other Launchpad features such as specifications or questions.
@@ -16,100 +16,112 @@
Exporting one bug
-----------------
-We will export bug #1 in the context of Firefox. First some initial setup:
-
- >>> import sys
- >>> try:
- ... import xml.etree.cElementTree as ET
- ... except ImportError:
- ... import cElementTree as ET
- >>> from zope.component import getUtility
- >>> from cStringIO import StringIO
- >>> from lp.bugs.interfaces.bug import IBugSet
- >>> from lp.registry.interfaces.person import IPersonSet
- >>> from lp.registry.interfaces.product import IProductSet
- >>> from lp.bugs.scripts.bugexport import (
- ... serialise_bugtask, export_bugtasks)
-
+We will export bug #1 in the context of Firefox. First some initial
+setup:
+
+ >>> import sys
+ >>> try:
+ ... import xml.etree.cElementTree as ET
+ ... except ImportError:
+ ... import cElementTree as ET
+ >>> from zope.component import getUtility
+ >>> from cStringIO import StringIO
+ >>> from lp.bugs.interfaces.bug import IBugSet
+ >>> from lp.registry.interfaces.person import IPersonSet
+ >>> from lp.registry.interfaces.product import IProductSet
+ >>> from lp.bugs.scripts.bugexport import (
+ ... serialise_bugtask, export_bugtasks)
First get the bug task:
- >>> firefox = getUtility(IProductSet).getByName('firefox')
- >>> bug1 = getUtility(IBugSet).get(1)
- >>> bugtask = bug1.bugtasks[0]
- >>> bugtask.target == firefox
- True
+ >>> firefox = getUtility(IProductSet).getByName('firefox')
+ >>> bug1 = getUtility(IBugSet).get(1)
+ >>> bugtask = bug1.bugtasks[0]
+ >>> bugtask.target == firefox
+ True
Now we serialise it as XML, and print it:
- >>> node = serialise_bugtask(bugtask)
- >>> tree = ET.ElementTree(node)
- >>> tree.write(sys.stdout)
- <bug id="1">
- <private>False</private>
- <security_related>False</security_related>
- <datecreated>2004-01-01T20:58:04Z</datecreated>
- <title>Firefox does not support SVG</title>
- <description>Firefox needs to support embedded SVG images, now that the standard has been finalised.
- <BLANKLINE>
- The SVG standard 1.0 is complete, and draft implementations for Firefox exist. One of these implementations needs to be integrated with the base install of Firefox. Ideally, the implementation needs to include support for the manipulation of SVG objects from JavaScript to enable interactive and dynamic SVG drawings.</description>
- <reporter name="name12">Sample Person</reporter>
- <status>NEW</status>
- <importance>LOW</importance>
- <assignee name="mark">Mark Shuttleworth</assignee>
- <subscriptions>
- <subscriber name="name12">Sample Person</subscriber>
- <subscriber name="stevea">Steve Alexander</subscriber>
- </subscriptions>
- <comment>
- <sender name="name12">Sample Person</sender>
- <date>2004-09-24T21:17:17Z</date>
- <text>We've seen something very similar on AIX with Gnome 2.6 when it is compiled with XFT support. It might be that the anti-aliasing is causing loopback devices to degrade, resulting in a loss of transparency at the system cache level and decoherence in the undelete function. This is only known to be a problem when the moon is gibbous.</text>
- </comment>
- <comment>
- <sender name="name12">Sample Person</sender>
- <date>2004-09-24T21:24:03Z</date>
- <text>Sorry, it was SCO unix which appears to have the same bug. For a brief moment I was confused there, since so much code is known to have been copied from SCO into AIX.</text>
- </comment>
- </bug>
+ >>> node = serialise_bugtask(bugtask)
+ >>> tree = ET.ElementTree(node)
+ >>> tree.write(sys.stdout)
+ <bug id="1">
+ <private>False</private>
+ <security_related>False</security_related>
+ <datecreated>2004-01-01T20:58:04Z</datecreated>
+ <title>Firefox does not support SVG</title>
+ <description>Firefox needs to support embedded SVG images, now that the
+ standard has been finalised.
+ <BLANKLINE>
+ The SVG standard 1.0 is complete, and draft implementations for Firefox
+ exist. One of these implementations needs to be integrated with the base
+ install of Firefox. Ideally, the implementation needs to include support
+ for the manipulation of SVG objects from JavaScript to enable interactive
+ and dynamic SVG drawings.</description>
+ <reporter name="name12">Sample Person</reporter>
+ <status>NEW</status>
+ <importance>LOW</importance>
+ <assignee name="mark">Mark Shuttleworth</assignee>
+ <subscriptions>
+ <subscriber name="name12">Sample Person</subscriber>
+ <subscriber name="stevea">Steve Alexander</subscriber>
+ </subscriptions>
+ <comment>
+ <sender name="name12">Sample Person</sender>
+ <date>2004-09-24T21:17:17Z</date>
+ <text>We've seen something very similar on AIX with Gnome 2.6 when it is
+ compiled with XFT support. It might be that the anti-aliasing is causing
+ loopback devices to degrade, resulting in a loss of transparency at the
+ system cache level and decoherence in the undelete function. This is only
+ known to be a problem when the moon is gibbous.</text>
+ </comment>
+ <comment>
+ <sender name="name12">Sample Person</sender>
+ <date>2004-09-24T21:24:03Z</date>
+ <text>Sorry, it was SCO unix which appears to have the same bug. For a
+ brief moment I was confused there, since so much code is known to have
+ been copied from SCO into AIX.</text>
+ </comment>
+ </bug>
Exporting a Product's Bugs
--------------------------
-Rather than exporting a single bug, we'll usually want to export all
-the bugs for a product. The export_bugtasks() function does this by
+Rather than exporting a single bug, we'll usually want to export all the
+bugs for a product. The export_bugtasks() function does this by
successively serialising each of the tasks for that product.
- >>> import transaction
- >>> export_bugtasks(transaction, firefox, sys.stdout)
- <launchpad-bugs xmlns="https://launchpad.net/xmlns/2006/bugs">
- <bug id="1">
- ...
- </bug>
- <bug id="4">
- ...
- <title>Reflow problems with complex page layouts</title>
- ...
- <tags>
- <tag>layout-test</tag>
- </tags>
- ...
- </bug>
- <bug id="5">
- ...
- <title>Firefox install instructions should be complete</title>
- ...
- </bug>
- <bug id="6">
- ...
- <duplicateof>5</duplicateof>
- ...
- <title>Firefox crashes when Save As dialog for a nonexistent window is closed</title>
- ...
- ...
- </bug>
- </launchpad-bugs>
+ >>> import transaction
+ >>> export_bugtasks(transaction, firefox, sys.stdout)
+ <launchpad-bugs xmlns="https://launchpad.net/xmlns/2006/bugs">
+ <bug id="1">
+ ...
+ </bug>
+ <bug id="4">
+ ...
+ <title>Reflow problems with complex page layouts</title>
+ ...
+ <tags>
+ <tag>layout-test</tag>
+ </tags>
+ ...
+ </bug>
+ <bug id="5">
+ ...
+ <title>Firefox install instructions should be complete</title>
+ ...
+ </bug>
+ <bug id="6">
+ ...
+ <duplicateof>5</duplicateof>
+ ...
+ <title>Firefox crashes when Save As dialog for a nonexistent window is
+ closed</title>
+ ...
+ ...
+ </bug>
+ </launchpad-bugs>
Attachments
@@ -119,62 +131,62 @@
bug #1. We need to commit here so that the librarian can later serve
the file when we later serialise the bug:
- >>> login('test@xxxxxxxxxxxxx')
- >>> bug1 = getUtility(IBugSet).get(1)
- >>> sampleperson = getUtility(IPersonSet).getByEmail('test@xxxxxxxxxxxxx')
- >>> bug1.addAttachment(sampleperson, StringIO('Hello World'),
- ... 'Added attachment', 'hello.txt',
- ... description='"Hello World" attachment')
- <BugAttachment ...>
- >>> transaction.commit()
-
-A reference to the attachment is included with the new comment with
-the attachment contents encoded using base-64:
-
- >>> node = serialise_bugtask(bug1.bugtasks[0])
- >>> tree = ET.ElementTree(node)
- >>> tree.write(sys.stdout)
- <bug id="1">
- ...
- <comment>
- <sender name="name12">Sample Person</sender>
- <date>...</date>
- <text>Added attachment</text>
- <attachment href="http://bugs.launchpad.dev/bugs/1/.../+files/hello.txt">
- <type>UNSPECIFIED</type>
- <filename>hello.txt</filename>
- <title>"Hello World" attachment</title>
- <mimetype>text/plain</mimetype>
- <contents>SGVsbG8gV29ybGQ=
- </contents>
- </attachment>
- </comment>
- ...
+ >>> login('test@xxxxxxxxxxxxx')
+ >>> bug1 = getUtility(IBugSet).get(1)
+ >>> sampleperson = getUtility(IPersonSet).getByEmail('test@xxxxxxxxxxxxx')
+ >>> bug1.addAttachment(
+ ... sampleperson, StringIO('Hello World'), 'Added attachment',
+ ... 'hello.txt', description='"Hello World" attachment')
+ <BugAttachment ...>
+
+ >>> transaction.commit()
+
+A reference to the attachment is included with the new comment with the
+attachment contents encoded using base-64:
+
+ >>> node = serialise_bugtask(bug1.bugtasks[0])
+ >>> tree = ET.ElementTree(node)
+ >>> tree.write(sys.stdout)
+ <bug id="1">
+ ...
+ <comment>
+ <sender name="name12">Sample Person</sender>
+ <date>...</date>
+ <text>Added attachment</text>
+ <attachment href="http://bugs.launchpad.dev/bugs/1/.../+files/hello.txt">
+ <type>UNSPECIFIED</type>
+ <filename>hello.txt</filename>
+ <title>"Hello World" attachment</title>
+ <mimetype>text/plain</mimetype>
+ <contents>SGVsbG8gV29ybGQ=
+ </contents>
+ </attachment>
+ </comment>
+ ...
Private Bugs
------------
By default a bug export will not include any private bugs. However,
-they can be included by passing the --include-private flag to the
-import script. To test this, we'll make a bug private:
-
- >>> bug1.setPrivate(True, getUtility(ILaunchBag).user)
- True
- >>> transaction.commit()
-
+they can be included by passing the --include-private flag to the import
+script. To test this, we'll make a bug private:
+
+ >>> bug1.setPrivate(True, getUtility(ILaunchBag).user)
+ True
+
+ >>> transaction.commit()
Now we'll do a dump not including private bugs:
- >>> output = StringIO()
- >>> export_bugtasks(transaction, firefox, output)
- >>> '<bug id="1">' in output.getvalue()
- False
-
+ >>> output = StringIO()
+ >>> export_bugtasks(transaction, firefox, output)
+ >>> '<bug id="1">' in output.getvalue()
+ False
However, bug #1 will appear in the export if we include private bugs:
- >>> output = StringIO()
- >>> export_bugtasks(transaction, firefox, output, include_private=True)
- >>> '<bug id="1">' in output.getvalue()
- True
+ >>> output = StringIO()
+ >>> export_bugtasks(transaction, firefox, output, include_private=True)
+ >>> '<bug id="1">' in output.getvalue()
+ True
=== modified file 'lib/lp/bugs/doc/bug.txt'
--- lib/lp/bugs/doc/bug.txt 2011-09-21 13:27:17 +0000
+++ lib/lp/bugs/doc/bug.txt 2011-09-26 07:13:14 +0000
@@ -448,28 +448,29 @@
>>> params.setBugTarget(product=firefox)
>>> added_bug = getUtility(IBugSet).createBug(params)
Traceback (most recent call last):
- ...
+ ...
AssertionError: Expected either a comment or a msg, but got both.
So, let's continue:
- >>> [subscription.person.name for subscription in public_bug.subscriptions]
- [u'name16']
+ >>> for subscription in public_bug.subscriptions:
+ ... print subscription.person.name
+ name16
The first comment made (this is submitted in the bug report) is set to
the description of the bug:
- >>> public_bug.description
- u'blah blah blah'
+ >>> print public_bug.description
+ blah blah blah
The bug description can also be accessed through the task:
- >>> public_bug.bugtasks[0].bug.description
- u'blah blah blah'
+ >>> print public_bug.bugtasks[0].bug.description
+ blah blah blah
>>> public_bug.description = 'a new description'
- >>> public_bug.bugtasks[0].bug.description
- u'a new description'
+ >>> print public_bug.bugtasks[0].bug.description
+ a new description
When a private bug is filed:
@@ -1130,9 +1131,10 @@
It's done in a way that we only issue two queries to fetch all this
information, too:
-XXX: bug=https://bugs.launchpad.net/storm/+bug/619017 means that this sometimes
-does 3 queries, depending on the precise state of the storm cache. To avoid
-spurious failures it has been changed to tolerate this additional query.
+XXX RobertCollins <unknown date> bug=619017: Storm bug 619017 means that this
+sometimes does 3 queries, depending on the precise state of the storm cache.
+To avoid spurious failures it has been changed to tolerate this additional
+query.
>>> len(CursorWrapper.last_executed_sql) - queries <= 3
True
@@ -1165,9 +1167,12 @@
... print '%s\t%s\t%s' % (
... indexed_message.index, indexed_message.subject,
... indexed_message.inside.title)
- 0 PEBCAK Bug #2 in Tomcat: "Blackhole Trash folder"
- 1 Fantastic idea, I'd really like to see this Bug #2 in Tomcat: "Blackhole Trash folder"
- 2 Strange bug with duplicate messages. Bug #2 in Tomcat: "Blackhole Trash folder"
+ 0 PEBCAK
+ Bug #2 in Tomcat: "Blackhole Trash folder"
+ 1 Fantastic idea, I'd really like to see this
+ Bug #2 in Tomcat: "Blackhole Trash folder"
+ 2 Strange bug with duplicate messages.
+ Bug #2 in Tomcat: "Blackhole Trash folder"
Affected users
=== modified file 'lib/lp/bugs/doc/bugnotification-email.txt'
--- lib/lp/bugs/doc/bugnotification-email.txt 2011-09-22 05:21:44 +0000
+++ lib/lp/bugs/doc/bugnotification-email.txt 2011-09-26 07:13:14 +0000
@@ -93,7 +93,8 @@
New security related bugs are sent with a prominent warning:
- >>> changed = bug_four.setSecurityRelated(True, getUtility(ILaunchBag).user)
+ >>> changed = bug_four.setSecurityRelated(
+ ... True, getUtility(ILaunchBag).user)
>>> subject, body = generate_bug_add_email(bug_four)
>>> subject
@@ -214,6 +215,9 @@
+ It's also smart enough to preserve whitespace, finally!
-----------------------------
+(Note that there's a blank line in the email that contains whitespace. You
+may see a lint warning for that.)
+
Let's make the bug security-related, and private (we need to switch
logins to a user that is explicitly subscribed to this bug):
@@ -221,7 +225,8 @@
>>> edited_bug.setPrivate(True, getUtility(ILaunchBag).user)
True
- >>> changed = edited_bug.setSecurityRelated(True, getUtility(ILaunchBag).user)
+ >>> changed = edited_bug.setSecurityRelated(
+ ... True, getUtility(ILaunchBag).user)
>>> bug_delta = BugDelta(
... bug=edited_bug,
... bugurl="http://www.example.com/bugs/2",
@@ -247,7 +252,8 @@
>>> edited_bug.setPrivate(False, getUtility(ILaunchBag).user)
True
- >>> changed = edited_bug.setSecurityRelated(False, getUtility(ILaunchBag).user)
+ >>> changed = edited_bug.setSecurityRelated(
+ ... False, getUtility(ILaunchBag).user)
>>> bug_delta = BugDelta(
... bug=edited_bug,
... bugurl="http://www.example.com/bugs/2",
=== modified file 'lib/lp/bugs/doc/bugtask-expiration.txt'
--- lib/lp/bugs/doc/bugtask-expiration.txt 2011-09-23 11:30:13 +0000
+++ lib/lp/bugs/doc/bugtask-expiration.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Bugtask Expiration =
+Bugtask Expiration
+==================
Old unattended Incomplete bugtasks clutter the search results of
Launchpad Bugs making the bug staff's job difficult. A script is run
@@ -21,7 +22,8 @@
all the rules stated above.
-== findExpirableBugTasks() Part 1 ==
+findExpirableBugTasks() Part 1
+------------------------------
BugTaskSet provides findExpirableBugTasks() to find bugtasks that
qualify for expiration. The bugtasks must must meet all the
@@ -62,7 +64,9 @@
... date_modified = datetime.now(UTC) - timedelta(days=days_ago)
... bug.date_last_updated = date_modified
-== Setup ==
+
+Setup
+-----
Let's make some bugtasks that qualify for expiration. A Jokosher
bugtask and a conjoined pair of ubuntu_hoary and ubuntu bugtasks
@@ -240,7 +244,8 @@
recent False 31 Incomplete False False False False
no_expire False 61 Incomplete False False False False
-== isExpirable() ==
+isExpirable()
+-------------
In addition to can_expire bugs have an isExpirable method to which a custom
number of days, days_old, can be passed. days_old is then used with
@@ -256,8 +261,8 @@
>>> very_old_bugtask.transitionToStatus(
... BugTaskStatus.INVALID, sample_person)
- # Pass isExpirable() a days_old parameter, then set the bug to Invalid so it
- # doesn't affect the rest of the doctest
+ # Pass isExpirable() a days_old parameter, then set the bug to Invalid so
+ # it doesn't affect the rest of the doctest.
>>> from lp.bugs.tests.bug import create_old_bug
>>> not_so_old_bugtask = create_old_bug('expirable_distro', 31, ubuntu)
>>> not_so_old_bugtask.bug.isExpirable(days_old=14)
@@ -266,7 +271,8 @@
... BugTaskStatus.INVALID, sample_person)
-== findExpirableBugTasks() Part 2 ==
+findExpirableBugTasks() Part 2
+------------------------------
The value of the min_days_old controls the bugtasks that are
returned. The oldest bug in this test is 351 days old, the youngest is
@@ -375,7 +381,8 @@
ROLE EXPIRE AGE STATUS ASSIGNED DUP MILE REPLIES
-== Privacy ==
+Privacy
+-------
The user parameter indicates which user is performing the search. Only
bugs that the user has permission to view are returned. A value of None
@@ -443,7 +450,8 @@
True
>>> reset_bug_modified_date(private_bug, 351)
-== The default expiration age ==
+The default expiration age
+--------------------------
The expiration age is set using the
config.malone.days_before_expiration configuration variable. It
@@ -457,7 +465,8 @@
60
-== Running the script ==
+Running the script
+------------------
There are no Expired Bugtasks in sampledata, from the tests above.
@@ -502,7 +511,8 @@
>>> bugtasks = [BugTask.get(bugtask.id) for bugtask in bugtasks]
-== After the script has run ==
+After the script has run
+------------------------
There are three Expired bugtasks. Jokosher, hoary and ubuntu were
expired by the expiration process. Although ubuntu was never returned
@@ -549,7 +559,8 @@
Launchpad Janitor Ubuntu Hoary: status Incomplete Expired
-== enable_bug_expiration ==
+enable_bug_expiration
+---------------------
The bugtask no_expiration_bugtask has not been expired because it does
not participate in bug expiration. When uses_bug_expiration is set to
=== modified file 'lib/lp/bugs/doc/checkwatches-cli-switches.txt'
--- lib/lp/bugs/doc/checkwatches-cli-switches.txt 2011-09-19 02:23:59 +0000
+++ lib/lp/bugs/doc/checkwatches-cli-switches.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Updating selected bug trackers =
+Updating selected bug trackers
+==============================
The CheckwatchesMaster class can be instructed to update only a subset of
bugtrackers. This is acheived by passing a list of bug tracker names to
@@ -105,7 +106,8 @@
DEBUG Enabled by DEFAULT section
INFO Resetting 5 bug watches for bug tracker 'savannah'
INFO Updating 5 watches on bug tracker 'savannah'
- INFO 'Unsupported Bugtracker' error updating http://savannah.gnu.org/: SAVANE
+ INFO 'Unsupported Bugtracker' error updating http://savannah.gnu.org/:
+ SAVANE
INFO 0 watches left to check on bug tracker 'savannah'
INFO Time for this run...
=== modified file 'lib/lp/bugs/javascript/bugtask_index.js'
--- lib/lp/bugs/javascript/bugtask_index.js 2011-09-16 00:11:16 +0000
+++ lib/lp/bugs/javascript/bugtask_index.js 2011-09-26 07:13:14 +0000
@@ -375,7 +375,7 @@
on: {
start: function () {
subscribers_list.subscribers_list.startActivity(
- 'Updating subscribers...')
+ 'Updating subscribers...');
},
end: function () {
subscribers_list.subscribers_list.stopActivity();
=== modified file 'lib/lp/bugs/javascript/tests/test_async_comment_loading.js'
--- lib/lp/bugs/javascript/tests/test_async_comment_loading.js 2011-09-09 11:35:34 +0000
+++ lib/lp/bugs/javascript/tests/test_async_comment_loading.js 2011-09-26 07:13:14 +0000
@@ -113,7 +113,7 @@
'<div>' + comments_markup + '</div>';
Assert.areEqual(
expected_markup, this.comments_container.get('innerHTML'));
- },
+ }
}));
=== modified file 'lib/lp/bugs/model/bug.py'
--- lib/lp/bugs/model/bug.py 2011-09-26 03:26:54 +0000
+++ lib/lp/bugs/model/bug.py 2011-09-26 07:13:14 +0000
@@ -2239,6 +2239,7 @@
BugActivity.datechanged <= end_date)
return activity_in_range
+
@ProxyFactory
def get_also_notified_subscribers(
bug_or_bugtask, recipients=None, level=None):
=== modified file 'lib/lp/bugs/model/tests/test_bug.py'
--- lib/lp/bugs/model/tests/test_bug.py 2011-09-23 22:35:35 +0000
+++ lib/lp/bugs/model/tests/test_bug.py 2011-09-26 07:13:14 +0000
@@ -845,7 +845,7 @@
# that bug that falls within a given date range.
bug = self.factory.makeBug(
date_created=self.now - timedelta(days=365))
- self._makeActivityForBug(bug, activity_ages=[200,100])
+ self._makeActivityForBug(bug, activity_ages=[200, 100])
start_date = self.now - timedelta(days=250)
end_date = self.now - timedelta(days=150)
activity = bug.getActivityForDateRange(
@@ -858,7 +858,7 @@
# falls on the start_ and end_ dates.
bug = self.factory.makeBug(
date_created=self.now - timedelta(days=365))
- self._makeActivityForBug(bug, activity_ages=[300,200,100])
+ self._makeActivityForBug(bug, activity_ages=[300, 200, 100])
start_date = self.now - timedelta(days=300)
end_date = self.now - timedelta(days=100)
activity = bug.getActivityForDateRange(
=== modified file 'lib/lp/bugs/stories/bug-privacy/xx-bug-privacy.txt'
--- lib/lp/bugs/stories/bug-privacy/xx-bug-privacy.txt 2011-09-22 01:45:12 +0000
+++ lib/lp/bugs/stories/bug-privacy/xx-bug-privacy.txt 2011-09-26 07:13:14 +0000
@@ -24,35 +24,36 @@
Foo Bar is not Cc'd on this bug, but is able to set the bug private
anyway, because he is an admin.
- >>> browser.open(
- ... "http://bugs.launchpad.dev/debian/+source/mozilla-firefox/"
- ... "+bug/2/+secrecy")
- >>> browser.getControl("This bug report should be private").selected = True
- >>> browser.getControl("Change").click()
- >>> print browser.url
- http://bugs.launchpad.dev/debian/+source/mozilla-firefox/+bug/2
+ >>> browser.open(
+ ... "http://bugs.launchpad.dev/debian/+source/mozilla-firefox/"
+ ... "+bug/2/+secrecy")
+ >>> browser.getControl("This bug report should be private").selected = (
+ ... True)
+ >>> browser.getControl("Change").click()
+ >>> print browser.url
+ http://bugs.launchpad.dev/debian/+source/mozilla-firefox/+bug/2
Subscribers have been updated according to the privacy rules.
- >>> browser.open(
- ... "http://launchpad.dev/bugs/2/+bug-portlet-subscribers-details")
- >>> print_direct_subscribers(browser.contents)
- Mark Shuttleworth (Unsubscribe)
- Sample Person (Unsubscribe)
- Steve Alexander (Unsubscribe)
- Ubuntu Team (Unsubscribe)
+ >>> browser.open(
+ ... "http://launchpad.dev/bugs/2/+bug-portlet-subscribers-details")
+ >>> print_direct_subscribers(browser.contents)
+ Mark Shuttleworth (Unsubscribe)
+ Sample Person (Unsubscribe)
+ Steve Alexander (Unsubscribe)
+ Ubuntu Team (Unsubscribe)
- >>> print_also_notified(browser.contents)
- Also notified:
+ >>> print_also_notified(browser.contents)
+ Also notified:
When we go back to the secrecy form, the previously set value is pre-selected.
- >>> browser.open(
- ... "http://bugs.launchpad.dev/debian/+source/mozilla-firefox/"
- ... "+bug/2/+secrecy")
- >>> browser.getControl("This bug report should be private").selected
- True
+ >>> browser.open(
+ ... "http://bugs.launchpad.dev/debian/+source/mozilla-firefox/"
+ ... "+bug/2/+secrecy")
+ >>> browser.getControl("This bug report should be private").selected
+ True
Foo Bar files a security (private) bug on Ubuntu Linux. He gets
redirected to the bug page.
@@ -63,7 +64,8 @@
Ubuntu has no security contact, so the Ubuntu maintainer, Ubuntu Team,
will be subscribed instead.
- >>> browser.getControl(name="field.title", index=0).value = "a private bug"
+ >>> browser.getControl(name="field.title", index=0).value = (
+ ... "a private bug")
>>> browser.getControl('Continue').click()
>>> print browser.contents
@@ -119,7 +121,8 @@
>>> browser.open("http://launchpad.dev/ubuntu/+filebug")
- >>> browser.getControl(name="field.title", index=0).value = "a private bug"
+ >>> browser.getControl(name="field.title", index=0).value = (
+ ... "a private bug")
>>> browser.getControl('Continue').click()
>>> print browser.contents
=== modified file 'lib/lp/code/browser/codeimport.py'
--- lib/lp/code/browser/codeimport.py 2011-08-30 10:11:37 +0000
+++ lib/lp/code/browser/codeimport.py 2011-09-26 07:13:14 +0000
@@ -225,7 +225,6 @@
code_import.branch.unique_name))
-
class NewCodeImportForm(Interface):
"""The fields presented on the form for editing a code import."""
@@ -253,7 +252,7 @@
"The URL of the git repository. The HEAD branch will be "
"imported."),
allowed_schemes=["git", "http", "https"],
- allow_userinfo=False, # Only anonymous access is supported.
+ allow_userinfo=False, # Only anonymous access is supported.
allow_port=True,
allow_query=False,
allow_fragment=False,
@@ -265,11 +264,11 @@
"The URL of the Mercurial repository. The tip branch will be "
"imported."),
allowed_schemes=["http", "https"],
- allow_userinfo=False, # Only anonymous access is supported.
+ allow_userinfo=False, # Only anonymous access is supported.
allow_port=True,
- allow_query=False, # Query makes no sense in Mercurial
- allow_fragment=False, # Fragment makes no sense in Mercurial
- trailing_slash=False) # See http://launchpad.net/bugs/56357.
+ allow_query=False, # Query makes no sense in Mercurial.
+ allow_fragment=False, # Fragment makes no sense in Mercurial.
+ trailing_slash=False) # See http://launchpad.net/bugs/56357.
branch_name = copy_field(
IBranch['name'],
@@ -330,8 +329,9 @@
owner_field = self.schema['owner']
any_owner_choice = Choice(
__name__='owner', title=owner_field.title,
- description = _("As an administrator you are able to reassign"
- " this branch to any person or team."),
+ description=_(
+ "As an administrator you are able to reassign this "
+ "branch to any person or team."),
required=True, vocabulary='ValidPersonOrTeam')
any_owner_field = form.Fields(
any_owner_choice, render_context=self.render_context)
@@ -492,6 +492,7 @@
return self._showButtonForStatus(status)
else:
condition = None
+
def success(self, action, data):
"""Make the requested status change."""
if status is not None:
=== modified file 'lib/lp/code/interfaces/branchmergeproposal.py'
--- lib/lp/code/interfaces/branchmergeproposal.py 2011-08-26 00:43:31 +0000
+++ lib/lp/code/interfaces/branchmergeproposal.py 2011-09-26 07:13:14 +0000
@@ -268,7 +268,8 @@
all_comments = exported(
CollectionField(
title=_("All messages discussing this merge proposal"),
- value_type=Reference(schema=Interface), # ICodeReviewComment
+ # Really ICodeReviewComment.
+ value_type=Reference(schema=Interface),
readonly=True))
address = exported(
@@ -281,13 +282,15 @@
@operation_parameters(
id=Int(
title=_("A CodeReviewComment ID.")))
- @operation_returns_entry(Interface) # ICodeReviewComment
+ # Really ICodeReviewComment.
+ @operation_returns_entry(Interface)
@export_read_operation()
def getComment(id):
"""Return the CodeReviewComment with the specified ID."""
@call_with(user=REQUEST_USER)
- @operation_returns_collection_of(Interface) # IBugTask
+ # Really IBugTask.
+ @operation_returns_collection_of(Interface)
@export_read_operation()
@operation_for_version('devel')
def getRelatedBugTasks(user):
@@ -313,12 +316,12 @@
notified.
"""
-
# Cannot specify value type without creating a circular dependency
votes = exported(
CollectionField(
title=_('The votes cast or expected for this proposal'),
- value_type=Reference(schema=Interface), #ICodeReviewVoteReference
+ # Really ICodeReviewVoteReference.
+ value_type=Reference(schema=Interface),
readonly=True,
)
)
@@ -473,7 +476,8 @@
title=_("A reviewer."), schema=IPerson),
review_type=Text())
@call_with(registrant=REQUEST_USER)
- @operation_returns_entry(Interface) # Really ICodeReviewVoteReference
+ # Really ICodeReviewVoteReference.
+ @operation_returns_entry(Interface)
@export_write_operation()
def nominateReviewer(reviewer, registrant, review_type=None):
"""Set the specified person as a reviewer.
=== modified file 'lib/lp/code/interfaces/codeimport.py'
--- lib/lp/code/interfaces/codeimport.py 2011-08-30 16:54:31 +0000
+++ lib/lp/code/interfaces/codeimport.py 2011-09-26 07:13:14 +0000
@@ -124,9 +124,9 @@
allowed_schemes=["http", "https", "svn", "git"],
allow_userinfo=True,
allow_port=True,
- allow_query=False, # Query makes no sense in Subversion.
- allow_fragment=False, # Fragment makes no sense in Subversion.
- trailing_slash=False)) # See http://launchpad.net/bugs/56357.
+ allow_query=False, # Query makes no sense in Subversion.
+ allow_fragment=False, # Fragment makes no sense in Subversion.
+ trailing_slash=False)) # See http://launchpad.net/bugs/56357.
cvs_root = exported(
TextLine(title=_("Repository"), required=False, readonly=True,
=== modified file 'lib/lp/code/interfaces/tests/test_branch.py'
--- lib/lp/code/interfaces/tests/test_branch.py 2011-08-16 00:43:35 +0000
+++ lib/lp/code/interfaces/tests/test_branch.py 2011-09-26 07:13:14 +0000
@@ -5,7 +5,7 @@
__metaclass__ = type
-import lp.codehosting # For plugins
+import lp.codehosting # For plugins.
from bzrlib.branch import (
format_registry as branch_format_registry,
=== modified file 'lib/lp/code/model/diff.py'
--- lib/lp/code/model/diff.py 2011-08-26 00:43:31 +0000
+++ lib/lp/code/model/diff.py 2011-09-26 07:13:14 +0000
@@ -325,7 +325,6 @@
delegates(IDiff, context='diff')
__storm_table__ = 'PreviewDiff'
-
id = Int(primary=True)
diff_id = Int(name='diff')
=== modified file 'lib/lp/code/model/tests/test_branchcollection.py'
--- lib/lp/code/model/tests/test_branchcollection.py 2011-08-30 19:25:53 +0000
+++ lib/lp/code/model/tests/test_branchcollection.py 2011-09-26 07:13:14 +0000
@@ -749,7 +749,7 @@
# way, for_branches=None (the default) has a very different behavior
# than for_branches=[]: the first is no restriction, while the second
# excludes everything.
- mp = self.factory.makeBranchMergeProposal()
+ self.factory.makeBranchMergeProposal()
proposals = self.all_branches.getMergeProposals(for_branches=[])
self.assertEqual([], list(proposals))
self.assertIsInstance(proposals, EmptyResultSet)
@@ -760,7 +760,7 @@
# way, merged_revnos=None (the default) has a very different behavior
# than merged_revnos=[]: the first is no restriction, while the second
# excludes everything.
- mp = self.factory.makeBranchMergeProposal()
+ self.factory.makeBranchMergeProposal()
proposals = self.all_branches.getMergeProposals(merged_revnos=[])
self.assertEqual([], list(proposals))
self.assertIsInstance(proposals, EmptyResultSet)
=== modified file 'lib/lp/code/model/tests/test_sourcepackagerecipe.py'
--- lib/lp/code/model/tests/test_sourcepackagerecipe.py 2011-08-31 20:27:28 +0000
+++ lib/lp/code/model/tests/test_sourcepackagerecipe.py 2011-09-26 07:13:14 +0000
@@ -101,12 +101,12 @@
"""
registrant = self.factory.makePerson()
return dict(
- registrant = registrant,
- owner = self.factory.makeTeam(owner=registrant),
- distroseries = [self.factory.makeDistroSeries()],
- name = self.factory.getUniqueString(u'recipe-name'),
- description = self.factory.getUniqueString(u'recipe-description'),
- recipe = self.factory.makeRecipeText(*branches))
+ registrant=registrant,
+ owner=self.factory.makeTeam(owner=registrant),
+ distroseries=[self.factory.makeDistroSeries()],
+ name=self.factory.getUniqueString(u'recipe-name'),
+ description=self.factory.getUniqueString(u'recipe-description'),
+ recipe=self.factory.makeRecipeText(*branches))
def test_creation(self):
# The metadata supplied when a SourcePackageRecipe is created is
=== modified file 'lib/lp/codehosting/puller/tests/test_worker_formats.py'
--- lib/lp/codehosting/puller/tests/test_worker_formats.py 2011-09-12 11:25:28 +0000
+++ lib/lp/codehosting/puller/tests/test_worker_formats.py 2011-09-26 07:13:14 +0000
@@ -5,7 +5,7 @@
__metaclass__ = type
-import lp.codehosting # for bzr plugins
+import lp.codehosting # For bzr plugins.
from bzrlib.branch import Branch
from bzrlib.bzrdir import BzrDirMetaFormat1
=== modified file 'lib/lp/codehosting/puller/worker.py'
--- lib/lp/codehosting/puller/worker.py 2011-09-02 08:07:39 +0000
+++ lib/lp/codehosting/puller/worker.py 2011-09-26 07:13:14 +0000
@@ -8,7 +8,7 @@
import sys
import urllib2
-import lp.codehosting # to load bzr plugins
+import lp.codehosting # Needed to load bzr plugins.
from bzrlib import (
errors,
=== modified file 'lib/lp/hardwaredb/tests/test_hwdb_submission_validation.py'
--- lib/lp/hardwaredb/tests/test_hwdb_submission_validation.py 2011-09-02 07:13:54 +0000
+++ lib/lp/hardwaredb/tests/test_hwdb_submission_validation.py 2011-09-26 07:13:14 +0000
@@ -140,7 +140,7 @@
'/sys/class/dmi/id/bios_vendor:Dell Inc.'
'/sys/class/dmi/id/bios_version:A12'
'</dmi>'),
- where = '<hardware>',
+ where='<hardware>',
after=True)
# Add the OopsHandler to the log, because we want to make sure this
# doesn't create an Oops report.
@@ -345,7 +345,7 @@
self.fail('assertErrorMessage did not fail for a non-existing '
'error message.')
- # If the parameter result is not None, assertErrorMessage
+ # If the parameter result is not None, assertErrorMessage
# assertErrorMessage raises failureExeception.
try:
self.assertErrorMessage(
@@ -379,7 +379,7 @@
sample_data = self.insertSampledata(
data=self.sample_data,
insert_text='<nonsense/>',
- where = '</system>')
+ where='</system>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -468,7 +468,6 @@
# is set for all three tags. In all three tags, the value may also
# be 'True'.
for tag in ('live_cd', 'private', 'contactable'):
- replace_text = '<%s value="False"/>' % tag
sample_data = self.sample_data.replace(
'<%s value="False"/>' % tag,
'<%s value="True"/>' % tag)
@@ -660,7 +659,7 @@
# The omission of either of the required attributes is detected by
# by the Relax NG validation
for only_attribute in ('name', 'version'):
- tag = '<plugin %s="some_value"/>' % only_attribute
+ tag = '<plugin %s="some_value"/>' % only_attribute
sample_data = self.sample_data.replace(
'<plugin name="architecture_info" version="1.1"/>', tag)
result, submission_id = self.runValidator(sample_data)
@@ -843,7 +842,6 @@
# Any other tag than <device> within <hal> is not allowed.
sample_data = self.sample_data
- insert_position = sample_data.find('<device')
sample_data = self.insertSampledata(
data=self.sample_data,
insert_text='<nonsense/>',
@@ -1095,25 +1093,25 @@
# invalid.
if min_value is not None:
self._testMinMaxIntegerValue(
- property_type, relax_ng_type, min_value, min_value-1)
+ property_type, relax_ng_type, min_value, min_value - 1)
# A value larger than the maximum allowed value is detected as
# invalid.
if max_value is not None:
self._testMinMaxIntegerValue(
- property_type, relax_ng_type, max_value, max_value+1)
+ property_type, relax_ng_type, max_value, max_value + 1)
def testIntegerProperties(self):
"""Validation of integer properties."""
type_info = (('dbus.Byte', 'unsignedByte', 0, 255),
- ('dbus.Int16', 'short', -2**15, 2**15-1),
- ('dbus.Int32', 'int', -2**31, 2**31-1),
- ('dbus.Int64', 'long', -2**63, 2**63-1),
- ('dbus.UInt16', 'unsignedShort', 0, 2**16-1),
- ('dbus.UInt32', 'unsignedInt', 0, 2**32-1),
- ('dbus.UInt64', 'unsignedLong', 0, 2**64-1),
+ ('dbus.Int16', 'short', -2 ** 15, 2 ** 15 - 1),
+ ('dbus.Int32', 'int', -2 ** 31, 2 ** 31 - 1),
+ ('dbus.Int64', 'long', -2 ** 63, 2 ** 63 - 1),
+ ('dbus.UInt16', 'unsignedShort', 0, 2 ** 16 - 1),
+ ('dbus.UInt32', 'unsignedInt', 0, 2 ** 32 - 1),
+ ('dbus.UInt64', 'unsignedLong', 0, 2 ** 64 - 1),
('long', 'integer', None, None),
- ('int', 'long', -2**63, 2**63-1))
+ ('int', 'long', -2 ** 63, 2 ** 63 - 1))
for property_type, relax_ng_type, min_value, max_value in type_info:
self._testIntegerProperty(
property_type, relax_ng_type, min_value, max_value)
@@ -1440,12 +1438,12 @@
int_types = (
('dbus.Byte', 'unsignedByte', 0, 255),
('dbus.Int16', 'short', -32768, 32767),
- ('dbus.Int32', 'int', -2**31, 2**31-1),
- ('dbus.Int64', 'long', -2**63, 2**63-1),
- ('dbus.UInt16', 'unsignedShort', 0, 2**16-1),
- ('dbus.UInt32', 'unsignedInt', 0, 2**32-1),
- ('dbus.UInt64', 'unsignedLong', 0, 2**64-1),
- ('int', 'long', -2**63, 2**63-1),
+ ('dbus.Int32', 'int', -2 ** 31, 2 ** 31 - 1),
+ ('dbus.Int64', 'long', -2 ** 63, 2 ** 63 - 1),
+ ('dbus.UInt16', 'unsignedShort', 0, 2 ** 16 - 1),
+ ('dbus.UInt32', 'unsignedInt', 0, 2 ** 32 - 1),
+ ('dbus.UInt64', 'unsignedLong', 0, 2 ** 64 - 1),
+ ('int', 'long', -2 ** 63, 2 ** 63 - 1),
('long', 'integer', None, None))
for value_type, relax_ng_type, min_allowed, max_allowed in int_types:
self._testIntegerValueTag(property_type, value_type,
@@ -1487,8 +1485,8 @@
tag = template % 'nonsense'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1499,8 +1497,8 @@
tag = template % ''
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertNotEqual(
result, None, 'empty tag <value type="%s">' % value_type)
@@ -1509,8 +1507,8 @@
tag = template % '<nonsense/>'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1523,8 +1521,8 @@
tag = template % '<value type="int" name="baz">1</value>'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertNotEqual(
result, None,
@@ -1533,8 +1531,8 @@
tag = template % '<value type="int">1</value>'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1546,8 +1544,8 @@
tag = template % '<value type="int">1</value>'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertNotEqual(
result, None,
@@ -1556,8 +1554,8 @@
tag = template % '<value type="int" nam="baz">1</value>'
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = tag,
- where = '</device>')
+ insert_text=tag,
+ where='</device>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1607,7 +1605,7 @@
sample_data = self.insertSampledata(
data=self.sample_data,
insert_text='<nonsense/>',
- where = '</processors>')
+ where='</processors>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1677,7 +1675,7 @@
sample_data = self.insertSampledata(
data=self.sample_data,
insert_text='<nonsense/>',
- where = '</processor>')
+ where='</processor>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
submission_id, result,
@@ -1909,7 +1907,7 @@
# rejected.
sample_data = self.insertSampledata(
data=self.sample_data,
- insert_text = '<nonsense/>',
+ insert_text='<nonsense/>',
where='</software>')
result, submission_id = self.runValidator(sample_data)
self.assertErrorMessage(
@@ -2132,7 +2130,7 @@
submission_id, result,
'Extra element xorg in interleave',
'detection of invalid attribute of <xorg>')
-
+
def testXorgTagSubTags(self):
"""Test the validation of <xorg> sub-tags."""
# the only allowed sub-tag is <driver>.
@@ -2145,7 +2143,7 @@
submission_id, result,
'Extra element xorg in interleave',
'detection of invalid sub-tag of <xorg>')
-
+
# <xorg> may be empty
sample_data = self.replaceSampledata(
data=self.sample_data,
@@ -2176,7 +2174,6 @@
'%s="%s"' % attribute for attribute in attributes]
attributes = ' '.join(attributes)
return '<driver %s/>' % attributes
-
def testXorgDriverTagRequiredAttributes(self):
"""Test the validation of attributes of <driver> within <xorg>.
@@ -2232,7 +2229,7 @@
submission_id, result,
'omitted optional attribute %s of <driver> in <xorg> '
'treated as invalid' % omit)
-
+
def testXorgDriverTagInvalidAttributes(self):
"""Test the validation of attributes of <driver> within <xorg>.
@@ -2294,9 +2291,9 @@
def testQuestionsTagAttributesSubTags(self):
"""Test the validation of the <questions> sub-tags."""
- # The only allowed sub-tag is <question>
+ # The only allowed sub-tag is <question>.
sample_data = self.insertSampledata(
- data = self.sample_data,
+ data=self.sample_data,
insert_text='<nonsense/>',
where='</questions>')
result, submission_id = self.runValidator(sample_data)
@@ -2305,9 +2302,9 @@
'Expecting element question, got nonsense',
'invalid sub-tag of <questions>')
- # <questions> may be empty
+ # The <questions> tag may be empty.
sample_data = self.replaceSampledata(
- data = self.sample_data,
+ data=self.sample_data,
replace_text='<questions/>',
from_text='<questions>',
to_text='</questions>')
@@ -2319,7 +2316,7 @@
"""Test the validation of CDATA in <questions> tag."""
# CDATA content is not allowed.
sample_data = self.insertSampledata(
- data = self.sample_data,
+ data=self.sample_data,
insert_text='nonsense',
where='</questions>')
result, submission_id = self.runValidator(sample_data)
@@ -2350,7 +2347,7 @@
to_text='>')
result, submission_id = self.runValidator(sample_data)
self.assertNotEqual(
- result, None,
+ result, None,
'<question> tag without attribute "plugin" was treated as '
'invalid')
@@ -2378,7 +2375,7 @@
to_text='>')
result, submission_id = self.runValidator(sample_data)
self.assertNotEqual(
- result, None,
+ result, None,
'Omitting sub-tag <command> of <question> was treated as invalid')
# The sub-tag <answer> is required; <answer_choices>, which follows
@@ -2608,7 +2605,6 @@
submission_id, result,
'Expecting element value, got nonsense',
'detection of invalid sub-tag of <answer_choices>')
-
def testTargetTagAttributes(self):
"""Test the validation of <target> tag attributes."""
@@ -2678,7 +2674,7 @@
self.assertNotEqual(
result, None,
'Valid <driver> sub-tag of <target> treated as invalid')
-
+
def testTargetTagInvalidSubtag(self):
"""Test the validation of an invalid <target> sub-tag."""
sample_data = self.replaceSampledata(
@@ -2697,8 +2693,8 @@
# This tag has no attributes.
sample_data = self.replaceSampledata(
data=self.sample_data,
- replace_text=
- '<target id="42"><driver bar="baz">foo</driver></target>',
+ replace_text=(
+ '<target id="42"><driver bar="baz">foo</driver></target>'),
from_text='<target',
to_text='</target>')
result, submission_id = self.runValidator(sample_data)
@@ -2710,8 +2706,8 @@
# Sub-tags are not allowed.
sample_data = self.replaceSampledata(
data=self.sample_data,
- replace_text=
- '<target id="42"><driver>foo<nonsense/></driver></target>',
+ replace_text=(
+ '<target id="42"><driver>foo<nonsense/></driver></target>'),
from_text='<target',
to_text='</target>')
result, submission_id = self.runValidator(sample_data)
=== modified file 'lib/lp/registry/browser/announcement.py'
--- lib/lp/registry/browser/announcement.py 2011-09-18 18:42:30 +0000
+++ lib/lp/registry/browser/announcement.py 2011-09-26 07:13:14 +0000
@@ -155,10 +155,10 @@
def announce_action(self, action, data):
"""Registers a new announcement."""
self.context.announce(
- user = self.user,
- title = data.get('title'),
- summary = data.get('summary'),
- url = data.get('url'),
+ user=self.user,
+ title=data.get('title'),
+ summary=data.get('summary'),
+ url=data.get('url'),
publication_date = data.get('publication_date')
)
self.next_url = canonical_url(self.context)
@@ -193,7 +193,7 @@
self.context.modify(title=data.get('title'),
summary=data.get('summary'),
url=data.get('url'))
- self.next_url = canonical_url(self.context.target)+'/+announcements'
+ self.next_url = canonical_url(self.context.target) + '/+announcements'
class AnnouncementRetargetForm(Interface):
@@ -237,7 +237,7 @@
def retarget_action(self, action, data):
target = data.get('target')
self.context.retarget(target)
- self.next_url = canonical_url(self.context.target)+'/+announcements'
+ self.next_url = canonical_url(self.context.target) + '/+announcements'
class AnnouncementPublishView(AnnouncementFormMixin, LaunchpadFormView):
@@ -253,7 +253,7 @@
def publish_action(self, action, data):
publication_date = data['publication_date']
self.context.setPublicationDate(publication_date)
- self.next_url = canonical_url(self.context.target)+'/+announcements'
+ self.next_url = canonical_url(self.context.target) + '/+announcements'
class AnnouncementRetractView(AnnouncementFormMixin, LaunchpadFormView):
@@ -265,7 +265,7 @@
@action(_('Retract'), name='retract')
def retract_action(self, action, data):
self.context.retract()
- self.next_url = canonical_url(self.context.target)+'/+announcements'
+ self.next_url = canonical_url(self.context.target) + '/+announcements'
class AnnouncementDeleteView(AnnouncementFormMixin, LaunchpadFormView):
@@ -277,7 +277,7 @@
@action(_("Delete"), name="delete", validator='validate_cancel')
def action_delete(self, action, data):
self.context.destroySelf()
- self.next_url = canonical_url(self.context.target)+'/+announcements'
+ self.next_url = canonical_url(self.context.target) + '/+announcements'
class HasAnnouncementsView(LaunchpadView, FeedsMixin):
@@ -294,7 +294,7 @@
elif RootAnnouncementsFeedLink.usedfor.providedBy(self.context):
return RootAnnouncementsFeedLink(self.context).href
else:
- raise AssertionError, 'Unknown feed source'
+ raise AssertionError("Unknown feed source")
@cachedproperty
def announcements(self):
=== modified file 'lib/lp/registry/browser/tests/test_distribution_views.py'
--- lib/lp/registry/browser/tests/test_distribution_views.py 2011-09-08 02:46:07 +0000
+++ lib/lp/registry/browser/tests/test_distribution_views.py 2011-09-26 07:13:14 +0000
@@ -15,7 +15,6 @@
from lp.soyuz.interfaces.processor import IProcessorFamilySet
from lp.testing import (
login_celebrity,
- person_logged_in,
TestCaseWithFactory,
)
from lp.testing.sampledata import LAUNCHPAD_ADMIN
=== modified file 'lib/lp/registry/stories/announcements/xx-announcements.txt'
--- lib/lp/registry/stories/announcements/xx-announcements.txt 2011-09-20 01:42:30 +0000
+++ lib/lp/registry/stories/announcements/xx-announcements.txt 2011-09-26 07:13:14 +0000
@@ -18,7 +18,7 @@
... """The contents of the latest news portlet."""
... return extract_text(find_portlet(content, 'Announcements'))
>>> def count_show_links(content):
- ... """Determine whether the "Read more announcements" link is shown."""
+ ... """Is the "Read more announcements" link shown?"""
... return len(find_tags_by_class(
... content, 'menu-link-announcements'))
>>> def no_announcements(content):
@@ -612,7 +612,8 @@
Move announcement : Kubuntu announcement headline : GuadaLinex
>>> kamion_browser.getControl('For').value = 'kubuntu'
>>> kamion_browser.getControl('Retarget').click()
- >>> "don't have permission" in extract_text(find_main_content(kamion_browser.contents))
+ >>> "don't have permission" in extract_text(
+ ... find_main_content(kamion_browser.contents))
True
>>> print kamion_browser.title
Move announcement : Kubuntu announcement headline : GuadaLinex
@@ -635,7 +636,8 @@
>>> 'NetApplet Announcements' in nopriv_browser.contents
True
-The "self" link should point to the original URL, in the feeds.launchpad.dev domain.
+The "self" link should point to the original URL, in the feeds.launchpad.dev
+domain.
>>> strainer = SoupStrainer('link', rel='self')
>>> links = parse_links(nopriv_browser.contents, rel='self')
@@ -675,7 +677,8 @@
tag:launchpad.net,...:/+announcement/28
>>> priv_browser.open('http://launchpad.dev/guadalinex/+announcements')
- >>> 'Kubuntu announcement headline' in announcements(priv_browser.contents)
+ >>> "Kubuntu announcement headline" in (
+ ... announcements(priv_browser.contents))
True
>>> priv_browser.getLink('Kubuntu announcement headline').click()
>>> priv_browser.getLink('Delete announcement').click()
@@ -733,9 +736,10 @@
It excludes retracted and future announcements too:
- >>> '[guadalinex] Kubuntu announcement headline' in nopriv_browser.contents
+ >>> "[guadalinex] Kubuntu announcement headline" in (
+ ... nopriv_browser.contents)
False
- >>> '[jokosher] Jokosher announcement headline' in nopriv_browser.contents
+ >>> "[jokosher] Jokosher announcement headline" in nopriv_browser.contents
False
The announcements are stored as plain text, but the text-to-html formatter
=== modified file 'lib/lp/registry/stories/team-polls/create-polls.txt'
--- lib/lp/registry/stories/team-polls/create-polls.txt 2011-09-18 16:29:43 +0000
+++ lib/lp/registry/stories/team-polls/create-polls.txt 2011-09-26 07:13:14 +0000
@@ -90,14 +90,15 @@
>>> team_admin_browser.open(
... 'http://launchpad.dev/~ubuntu-team/+newpoll')
>>> team_admin_browser.getControl(
- ... 'The unique name of this poll').value = 'dpl-2080'
- >>> team_admin_browser.getControl(
- ... 'The title of this poll').value = 'Debian Project Leader Election 2080'
- >>> proposition = 'The next debian project leader'
- >>> team_admin_browser.getControl(
- ... 'The proposition that is going to be voted').value = proposition
- >>> team_admin_browser.getControl(
- ... 'Users can spoil their votes?').selected = True
+ ... "The unique name of this poll").value = 'dpl-2080'
+ >>> title_control = team_admin_browser.getControl(
+ ... "The title of this poll")
+ >>> title_control.value = "Debian Project Leader Election 2080"
+ >>> proposition = "The next debian project leader"
+ >>> team_admin_browser.getControl(
+ ... "The proposition that is going to be voted").value = proposition
+ >>> team_admin_browser.getControl(
+ ... "Users can spoil their votes?").selected = True
>>> team_admin_browser.getControl(
... name='field.dateopens').value = '2025-06-04 02:00:00+00:00'
>>> team_admin_browser.getControl(
@@ -131,14 +132,15 @@
>>> team_admin_browser.open(
... 'http://launchpad.dev/~ubuntu-team/+newpoll')
>>> team_admin_browser.getControl(
- ... 'The unique name of this poll').value = 'dpl-2080'
- >>> team_admin_browser.getControl(
- ... 'The title of this poll').value = 'Debian Project Leader Election 2080'
- >>> proposition = 'The next debian project leader'
- >>> team_admin_browser.getControl(
- ... 'The proposition that is going to be voted').value = proposition
- >>> team_admin_browser.getControl(
- ... 'Users can spoil their votes?').selected = True
+ ... "The unique name of this poll").value = 'dpl-2080'
+ >>> title_control = team_admin_browser.getControl(
+ ... "The title of this poll")
+ ... title_control.value = "Debian Project Leader Election 2080"
+ >>> proposition = "The next debian project leader"
+ >>> team_admin_browser.getControl(
+ ... "The proposition that is going to be voted").value = proposition
+ >>> team_admin_browser.getControl(
+ ... "Users can spoil their votes?").selected = True
>>> team_admin_browser.getControl(
... name='field.dateopens').value = '2025-06-04 02:00:00+00:00'
>>> team_admin_browser.getControl(
=== modified file 'lib/lp/services/mail/incoming.py'
--- lib/lp/services/mail/incoming.py 2011-09-15 04:49:44 +0000
+++ lib/lp/services/mail/incoming.py 2011-09-26 07:13:14 +0000
@@ -161,16 +161,19 @@
% (signing_domain,))
return None
for origin in ['From', 'Sender']:
- if signed_message[origin] is None: continue
+ if signed_message[origin] is None:
+ continue
name, addr = parseaddr(signed_message[origin])
try:
origin_domain = addr.split('@')[1]
except IndexError:
- log.warning("couldn't extract domain from address %r" % signed_message[origin])
+ log.warning(
+ "couldn't extract domain from address %r",
+ signed_message[origin])
if signing_domain == origin_domain:
log.info(
- "DKIM signing domain %s matches %s address %r"
- % (signing_domain, origin, addr))
+ "DKIM signing domain %s matches %s address %r",
+ signing_domain, origin, addr)
return addr
else:
log.info("DKIM signing domain %s doesn't match message origin; "
@@ -203,10 +206,9 @@
# authenticator for this mail.
log.debug('trusted DKIM mail from %s' % dkim_trusted_addr)
email_addr = dkim_trusted_addr
- else:
+ else:
email_addr = parseaddr(mail['From'])[1]
- signature = mail.signature
authutil = getUtility(IPlacelessAuthUtility)
principal = authutil.getPrincipalByLogin(email_addr)
@@ -236,11 +238,12 @@
signature_timestamp_checker)
-def _gpgAuthenticateEmail(mail, principal, person, signature_timestamp_checker):
+def _gpgAuthenticateEmail(mail, principal, person,
+ signature_timestamp_checker):
"""Check GPG signature.
- :param principal: Claimed sender of the mail; to be checked against the actual
- signature.
+ :param principal: Claimed sender of the mail; to be checked against the
+ actual signature.
:returns: principal, either strongly or weakly authenticated.
"""
log = logging.getLogger('process-mail')
=== modified file 'lib/lp/services/mail/sendmail.py'
--- lib/lp/services/mail/sendmail.py 2011-09-24 03:53:35 +0000
+++ lib/lp/services/mail/sendmail.py 2011-09-26 07:13:14 +0000
@@ -345,13 +345,15 @@
formataddr((name, address))
for name, address in getaddresses([email_header])]
+
def validate_message(message):
"""Validate that the supplied message is suitable for sending."""
- assert isinstance(message, Message), 'Not an email.Message.Message'
- assert 'to' in message and bool(message['to']), 'No To: header'
- assert 'from' in message and bool(message['from']), 'No From: header'
- assert 'subject' in message and bool(message['subject']), \
- 'No Subject: header'
+ assert isinstance(message, Message), "Not an email.Message.Message"
+ assert 'to' in message and bool(message['to']), "No To: header"
+ assert 'from' in message and bool(message['from']), "No From: header"
+ assert 'subject' in message and bool(message['subject']), (
+ "No Subject: header")
+
def sendmail(message, to_addrs=None, bulk=True):
"""Send an email.Message.Message
=== modified file 'lib/lp/services/mail/tests/test_dkim.py'
--- lib/lp/services/mail/tests/test_dkim.py 2011-09-06 02:06:13 +0000
+++ lib/lp/services/mail/tests/test_dkim.py 2011-09-26 07:13:14 +0000
@@ -275,13 +275,12 @@
self.factory.makeEmail(
person=person,
address='dkimtest@xxxxxxxxxxx')
- self._dns_responses['example._domainkey.canonical.com.'] = \
- sample_dns
- tweaked_message = 'Sender: dkimtest@xxxxxxxxxxxxx\n' + \
- plain_content.replace(
- 'From: Foo Bar <foo.bar@xxxxxxxxxxxxx>',
- 'From: DKIM Test <dkimtest@xxxxxxxxxxx>')
+ self._dns_responses['example._domainkey.canonical.com.'] = sample_dns
+ tweaked_message = (
+ "Sender: dkimtest@xxxxxxxxxxxxx\n" + plain_content.replace(
+ "From: Foo Bar <foo.bar@xxxxxxxxxxxxx>",
+ "From: DKIM Test <dkimtest@xxxxxxxxxxx>"))
signed_message = self.fake_signing(tweaked_message)
principal = authenticateEmail(
signed_message_from_string(signed_message))
- self.assertStronglyAuthenticated(principal, signed_message)
\ No newline at end of file
+ self.assertStronglyAuthenticated(principal, signed_message)
=== modified file 'lib/lp/soyuz/browser/sourcepackagebuilds.py'
--- lib/lp/soyuz/browser/sourcepackagebuilds.py 2011-09-13 05:23:16 +0000
+++ lib/lp/soyuz/browser/sourcepackagebuilds.py 2011-09-26 07:13:14 +0000
@@ -9,7 +9,6 @@
'SourcePackageBuildsView',
]
-from lazr.restful.utils import smartquote
from lp.soyuz.browser.build import BuildRecordsView
@@ -32,4 +31,3 @@
# this page is because it's unlikely that there will be so
# many builds that the listing will be overwhelming.
return None
-
=== modified file 'lib/lp/soyuz/doc/archive-override-check.txt'
--- lib/lp/soyuz/doc/archive-override-check.txt 2011-09-18 10:22:51 +0000
+++ lib/lp/soyuz/doc/archive-override-check.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Check for discrepancies in overrides between architectures =
+Check for discrepancies in overrides between architectures
+==========================================================
This script looks for out-of-sync overrides, checking for
discrepancies in overrides between architectures.
@@ -14,31 +15,30 @@
XXX cprov 20060714: we need better populate archive/"publishing
history" in the sampledata to test those tools properly.
- >>> import os
- >>> import subprocess
- >>> import sys
- >>> from canonical.config import config
-
- >>> script = os.path.join(config.root, "scripts", "ftpmaster-tools",
- ... "archive-override-check.py")
-
- >>> process = subprocess.Popen([sys.executable, script, "-v",
- ... "-d", "ubuntu",
- ... "-s", "warty"],
- ... stdout=subprocess.PIPE,
- ... stderr=subprocess.PIPE,)
- >>> stdout, stderr = process.communicate()
- >>> process.returncode
- 0
- >>> print stderr
- INFO Creating lockfile: ...
- DEBUG Considering: ubuntu/warty/RELEASE/CURRENT.
- DEBUG ... published sources
- DEBUG Rolling back any remaining transactions.
- DEBUG Removing lock file: ...
- <BLANKLINE>
+ >>> import os
+ >>> import subprocess
+ >>> import sys
+ >>> from canonical.config import config
+
+ >>> script = os.path.join(
+ ... config.root, "scripts", "ftpmaster-tools",
+ ... "archive-override-check.py")
+
+ >>> process = subprocess.Popen(
+ ... [sys.executable, script, "-v", "-d", "ubuntu", "-s", "warty"],
+ ... stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ >>> stdout, stderr = process.communicate()
+ >>> process.returncode
+ 0
+ >>> print stderr
+ INFO Creating lockfile: ...
+ DEBUG Considering: ubuntu/warty/RELEASE/CURRENT.
+ DEBUG ... published sources
+ DEBUG Rolling back any remaining transactions.
+ DEBUG Removing lock file: ...
+ <BLANKLINE>
Since its data is sane, empty STDOUT is okay.
- >>> print stdout
+ >>> print stdout
=== modified file 'lib/lp/soyuz/doc/buildd-mass-retry.txt'
--- lib/lp/soyuz/doc/buildd-mass-retry.txt 2011-09-18 10:22:51 +0000
+++ lib/lp/soyuz/doc/buildd-mass-retry.txt 2011-09-26 07:13:14 +0000
@@ -1,15 +1,16 @@
-= Testing buildd-mass-retry behaviour =
+Buildd-mass-retry behaviour
+===========================
- >>> import subprocess
- >>> import os
- >>> import sys
- >>> from canonical.config import config
+ >>> import subprocess
+ >>> import os
+ >>> import sys
+ >>> from canonical.config import config
'buildd-mass-retry' is a tool designed to retry a group of 'failed'
build records in a distroseries and/or architecture.
- >>> script = os.path.join(config.root, "scripts", "ftpmaster-tools",
- ... "buildd-mass-retry.py")
+ >>> script = os.path.join(
+ ... config.root, "scripts", "ftpmaster-tools", "buildd-mass-retry.py")
The user can specify a distribution, a suite name (-s) and/or and
architecture (-a) as a build record provider, it will restrict the
@@ -29,102 +30,106 @@
Passing only suite, request retry on all failed states:
- >>> process = subprocess.Popen([sys.executable, script, "-v", "-NFDC",
- ... "-s", "hoary"],
- ... stdout=subprocess.PIPE,
- ... stderr=subprocess.PIPE,)
- >>> stdout, stderr = process.communicate()
- >>> process.returncode
- 0
- >>> print stderr
- INFO Creating lockfile: ...
- INFO Initializing Build Mass-Retry for 'The Hoary Hedgehog Release/RELEASE'
- INFO Processing builds in 'Failed to build'
- INFO Processing builds in 'Dependency wait'
- INFO Retrying i386 build of libstdc++ b8p in ubuntu hoary RELEASE (12)
- INFO Processing builds in 'Chroot problem'
- INFO Success.
- INFO Dry-run.
- DEBUG Removing lock file: ...
- <BLANKLINE>
+ >>> process = subprocess.Popen(
+ ... [sys.executable, script, "-v", "-NFDC", "-s", "hoary"],
+ ... stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ >>> stdout, stderr = process.communicate()
+ >>> process.returncode
+ 0
+ >>> print stderr
+ INFO Creating lockfile: ...
+ INFO Initializing Build Mass-Retry for
+ 'The Hoary Hedgehog Release/RELEASE'
+ INFO Processing builds in 'Failed to build'
+ INFO Processing builds in 'Dependency wait'
+ INFO Retrying i386 build of libstdc++ b8p in ubuntu hoary RELEASE (12)
+ INFO Processing builds in 'Chroot problem'
+ INFO Success.
+ INFO Dry-run.
+ DEBUG Removing lock file: ...
+ <BLANKLINE>
Superseded builds won't be retried; buildd-manager will just skip the build
and set it to SUPERSEDED.
- >>> from zope.security.proxy import removeSecurityProxy
- >>> from lp.soyuz.interfaces.binarypackagebuild import (
- ... IBinaryPackageBuildSet)
- >>> from lp.soyuz.enums import PackagePublishingStatus
- >>> build = getUtility(IBinaryPackageBuildSet).getByID(12)
- >>> pub = removeSecurityProxy(build.current_source_publication)
+ >>> from zope.security.proxy import removeSecurityProxy
+ >>> from lp.soyuz.interfaces.binarypackagebuild import (
+ ... IBinaryPackageBuildSet)
+ >>> from lp.soyuz.enums import PackagePublishingStatus
+ >>> build = getUtility(IBinaryPackageBuildSet).getByID(12)
+ >>> pub = removeSecurityProxy(build.current_source_publication)
Let's mark the build from the previous run superseded.
- >>> pub.status = PackagePublishingStatus.SUPERSEDED
- >>> print build.current_source_publication
- None
- >>> transaction.commit()
+ >>> pub.status = PackagePublishingStatus.SUPERSEDED
+ >>> print build.current_source_publication
+ None
+ >>> transaction.commit()
A new run doesn't pick it up.
- >>> process = subprocess.Popen([sys.executable, script, "-v", "-NFDC",
- ... "-s", "hoary"],
- ... stdout=subprocess.PIPE,
- ... stderr=subprocess.PIPE,)
- >>> stdout, stderr = process.communicate()
- >>> process.returncode
- 0
- >>> print stderr
- INFO Creating lockfile: ...
- INFO Initializing Build Mass-Retry for 'The Hoary Hedgehog Release/RELEASE'
- INFO Processing builds in 'Failed to build'
- INFO Processing builds in 'Dependency wait'
- DEBUG Skipping superseded i386 build of libstdc++ b8p in ubuntu hoary RELEASE (12)
- INFO Processing builds in 'Chroot problem'
- INFO Success.
- INFO Dry-run.
- DEBUG Removing lock file: ...
- <BLANKLINE>
+ >>> process = subprocess.Popen(
+ ... [sys.executable, script, "-v", "-NFDC", "-s", "hoary"],
+ ... stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ >>> stdout, stderr = process.communicate()
+ >>> process.returncode
+ 0
+ >>> print stderr
+ INFO Creating lockfile: ...
+ INFO Initializing Build Mass-Retry for
+ 'The Hoary Hedgehog Release/RELEASE'
+ INFO Processing builds in 'Failed to build'
+ INFO Processing builds in 'Dependency wait'
+ DEBUG Skipping superseded i386 build of libstdc++ b8p in
+ ubuntu hoary RELEASE (12)
+ INFO Processing builds in 'Chroot problem'
+ INFO Success.
+ INFO Dry-run.
+ DEBUG Removing lock file: ...
+ <BLANKLINE>
- >>> pub.status = PackagePublishingStatus.PUBLISHED
- >>> transaction.commit()
+ >>> pub.status = PackagePublishingStatus.PUBLISHED
+ >>> transaction.commit()
Passing an architecture, which contains no failed build records,
nothing is done:
- >>> process = subprocess.Popen([sys.executable, script, "-v", "-NFDC",
- ... "-s", "hoary", "-a", "hppa"],
- ... stdout=subprocess.PIPE,
- ... stderr=subprocess.PIPE,)
- >>> stdout, stderr = process.communicate()
- >>> process.returncode
- 0
- >>> print stderr
- INFO Creating lockfile: ...
- INFO Initializing Build Mass-Retry for 'The Hoary Hedgehog Release for hppa (hppa)/RELEASE'
- INFO Processing builds in 'Failed to build'
- INFO Processing builds in 'Dependency wait'
- INFO Processing builds in 'Chroot problem'
- INFO Success.
- INFO Dry-run.
- DEBUG Removing lock file: ...
- <BLANKLINE>
+ >>> process = subprocess.Popen(
+ ... [
+ ... sys.executable, script,
+ ... "-v", "-NFDC", "-s", "hoary", "-a", "hppa",
+ ... ],
+ ... stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ >>> stdout, stderr = process.communicate()
+ >>> process.returncode
+ 0
+ >>> print stderr
+ INFO Creating lockfile: ...
+ INFO Initializing Build Mass-Retry for
+ 'The Hoary Hedgehog Release for hppa (hppa)/RELEASE'
+ INFO Processing builds in 'Failed to build'
+ INFO Processing builds in 'Dependency wait'
+ INFO Processing builds in 'Chroot problem'
+ INFO Success.
+ INFO Dry-run.
+ DEBUG Removing lock file: ...
+ <BLANKLINE>
Selecting only a specific failed state:
- >>> process = subprocess.Popen([sys.executable, script, "-v", "-NF",
- ... "-s", "hoary"],
- ... stdout=subprocess.PIPE,
- ... stderr=subprocess.PIPE,)
- >>> stdout, stderr = process.communicate()
- >>> process.returncode
- 0
- >>> print stderr
- INFO Creating lockfile: ...
- INFO Initializing Build Mass-Retry for 'The Hoary Hedgehog Release/RELEASE'
- INFO Processing builds in 'Failed to build'
- INFO Success.
- INFO Dry-run.
- DEBUG Removing lock file: ...
- <BLANKLINE>
+ >>> process = subprocess.Popen(
+ ... [sys.executable, script, "-v", "-NF", "-s", "hoary"],
+ ... stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ >>> stdout, stderr = process.communicate()
+ >>> process.returncode
+ 0
+ >>> print stderr
+ INFO Creating lockfile: ...
+ INFO Initializing Build Mass-Retry for
+ 'The Hoary Hedgehog Release/RELEASE'
+ INFO Processing builds in 'Failed to build'
+ INFO Success.
+ INFO Dry-run.
+ DEBUG Removing lock file: ...
+ <BLANKLINE>
=== modified file 'lib/lp/soyuz/doc/gina.txt'
--- lib/lp/soyuz/doc/gina.txt 2011-09-09 04:21:15 +0000
+++ lib/lp/soyuz/doc/gina.txt 2011-09-26 07:13:14 +0000
@@ -110,8 +110,8 @@
And two completely broken packages:
* util-linux, a source package that is missing from the pool. It
- generates 4 binary packages, all missing. It's correctly listed in Sources
- and Packages, though.
+ generates 4 binary packages, all missing. It's correctly listed in
+ Sources and Packages, though.
* clearlooks, a source package with no binaries listed, and which has
a DSC file that refers to an inexistant tar.gz.
@@ -186,7 +186,9 @@
>>> transaction.commit()
>>> fixture.cleanUp()
-=== Testing Source Package Results ===
+
+Testing Source Package Results
+..............................
We should have more source packages in the database:
@@ -292,8 +294,8 @@
were calculated directly on the files):
>>> from lp.soyuz.model.files import SourcePackageReleaseFile
- >>> files = SourcePackageReleaseFile.selectBy(sourcepackagereleaseID=cap.id,
- ... orderBy="libraryfile")
+ >>> files = SourcePackageReleaseFile.selectBy(
+ ... sourcepackagereleaseID=cap.id, orderBy="libraryfile")
>>> for f in files:
... print f.libraryfile.content.sha1
107d5478e72385f714523bad5359efedb5dcc8b2
@@ -321,7 +323,9 @@
>>> print db1.section.name
libs
-=== Testing Source Package Publishing ===
+
+Testing Source Package Publishing
+.................................
We check that the source package publishing override facility works:
@@ -337,28 +341,31 @@
successfully processed.
- We had 2 errors (out of 10 Sources stanzas) in hoary: mkvmlinuz and
- util-linux
+ util-linux.
- - We had 2 errors (out of 10 Sources stanzas) in breezy: python-sqllite and
- util-linux (again, poor thing)
+ - We had 2 errors (out of 10 Sources stanzas) in breezy: python-sqllite
+ and util-linux (again, poor thing).
>>> print SSPPH.select().count() - orig_sspph_count
21
- >>> print SSPPH.selectBy(
+ >>> new_count = SSPPH.selectBy(
... componentID=1,
- ... pocket=PackagePublishingPocket.RELEASE).count() - \
- ... orig_sspph_main_count
+ ... pocket=PackagePublishingPocket.RELEASE).count()
+ >>> print new_count - orig_sspph_main_count
21
-=== Testing Binary Package Results ===
+
+Testing Binary Package Results
+..............................
We have 26 binary packages in hoary. The 4 packages for util-linux fail, and 1
package fails for each of python-sqlite and python-pam. We should publish one
entry for each package listed in Releases.
-We have 23 binary packages in breezy. db1-compat, ed, the 3 libcap packages and
-python-pam is unchanged. python-sqlite fails. The 5 ubuntu-meta packages work.
+We have 23 binary packages in breezy. db1-compat, ed, the 3 libcap packages
+and python-pam is unchanged. python-sqlite fails. The 5 ubuntu-meta packages
+work.
>>> BinaryPackageRelease.select().count() - orig_bpr_count
40
@@ -456,11 +463,13 @@
XXX: test package with invalid source version
XXX: test package with maintainer with non-ascii name
-=== Testing People Created ===
-
-Ensure only one Kamion was created (he's an uploader on multiple packages), and
-that we imported exactly 9 people (13 packages with 3 being uploaded by Kamion,
-2 being uploaded by mdz and 2 by doko).
+
+Testing People Created
+......................
+
+Ensure only one Kamion was created (he's an uploader on multiple packages),
+and that we imported exactly 9 people (13 packages with 3 being uploaded by
+Kamion, 2 being uploaded by mdz and 2 by doko).
>>> from sqlobject import LIKE
>>> p = Person.selectOne(LIKE(Person.q.name, u"cjwatson%"))
@@ -474,7 +483,8 @@
13
-=== Re-run Gina ===
+Re-run Gina
+...........
The second run of gina uses a test archive that is a copy of the first
one, but with updated Packages and Sources files for breezy that do
@@ -560,9 +570,9 @@
>>> print SSPPH.select().count() - orig_sspph_count
23
-Check that the overrides we did were correctly issued. We can't use selectOneBy
-because, of course, there may be multiple rows published for that package --
-that's what overrides actually do.
+Check that the overrides we did were correctly issued. We can't use
+selectOneBy because, of course, there may be multiple rows published for that
+package -- that's what overrides actually do.
>>> from canonical.database.sqlbase import sqlvalues
>>> x11_pub = SSPPH.select("""
@@ -597,7 +607,8 @@
universe
-=== Partner archive import ===
+Partner archive import
+......................
Importing the partner archive requires overriding the component to
"partner", which also makes the archive on any publishing records the
@@ -654,7 +665,8 @@
set(['PARTNER'])
-=== Source-only imports ===
+Source-only imports
+...................
Gina has a 'source-only' configuration option which allows it to
import only sources from the configured archive.
@@ -743,7 +755,8 @@
True
-=== Processing multiple suites in the same batch ===
+Processing multiple suites in the same batch
+............................................
Both, 'lenny' and 'hoary' (as partner) will be processed in the same
batch.
@@ -764,7 +777,8 @@
0
-=== Other tests ===
+Other tests
+...........
For kicks, finally, run gina on a configured but incomplete archive:
@@ -777,7 +791,9 @@
>>> proc.wait()
1
-=== Wrap up ===
+
+Wrap up
+.......
Remove the tmp link to the gina_test_archive
>>> os.remove('/tmp/gina_test_archive')
=== modified file 'lib/lp/soyuz/model/publishing.py'
--- lib/lp/soyuz/model/publishing.py 2011-09-23 07:47:04 +0000
+++ lib/lp/soyuz/model/publishing.py 2011-09-26 07:13:14 +0000
@@ -688,7 +688,7 @@
return self.distroseries.distribution.getSourcePackageRelease(
self.supersededby)
- # XXX: StevenK 2011-09-13 bug=848563: This can die when
+ # XXX: StevenK 2011-09-13 bug=848563: This can die when
# self.sourcepackagename is populated.
@property
def source_package_name(self):
@@ -961,7 +961,7 @@
"""See `IBinaryPackagePublishingHistory`"""
return self.distroarchseries.distroseries
- # XXX: StevenK 2011-09-13 bug=848563: This can die when
+ # XXX: StevenK 2011-09-13 bug=848563: This can die when
# self.binarypackagename is populated.
@property
def binary_package_name(self):
=== modified file 'lib/lp/soyuz/scripts/packagecopier.py'
--- lib/lp/soyuz/scripts/packagecopier.py 2011-09-23 07:44:44 +0000
+++ lib/lp/soyuz/scripts/packagecopier.py 2011-09-26 07:13:14 +0000
@@ -53,8 +53,7 @@
)
from lp.soyuz.scripts.processaccepted import close_bugs_for_sourcepublication
-# XXX cprov 2009-06-12: This function could be incorporated in ILFA,
-# I just don't see a clear benefit in doing that right now.
+
def re_upload_file(libraryfile, restricted=False):
"""Re-upload a librarian file to the public server.
@@ -63,6 +62,9 @@
:return: A new `LibraryFileAlias`.
"""
+ # XXX cprov 2009-06-12: This function could be incorporated in ILFA.
+ # I just don't see a clear benefit in doing that right now.
+
# Open the the libraryfile for reading.
libraryfile.open()
=== modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py'
--- lib/lp/soyuz/scripts/tests/test_copypackage.py 2011-09-23 07:44:44 +0000
+++ lib/lp/soyuz/scripts/tests/test_copypackage.py 2011-09-26 07:13:14 +0000
@@ -263,7 +263,7 @@
# Create a brand new PPA.
archive = self.factory.makeArchive(
distribution=self.test_publisher.ubuntutest,
- purpose = ArchivePurpose.PPA)
+ purpose=ArchivePurpose.PPA)
# Make it private if necessary.
if private:
@@ -307,7 +307,7 @@
# files related to it will remain private.
public_archive = self.factory.makeArchive(
distribution=self.test_publisher.ubuntutest,
- purpose = ArchivePurpose.PPA)
+ purpose=ArchivePurpose.PPA)
public_source = private_source.copyTo(
private_source.distroseries, private_source.pocket,
public_archive)
@@ -382,7 +382,7 @@
# files related to it will remain private.
public_archive = self.factory.makeArchive(
distribution=self.test_publisher.ubuntutest,
- purpose = ArchivePurpose.PPA)
+ purpose=ArchivePurpose.PPA)
public_binary = private_binary.copyTo(
private_source.distroseries, private_source.pocket,
public_archive)[0]
@@ -419,7 +419,7 @@
# Copy The original source and binaries to a private PPA.
private_archive = self.factory.makeArchive(
distribution=self.test_publisher.ubuntutest,
- purpose = ArchivePurpose.PPA)
+ purpose=ArchivePurpose.PPA)
private_archive.buildd_secret = 'x'
private_archive.private = True
@@ -594,8 +594,8 @@
sources = []
for i in xrange(nb_of_sources):
source = self.test_publisher.getPubSource(
- version = u'%d' % self.factory.getUniqueInteger(),
- sourcename = u'name-%d' % self.factory.getUniqueInteger())
+ version=u'%d' % self.factory.getUniqueInteger(),
+ sourcename=u'name-%d' % self.factory.getUniqueInteger())
sources.append(source)
return sources
=== modified file 'lib/lp/soyuz/scripts/tests/test_processaccepted.py'
--- lib/lp/soyuz/scripts/tests/test_processaccepted.py 2011-09-05 16:47:08 +0000
+++ lib/lp/soyuz/scripts/tests/test_processaccepted.py 2011-09-26 07:13:14 +0000
@@ -126,5 +126,3 @@
for bug, bugtask in bugs:
self.assertEqual(BugTaskStatus.FIXRELEASED, bugtask.status)
-
-
=== modified file 'lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt'
--- lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt 2011-09-12 13:54:09 +0000
+++ lib/lp/soyuz/stories/ppa/xx-ppa-workflow.txt 2011-09-26 07:13:14 +0000
@@ -1,7 +1,8 @@
-= Personal Package Archive pages and work-flow =
-
-
-== Activating Personal Package Archives for Users ==
+Personal Package Archive pages and work-flow
+============================================
+
+Activating Personal Package Archives for Users
+----------------------------------------------
Personal Package Archives have to be activated before they can be
accessed, in this section we will cover the activation procedure for
@@ -10,11 +11,11 @@
A section named 'Personal Package Archives' is presented in the
user/team page.
- >>> anon_browser.open("http://launchpad.dev/~cprov")
+ >>> anon_browser.open("http://launchpad.dev/~cprov")
- >>> print_tag_with_id(anon_browser.contents, 'ppas')
- Personal package archives
- PPA for Celso Providelo
+ >>> print_tag_with_id(anon_browser.contents, 'ppas')
+ Personal package archives
+ PPA for Celso Providelo
There is a link in the body page pointing to Celso's PPA.
@@ -152,7 +153,8 @@
Required input is missing.
-== Activating Personal Package Archives for Teams ==
+Activating Personal Package Archives for Teams
+----------------------------------------------
Similarly to the user PPAs activation, team PPAs can be activated by
anyone with 'launchpad.Edit' permission in the team in question:
@@ -267,7 +269,8 @@
...
-== Activating someone else's Personal Package Archives ==
+Activating someone else's Personal Package Archives
+---------------------------------------------------
We also allow LP-admins to create Personal Package Archives in the
name of other users or teams:
@@ -466,7 +469,8 @@
Hack PPA : James Blackwell
-== Double submission ==
+Double submission
+-----------------
If two browser windows are open at the same time on the activation page
then when the second activation is clicked after already
@@ -521,7 +525,8 @@
boomppa
-== Activating an additional PPA ==
+Activating an additional PPA
+----------------------------
Users who already have a PPA may activate a second one. That's the case for
Celso.
@@ -682,7 +687,8 @@
Unauthorized: ...
-== Enabling or disabling of PPAs by the owner ==
+Enabling or disabling of PPAs by the owner
+------------------------------------------
Users with 'launchpad.Edit' permission for a PPA may disable or enable it.
They may also change whether the PPA is published to disk or not.
@@ -727,7 +733,8 @@
True
-== Deleting a PPA ==
+Deleting a PPA
+--------------
Users with launchpad.Edit permission see a "Delete PPA" link in the
navigation menu.
@@ -791,7 +798,7 @@
Traceback (most recent call last):
...
LinkNotFoundError
-
+
>>> print no_priv_browser.getLink("Edit PPA dependencies")
Traceback (most recent call last):
...
=== modified file 'lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt'
--- lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2011-09-23 07:44:44 +0000
+++ lib/lp/soyuz/stories/webservice/xx-source-package-publishing.txt 2011-09-26 07:13:14 +0000
@@ -45,7 +45,8 @@
>>> pubs = webservice.named_get(
... cprov_archive['self_link'], 'getPublishedSources',
- ... source_name="iceweasel", version="1.0", exact_match=True).jsonBody()
+ ... source_name="iceweasel", version="1.0",
+ ... exact_match=True).jsonBody()
>>> print_publications(pubs)
iceweasel 1.0 in warty
@@ -238,7 +239,6 @@
Create a private PPA for Celso with some binaries.
>>> from zope.component import getUtility
- >>> from lp.registry.interfaces.distribution import IDistributionSet
>>> from lp.registry.interfaces.person import IPersonSet
>>> from lp.soyuz.tests.test_publishing import (
... SoyuzTestPublisher)
=== modified file 'lib/lp/soyuz/tests/test_sourcepackagerelease.py'
--- lib/lp/soyuz/tests/test_sourcepackagerelease.py 2011-08-30 12:11:41 +0000
+++ lib/lp/soyuz/tests/test_sourcepackagerelease.py 2011-09-26 07:13:14 +0000
@@ -52,7 +52,6 @@
def test_uploader_recipe(self):
recipe_build = self.factory.makeSourcePackageRecipeBuild()
- recipe = recipe_build.recipe
spr = self.factory.makeSourcePackageRelease(
source_package_recipe_build=recipe_build)
self.assertEqual(recipe_build.requester, spr.uploader)
@@ -133,7 +132,7 @@
das, PackagePublishingPocket.RELEASE, parent_archive,
status=BuildStatus.FULLYBUILT)
bpr = self.factory.makeBinaryPackageRelease(build=orig_build)
- parent_binary_pub = self.factory.makeBinaryPackagePublishingHistory(
+ self.factory.makeBinaryPackagePublishingHistory(
binarypackagerelease=bpr, distroarchseries=das,
archive=parent_archive)
=== modified file 'lib/lp/testing/pgsql.py'
--- lib/lp/testing/pgsql.py 2011-09-06 04:09:46 +0000
+++ lib/lp/testing/pgsql.py 2011-09-26 07:13:14 +0000
@@ -44,7 +44,8 @@
try:
self.real_connection.close()
except psycopg2.InterfaceError:
- pass # Already closed, killed etc. Ignore.
+ # Already closed, killed etc. Ignore.
+ pass
def rollback(self, InterfaceError=psycopg2.InterfaceError):
# In our test suites, rollback ends up being called twice in some
@@ -87,9 +88,9 @@
class CursorWrapper:
"""A wrapper around cursor objects.
- Acts like a normal cursor object, except if CursorWrapper.record_sql is set,
- then queries that pass through CursorWrapper.execute will be appended to
- CursorWrapper.last_executed_sql. This is useful for tests that want to
+ Acts like a normal cursor object, except if CursorWrapper.record_sql is
+ set, then queries that pass through CursorWrapper.execute will be appended
+ to CursorWrapper.last_executed_sql. This is useful for tests that want to
ensure that certain SQL is generated.
"""
real_cursor = None
@@ -130,15 +131,19 @@
_org_connect = None
+
+
def fake_connect(*args, **kw):
return ConnectionWrapper(_org_connect(*args, **kw))
+
def installFakeConnect():
global _org_connect
assert _org_connect is None
_org_connect = psycopg2.connect
psycopg2.connect = fake_connect
+
def uninstallFakeConnect():
global _org_connect
assert _org_connect is not None
@@ -148,7 +153,8 @@
class PgTestSetup:
- connections = [] # Shared
+ # Shared:
+ connections = []
# Use a dynamically generated dbname:
dynamic = object()
@@ -267,9 +273,9 @@
return
self.dropDb()
- # Take out an external lock on the template to avoid causing contention
- # and impeding other processes (pg performs poorly when performing
- # concurrent create db from a single template).
+ # Take out an external lock on the template to avoid causing
+ # contention and impeding other processes (pg performs poorly
+ # when performing concurrent create db from a single template).
pid = os.getpid()
start = time.time()
# try for up to 10 seconds:
@@ -278,14 +284,16 @@
sys.stderr.write('%0.2f starting %s\n' % (start, pid,))
l = None
lockname = '/tmp/lp.createdb.%s' % (self.template,)
- # Wait for the external lock. Most LP tests use the DatabaseLayer which
- # does a double-indirect: it clones the launchpad_ftest_template into a
- # per-test runner template, so we don't have much template contention.
- # However there are a few tests in LP which do use template1 and will
- # contend a lot. Cloning template1 takes 0.2s on a modern machine, so
- # even a modest 8-way server will trivially backlog on db cloning.
- # The 30 second time is enough to deal with the backlog on the known
- # template1 using tests.
+ # Wait for the external lock. Most LP tests use the
+ # DatabaseLayer which does a double-indirect: it clones the
+ # launchpad_ftest_template into a per-test runner template, so
+ # we don't have much template contention.
+ # However there are a few tests in LP which do use template1 and
+ # will contend a lot. Cloning template1 takes 0.2s on a modern
+ # machine, so even a modest 8-way server will trivially backlog
+ # on db cloning.
+ # The 30 second time is enough to deal with the backlog on the
+ # known template1 using tests.
while time.time() - start < 30.0:
try:
if debug:
@@ -307,8 +315,9 @@
attempts = 10
for counter in range(0, attempts):
if debug:
- sys.stderr.write('%0.2f connecting %s %s\n' % (
- time.time(), pid, self.template))
+ sys.stderr.write(
+ "%0.2f connecting %s %s\n"
+ % (time.time(), pid, self.template))
con = self.superuser_connection(self.template)
try:
con.set_isolation_level(0)
@@ -325,8 +334,9 @@
# aborting badly.
atexit.register(self.dropDb)
if debug:
- sys.stderr.write('create db in %0.2fs\n' % (
- time.time()-_start,))
+ sys.stderr.write(
+ "create db in %0.2fs\n" % (
+ time.time() - _start))
break
except psycopg2.DatabaseError, x:
if counter == attempts - 1:
@@ -338,17 +348,19 @@
cur.close()
finally:
con.close()
- duration = (2**counter)*random.random()
+ duration = (2 ** counter) * random.random()
if debug:
sys.stderr.write(
'%0.2f busy:sleeping (%d retries) %s %s %s\n' % (
time.time(), counter, pid, self.template, duration))
- # Let the server wrap up whatever was blocking the copy of the template.
+ # Let the server wrap up whatever was blocking the copy
+ # of the template.
time.sleep(duration)
end = time.time()
if debug:
- sys.stderr.write('%0.2f (%0.2f) completed (%d retries) %s %s\n' % (
- end, end-start, counter, pid, self.template))
+ sys.stderr.write(
+ '%0.2f (%0.2f) completed (%d retries) %s %s\n'
+ % (end, end - start, counter, pid, self.template))
finally:
l.unlock()
if debug:
@@ -362,7 +374,8 @@
'''Close all outstanding connections and drop the database'''
for con in self.connections[:]:
if con.auto_close:
- con.close() # Removes itself from self.connections
+ # Removes itself from self.connections:
+ con.close()
if (ConnectionWrapper.committed and ConnectionWrapper.dirty):
PgTestSetup._reset_db = True
ConnectionWrapper.committed = False
=== modified file 'lib/lp/testing/tests/test_pgsql.py'
--- lib/lp/testing/tests/test_pgsql.py 2011-09-05 15:42:27 +0000
+++ lib/lp/testing/tests/test_pgsql.py 2011-09-26 07:13:14 +0000
@@ -102,7 +102,8 @@
cur = con.cursor()
cur.execute('CREATE TABLE foo (x int)')
con.commit()
- ConnectionWrapper.committed = False # Leave the table
+ # Leave the table.
+ ConnectionWrapper.committed = False
finally:
fixture.tearDown()
@@ -177,7 +178,7 @@
sequence_values = []
# Insert a row into it and roll back the changes. Each time, we
# should end up with the same sequence value
- for i in range(1,3):
+ for i in range(1, 3):
fixture.setUp()
try:
con = fixture.connect()
=== modified file 'lib/lp/translations/browser/pofile.py'
--- lib/lp/translations/browser/pofile.py 2011-09-13 05:23:16 +0000
+++ lib/lp/translations/browser/pofile.py 2011-09-26 07:13:14 +0000
@@ -343,10 +343,11 @@
else:
groups.append(_(u"%s assigned by %s") % (
translator.translator.displayname, group.title))
+
# There are at most two translation groups, so just using 'and'
# is fine here.
- statement = (_(u"This translation is managed by ") +
- _(u" and ").join(groups))+"."
+ statement = _(u"This translation is managed by %s.") % (
+ u" and ".join(groups))
else:
statement = _(u"No translation group has been assigned.")
return statement
@@ -395,7 +396,8 @@
show_option_changed = (
old_show_option is not None and old_show_option != self.show)
if show_option_changed:
- force_start = True # start will be 0, by default
+ # Start will be 0 by default.
+ force_start = True
else:
force_start = False
return POFileBatchNavigator(self._getSelectedPOTMsgSets(),
@@ -793,7 +795,8 @@
show_option_changed = (
old_show_option is not None and old_show_option != self.show)
if show_option_changed:
- force_start = True # start will be 0, by default
+ # Start will be 0 by default.
+ force_start = True
else:
force_start = False
return POFileBatchNavigator(self._getSelectedPOTMsgSets(),
=== modified file 'lib/lp/translations/stories/translationgroups/xx-translationgroups.txt'
--- lib/lp/translations/stories/translationgroups/xx-translationgroups.txt 2011-09-20 01:33:04 +0000
+++ lib/lp/translations/stories/translationgroups/xx-translationgroups.txt 2011-09-26 07:13:14 +0000
@@ -1,4 +1,5 @@
-= Translation Groups =
+Translation Groups
+==================
Make sure we can actually display the Translation Groups page.
@@ -503,7 +504,8 @@
-= Appointing translators in a translation group =
+Appointing translators in a translation group
+---------------------------------------------
No translators have been appointed in the polyglot group so far.
@@ -619,7 +621,8 @@
LinkNotFoundError
-= Change a translator in a translation group =
+Change a translator in a translation group
+------------------------------------------
The system allows us to change the translator for a concrete language
=== modified file 'lib/lp/translations/tests/pofiletranslator.txt'
--- lib/lp/translations/tests/pofiletranslator.txt 2011-09-06 05:00:26 +0000
+++ lib/lp/translations/tests/pofiletranslator.txt 2011-09-26 07:13:14 +0000
@@ -86,7 +86,8 @@
... """, vars())
>>> mark_posubmission_id, date_touched = pofiletranslator(
... mark_id, pofile_id)
- >>> stub_posubmission_id, date_touched = pofiletranslator(stub_id, pofile_id)
+ >>> stub_posubmission_id, date_touched = pofiletranslator(
+ ... stub_id, pofile_id)
>>> mark_posubmission_id == stub_posubmission_id
False
=== modified file 'scripts/close-account.py'
--- scripts/close-account.py 2011-09-13 09:15:40 +0000
+++ scripts/close-account.py 2011-09-26 07:13:14 +0000
@@ -27,9 +27,9 @@
cur = con.cursor()
cur.execute("""
SELECT Person.id, Person.account, name, teamowner
- FROM Person LEFT OUTER JOIN EmailAddress
- ON Person.id = EmailAddress.person
- WHERE name=%(username)s or lower(email)=lower(%(username)s)
+ FROM Person
+ LEFT OUTER JOIN EmailAddress ON Person.id = EmailAddress.person
+ WHERE name = %(username)s OR lower(email) = lower(%(username)s)
""", vars())
try:
person_id, account_id, username, teamowner = cur.fetchone()
@@ -64,12 +64,20 @@
unknown_rationale = PersonCreationRationale.UNKNOWN.value
cur.execute("""
UPDATE Person
- SET displayname='Removed by request',
- name=%(new_name)s, language=NULL, account=NULL,
- homepage_content=NULL, icon=NULL, mugshot=NULL,
- hide_email_addresses=TRUE, registrant=NULL, logo=NULL,
- creation_rationale=%(unknown_rationale)s, creation_comment=NULL
- WHERE id=%(person_id)s
+ SET
+ displayname = 'Removed by request',
+ name=%(new_name)s,
+ language = NULL,
+ account = NULL,
+ homepage_content = NULL,
+ icon = NULL,
+ mugshot = NULL,
+ hide_email_addresses = TRUE,
+ registrant = NULL,
+ logo = NULL,
+ creation_rationale = %(unknown_rationale)s,
+ creation_comment = NULL
+ WHERE id = %(person_id)s
""", vars())
# Remove the Account. We don't set the status to deactivated,
@@ -87,7 +95,7 @@
# Reassign their bugs
table_notification('BugTask')
cur.execute("""
- UPDATE BugTask SET assignee=NULL WHERE assignee=%(person_id)s
+ UPDATE BugTask SET assignee = NULL WHERE assignee = %(person_id)s
""", vars())
# Reassign questions assigned to the user, and close all their questions
@@ -165,6 +173,7 @@
return True
+
def main():
parser = OptionParser(
'%prog [options] (username|email) [...]'
@@ -198,5 +207,6 @@
con.rollback()
return 1
+
if __name__ == '__main__':
sys.exit(main())
=== modified file 'scripts/ftpmaster-tools/sync-source.py'
--- scripts/ftpmaster-tools/sync-source.py 2011-09-18 06:34:16 +0000
+++ scripts/ftpmaster-tools/sync-source.py 2011-09-26 07:13:14 +0000
@@ -558,6 +558,17 @@
current_sources, current_binaries)
+class Percentages:
+ """Helper to compute percentage ratios compared to a fixed total."""
+
+ def __init__(self, total):
+ self.total = total
+
+ def get_ratio(self, number):
+ """Report the ration of `number` to `self.total`, as a percentage."""
+ return (float(number) / self.total) * 100
+
+
def do_diff(Sources, Suite, origin, arguments, current_binaries):
stat_us = 0
stat_cant_update = 0
@@ -622,22 +633,23 @@
% (pkg, dest_version, source_version))
if Options.all:
+ percentages = Percentages(stat_count)
print
print ("Out-of-date BUT modified: %3d (%.2f%%)"
- % (stat_cant_update, (float(stat_cant_update)/stat_count)*100))
+ % (stat_cant_update, percentages.get_ratio(stat_cant_update)))
print ("Updated: %3d (%.2f%%)"
- % (stat_updated, (float(stat_updated)/stat_count)*100))
+ % (stat_updated, percentages.get_ratio(stat_updated)))
print ("Ubuntu Specific: %3d (%.2f%%)"
- % (stat_us, (float(stat_us)/stat_count)*100))
+ % (stat_us, percentages.get_ratio(stat_us)))
print ("Up-to-date [Modified]: %3d (%.2f%%)"
- % (stat_uptodate_modified,
- (float(stat_uptodate_modified)/stat_count)*100))
+ % (stat_uptodate_modified, percentages.get_ratio(
+ stat_uptodate_modified)))
print ("Up-to-date: %3d (%.2f%%)"
- % (stat_uptodate, (float(stat_uptodate)/stat_count)*100))
+ % (stat_uptodate, percentages.get_ratio(stat_uptodate)))
print ("Blacklisted: %3d (%.2f%%)"
- % (stat_blacklisted, (float(stat_blacklisted)/stat_count)*100))
+ % (stat_blacklisted, percentages.get_ratio(stat_blacklisted)))
print ("Broken: %3d (%.2f%%)"
- % (stat_broken, (float(stat_broken)/stat_count)*100))
+ % (stat_broken, percentages.get_ratio(stat_broken)))
print " -----------"
print "Total: %s" % (stat_count)
=== modified file 'scripts/process-one-mail.py'
--- scripts/process-one-mail.py 2011-09-15 06:39:13 +0000
+++ scripts/process-one-mail.py 2011-09-26 07:13:14 +0000
@@ -9,18 +9,15 @@
import sys
-from zope.component.interfaces import ComponentLookupError
-
from canonical.config import config
-from lp.services.scripts.base import (
- LaunchpadScript, LaunchpadScriptFailure)
+from lp.services.scripts.base import LaunchpadScript
from lp.services.mail.incoming import (
handle_one_mail)
-from canonical.launchpad.interfaces.mailbox import IMailBox
from lp.services.mail.helpers import (
save_mail_to_librarian,
)
-from lp.services.mail.signedmessage import signed_message_from_string
+from lp.services.mail.signedmessage import signed_message_from_string
+
class ProcessMail(LaunchpadScript):
usage = """%prog [options] [MAIL_FILE]
=== modified file 'scripts/script-monitor-nagios.py'
--- scripts/script-monitor-nagios.py 2011-09-06 05:00:26 +0000
+++ scripts/script-monitor-nagios.py 2011-09-26 07:13:14 +0000
@@ -58,7 +58,8 @@
start_date = datetime.now() - timedelta(minutes=minutes_ago)
completed_from = strftime("%Y-%m-%d %H:%M:%S", start_date.timetuple())
- completed_to = strftime("%Y-%m-%d %H:%M:%S", datetime.now().timetuple())
+ completed_to = strftime(
+ "%Y-%m-%d %H:%M:%S", datetime.now().timetuple())
hosts_scripts = []
for arg in args:
@@ -95,8 +96,8 @@
print "All scripts ran as expected"
return 0
except Exception, e:
- # Squeeze the exception type and stringification of the exception value
- # on to one line.
+ # Squeeze the exception type and stringification of the exception
+ # value on to one line.
print "Unhandled exception: %s %r" % (e.__class__.__name__, str(e))
return 3
=== modified file 'scripts/script-monitor.py'
--- scripts/script-monitor.py 2011-09-06 05:00:26 +0000
+++ scripts/script-monitor.py 2011-09-26 07:13:14 +0000
@@ -46,7 +46,8 @@
start_date = datetime.now() - timedelta(minutes=minutes_ago)
completed_from = strftime("%Y-%m-%d %H:%M:%S", start_date.timetuple())
- completed_to = strftime("%Y-%m-%d %H:%M:%S", datetime.now().timetuple())
+ completed_to = strftime(
+ "%Y-%m-%d %H:%M:%S", datetime.now().timetuple())
hosts_scripts = []
for arg in args:
@@ -75,17 +76,19 @@
subj.append("%s:%s" % (hostname, scriptname))
error_found = 2
if error_found:
- # Construct our email
+ # Construct our email.
msg = MIMEText('\n'.join(msg))
msg['Subject'] = "Scripts failed to run: %s" % ", ".join(subj)
msg['From'] = 'script-failures@xxxxxxxxxxxxx'
msg['Reply-To'] = 'launchpad@xxxxxxxxxxxxxxxxxxx'
msg['To'] = 'launchpad@xxxxxxxxxxxxxxxxxxx'
- # Send out the email
+ # Send out the email.
smtp = smtplib.SMTP()
smtp.connect()
- smtp.sendmail('script-failures@xxxxxxxxxxxxx', ['launchpad@xxxxxxxxxxxxxxxxxxx'], msg.as_string())
+ smtp.sendmail(
+ 'script-failures@xxxxxxxxxxxxx',
+ ['launchpad@xxxxxxxxxxxxxxxxxxx'], msg.as_string())
smtp.close()
return 2
except:
=== modified file 'standard_test_template.js'
--- standard_test_template.js 2011-09-09 14:16:28 +0000
+++ standard_test_template.js 2011-09-26 07:13:14 +0000
@@ -22,7 +22,8 @@
_should: {
error: {
- test_config_undefined: true,
+ test_config_undefined: true
+ // Careful: no comma after last item or IE chokes.
}
},
@@ -52,7 +53,8 @@
test_config_undefined: function() {
// Verify an error is thrown if there is no config.
mynamespace.setup();
- },
+ }
+ // Careful: no comma after last item or IE chokes.
}));
=== modified file 'utilities/check-templates.sh'
--- utilities/check-templates.sh 2011-09-18 07:22:47 +0000
+++ utilities/check-templates.sh 2011-09-26 07:13:14 +0000
@@ -7,7 +7,7 @@
LPDIR=lib/canonical/launchpad
REGISTRY="lib/canonical/launchpad/zcml/*.zcml lib/canonical/*.zcml
- lib/canonical/launchpad/*.zcml *.zcml
+ lib/canonical/launchpad/*.zcml *.zcml
lib/zope/app/exception/browser/configure.zcml
lib/zope/app/debugskin/configure.zcml
lib/canonical/launchpad/webapp/*.zcml
@@ -15,7 +15,7 @@
MASTER_MACRO='metal:use-macro="context/@@main_template/master"'
-for f in $LPDIR/templates/*.pt; do
+for f in $LPDIR/templates/*.pt; do
base=`basename $f`
clean=`echo $base | cut -d. -f1 | tr - _`
if echo $base | grep -qa ^template-; then
=== modified file 'utilities/report-database-stats.py'
--- utilities/report-database-stats.py 2011-09-06 05:00:26 +0000
+++ utilities/report-database-stats.py 2011-09-26 07:13:14 +0000
@@ -229,10 +229,10 @@
parser.add_option(
"-i", "--interval", dest="interval", type=str,
default=None, metavar="INTERVAL",
- help=
+ help=(
"Use statistics collected over the last INTERVAL period. "
"INTERVAL is a string parsable by PostgreSQL "
- "such as '5 minutes'.")
+ "such as '5 minutes'."))
parser.add_option(
"-n", "--limit", dest="limit", type=int,
default=15, metavar="NUM",