yellow team mailing list archive
-
yellow team
-
Mailing list archive
-
Message #00491
[Merge] lp:~frankban/launchpad/rocketfuel-setuplxc into lp:launchpad
Francesco Banconi has proposed merging lp:~frankban/launchpad/rocketfuel-setuplxc into lp:launchpad.
Requested reviews:
Launchpad Yellow Squad (yellow)
For more details, see:
https://code.launchpad.net/~frankban/launchpad/rocketfuel-setuplxc/+merge/94763
= Summary =
This is the first step of *lpsetup* implementation.
It's basically a Python script created to eliminate code duplication between
*setuplxc* and *rocketfuel-setup/get/branch*.
PS: It's a brand new huge file, sorry about that.
== Proposed fix ==
Actually the script can be used to set up a Launchpad development or testing
environment in a real machine or inside an LXC container, with ability to
update the codebase once installed.
The next steps will be, in random order:
- implement lxc-update sub command: update the codebase inside a running or
stopped LXC
- implement branch and lxc-branch sub commands: see rocketfuel-branch
- ability to specify a custom ssh key name to connect to LXC
- create a real *lpsetup* project, splitting code in separate files, etc.
- deb package?
- ec2 tests, tests, tests
== Implementation details ==
While *setuplxc* sets up the LXC connecting via ssh to the container,
*lpsetup* uses a different approach: the script calls itself using an ssh
connection to the container. This is possible thanks to the script ability
to be run specifying *actions*, i.e. a subset of internal functions to be
called. This way, the LXC can initialize itself and the code is reused.
*lpsetup* improves some of the functionalities present in *setuplxc*:
- errors in executing external commands are caught and reported, and they
cause the script to stop running
- apt repositories are added in a gentler way, using *add-apt-repository*
(this allows to get rid of `apt-get install --allow-unauthenticated`
and needs to be back-ported to *setuplxc*)
- some helper objects were fixed and some others were added
A thin subcommands management layer (`BaseSubCommand`) was implemented to
allow extensibility.
Because the script can be executed in lucid containers,
it is now compatible with Python 2.6.
== Tests ==
python -m doctest -v lpsetup.py
--
https://code.launchpad.net/~frankban/launchpad/rocketfuel-setuplxc/+merge/94763
Your team Launchpad Yellow Squad is requested to review the proposed merge of lp:~frankban/launchpad/rocketfuel-setuplxc into lp:launchpad.
=== modified file 'Makefile'
=== modified file 'buildout-templates/bin/combo-rootdir.in'
--- buildout-templates/bin/combo-rootdir.in 2012-02-23 12:15:01 +0000
+++ buildout-templates/bin/combo-rootdir.in 2012-02-27 13:14:12 +0000
@@ -30,7 +30,23 @@
cp -a $jsdir $BUILD_DIR/lp/$app
done
# ... and delete tests.
-find $BUILD_DIR/lp -name 'tests' -type d | xargs rm -rf
-
+<<<<<<< TREE
+find $BUILD_DIR/lp -name 'tests' -type d | xargs rm -rf
+
+=======
+find $BUILD_DIR/lp -name 'tests' -type d | xargs rm -rf
+
+# We have to move some modules around.
+cp lib/lp/contrib/javascript/lp.mustache.js $BUILD_DIR/lp
+rm $BUILD_DIR/lp/contrib/mustache.js
+rm $BUILD_DIR/lp/app/worlddata
+cp lib/lp/services/worlddata/javascript/languages.js $BUILD_DIR/lp
+cp lib/lp/app/longpoll/javascript/longpoll.js $BUILD_DIR/lp
+
+>>>>>>> MERGE-SOURCE
# Minify our JS.
+<<<<<<< TREE
bin/lpjsmin -p $BUILD_DIR/lp
+=======
+bin/py -m lp.scripts.utilities.js.jsmin_all $BUILD_DIR/lp
+>>>>>>> MERGE-SOURCE
=== modified file 'buildout.cfg'
=== modified file 'database/sampledata/current-dev.sql'
=== modified file 'database/sampledata/current.sql'
=== modified file 'database/schema/comments.sql'
=== modified file 'database/schema/security.cfg'
=== modified file 'lib/lp/answers/browser/questiontarget.py'
=== modified file 'lib/lp/app/browser/tales.py'
=== modified file 'lib/lp/app/templates/launchpad-notfound.pt'
=== modified file 'lib/lp/blueprints/browser/specificationtarget.py'
=== modified file 'lib/lp/bugs/browser/bugtarget.py'
=== modified file 'lib/lp/bugs/browser/tests/test_bugtask.py'
=== modified file 'lib/lp/bugs/javascript/tests/test_bugtarget_portlet_bugtags.html'
--- lib/lp/bugs/javascript/tests/test_bugtarget_portlet_bugtags.html 2012-02-13 15:18:05 +0000
+++ lib/lp/bugs/javascript/tests/test_bugtarget_portlet_bugtags.html 2012-02-27 13:14:12 +0000
@@ -7,6 +7,7 @@
<html>
<head>
+<<<<<<< TREE
<title>Test bugtarget_portlet_bugtags</title>
<!-- YUI and test setup -->
@@ -59,4 +60,37 @@
<a href="" id="show-fewer-tags-link" class="js-action hidden">Show fewer tags…</a>
</div>
</body>
+=======
+ <title>Launchpad bugtarget_portlet_bugtags</title>
+ <!-- YUI and test setup -->
+ <script type="text/javascript"
+ src="../../../../canonical/launchpad/icing/yui/yui/yui.js">
+ </script>
+ <link rel="stylesheet" href="../../../app/javascript/testing/test.css" />
+ <script type="text/javascript"
+ src="../../../app/javascript/testing/testrunner.js"></script>
+
+ <!-- Other dependencies -->
+ <script type="text/javascript" src="../../../app/javascript/lp.js"></script>
+ <script type="text/javascript" src="../../../app/javascript/client.js"></script>
+ <script type="text/javascript" src="../../../app/javascript/testing/mockio.js"></script>
+
+ <!-- The module under test -->
+ <script type="text/javascript" src="../bugtarget_portlet_bugtags.js"></script>
+
+ <!-- The test suite -->
+ <script type="text/javascript" src="test_bugtarget_portlet_bugtags.js"></script>
+ </head>
+ <body class="yui3-skin-sam">
+ <div class="portlet" id="portlet-tags">
+ <div id="tags-portlet-spinner"
+ class="hidden centered">
+ <img src="/@@/spinner" />
+ </div>
+ <a id="tags-content-link" href="/launchpad/+bugtarget-portlet-tags-content"></a>
+ <a href="" id="show-more-tags-link" class="js-action hidden">Show more tags…</a>
+ <a href="" id="show-fewer-tags-link" class="js-action hidden">Show fewer tags…</a>
+ </div>
+ </body>
+>>>>>>> MERGE-SOURCE
</html>
=== modified file 'lib/lp/bugs/model/bugtask.py'
=== modified file 'lib/lp/bugs/model/bugtasksearch.py'
=== modified file 'lib/lp/bugs/tests/test_bugtask_search.py'
=== modified file 'lib/lp/code/browser/branchlisting.py'
=== modified file 'lib/lp/code/doc/revision.txt'
--- lib/lp/code/doc/revision.txt 2012-02-15 17:29:54 +0000
+++ lib/lp/code/doc/revision.txt 2012-02-27 13:14:12 +0000
@@ -130,33 +130,19 @@
In particular, IBranch.getScannerData efficiently retrieves the BranchRevision
data needed by the branch-scanner script.
- >>> ancestry, history = branch.getScannerData()
-
-The first return value is a set of revision_id strings for the full ancestry
-of the branch.
-
- >>> for revision_id in sorted(ancestry):
- ... print revision_id
- foo@localhost-20051031165758-48acedf2b6a2e898
- foo@localhost-20051031170008-098959758bf79803
- foo@localhost-20051031170239-5fce7d6bd3f01efc
- foo@localhost-20051031170357-1301ad6d387feb23
- test@xxxxxxxxxxxxx-20051031165248-6f1bb97973c2b4f4
- test@xxxxxxxxxxxxx-20051031165338-5f2f3d6b10bb3bf0
- test@xxxxxxxxxxxxx-20051031165532-3113df343e494daa
- test@xxxxxxxxxxxxx-20051031165901-43b9644ec2eacc4e
-
-The second return value is a sequence of revision_id strings for the revision
-history of the branch.
-
- >>> for revision_id in history:
- ... print revision_id
- test@xxxxxxxxxxxxx-20051031165248-6f1bb97973c2b4f4
- test@xxxxxxxxxxxxx-20051031165338-5f2f3d6b10bb3bf0
- foo@localhost-20051031165758-48acedf2b6a2e898
- foo@localhost-20051031170008-098959758bf79803
- foo@localhost-20051031170239-5fce7d6bd3f01efc
- foo@localhost-20051031170357-1301ad6d387feb23
+ >>> history = branch.getScannerData()
+
+The return value is a sequence of tuples with revision numbers and
+revision_ids.
+
+ >>> for revno, revision_id in history:
+ ... print "%d, %s" % (revno, revision_id)
+ 6, foo@localhost-20051031170357-1301ad6d387feb23
+ 5, foo@localhost-20051031170239-5fce7d6bd3f01efc
+ 4, foo@localhost-20051031170008-098959758bf79803
+ 3, foo@localhost-20051031165758-48acedf2b6a2e898
+ 2, test@xxxxxxxxxxxxx-20051031165338-5f2f3d6b10bb3bf0
+ 1, test@xxxxxxxxxxxxx-20051031165248-6f1bb97973c2b4f4
=== Deleting BranchRevisions ===
=== modified file 'lib/lp/code/interfaces/branch.py'
--- lib/lp/code/interfaces/branch.py 2012-02-20 02:07:55 +0000
+++ lib/lp/code/interfaces/branch.py 2012-02-27 13:14:12 +0000
@@ -887,19 +887,14 @@
"""
def getScannerData():
- """Retrieve the full ancestry of a branch for the branch scanner.
+ """Retrieve the full history of a branch for the branch scanner.
The branch scanner script is the only place where we need to retrieve
- all the BranchRevision rows for a branch. Since the ancestry of some
+ all the BranchRevision rows for a branch. Since the history of some
branches is into the tens of thousands we don't want to materialise
BranchRevision instances for each of these.
- :return: tuple of three items.
- 1. Ancestry set of bzr revision-ids.
- 2. History list of bzr revision-ids. Similar to the result of
- bzrlib.Branch.revision_history().
- 3. Dictionnary mapping bzr bzr revision-ids to the database ids of
- the corresponding BranchRevision rows for this branch.
+ :return: Iterator over bzr revision-ids in history, newest first.
"""
def getInternalBzrUrl():
=== modified file 'lib/lp/code/model/branch.py'
--- lib/lp/code/model/branch.py 2012-02-27 04:29:39 +0000
+++ lib/lp/code/model/branch.py 2012-02-27 13:14:12 +0000
@@ -970,19 +970,12 @@
def getScannerData(self):
"""See `IBranch`."""
- columns = (BranchRevision.sequence, Revision.revision_id)
rows = Store.of(self).using(Revision, BranchRevision).find(
- columns,
+ (BranchRevision.sequence, Revision.revision_id),
Revision.id == BranchRevision.revision_id,
- BranchRevision.branch_id == self.id)
- rows = rows.order_by(BranchRevision.sequence)
- ancestry = set()
- history = []
- for sequence, revision_id in rows:
- ancestry.add(revision_id)
- if sequence is not None:
- history.append(revision_id)
- return ancestry, history
+ BranchRevision.branch_id == self.id,
+ BranchRevision.sequence != None)
+ return rows.order_by(Desc(BranchRevision.sequence))
def getPullURL(self):
"""See `IBranch`."""
=== modified file 'lib/lp/code/model/branchjob.py'
--- lib/lp/code/model/branchjob.py 2012-02-21 19:13:45 +0000
+++ lib/lp/code/model/branchjob.py 2012-02-27 13:14:12 +0000
@@ -464,6 +464,7 @@
super(RevisionsAddedJob, self).__init__(context)
self._bzr_branch = None
self._tree_cache = {}
+ self._author_display_cache = {}
@property
def bzr_branch(self):
@@ -485,16 +486,20 @@
def iterAddedMainline(self):
"""Iterate through revisions added to the mainline."""
- repository = self.bzr_branch.repository
- added_revisions = repository.get_graph().find_unique_ancestors(
- self.last_revision_id, [self.last_scanned_id])
+ graph = self.bzr_branch.repository.get_graph()
+ branch_last_revinfo = self.bzr_branch.last_revision_info()
+ # Find the revision number matching self.last_revision_id.
+ last_revno = graph.find_distance_to_null(
+ self.last_revision_id,
+ [(branch_last_revinfo[1], branch_last_revinfo[0])])
+ added_revisions = graph.find_unique_ancestors(
+ self.last_revision_id, [self.last_scanned_id, NULL_REVISION])
# Avoid hitting the database since bzrlib makes it easy to check.
- # There are possibly more efficient ways to get the mainline
- # revisions, but this is simple and it works.
- history = self.bzr_branch.revision_history()
- for num, revid in enumerate(history):
+ history = graph.iter_lefthand_ancestry(
+ self.last_revision_id, [NULL_REVISION, self.last_scanned_id, None])
+ for distance, revid in enumerate(history):
if revid in added_revisions:
- yield repository.get_revision(revid), num + 1
+ yield revid, last_revno - distance
def generateDiffs(self):
"""Determine whether to generate diffs."""
@@ -515,8 +520,11 @@
self.bzr_branch.lock_read()
try:
- for revision, revno in self.iterAddedMainline():
+ for revision_id, revno in reversed(
+ list(self.iterAddedMainline())):
assert revno is not None
+ revision = self.bzr_branch.repository.get_revision(
+ revision_id)
mailer = self.getMailerForRevision(
revision, revno, self.generateDiffs())
mailer.sendAll()
@@ -644,6 +652,27 @@
[proposal for proposal, date_created in proposals.itervalues()],
key=operator.attrgetter('date_created'), reverse=True)
+ def getAuthorDisplayName(self, author):
+ """Get the display name for an author.
+
+ This caches the result.
+
+ :param author: Author text as found in the bzr revision
+ :return: Prettified author name
+ """
+ try:
+ return self._author_display_cache[author]
+ except KeyError:
+ pass
+ revision_set = RevisionSet()
+ rev_author = revision_set.acquireRevisionAuthor(author)
+ if rev_author.person is None:
+ displayname = rev_author.name
+ else:
+ displayname = rev_author.person.unique_displayname
+ self._author_display_cache[author] = displayname
+ return displayname
+
def getRevisionMessage(self, revision_id, revno):
"""Return the log message for a revision.
@@ -655,7 +684,9 @@
try:
graph = self.bzr_branch.repository.get_graph()
merged_revisions = self.getMergedRevisionIDs(revision_id, graph)
+ outf = StringIO()
authors = self.getAuthors(merged_revisions, graph)
+<<<<<<< TREE
revision_set = RevisionSet()
rev_authors = revision_set.acquireRevisionAuthors(authors)
outf = StringIO()
@@ -666,6 +697,11 @@
else:
displayname = rev_author.person.unique_displayname
pretty_authors.append(' %s' % displayname)
+=======
+ pretty_authors = list(set([
+ ' %s' % self.getAuthorDisplayName(author) for author in
+ authors]))
+>>>>>>> MERGE-SOURCE
if len(pretty_authors) > 0:
outf.write('Merge authors:\n')
=== modified file 'lib/lp/code/model/tests/test_branchjob.py'
--- lib/lp/code/model/tests/test_branchjob.py 2012-02-21 19:13:45 +0000
+++ lib/lp/code/model/tests/test_branchjob.py 2012-02-27 13:14:12 +0000
@@ -24,6 +24,7 @@
import pytz
from sqlobject import SQLObjectNotFound
from storm.locals import Store
+from testtools.matchers import Equals
import transaction
from zope.component import getUtility
from zope.security.proxy import removeSecurityProxy
@@ -72,7 +73,10 @@
from lp.services.job.runner import JobRunner
from lp.services.osutils import override_environ
from lp.services.webapp import canonical_url
-from lp.testing import TestCaseWithFactory
+from lp.testing import (
+ StormStatementRecorder,
+ TestCaseWithFactory,
+ )
from lp.testing.dbuser import (
dbuser,
switch_dbuser,
@@ -83,6 +87,7 @@
)
from lp.testing.librarianhelpers import get_newest_librarian_file
from lp.testing.mail_helpers import pop_notifications
+from lp.testing.matchers import HasQueryCount
from lp.translations.enums import RosettaImportStatus
from lp.translations.interfaces.translationimportqueue import (
ITranslationImportQueue,
@@ -378,16 +383,13 @@
job = RevisionsAddedJob.create(branch, 'rev1', 'rev2', '')
self.assertEqual([job], list(RevisionsAddedJob.iterReady()))
- def updateDBRevisions(self, branch, bzr_branch, revision_ids=None):
+ def updateDBRevisions(self, branch, bzr_branch, revision_ids):
"""Update the database for the revisions.
:param branch: The database branch associated with the revisions.
:param bzr_branch: The Bazaar branch associated with the revisions.
- :param revision_ids: The ids of the revisions to update. If not
- supplied, the branch revision history is used.
+ :param revision_ids: The ids of the revisions to update.
"""
- if revision_ids is None:
- revision_ids = bzr_branch.revision_history()
for bzr_revision in bzr_branch.repository.get_revisions(revision_ids):
existing = branch.getBranchRevision(
revision_id=bzr_revision.revision_id)
@@ -425,6 +427,21 @@
tree.unlock()
return branch, tree
+ def test_getAuthorDisplayName(self):
+ """getAuthorDisplayName lookups the display name for an author."""
+ self.useBzrBranches(direct_database=True)
+ branch, tree = self.create3CommitsBranch()
+ job = RevisionsAddedJob.create(branch, 'rev1', 'rev2', '')
+ job.bzr_branch.lock_read()
+ self.addCleanup(job.bzr_branch.unlock)
+ self.assertEquals("Somebody <foo@xxxxxxxxxxx>",
+ job.getAuthorDisplayName("Somebody <foo@xxxxxxxxxxx>"))
+ # The next call should be cached
+ with StormStatementRecorder() as recorder:
+ self.assertEquals("Somebody <foo@xxxxxxxxxxx>",
+ job.getAuthorDisplayName("Somebody <foo@xxxxxxxxxxx>"))
+ self.assertThat(recorder, HasQueryCount(Equals(0)))
+
def test_iterAddedMainline(self):
"""iterAddedMainline iterates through mainline revisions."""
self.useBzrBranches(direct_database=True)
@@ -432,7 +449,7 @@
job = RevisionsAddedJob.create(branch, 'rev1', 'rev2', '')
job.bzr_branch.lock_read()
self.addCleanup(job.bzr_branch.unlock)
- [(revision, revno)] = list(job.iterAddedMainline())
+ [(revision_id, revno)] = list(job.iterAddedMainline())
self.assertEqual(2, revno)
def test_iterAddedNonMainline(self):
@@ -446,24 +463,24 @@
with override_environ(BZR_EMAIL='me@xxxxxxxxxxx'):
tree.commit('rev3a', rev_id='rev3a')
self.updateDBRevisions(branch, tree.branch, ['rev3', 'rev3a'])
- job = RevisionsAddedJob.create(branch, 'rev1', 'rev3', '')
+ job = RevisionsAddedJob.create(branch, 'rev1', 'rev2', '')
job.bzr_branch.lock_read()
self.addCleanup(job.bzr_branch.unlock)
- out = [x.revision_id for x, y in job.iterAddedMainline()]
+ out = [x for x, y in job.iterAddedMainline()]
self.assertEqual(['rev2'], out)
def test_iterAddedMainline_order(self):
- """iterAddedMainline iterates in commit order."""
+ """iterAddedMainline iterates in reverse commit order."""
self.useBzrBranches(direct_database=True)
branch, tree = self.create3CommitsBranch()
job = RevisionsAddedJob.create(branch, 'rev1', 'rev3', '')
job.bzr_branch.lock_read()
self.addCleanup(job.bzr_branch.unlock)
# Since we've gone from rev1 to rev3, we've added rev2 and rev3.
- [(rev2, revno2), (rev3, revno3)] = list(job.iterAddedMainline())
- self.assertEqual('rev2', rev2.revision_id)
+ [(rev3, revno3), (rev2, revno2)] = list(job.iterAddedMainline())
+ self.assertEqual('rev2', rev2)
self.assertEqual(2, revno2)
- self.assertEqual('rev3', rev3.revision_id)
+ self.assertEqual('rev3', rev3)
self.assertEqual(3, revno3)
def makeBranchWithCommit(self):
@@ -775,7 +792,8 @@
committer="Joe Bloggs <joe@xxxxxxxxxxx>",
timestamp=1000100000.0, timezone=0)
switch_dbuser('branchscanner')
- self.updateDBRevisions(db_branch, tree.branch)
+ self.updateDBRevisions(
+ db_branch, tree.branch, [first_revision, second_revision])
expected = (
u"-" * 60 + '\n'
"revno: 1" '\n'
@@ -818,7 +836,8 @@
committer=u"Non ASCII: \xed", timestamp=1000000000.0,
timezone=0)
switch_dbuser('branchscanner')
- self.updateDBRevisions(db_branch, tree.branch)
+ self.updateDBRevisions(
+ db_branch, tree.branch, [rev_id])
job = RevisionsAddedJob.create(db_branch, '', '', '')
message = job.getRevisionMessage(rev_id, 1)
# The revision message must be a unicode object.
=== modified file 'lib/lp/codehosting/codeimport/tests/test_worker.py'
--- lib/lp/codehosting/codeimport/tests/test_worker.py 2012-02-15 17:29:54 +0000
+++ lib/lp/codehosting/codeimport/tests/test_worker.py 2012-02-27 13:14:12 +0000
@@ -185,7 +185,7 @@
store = self.makeBranchStore()
bzr_branch = store.pull(
self.arbitrary_branch_id, self.temp_dir, default_format)
- self.assertEqual([], bzr_branch.revision_history())
+ self.assertEqual((0, 'null:'), bzr_branch.last_revision_info())
def test_getNewBranch_without_tree(self):
# If pull() with needs_tree=False creates a new branch, it doesn't
@@ -657,7 +657,7 @@
# import.
worker = self.makeImportWorker()
bzr_branch = worker.getBazaarBranch()
- self.assertEqual([], bzr_branch.revision_history())
+ self.assertEqual((0, 'null:'), bzr_branch.last_revision_info())
def test_bazaarBranchLocation(self):
# getBazaarBranch makes the working tree under the current working
@@ -844,7 +844,7 @@
worker.run()
branch = self.getStoredBazaarBranch(worker)
self.assertEqual(
- self.foreign_commit_count, len(branch.revision_history()))
+ self.foreign_commit_count, branch.revno())
def test_sync(self):
# Do an import.
@@ -854,7 +854,7 @@
worker.run()
branch = self.getStoredBazaarBranch(worker)
self.assertEqual(
- self.foreign_commit_count, len(branch.revision_history()))
+ self.foreign_commit_count, branch.revno())
# Change the remote branch.
self.makeForeignCommit(worker.source_details)
@@ -865,7 +865,7 @@
# Check that the new revisions are in the Bazaar branch.
branch = self.getStoredBazaarBranch(worker)
self.assertEqual(
- self.foreign_commit_count, len(branch.revision_history()))
+ self.foreign_commit_count, branch.revno())
def test_import_script(self):
# Like test_import, but using the code-import-worker.py script
@@ -902,7 +902,7 @@
branch = Branch.open(branch_url)
self.assertEqual(
- self.foreign_commit_count, len(branch.revision_history()))
+ self.foreign_commit_count, branch.revno())
def test_script_exit_codes(self):
# After a successful import that imports revisions, the worker exits
@@ -1381,8 +1381,7 @@
self.assertEqual(
CodeImportWorkerExitCode.SUCCESS, worker.run())
branch = self.getStoredBazaarBranch(worker)
- self.assertEqual(
- 1, len(branch.revision_history()))
+ self.assertEqual(1, branch.revno())
self.assertEqual(
"Some Random Hacker <jane@xxxxxxxxxxx>",
branch.repository.get_revision(branch.last_revision()).committer)
=== modified file 'lib/lp/codehosting/codeimport/tests/test_workermonitor.py'
--- lib/lp/codehosting/codeimport/tests/test_workermonitor.py 2012-02-15 17:29:54 +0000
+++ lib/lp/codehosting/codeimport/tests/test_workermonitor.py 2012-02-27 13:14:12 +0000
@@ -803,8 +803,7 @@
url = get_default_bazaar_branch_store()._getMirrorURL(
code_import.branch.id)
branch = Branch.open(url)
- self.assertEqual(
- self.foreign_commit_count, len(branch.revision_history()))
+ self.assertEqual(self.foreign_commit_count, branch.revno())
def assertImported(self, ignored, code_import_id):
"""Assert that the `CodeImport` of the given id was imported."""
=== modified file 'lib/lp/codehosting/scanner/bzrsync.py'
--- lib/lp/codehosting/scanner/bzrsync.py 2012-02-21 19:13:45 +0000
+++ lib/lp/codehosting/scanner/bzrsync.py 2012-02-27 13:14:12 +0000
@@ -16,6 +16,7 @@
import logging
+from bzrlib.errors import NoSuchRevision
from bzrlib.graph import DictParentsProvider
from bzrlib.revision import NULL_REVISION
import pytz
@@ -85,16 +86,16 @@
# Get the history and ancestry from the branch first, to fail early
# if something is wrong with the branch.
self.logger.info("Retrieving history from bzrlib.")
- bzr_history = bzr_branch.revision_history()
+ last_revision_info = bzr_branch.last_revision_info()
# The BranchRevision, Revision and RevisionParent tables are only
# written to by the branch-scanner, so they are not subject to
# write-lock contention. Update them all in a single transaction to
# improve the performance and allow garbage collection in the future.
- db_ancestry, db_history = self.retrieveDatabaseAncestry()
+ db_history = self.retrieveDatabaseAncestry()
(new_ancestry, branchrevisions_to_delete,
revids_to_insert) = self.planDatabaseChanges(
- bzr_branch, bzr_history, db_ancestry, db_history)
+ bzr_branch, last_revision_info, db_history)
new_db_revs = (
new_ancestry - getUtility(IRevisionSet).onlyPresent(new_ancestry))
self.logger.info("Adding %s new revisions.", len(new_db_revs))
@@ -110,7 +111,7 @@
# Notify any listeners that the tip of the branch has changed, but
# before we've actually updated the database branch.
- initial_scan = (len(db_history) == 0)
+ initial_scan = (self.db_branch.last_scanned_id is None)
notify(events.TipChanged(self.db_branch, bzr_branch, initial_scan))
# The Branch table is modified by other systems, including the web UI,
@@ -120,7 +121,7 @@
# not been updated. Since this has no ill-effect, and can only err on
# the pessimistic side (tell the user the data has not yet been
# updated although it has), the race is acceptable.
- self.updateBranchStatus(bzr_history)
+ self.updateBranchStatus(last_revision_info)
notify(
events.ScanCompleted(
self.db_branch, bzr_branch, self.logger, new_ancestry))
@@ -129,8 +130,7 @@
def retrieveDatabaseAncestry(self):
"""Efficiently retrieve ancestry from the database."""
self.logger.info("Retrieving ancestry from database.")
- db_ancestry, db_history = self.db_branch.getScannerData()
- return db_ancestry, db_history
+ return self.db_branch.getScannerData()
def _getRevisionGraph(self, bzr_branch, db_last):
if bzr_branch.repository.has_revision(db_last):
@@ -150,42 +150,32 @@
return bzr_branch.repository.get_graph(PPSource)
- def getAncestryDelta(self, bzr_branch):
- bzr_last = bzr_branch.last_revision()
- db_last = self.db_branch.last_scanned_id
- if db_last is None:
- added_ancestry = set(bzr_branch.repository.get_ancestry(bzr_last))
- added_ancestry.discard(None)
- removed_ancestry = set()
- else:
- graph = self._getRevisionGraph(bzr_branch, db_last)
- added_ancestry, removed_ancestry = (
- graph.find_difference(bzr_last, db_last))
- added_ancestry.discard(NULL_REVISION)
+ def getAncestryDelta(self, bzr_branch, bzr_last_revinfo, graph, db_last):
+ added_ancestry, removed_ancestry = graph.find_difference(
+ bzr_last_revinfo[1], db_last)
+ added_ancestry.discard(NULL_REVISION)
return added_ancestry, removed_ancestry
- def getHistoryDelta(self, bzr_history, db_history):
+ def getHistoryDelta(self, bzr_branch, bzr_last_revinfo, graph, db_history):
self.logger.info("Calculating history delta.")
- common_len = min(len(bzr_history), len(db_history))
- while common_len > 0:
- # The outer conditional improves efficiency. Without it, the
- # algorithm is O(history-size * change-size), which can be
- # excessive if a long branch is replaced by another long branch
- # with a distant (or no) common mainline parent. The inner
- # conditional is needed for correctness with branches where the
- # history does not follow the line of leftmost parents.
- if db_history[common_len - 1] == bzr_history[common_len - 1]:
- if db_history[:common_len] == bzr_history[:common_len]:
+ removed_history = []
+ common_revid = NULL_REVISION
+ for (revno, revid) in db_history:
+ try:
+ if bzr_branch.get_rev_id(revno) == revid:
+ common_revid = revid
+ common_len = revno
break
- common_len -= 1
+ except NoSuchRevision:
+ pass
+ removed_history.append(revid)
# Revision added or removed from the branch's history. These lists may
# include revisions whose history position has merely changed.
- removed_history = db_history[common_len:]
- added_history = bzr_history[common_len:]
+ added_history = list(graph.iter_lefthand_ancestry(bzr_last_revinfo[1],
+ (common_revid, )))
return added_history, removed_history
- def planDatabaseChanges(self, bzr_branch, bzr_history, db_ancestry,
- db_history):
+ def planDatabaseChanges(self, bzr_branch, bzr_last_revinfo, db_history):
"""Plan database changes to synchronize with bzrlib data.
Use the data retrieved by `retrieveDatabaseAncestry` and
@@ -193,9 +183,18 @@
"""
self.logger.info("Planning changes.")
# Find the length of the common history.
- added_history, removed_history = self.getHistoryDelta(
- bzr_history, db_history)
- added_ancestry, removed_ancestry = self.getAncestryDelta(bzr_branch)
+ db_last = self.db_branch.last_scanned_id
+ if db_last is None:
+ db_last = NULL_REVISION
+ graph = self._getRevisionGraph(bzr_branch, db_last)
+ bzr_branch.lock_read()
+ try:
+ added_history, removed_history = self.getHistoryDelta(
+ bzr_branch, bzr_last_revinfo, graph, db_history)
+ added_ancestry, removed_ancestry = self.getAncestryDelta(
+ bzr_branch, bzr_last_revinfo, graph, db_last)
+ finally:
+ bzr_branch.unlock()
notify(
events.RevisionsRemoved(
@@ -210,10 +209,9 @@
# We must insert BranchRevision rows for all revisions which were
# added to the ancestry or whose sequence value has changed.
- last_revno = len(bzr_history)
revids_to_insert = dict(
self.revisionsToInsert(
- added_history, last_revno, added_ancestry))
+ added_history, bzr_last_revinfo[0], added_ancestry))
# We must remove any stray BranchRevisions that happen to already be
# present.
existing_branchrevisions = Store.of(self.db_branch).find(
@@ -231,8 +229,8 @@
:param revisions: the set of Bazaar revision IDs to return bzrlib
Revision objects for.
"""
- revisions = bzr_branch.repository.get_parent_map(revisions)
- return bzr_branch.repository.get_revisions(revisions.keys())
+ revisions = list(bzr_branch.repository.has_revisions(revisions))
+ return bzr_branch.repository.get_revisions(revisions)
def syncRevisions(self, bzr_branch, bzr_revisions, revids_to_insert):
"""Import the supplied revisions.
@@ -257,13 +255,13 @@
"""Calculate the revisions to insert and their revnos.
:param added_history: A list of revision ids added to the revision
- history in parent-to-child order.
+ history in child-to-parent order.
:param last_revno: The revno of the last revision.
:param added_ancestry: A set of revisions that have been added to the
ancestry of the branch. May overlap with added_history.
"""
start_revno = last_revno - len(added_history) + 1
- for (revno, revision_id) in enumerate(added_history, start_revno):
+ for (revno, revision_id) in enumerate(reversed(added_history), start_revno):
yield revision_id, revno
for revision_id in added_ancestry.difference(added_history):
yield revision_id, None
@@ -290,12 +288,10 @@
for revid_seq_pair_chunk in iter_list_chunks(revid_seq_pairs, 10000):
self.db_branch.createBranchRevisionFromIDs(revid_seq_pair_chunk)
- def updateBranchStatus(self, bzr_history):
+ def updateBranchStatus(self, (revision_count, last_revision)):
"""Update the branch-scanner status in the database Branch table."""
# Record that the branch has been updated.
- revision_count = len(bzr_history)
if revision_count > 0:
- last_revision = bzr_history[-1]
revision = getUtility(IRevisionSet).getByRevisionId(last_revision)
else:
revision = None
=== modified file 'lib/lp/codehosting/scanner/tests/test_bzrsync.py'
--- lib/lp/codehosting/scanner/tests/test_bzrsync.py 2012-02-22 13:32:09 +0000
+++ lib/lp/codehosting/scanner/tests/test_bzrsync.py 2012-02-27 13:14:12 +0000
@@ -17,6 +17,7 @@
from bzrlib.tests import TestCaseWithTransport
from bzrlib.uncommit import uncommit
import pytz
+from storm.expr import Desc
from storm.locals import Store
from twisted.python.util import mergeFunctionMetadata
from zope.component import getUtility
@@ -428,9 +429,10 @@
else:
bzr_branch.set_last_revision_info(revno, bzr_rev)
delta_branch = bzr_branch
- return sync.getAncestryDelta(delta_branch)
+ return sync.getAncestryDelta(
+ delta_branch, delta_branch.last_revision_info(), graph, db_rev)
- added_ancestry, removed_ancestry = get_delta('merge', None)
+ added_ancestry, removed_ancestry = get_delta('merge', NULL_REVISION)
# All revisions are new for an unscanned branch
self.assertEqual(
set(['base', 'trunk', 'branch', 'merge']), added_ancestry)
@@ -471,8 +473,11 @@
# yield each revision along with a sequence number, starting at 1.
self.commitRevision(rev_id='rev-1')
bzrsync = self.makeBzrSync(self.db_branch)
- bzr_history = self.bzr_branch.revision_history()
- added_ancestry = bzrsync.getAncestryDelta(self.bzr_branch)[0]
+ bzr_history = ['rev-1']
+ graph = bzrsync._getRevisionGraph(self.bzr_branch, 'rev-1')
+ added_ancestry = bzrsync.getAncestryDelta(
+ self.bzr_branch, self.bzr_branch.last_revision_info(),
+ graph, 'rev-1')[0]
result = bzrsync.revisionsToInsert(
bzr_history, self.bzr_branch.revno(), added_ancestry)
self.assertEqual({'rev-1': 1}, dict(result))
@@ -483,8 +488,12 @@
(db_branch, bzr_tree), ignored = self.makeBranchWithMerge(
'base', 'trunk', 'branch', 'merge')
bzrsync = self.makeBzrSync(db_branch)
- bzr_history = bzr_tree.branch.revision_history()
- added_ancestry = bzrsync.getAncestryDelta(bzr_tree.branch)[0]
+ bzr_history = ['merge', 'trunk', 'base']
+ graph = bzrsync._getRevisionGraph(bzr_tree.branch, 'merge')
+ self.addCleanup(bzr_tree.branch.lock_read().unlock)
+ added_ancestry = bzrsync.getAncestryDelta(
+ bzr_tree.branch, bzr_tree.branch.last_revision_info(),
+ graph, NULL_REVISION)[0]
expected = {'base': 1, 'trunk': 2, 'merge': 3, 'branch': None}
self.assertEqual(
expected, dict(bzrsync.revisionsToInsert(bzr_history,
@@ -528,7 +537,7 @@
self.assertEqual(self.getBranchRevisions(db_trunk), expected)
def test_retrieveDatabaseAncestry(self):
- # retrieveDatabaseAncestry should set db_ancestry and db_history to
+ # retrieveDatabaseAncestry should set db_history to
# Launchpad's current understanding of the branch state.
# db_branch_revision_map should map Bazaar revision_ids to
# BranchRevision.ids.
@@ -541,19 +550,16 @@
'~name12/+junk/junk.contrib')
branch_revisions = IStore(BranchRevision).find(
BranchRevision, BranchRevision.branch == branch)
- sampledata = list(branch_revisions.order_by(BranchRevision.sequence))
- expected_ancestry = set(branch_revision.revision.revision_id
- for branch_revision in sampledata)
- expected_history = [branch_revision.revision.revision_id
+ sampledata = list(branch_revisions.order_by(Desc(BranchRevision.sequence)))
+ expected_history = [
+ (branch_revision.sequence, branch_revision.revision.revision_id)
for branch_revision in sampledata
if branch_revision.sequence is not None]
self.create_branch_and_tree(db_branch=branch)
bzrsync = self.makeBzrSync(branch)
- db_ancestry, db_history = (
- bzrsync.retrieveDatabaseAncestry())
- self.assertEqual(expected_ancestry, set(db_ancestry))
+ db_history = bzrsync.retrieveDatabaseAncestry()
self.assertEqual(expected_history, list(db_history))
@@ -576,9 +582,9 @@
syncer.syncBranchAndClose(self.bzr_tree.branch)
self.assertEqual(rev2_id, self.db_branch.last_scanned_id)
self.db_branch.last_scanned_id = rev1_id
- db_ancestry, db_history = self.db_branch.getScannerData()
+ db_history = self.db_branch.getScannerData()
branchrevisions_to_delete = syncer.planDatabaseChanges(
- self.bzr_branch, [rev1_id, rev2_id], db_ancestry, db_history)[1]
+ self.bzr_branch, (2, rev2_id), db_history)[1]
self.assertIn(merge_id, branchrevisions_to_delete)
=== modified file 'lib/lp/codehosting/tests/test_branchdistro.py'
--- lib/lp/codehosting/tests/test_branchdistro.py 2012-02-15 17:29:54 +0000
+++ lib/lp/codehosting/tests/test_branchdistro.py 2012-02-27 13:14:12 +0000
@@ -267,12 +267,9 @@
self.assertEqual(tip_revision_id, new_branch.last_mirrored_id)
self.assertEqual(tip_revision_id, new_branch.last_scanned_id)
# Make sure that the branch revisions have been copied.
- old_ancestry, old_history = removeSecurityProxy(
- db_branch).getScannerData()
- new_ancestry, new_history = removeSecurityProxy(
- new_branch).getScannerData()
- self.assertEqual(old_ancestry, new_ancestry)
- self.assertEqual(old_history, new_history)
+ old_history = removeSecurityProxy(db_branch).getScannerData()
+ new_history = removeSecurityProxy(new_branch).getScannerData()
+ self.assertEqual(list(old_history), list(new_history))
self.assertFalse(new_branch.pending_writes)
self.assertIs(None, new_branch.stacked_on)
self.assertEqual(new_branch, db_branch.stacked_on)
=== modified file 'lib/lp/registry/browser/person.py'
--- lib/lp/registry/browser/person.py 2012-02-21 22:46:28 +0000
+++ lib/lp/registry/browser/person.py 2012-02-27 13:14:12 +0000
@@ -198,7 +198,15 @@
)
from lp.registry.interfaces.wikiname import IWikiNameSet
from lp.registry.mail.notification import send_direct_contact_email
-from lp.registry.model.person import get_recipients
+<<<<<<< TREE
+from lp.registry.model.person import get_recipients
+=======
+from lp.registry.model.milestone import (
+ Milestone,
+ milestone_sort_key,
+ )
+from lp.registry.model.person import get_recipients
+>>>>>>> MERGE-SOURCE
from lp.services.config import config
from lp.services.database.sqlbase import flush_database_updates
from lp.services.feeds.browser import FeedsMixin
=== modified file 'lib/lp/registry/browser/product.py'
--- lib/lp/registry/browser/product.py 2012-02-24 06:33:10 +0000
+++ lib/lp/registry/browser/product.py 2012-02-27 13:14:12 +0000
@@ -1540,6 +1540,7 @@
usage_fieldname = 'answers_usage'
+<<<<<<< TREE
class ProductPrivateBugsMixin():
"""A mixin for setting the product private_bugs field."""
def setUpFields(self):
@@ -1571,6 +1572,39 @@
return parent.updateContextFromData(data, context, notify_modified)
+=======
+class ProductPrivateBugsMixin():
+ """A mixin for setting the product private_bugs field."""
+ def setUpFields(self):
+ # private_bugs is readonly since we are using a mutator but we need
+ # to edit it on the form.
+ super(ProductPrivateBugsMixin, self).setUpFields()
+ self.form_fields = self.form_fields.omit('private_bugs')
+ private_bugs = copy_field(IProduct['private_bugs'], readonly=False)
+ self.form_fields += form.Fields(private_bugs)
+
+ def validate(self, data):
+ super(ProductPrivateBugsMixin, self).validate(data)
+ private_bugs = data.get('private_bugs')
+ if private_bugs is None:
+ return
+ try:
+ self.context.checkPrivateBugsTransitionAllowed(
+ private_bugs, self.user)
+ except Exception as e:
+ self.setFieldError('private_bugs', e.message)
+
+ def updateContextFromData(self, data, context=None, notify_modified=True):
+ # private_bugs uses a mutator to check permissions, so it needs to
+ # be handled separately.
+ if data.has_key('private_bugs'):
+ self.context.setPrivateBugs(data['private_bugs'], self.user)
+ del data['private_bugs']
+ parent = super(ProductPrivateBugsMixin, self)
+ return parent.updateContextFromData(data, context, notify_modified)
+
+
+>>>>>>> MERGE-SOURCE
class ProductEditView(ProductLicenseMixin, LaunchpadEditFormView):
"""View class that lets you edit a Product object."""
=== modified file 'lib/lp/registry/browser/team.py'
--- lib/lp/registry/browser/team.py 2012-02-19 14:44:47 +0000
+++ lib/lp/registry/browser/team.py 2012-02-27 13:14:12 +0000
@@ -268,9 +268,15 @@
'Private teams must have a Restricted subscription '
'policy.')
+<<<<<<< TREE
def setUpVisibilityField(self, render_context=False):
"""Set the visibility field to read-write, or remove it."""
+=======
+ def setUpVisibilityField(self):
+ """Set the visibility field to read-write, or remove it."""
+>>>>>>> MERGE-SOURCE
self.form_fields = self.form_fields.omit('visibility')
+<<<<<<< TREE
if self.user and self.user.checkAllowVisibility():
visibility = copy_field(ITeam['visibility'], readonly=False)
self.form_fields += Fields(
@@ -280,6 +286,11 @@
field = field_names.pop()
field_names.insert(2, field)
self.form_fields = self.form_fields.select(*field_names)
+=======
+ if self.user and self.user.checkAllowVisibility():
+ visibility = copy_field(ITeam['visibility'], readonly=False)
+ self.form_fields += Fields(visibility)
+>>>>>>> MERGE-SOURCE
class TeamEditView(TeamFormMixin, PersonRenameFormMixin,
@@ -312,7 +323,11 @@
self.field_names.remove('contactemail')
self.field_names.remove('teamowner')
super(TeamEditView, self).setUpFields()
+<<<<<<< TREE
self.setUpVisibilityField(render_context=True)
+=======
+ self.setUpVisibilityField()
+>>>>>>> MERGE-SOURCE
def setUpWidgets(self):
super(TeamEditView, self).setUpWidgets()
=== modified file 'lib/lp/registry/browser/tests/team-views.txt'
=== modified file 'lib/lp/registry/browser/tests/test_team.py'
--- lib/lp/registry/browser/tests/test_team.py 2012-02-21 22:46:28 +0000
+++ lib/lp/registry/browser/tests/test_team.py 2012-02-27 13:14:12 +0000
@@ -519,6 +519,7 @@
'visibility',
[field.__name__ for field in view.form_fields])
+<<<<<<< TREE
def test_person_with_cs_can_create_private_team(self):
personset = getUtility(IPersonSet)
team = self.factory.makeTeam(
@@ -541,6 +542,30 @@
team = personset.getByName(team_name)
self.assertIsNotNone(team)
self.assertEqual(PersonVisibility.PRIVATE, team.visibility)
+=======
+ def test_person_with_cs_can_create_private_team(self):
+ personset = getUtility(IPersonSet)
+ team = self.factory.makeTeam(
+ subscription_policy=TeamSubscriptionPolicy.MODERATED)
+ product = self.factory.makeProduct(owner=team)
+ self.factory.makeCommercialSubscription(product)
+ team_name = self.factory.getUniqueString()
+ form = {
+ 'field.name': team_name,
+ 'field.displayname': 'New Team',
+ 'field.subscriptionpolicy': 'RESTRICTED',
+ 'field.visibility': 'PRIVATE',
+ 'field.actions.create': 'Create',
+ }
+ with person_logged_in(team.teamowner):
+ with FeatureFixture(self.feature_flag):
+ view = create_initialized_view(
+ personset, name=self.view_name, principal=team.teamowner,
+ form=form)
+ team = personset.getByName(team_name)
+ self.assertIsNotNone(team)
+ self.assertEqual(PersonVisibility.PRIVATE, team.visibility)
+>>>>>>> MERGE-SOURCE
def test_person_with_expired_cs_does_not_see_visibility(self):
personset = getUtility(IPersonSet)
=== modified file 'lib/lp/registry/configure.zcml'
=== modified file 'lib/lp/registry/interfaces/person.py'
--- lib/lp/registry/interfaces/person.py 2012-02-19 14:44:47 +0000
+++ lib/lp/registry/interfaces/person.py 2012-02-27 13:14:12 +0000
@@ -704,28 +704,53 @@
def anyone_can_join():
"""Quick check as to whether a team allows anyone to join."""
- def checkAllowVisibility():
- """Is the user allowed to see the visibility field.
-
- :param: The user.
- :return: True if they can, otherwise False.
- """
-
- @mutator_for(visibility)
- @call_with(user=REQUEST_USER)
- @operation_parameters(visibility=copy_field(visibility))
- @export_write_operation()
- @operation_for_version("beta")
- def transitionVisibility(visibility, user):
- """Set visibility of IPerson.
-
- :param visibility: The PersonVisibility to change to.
- :param user: The user requesting the change.
- :raises: `ImmutableVisibilityError` when the visibility can not
- be changed.
- :return: None.
- """
-
+<<<<<<< TREE
+ def checkAllowVisibility():
+ """Is the user allowed to see the visibility field.
+
+ :param: The user.
+ :return: True if they can, otherwise False.
+ """
+
+ @mutator_for(visibility)
+ @call_with(user=REQUEST_USER)
+ @operation_parameters(visibility=copy_field(visibility))
+ @export_write_operation()
+ @operation_for_version("beta")
+ def transitionVisibility(visibility, user):
+ """Set visibility of IPerson.
+
+ :param visibility: The PersonVisibility to change to.
+ :param user: The user requesting the change.
+ :raises: `ImmutableVisibilityError` when the visibility can not
+ be changed.
+ :return: None.
+ """
+
+=======
+ def checkAllowVisibility():
+ """Is the user allowed to see the visibility field.
+
+ :param: The user.
+ :return: True if they can, otherwise False.
+ """
+
+ @mutator_for(visibility)
+ @call_with(user=REQUEST_USER)
+ @operation_parameters(visibility=copy_field(visibility))
+ @export_write_operation()
+ @operation_for_version("beta")
+ def transitionVisibility(visibility, user):
+ """Set visibility of IPerson.
+
+ :param visibility: The PersonVisibility to change to.
+ :param user: The user requesting the change.
+ :raises: `ImmutableVisibilityError` when the visibility can not
+ be changed.
+ :return: None.
+ """
+
+>>>>>>> MERGE-SOURCE
class IPersonLimitedView(IHasIcon, IHasLogo):
"""IPerson attributes that require launchpad.LimitedView permission."""
=== modified file 'lib/lp/registry/model/person.py'
--- lib/lp/registry/model/person.py 2012-02-24 04:29:13 +0000
+++ lib/lp/registry/model/person.py 2012-02-27 13:14:12 +0000
@@ -295,6 +295,7 @@
from lp.services.verification.interfaces.authtoken import LoginTokenType
from lp.services.verification.interfaces.logintoken import ILoginTokenSet
from lp.services.verification.model.logintoken import LoginToken
+from lp.services.webapp.authorization import check_permission
from lp.services.webapp.dbpolicy import MasterDatabasePolicy
from lp.services.webapp.interfaces import ILaunchBag
from lp.services.worlddata.model.language import Language
@@ -2974,6 +2975,7 @@
"""See `IPerson.`"""
return self.subscriptionpolicy in CLOSED_TEAM_POLICY
+<<<<<<< TREE
def checkAllowVisibility(self):
role = IPersonRoles(self)
if role.in_commercial_admin or role.in_admin:
@@ -2992,6 +2994,27 @@
raise ImmutableVisibilityError()
self.visibility = visibility
+=======
+ def checkAllowVisibility(self):
+ feature_flag = getFeatureFlag(
+ 'disclosure.show_visibility_for_team_add.enabled')
+ if feature_flag:
+ if self.hasCurrentCommercialSubscription():
+ return True
+ else:
+ if check_permission('launchpad.Commercial', self):
+ return True
+ return False
+
+ def transitionVisibility(self, visibility, user):
+ if self.visibility == visibility:
+ return
+ validate_person_visibility(self, 'visibility', visibility)
+ if not user.checkAllowVisibility():
+ raise ImmutableVisibilityError()
+ self.visibility = visibility
+
+>>>>>>> MERGE-SOURCE
class PersonSet:
"""The set of persons."""
=== modified file 'lib/lp/registry/stories/product/xx-product-with-private-defaults.txt'
--- lib/lp/registry/stories/product/xx-product-with-private-defaults.txt 2012-02-15 13:24:01 +0000
+++ lib/lp/registry/stories/product/xx-product-with-private-defaults.txt 2012-02-27 13:14:12 +0000
@@ -18,9 +18,13 @@
>>> admin_browser.open("http://launchpad.dev/redfish/+admin")
>>> admin_browser.getControl(name="field.private_bugs").value = True
>>> admin_browser.getControl("Change").click()
+<<<<<<< TREE
>>> admin_browser.open("http://launchpad.dev/redfish/+admin")
>>> admin_browser.getControl(name="field.private_bugs").value
True
+=======
+
+>>>>>>> MERGE-SOURCE
Filing a new bug
----------------
=== modified file 'lib/lp/registry/tests/test_commercialprojects_vocabularies.py'
--- lib/lp/registry/tests/test_commercialprojects_vocabularies.py 2012-02-21 22:46:28 +0000
+++ lib/lp/registry/tests/test_commercialprojects_vocabularies.py 2012-02-27 13:14:12 +0000
@@ -64,8 +64,13 @@
"Expected %d results but got %d." % (self.num_commercial,
len(results)))
+<<<<<<< TREE
def test_searchForTerms_success(self):
# Search for active maintained projects success.
+=======
+ def test_searchForTerms_success(self):
+ # Search for for active maintained projects success.
+>>>>>>> MERGE-SOURCE
results = self.vocab.searchForTerms('widget')
self.assertEqual(
self.num_commercial, len(results),
=== modified file 'lib/lp/registry/tests/test_person.py'
--- lib/lp/registry/tests/test_person.py 2012-02-22 15:59:10 +0000
+++ lib/lp/registry/tests/test_person.py 2012-02-27 13:14:12 +0000
@@ -562,6 +562,7 @@
person = self.factory.makePerson()
self.assertFalse(person.hasCurrentCommercialSubscription())
+<<<<<<< TREE
def test_commercial_admin_with_ff_checkAllowVisibility(self):
admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL)
feature_flag = {
@@ -575,6 +576,14 @@
ImmutableVisibilityError, person.transitionVisibility,
PersonVisibility.PRIVATE, person)
+=======
+ def test_can_not_set_visibility(self):
+ person = self.factory.makePerson()
+ self.assertRaises(
+ ImmutableVisibilityError, person.transitionVisibility,
+ PersonVisibility.PRIVATE, person)
+
+>>>>>>> MERGE-SOURCE
class TestPersonStates(TestCaseWithFactory):
=== modified file 'lib/lp/registry/tests/test_personset.py'
=== modified file 'lib/lp/registry/xmlrpc/canonicalsso.py'
--- lib/lp/registry/xmlrpc/canonicalsso.py 2012-02-16 20:37:55 +0000
+++ lib/lp/registry/xmlrpc/canonicalsso.py 2012-02-27 13:14:12 +0000
@@ -1,3 +1,4 @@
+<<<<<<< TREE
# Copyright 2012 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
@@ -54,3 +55,61 @@
implements(ICanonicalSSOApplication)
title = "Canonical SSO API"
+=======
+# Copyright 2012 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""XMLRPC APIs for Canonical SSO to retrieve person details."""
+
+__metaclass__ = type
+__all__ = [
+ 'CanonicalSSOAPI',
+ 'CanonicalSSOApplication',
+ ]
+
+
+from zope.component import getUtility
+from zope.interface import implements
+from zope.security.proxy import removeSecurityProxy
+
+from lp.registry.interfaces.person import (
+ ICanonicalSSOAPI,
+ ICanonicalSSOApplication,
+ IPerson,
+ )
+from lp.services.identity.interfaces.account import IAccountSet
+from lp.services.webapp import LaunchpadXMLRPCView
+
+
+class CanonicalSSOAPI(LaunchpadXMLRPCView):
+ """See `ICanonicalSSOAPI`."""
+
+ implements(ICanonicalSSOAPI)
+
+ def getPersonDetailsByOpenIDIdentifier(self, openid_identifier):
+ try:
+ account = getUtility(IAccountSet).getByOpenIDIdentifier(
+ openid_identifier.decode('ascii'))
+ except LookupError:
+ return None
+ person = IPerson(account, None)
+ if person is None:
+ return
+
+ time_zone = person.location and person.location.time_zone
+ team_names = dict(
+ (removeSecurityProxy(t).name, t.private)
+ for t in person.teams_participated_in)
+ return {
+ 'name': person.name,
+ 'time_zone': time_zone,
+ 'teams': team_names,
+ }
+
+
+class CanonicalSSOApplication:
+ """Canonical SSO end-point."""
+ implements(ICanonicalSSOApplication)
+
+ title = "Canonical SSO API"
+>>>>>>> MERGE-SOURCE
=== modified file 'lib/lp/services/features/flags.py'
=== modified file 'lib/lp/testing/factory.py'
=== modified file 'lib/lp/translations/browser/potemplate.py'
=== added file 'utilities/lpsetup.py'
--- utilities/lpsetup.py 1970-01-01 00:00:00 +0000
+++ utilities/lpsetup.py 2012-02-27 13:14:12 +0000
@@ -0,0 +1,1671 @@
+#!/usr/bin/env python
+# Copyright 2012 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Create and update Launchpad development and testing environments."""
+
+__metaclass__ = type
+__all__ = [
+ 'ActionsBasedSubCommand',
+ 'apt_get_install',
+ 'ArgumentParser',
+ 'BaseSubCommand',
+ 'bzr_whois',
+ 'call',
+ 'cd',
+ 'check_output',
+ 'CommandError',
+ 'create_lxc',
+ 'environ',
+ 'file_append',
+ 'file_prepend',
+ 'generate_ssh_keys',
+ 'get_container_path',
+ 'get_lxc_gateway',
+ 'get_su_command',
+ 'get_user_home',
+ 'get_user_ids',
+ 'initialize',
+ 'InstallSubCommand',
+ 'join_command',
+ 'LaunchpadError',
+ 'link_sourcecode_in_branches',
+ 'LXCInstallSubCommand',
+ 'lxc_in_state',
+ 'lxc_running',
+ 'lxc_stopped',
+ 'make_launchpad',
+ 'mkdirs',
+ 'setup_apt',
+ 'setup_codebase',
+ 'setup_external_sourcecode',
+ 'setup_launchpad',
+ 'setup_launchpad_lxc',
+ 'ssh',
+ 'SSHError',
+ 'start_lxc',
+ 'stop_lxc',
+ 'su',
+ 'this_command',
+ 'update_launchpad',
+ 'update_launchpad_lxc',
+ 'user_exists',
+ 'ValidationError',
+ 'wait_for_lxc',
+ ]
+
+# To run doctests: python -m doctest -v lpsetup.py
+
+from collections import namedtuple
+from contextlib import contextmanager
+from email.Utils import parseaddr, formataddr
+from functools import partial
+import argparse
+import errno
+import os
+import pipes
+import platform
+import pwd
+import shutil
+import subprocess
+import sys
+import time
+
+
+APT_REPOSITORIES = (
+ 'deb http://archive.ubuntu.com/ubuntu {distro} multiverse',
+ 'deb http://archive.ubuntu.com/ubuntu {distro}-updates multiverse',
+ 'deb http://archive.ubuntu.com/ubuntu {distro}-security multiverse',
+ 'ppa:launchpad/ppa',
+ 'ppa:bzr/ppa',
+ )
+BASE_PACKAGES = ['ssh', 'bzr', 'language-pack-en']
+CHECKOUT_DIR = '~/launchpad/branches'
+DEPENDENCIES_DIR = '~/launchpad/dependencies'
+DHCP_FILE = '/etc/dhcp/dhclient.conf'
+HOSTS_CONTENT = (
+ ('127.0.0.88',
+ 'launchpad.dev answers.launchpad.dev archive.launchpad.dev '
+ 'api.launchpad.dev bazaar-internal.launchpad.dev beta.launchpad.dev '
+ 'blueprints.launchpad.dev bugs.launchpad.dev code.launchpad.dev '
+ 'feeds.launchpad.dev id.launchpad.dev keyserver.launchpad.dev '
+ 'lists.launchpad.dev openid.launchpad.dev '
+ 'ubuntu-openid.launchpad.dev ppa.launchpad.dev '
+ 'private-ppa.launchpad.dev testopenid.dev translations.launchpad.dev '
+ 'xmlrpc-private.launchpad.dev xmlrpc.launchpad.dev'),
+ ('127.0.0.99', 'bazaar.launchpad.dev'),
+ )
+HOSTS_FILE = '/etc/hosts'
+LP_APACHE_MODULES = 'proxy proxy_http rewrite ssl deflate headers'
+LP_APACHE_ROOTS = (
+ '/var/tmp/bazaar.launchpad.dev/static',
+ '/var/tmp/bazaar.launchpad.dev/mirrors',
+ '/var/tmp/archive',
+ '/var/tmp/ppa',
+ )
+LP_BZR_LOCATIONS = (
+ ('submit_branch', '{checkout_dir}'),
+ ('public_branch', 'bzr+ssh://bazaar.launchpad.net/~{lpuser}/launchpad'),
+ ('public_branch:policy', 'appendpath'),
+ ('push_location', 'lp:~{lpuser}/launchpad'),
+ ('push_location:policy', 'appendpath'),
+ ('merge_target', '{checkout_dir}'),
+ ('submit_to', 'merge@xxxxxxxxxxxxxxxxxx'),
+ )
+LP_CHECKOUT = 'devel'
+LP_PACKAGES = [
+ 'bzr', 'launchpad-developer-dependencies', 'apache2',
+ 'apache2-mpm-worker', 'libapache2-mod-wsgi'
+ ]
+LP_REPOS = (
+ 'http://bazaar.launchpad.net/~launchpad-pqm/launchpad/devel',
+ 'lp:launchpad',
+ )
+LP_SOURCE_DEPS = (
+ 'http://bazaar.launchpad.net/~launchpad/lp-source-dependencies/trunk')
+LXC_CONFIG_TEMPLATE = '/etc/lxc/local.conf'
+LXC_GUEST_ARCH = 'i386'
+LXC_GUEST_CHOICES = ('lucid', 'oneiric', 'precise')
+LXC_GUEST_OS = LXC_GUEST_CHOICES[0]
+LXC_NAME = 'lptests'
+LXC_OPTIONS = """
+lxc.network.type = veth
+lxc.network.link = {interface}
+lxc.network.flags = up
+"""
+LXC_PACKAGES = ['lxc', 'libvirt-bin']
+LXC_PATH = '/var/lib/lxc/'
+RESOLV_FILE = '/etc/resolv.conf'
+
+
+Env = namedtuple('Env', 'uid gid home')
+
+
+class LaunchpadError(Exception):
+ """Base exception for lpsetup."""
+
+
+class CommandError(LaunchpadError):
+ """Errors occurred running shell commands."""
+
+
+class SSHError(LaunchpadError):
+ """Errors occurred during SSH connection."""
+
+
+class ValidationError(LaunchpadError):
+ """Argparse invalid arguments."""
+
+
+def apt_get_install(*args):
+ """Install given packages using apt."""
+ with environ(DEBIAN_FRONTEND='noninteractive'):
+ cmd = ('apt-get', '-y', 'install') + args
+ return call(*cmd)
+
+
+def join_command(args):
+ """Return a valid Unix command line from `args`.
+
+ >>> join_command(['ls', '-l'])
+ 'ls -l'
+
+ Arguments containing spaces and empty args are correctly quoted::
+
+ >>> join_command(['command', 'arg1', 'arg containig space', ''])
+ "command arg1 'arg containig space' ''"
+ """
+ return ' '.join(pipes.quote(arg) for arg in args)
+
+
+def bzr_whois(user):
+ """Return full name and email of bzr `user`.
+
+ Return None if the given `user` does not have a bzr user id.
+ """
+ with su(user):
+ try:
+ whoami = check_output('bzr', 'whoami')
+ except (CommandError, OSError):
+ return None
+ return parseaddr(whoami)
+
+
+@contextmanager
+def cd(directory):
+ """A context manager to temporary change current working dir, e.g.::
+
+ >>> import os
+ >>> os.chdir('/tmp')
+ >>> with cd('/bin'): print os.getcwd()
+ /bin
+ >>> print os.getcwd()
+ /tmp
+ """
+ cwd = os.getcwd()
+ os.chdir(directory)
+ yield
+ os.chdir(cwd)
+
+
+def check_output(*args, **kwargs):
+ r"""Run the command with the given arguments.
+
+ The first argument is the path to the command to run, subsequent arguments
+ are command-line arguments to be passed.
+
+ Usually the output is captured by a pipe and returned::
+
+ >>> check_output('echo', 'output')
+ 'output\n'
+
+ A `CommandError` exception is raised if the return code is not zero::
+
+ >>> check_output('ls', '--not a valid switch', stderr=subprocess.PIPE)
+ Traceback (most recent call last):
+ CommandError: Error running: ls '--not a valid switch'
+
+ None arguments are ignored::
+
+ >>> check_output(None, 'echo', None, 'output')
+ 'output\n'
+ """
+ args = [i for i in args if i is not None]
+ process = subprocess.Popen(
+ args, stdout=kwargs.pop('stdout', subprocess.PIPE),
+ close_fds=True, **kwargs)
+ stdout, stderr = process.communicate()
+ retcode = process.poll()
+ if retcode:
+ raise CommandError('Error running: ' + join_command(args))
+ return stdout
+
+
+call = partial(check_output, stdout=None)
+
+
+@contextmanager
+def environ(**kwargs):
+ """A context manager to temporary change environment variables.
+
+ If an existing environment variable is changed, it is restored during
+ context cleanup::
+
+ >>> import os
+ >>> os.environ['MY_VARIABLE'] = 'foo'
+ >>> with environ(MY_VARIABLE='bar'): print os.getenv('MY_VARIABLE')
+ bar
+ >>> print os.getenv('MY_VARIABLE')
+ foo
+ >>> del os.environ['MY_VARIABLE']
+
+ If we are adding environment variables, they are removed during context
+ cleanup::
+
+ >>> import os
+ >>> with environ(MY_VAR1='foo', MY_VAR2='bar'):
+ ... print os.getenv('MY_VAR1'), os.getenv('MY_VAR2')
+ foo bar
+ >>> os.getenv('MY_VAR1') == os.getenv('MY_VAR2') == None
+ True
+ """
+ backup = {}
+ for key, value in kwargs.items():
+ backup[key] = os.getenv(key)
+ os.environ[key] = value
+ yield
+ for key, value in backup.items():
+ if value is None:
+ del os.environ[key]
+ else:
+ os.environ[key] = value
+
+
+def file_append(filename, line):
+ r"""Append given `line`, if not present, at the end of `filename`.
+
+ Usage example::
+
+ >>> import tempfile
+ >>> f = tempfile.NamedTemporaryFile('w', delete=False)
+ >>> f.write('line1\n')
+ >>> f.close()
+ >>> file_append(f.name, 'new line\n')
+ >>> open(f.name).read()
+ 'line1\nnew line\n'
+
+ Nothing happens if the file already contains the given `line`::
+
+ >>> file_append(f.name, 'new line\n')
+ >>> open(f.name).read()
+ 'line1\nnew line\n'
+
+ A new line is automatically added before the given `line` if it is not
+ present at the end of current file content::
+
+ >>> import tempfile
+ >>> f = tempfile.NamedTemporaryFile('w', delete=False)
+ >>> f.write('line1')
+ >>> f.close()
+ >>> file_append(f.name, 'new line\n')
+ >>> open(f.name).read()
+ 'line1\nnew line\n'
+
+ The file is created if it does not exist::
+
+ >>> import tempfile
+ >>> filename = tempfile.mktemp()
+ >>> file_append(filename, 'line1\n')
+ >>> open(filename).read()
+ 'line1\n'
+ """
+ with open(filename, 'a+') as f:
+ content = f.read()
+ if line not in content:
+ if content.endswith('\n') or not content:
+ f.write(line)
+ else:
+ f.write('\n' + line)
+
+
+def file_prepend(filename, line):
+ r"""Insert given `line`, if not present, at the beginning of `filename`.
+
+ Usage example::
+
+ >>> import tempfile
+ >>> f = tempfile.NamedTemporaryFile('w', delete=False)
+ >>> f.write('line1\n')
+ >>> f.close()
+ >>> file_prepend(f.name, 'line0\n')
+ >>> open(f.name).read()
+ 'line0\nline1\n'
+
+ If the file starts with the given `line`, nothing happens::
+
+ >>> file_prepend(f.name, 'line0\n')
+ >>> open(f.name).read()
+ 'line0\nline1\n'
+
+ If the file contains the given `line`, but not at the beginning,
+ the line is moved on top::
+
+ >>> file_prepend(f.name, 'line1\n')
+ >>> open(f.name).read()
+ 'line1\nline0\n'
+ """
+ with open(filename, 'r+') as f:
+ lines = f.readlines()
+ if lines[0] != line:
+ if line in lines:
+ lines.remove(line)
+ lines.insert(0, line)
+ f.seek(0)
+ f.writelines(lines)
+
+
+def generate_ssh_keys(directory, filename='id_rsa'):
+ """Generate ssh key pair saving them inside the given `directory`."""
+ path = os.path.join(directory, filename)
+ return call('ssh-keygen', '-q', '-t', 'rsa', '-N', '', '-f', path)
+
+
+def get_container_path(lxc_name, path='', base_path=LXC_PATH):
+ """Return the path of LXC container called `lxc_name`.
+
+ If a `path` is given, return that path inside the container, e.g.::
+
+ >>> get_container_path('mycontainer')
+ '/var/lib/lxc/mycontainer/rootfs/'
+ >>> get_container_path('mycontainer', '/etc/apt/')
+ '/var/lib/lxc/mycontainer/rootfs/etc/apt/'
+ >>> get_container_path('mycontainer', 'home')
+ '/var/lib/lxc/mycontainer/rootfs/home'
+ """
+ return os.path.join(base_path, lxc_name, 'rootfs', path.lstrip('/'))
+
+
+def get_lxc_gateway():
+ """Return a tuple of gateway name and address.
+
+ The gateway name and address will change depending on which version
+ of Ubuntu the script is running on.
+ """
+ release_name = platform.linux_distribution()[2]
+ if release_name == 'oneiric':
+ return 'virbr0', '192.168.122.1'
+ else:
+ return 'lxcbr0', '10.0.3.1'
+
+
+def get_su_command(user, args):
+ """Return a command line as a sequence, prepending "su" if necessary.
+
+ This can be used together with `call` or `check_output` when the `su`
+ context manager is not enaugh (e.g. an external program uses uid rather
+ than euid).
+
+ >>> import getpass
+ >>> current_user = getpass.getuser()
+
+ If the su is requested as current user, the arguments are returned as
+ given::
+
+ >>> get_su_command(current_user, ('ls', '-l'))
+ ('ls', '-l')
+
+ Otherwise, "su" is prepended::
+
+ >>> get_su_command('nobody', ('ls', '-l', 'my file'))
+ ('su', 'nobody', '-c', "ls -l 'my file'")
+ """
+ if get_user_ids(user)[0] != os.getuid():
+ args = [i for i in args if i is not None]
+ return ('su', user, '-c', join_command(args))
+ return args
+
+
+def get_user_home(user):
+ """Return the home directory of the given `user`.
+
+ >>> get_user_home('root')
+ '/root'
+
+ If the user does not exist, return a default /home/[username] home::
+
+ >>> get_user_home('_this_user_does_not_exist_')
+ '/home/_this_user_does_not_exist_'
+ """
+ try:
+ return pwd.getpwnam(user).pw_dir
+ except KeyError:
+ return os.path.join(os.path.sep, 'home', user)
+
+
+def get_user_ids(user):
+ """Return the uid and gid of given `user`, e.g.::
+
+ >>> get_user_ids('root')
+ (0, 0)
+ """
+ userdata = pwd.getpwnam(user)
+ return userdata.pw_uid, userdata.pw_gid
+
+
+def lxc_in_state(state, lxc_name, timeout=30):
+ """Return True if the LXC named `lxc_name` is in state `state`.
+
+ Return False otherwise.
+ """
+ while timeout:
+ try:
+ output = check_output(
+ 'lxc-info', '-n', lxc_name, stderr=subprocess.STDOUT)
+ except CommandError:
+ pass
+ else:
+ if state in output:
+ return True
+ timeout -= 1
+ time.sleep(1)
+ return False
+
+
+lxc_running = partial(lxc_in_state, 'RUNNING')
+lxc_stopped = partial(lxc_in_state, 'STOPPED')
+
+
+def mkdirs(*args):
+ """Create leaf directories (given as `args`) and all intermediate ones.
+
+ >>> import tempfile
+ >>> base_dir = tempfile.mktemp(suffix='/')
+ >>> dir1 = tempfile.mktemp(prefix=base_dir)
+ >>> dir2 = tempfile.mktemp(prefix=base_dir)
+ >>> mkdirs(dir1, dir2)
+ >>> os.path.isdir(dir1)
+ True
+ >>> os.path.isdir(dir2)
+ True
+
+ If the leaf directory already exists the function returns without errors::
+
+ >>> mkdirs(dir1)
+
+ An `OSError` is raised if the leaf path exists and it is a file::
+
+ >>> f = tempfile.NamedTemporaryFile(
+ ... 'w', delete=False, prefix=base_dir)
+ >>> f.close()
+ >>> mkdirs(f.name) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ OSError: ...
+ """
+ for directory in args:
+ try:
+ os.makedirs(directory)
+ except OSError as err:
+ if err.errno != errno.EEXIST or os.path.isfile(directory):
+ raise
+
+
+def ssh(location, user=None, caller=subprocess.call):
+ """Return a callable that can be used to run ssh shell commands.
+
+ The ssh `location` and, optionally, `user` must be given.
+ If the user is None then the current user is used for the connection.
+
+ The callable internally uses the given `caller`::
+
+ >>> def caller(cmd):
+ ... print cmd
+ >>> sshcall = ssh('example.com', 'myuser', caller=caller)
+ >>> root_sshcall = ssh('example.com', caller=caller)
+ >>> sshcall('ls -l') # doctest: +ELLIPSIS
+ ('ssh', '-t', ..., 'myuser@xxxxxxxxxxx', '--', 'ls -l')
+ >>> root_sshcall('ls -l') # doctest: +ELLIPSIS
+ ('ssh', '-t', ..., 'example.com', '--', 'ls -l')
+
+ If the ssh command exits with an error code, an `SSHError` is raised::
+
+ >>> ssh('loc', caller=lambda cmd: 1)('ls -l') # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ SSHError: ...
+
+ If ignore_errors is set to True when executing the command, no error
+ will be raised, even if the command itself returns an error code.
+
+ >>> sshcall = ssh('loc', caller=lambda cmd: 1)
+ >>> sshcall('ls -l', ignore_errors=True)
+ """
+ if user is not None:
+ location = '{user}@{location}'.format(user=user, location=location)
+
+ def _sshcall(cmd, ignore_errors=False):
+ sshcmd = (
+ 'ssh',
+ '-t',
+ '-t', # Yes, this second -t is deliberate. See `man ssh`.
+ '-o', 'StrictHostKeyChecking=no',
+ '-o', 'UserKnownHostsFile=/dev/null',
+ location,
+ '--', cmd,
+ )
+ if caller(sshcmd) and not ignore_errors:
+ raise SSHError('Error running command: ' + ' '.join(sshcmd))
+
+ return _sshcall
+
+
+@contextmanager
+def su(user):
+ """A context manager to temporary run the script as a different user."""
+ uid, gid = get_user_ids(user)
+ os.setegid(gid)
+ os.seteuid(uid)
+ home = get_user_home(user)
+ with environ(HOME=home):
+ try:
+ yield Env(uid, gid, home)
+ finally:
+ os.setegid(os.getgid())
+ os.seteuid(os.getuid())
+
+
+def this_command(directory, args):
+ """Return a command line to re-launch this script using given `args`.
+
+ The command returned will execute this script from `directory`::
+
+ >>> import os
+ >>> script_name = os.path.basename(__file__)
+
+ >>> cmd = this_command('/home/user/lp/branches', ['install'])
+ >>> cmd == ('cd /home/user/lp/branches && devel/utilities/' +
+ ... script_name + ' install')
+ True
+
+ Arguments are correctly quoted::
+
+ >>> cmd = this_command('/path', ['-arg', 'quote me'])
+ >>> cmd == ('cd /path && devel/utilities/' +
+ ... script_name + " -arg 'quote me'")
+ True
+ """
+ script = join_command([
+ os.path.join(LP_CHECKOUT, 'utilities', os.path.basename(__file__)),
+ ] + list(args))
+ return join_command(['cd', directory]) + ' && ' + script
+
+
+def user_exists(username):
+ """Return True if given `username` exists, e.g.::
+
+ >>> user_exists('root')
+ True
+ >>> user_exists('_this_user_does_not_exist_')
+ False
+ """
+ try:
+ pwd.getpwnam(username)
+ except KeyError:
+ return False
+ return True
+
+
+def setup_codebase(user, valid_ssh_keys, checkout_dir, dependencies_dir):
+ """Set up Launchpad repository and source dependencies.
+
+ Return True if new changes are pulled from bzr repository.
+ """
+ # Using real su because bzr uses uid.
+ if os.path.exists(checkout_dir):
+ # Pull the repository.
+ revno_args = ('bzr', 'revno', checkout_dir)
+ revno = check_output(*revno_args)
+ call(*get_su_command(user, ['bzr', 'pull', '-d', checkout_dir]))
+ changed = revno != check_output(*revno_args)
+ else:
+ # Branch the repository.
+ cmd = ('bzr', 'branch',
+ LP_REPOS[1] if valid_ssh_keys else LP_REPOS[0], checkout_dir)
+ call(*get_su_command(user, cmd))
+ changed = True
+ # Check repository integrity.
+ if subprocess.call(['bzr', 'status', '-q', checkout_dir]):
+ raise LaunchpadError(
+ 'Repository {0} is corrupted.'.format(checkout_dir))
+ # Set up source dependencies.
+ with su(user):
+ for subdir in ('eggs', 'yui', 'sourcecode'):
+ mkdirs(os.path.join(dependencies_dir, subdir))
+ download_cache = os.path.join(dependencies_dir, 'download-cache')
+ if os.path.exists(download_cache):
+ call('bzr', 'up', download_cache)
+ else:
+ call('bzr', 'co', '--lightweight', LP_SOURCE_DEPS, download_cache)
+ return changed
+
+
+def setup_external_sourcecode(
+ user, valid_ssh_keys, checkout_dir, dependencies_dir):
+ """Update and link external sourcecode."""
+ cmd = (
+ 'utilities/update-sourcecode',
+ None if valid_ssh_keys else '--use-http',
+ os.path.join(dependencies_dir, 'sourcecode'),
+ )
+ with cd(checkout_dir):
+ # Using real su because update-sourcecode uses uid.
+ call(*get_su_command(user, cmd))
+ with su(user):
+ call('utilities/link-external-sourcecode', dependencies_dir)
+
+
+def make_launchpad(user, checkout_dir, install=False):
+ """Make and optionally install Launchpad."""
+ # Using real su because mailman make script uses uid.
+ call(*get_su_command(user, ['make', '-C', checkout_dir]))
+ if install:
+ call('make', '-C', checkout_dir, 'install')
+
+
+def initialize(
+ user, full_name, email, lpuser, private_key, public_key, valid_ssh_keys,
+ dependencies_dir, directory):
+ """Initialize host machine."""
+ # Install necessary deb packages. This requires Oneiric or later.
+ call('apt-get', 'update')
+ apt_get_install(*BASE_PACKAGES)
+ # Create the user (if he does not exist).
+ if not user_exists(user):
+ call('useradd', '-m', '-s', '/bin/bash', '-U', user)
+ # Generate root ssh keys if they do not exist.
+ if not os.path.exists('/root/.ssh/id_rsa.pub'):
+ generate_ssh_keys('/root/.ssh/')
+ with su(user) as env:
+ # Set up the user's ssh directory. The ssh key must be associated
+ # with the lpuser's Launchpad account.
+ ssh_dir = os.path.join(env.home, '.ssh')
+ mkdirs(ssh_dir)
+ # Generate user ssh keys if none are supplied.
+ if not valid_ssh_keys:
+ generate_ssh_keys(ssh_dir)
+ private_key = open(os.path.join(ssh_dir, 'id_rsa')).read()
+ public_key = open(os.path.join(ssh_dir, 'id_rsa.pub')).read()
+ priv_file = os.path.join(ssh_dir, 'id_rsa')
+ pub_file = os.path.join(ssh_dir, 'id_rsa.pub')
+ auth_file = os.path.join(ssh_dir, 'authorized_keys')
+ known_hosts = os.path.join(ssh_dir, 'known_hosts')
+ known_host_content = check_output(
+ 'ssh-keyscan', '-t', 'rsa', 'bazaar.launchpad.net')
+ for filename, contents, mode in [
+ (priv_file, private_key, 'w'),
+ (pub_file, public_key, 'w'),
+ (auth_file, public_key, 'a'),
+ (known_hosts, known_host_content, 'a'),
+ ]:
+ with open(filename, mode) as f:
+ f.write(contents + '\n')
+ os.chmod(filename, 0644)
+ os.chmod(priv_file, 0600)
+ # Set up bzr and Launchpad authentication.
+ call('bzr', 'whoami', formataddr([full_name, email]))
+ if valid_ssh_keys:
+ subprocess.call(['bzr', 'lp-login', lpuser])
+ # Set up the repository.
+ mkdirs(directory)
+ call('bzr', 'init-repo', '--2a', directory)
+ # Set up the codebase.
+ checkout_dir = os.path.join(directory, LP_CHECKOUT)
+ setup_codebase(user, valid_ssh_keys, checkout_dir, dependencies_dir)
+ # Set up bzr locations
+ tmpl = ''.join('{0} = {1}\n'.format(k, v) for k, v in LP_BZR_LOCATIONS)
+ locations = tmpl.format(checkout_dir=checkout_dir, lpuser=lpuser)
+ contents = '[{0}]\n'.format(directory) + locations
+ with su(user) as env:
+ bzr_locations = os.path.join(env.home, '.bazaar', 'locations.conf')
+ file_append(bzr_locations, contents)
+
+
+def setup_apt(no_repositories=True):
+ """Setup, update and upgrade deb packages."""
+ if not no_repositories:
+ distro = check_output('lsb_release', '-cs').strip()
+ # APT repository update.
+ for reposirory in APT_REPOSITORIES:
+ assume_yes = None if distro == 'lucid' else '-y'
+ call('add-apt-repository', assume_yes,
+ reposirory.format(distro=distro))
+ # XXX frankban 2012-01-13 - Bug 892892: upgrading mountall in LXC
+ # containers currently does not work.
+ call("echo 'mountall hold' | dpkg --set-selections", shell=True)
+ call('apt-get', 'update')
+ # Install base and Launchpad deb packages.
+ apt_get_install(*LP_PACKAGES)
+
+
+def setup_launchpad(user, dependencies_dir, directory, valid_ssh_keys):
+ """Set up the Launchpad environment."""
+ # User configuration.
+ subprocess.call(['adduser', user, 'sudo'])
+ gid = pwd.getpwnam(user).pw_gid
+ subprocess.call(['addgroup', '--gid', str(gid), user])
+ # Set up Launchpad dependencies.
+ checkout_dir = os.path.join(directory, LP_CHECKOUT)
+ setup_external_sourcecode(
+ user, valid_ssh_keys, checkout_dir, dependencies_dir)
+ with su(user):
+ # Create Apache document roots, to avoid warnings.
+ mkdirs(*LP_APACHE_ROOTS)
+ # Set up Apache modules.
+ for module in LP_APACHE_MODULES.split():
+ call('a2enmod', module)
+ with cd(checkout_dir):
+ # Launchpad database setup.
+ call('utilities/launchpad-database-setup', user)
+ # Make and install launchpad.
+ make_launchpad(user, checkout_dir, install=True)
+ # Set up container hosts file.
+ lines = ['{0}\t{1}'.format(ip, names) for ip, names in HOSTS_CONTENT]
+ file_append(HOSTS_FILE, '\n'.join(lines))
+
+
+def update_launchpad(user, valid_ssh_keys, dependencies_dir, directory, apt):
+ """Update the Launchpad environment."""
+ if apt:
+ setup_apt(no_repositories=True)
+ checkout_dir = os.path.join(directory, LP_CHECKOUT)
+ # Update the Launchpad codebase.
+ changed = setup_codebase(
+ user, valid_ssh_keys, checkout_dir, dependencies_dir)
+ setup_external_sourcecode(
+ user, valid_ssh_keys, checkout_dir, dependencies_dir)
+ if changed:
+ make_launchpad(user, checkout_dir, install=False)
+
+
+def link_sourcecode_in_branches(user, dependencies_dir, directory):
+ """Link external sourcecode for all branches in the project."""
+ checkout_dir = os.path.join(directory, LP_CHECKOUT)
+ cmd = os.path.join(checkout_dir, 'utilities', 'link-external-sourcecode')
+ with su(user):
+ for dirname in os.listdir(directory):
+ branch = os.path.join(directory, dirname)
+ sourcecode_dir = os.path.join(branch, 'sourcecode')
+ if (branch != checkout_dir and
+ os.path.exists(sourcecode_dir) and
+ os.path.isdir(sourcecode_dir)):
+ call(cmd, '--parent', dependencies_dir, '--target', branch)
+
+
+def create_lxc(user, lxc_name, lxc_arch, lxc_os):
+ """Create the LXC named `lxc_name` sharing `user` home directory.
+
+ The container will be used as development environment or as base template
+ for parallel testing using ephemeral instances.
+ """
+ # Install necessary deb packages.
+ apt_get_install(*LXC_PACKAGES)
+ # XXX 2012-02-02 gmb bug=925024:
+ # These calls need to be removed once the lxc vs. apparmor bug
+ # is resolved, since having apparmor enabled for lxc is very
+ # much a Good Thing.
+ # Disable the apparmor profiles for lxc so that we don't have
+ # problems installing postgres.
+ call('ln', '-s',
+ '/etc/apparmor.d/usr.bin.lxc-start', '/etc/apparmor.d/disable/')
+ call('apparmor_parser', '-R', '/etc/apparmor.d/usr.bin.lxc-start')
+ # Update resolv file in order to get the ability to ssh into the LXC
+ # container using its name.
+ lxc_gateway_name, lxc_gateway_address = get_lxc_gateway()
+ file_prepend(RESOLV_FILE, 'nameserver {0}\n'.format(lxc_gateway_address))
+ file_append(
+ DHCP_FILE,
+ 'prepend domain-name-servers {0};\n'.format(lxc_gateway_address))
+ # Container configuration template.
+ content = LXC_OPTIONS.format(interface=lxc_gateway_name)
+ with open(LXC_CONFIG_TEMPLATE, 'w') as f:
+ f.write(content)
+ # Creating container.
+ call(
+ 'lxc-create',
+ '-t', 'ubuntu',
+ '-n', lxc_name,
+ '-f', LXC_CONFIG_TEMPLATE,
+ '--',
+ '-r {os} -a {arch} -b {user}'.format(
+ os=lxc_os, arch=lxc_arch, user=user),
+ )
+ # Set up root ssh key.
+ user_authorized_keys = os.path.expanduser(
+ '~' + user + '/.ssh/authorized_keys')
+ with open(user_authorized_keys, 'a') as f:
+ f.write(open('/root/.ssh/id_rsa.pub').read())
+ dst = get_container_path(lxc_name, '/root/.ssh/')
+ mkdirs(dst)
+ shutil.copy(user_authorized_keys, dst)
+
+
+def initialize_lxc(lxc_name, lxc_os):
+ """Initialize LXC container."""
+ base_packages = list(BASE_PACKAGES)
+ if lxc_os == 'lucid':
+ # Install argparse to be able to run this script inside a lucid lxc.
+ base_packages.append('python-argparse')
+ ssh(lxc_name)(
+ 'DEBIAN_FRONTEND=noninteractive '
+ 'apt-get install -y ' + ' '.join(base_packages))
+
+
+def setup_launchpad_lxc(
+ user, dependencies_dir, directory, valid_ssh_keys, lxc_name):
+ """Set up the Launchpad environment inside an LXC."""
+ # Use ssh to call this script from inside the container.
+ args = [
+ 'install', '-u', user, '-a', 'setup_apt', 'setup_launchpad',
+ '-d', dependencies_dir, '-c', directory
+ ]
+ cmd = this_command(directory, args)
+ ssh(lxc_name)(cmd)
+
+
+def update_launchpad_lxc(user, dependencies_dir, directory, lxc_name):
+ """Update the Launchpad environment inside an LXC."""
+ # Use ssh to call this script from inside the container.
+ # XXX frankban: update branch in lxc
+ pass
+
+
+def start_lxc(lxc_name):
+ """Start the lxc instance named `lxc_name`."""
+ call('lxc-start', '-n', lxc_name, '-d')
+
+
+def stop_lxc(lxc_name):
+ """Stop the lxc instance named `lxc_name`."""
+ ssh(lxc_name)('poweroff')
+ if not lxc_stopped(lxc_name):
+ subprocess.call(['lxc-stop', '-n', lxc_name])
+
+
+def wait_for_lxc(lxc_name, trials=60, sleep_seconds=1):
+ """Try to ssh as `user` into the LXC container named `lxc_name`."""
+ sshcall = ssh(lxc_name)
+ while True:
+ trials -= 1
+ try:
+ sshcall('true')
+ except SSHError:
+ if not trials:
+ raise
+ time.sleep(sleep_seconds)
+ else:
+ break
+
+
+def handle_user(namespace):
+ """Handle user argument.
+
+ This validator populates namespace with `home_dir` name::
+
+ >>> import getpass
+ >>> username = getpass.getuser()
+
+ >>> namespace = argparse.Namespace(user=username)
+
+ >>> handle_user(namespace)
+ >>> namespace.home_dir == '/home/' + username
+ True
+
+ The validation fails if the current user is root and no user is provided::
+
+ >>> namespace = argparse.Namespace(user=None, euid=0)
+ >>> handle_user(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: argument user ...
+ """
+ if namespace.user is None:
+ if not namespace.euid:
+ raise ValidationError('argument user can not be omitted if '
+ 'the script is run as root.')
+ namespace.user = pwd.getpwuid(namespace.euid).pw_name
+ namespace.home_dir = get_user_home(namespace.user)
+
+
+def handle_lpuser(namespace):
+ """Handle lpuser argument.
+
+ If lpuser is not provided by namespace, the user name is used::
+
+ >>> import getpass
+ >>> username = getpass.getuser()
+
+ >>> namespace = argparse.Namespace(user=username, lpuser=None)
+ >>> handle_lpuser(namespace)
+ >>> namespace.lpuser == username
+ True
+ """
+ if namespace.lpuser is None:
+ namespace.lpuser = namespace.user
+
+
+def handle_userdata(namespace, whois=bzr_whois):
+ """Handle full_name and email arguments.
+
+ If they are not provided, this function tries to obtain them using
+ the given `whois` callable::
+
+ >>> namespace = argparse.Namespace(
+ ... full_name=None, email=None, user='root')
+ >>> email = 'email@xxxxxxxxxxx'
+ >>> handle_userdata(namespace, lambda user: (user, email))
+ >>> namespace.full_name == namespace.user
+ True
+ >>> namespace.email == email
+ True
+
+ The validation fails if full_name or email are not provided and
+ they can not be obtained using the `whois` callable::
+
+ >>> namespace = argparse.Namespace(
+ ... full_name=None, email=None, user='root')
+ >>> handle_userdata(namespace, lambda user: None) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: arguments full-name ...
+
+ >>> namespace = argparse.Namespace(
+ ... full_name=None, email=None, user='this_user_does_not_exist')
+ >>> handle_userdata(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: arguments full-name ...
+
+ It does not make sense to provide only one argument::
+
+ >>> namespace = argparse.Namespace(full_name='Foo Bar', email=None)
+ >>> handle_userdata(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: arguments full-name ...
+ """
+ args = (namespace.full_name, namespace.email)
+ if not all(args):
+ if any(args):
+ raise ValidationError(
+ 'arguments full-name and email: '
+ 'either none or both must be provided.')
+ if user_exists(namespace.user):
+ userdata = whois(namespace.user)
+ if userdata is None:
+ raise ValidationError(
+ 'arguments full-name and email are required: '
+ 'bzr user id not found.')
+ namespace.full_name, namespace.email = userdata
+ else:
+ raise ValidationError(
+ 'arguments full-name and email are required: '
+ 'system user not found.')
+
+
+def handle_ssh_keys(namespace):
+ r"""Handle private and public ssh keys.
+
+ Keys contained in the namespace are escaped::
+
+ >>> private = r'PRIVATE\nKEY'
+ >>> public = r'PUBLIC\nKEY'
+ >>> namespace = argparse.Namespace(
+ ... private_key=private, public_key=public)
+ >>> handle_ssh_keys(namespace)
+ >>> namespace.private_key == private.decode('string-escape')
+ True
+ >>> namespace.public_key == public.decode('string-escape')
+ True
+ >>> namespace.valid_ssh_keys
+ True
+
+ Keys are None if they are not provided and can not be found in the
+ current home directory::
+
+ >>> namespace = argparse.Namespace(
+ ... private_key=None, home_dir='/tmp/__does_not_exists__')
+ >>> handle_ssh_keys(namespace) # doctest: +ELLIPSIS
+ >>> print namespace.private_key
+ None
+ >>> print namespace.public_key
+ None
+ >>> namespace.valid_ssh_keys
+ False
+
+ If only one of private_key and public_key is provided, a
+ ValidationError will be raised.
+
+ >>> namespace = argparse.Namespace(
+ ... private_key=private, public_key=None,
+ ... home_dir='/tmp/__does_not_exists__')
+ >>> handle_ssh_keys(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: arguments private-key...
+ """
+ namespace.valid_ssh_keys = True
+ for attr, filename in (
+ ('private_key', 'id_rsa'),
+ ('public_key', 'id_rsa.pub')):
+ value = getattr(namespace, attr, None)
+ if value:
+ setattr(namespace, attr, value.decode('string-escape'))
+ else:
+ path = os.path.join(namespace.home_dir, '.ssh', filename)
+ try:
+ value = open(path).read()
+ except IOError:
+ value = None
+ namespace.valid_ssh_keys = False
+ setattr(namespace, attr, value)
+ if bool(namespace.private_key) != bool(namespace.public_key):
+ raise ValidationError(
+ "arguments private-key and public-key: "
+ "both must be provided or neither must be provided.")
+
+
+def handle_directories(namespace):
+ """Handle checkout and dependencies directories.
+
+ The ~ construction is automatically expanded::
+
+ >>> namespace = argparse.Namespace(
+ ... directory='~/launchpad', dependencies_dir='~/launchpad/deps',
+ ... home_dir='/home/foo')
+ >>> handle_directories(namespace)
+ >>> namespace.directory
+ '/home/foo/launchpad'
+ >>> namespace.dependencies_dir
+ '/home/foo/launchpad/deps'
+
+ The validation fails for directories not residing inside the home::
+
+ >>> namespace = argparse.Namespace(
+ ... directory='/tmp/launchpad',
+ ... dependencies_dir='~/launchpad/deps',
+ ... home_dir='/home/foo')
+ >>> handle_directories(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: argument directory ...
+
+ The validation fails if the directory contains spaces::
+
+ >>> namespace = argparse.Namespace(directory='my directory')
+ >>> handle_directories(namespace) # doctest: +ELLIPSIS
+ Traceback (most recent call last):
+ ValidationError: argument directory ...
+ """
+ if ' ' in namespace.directory:
+ raise ValidationError('argument directory can not contain spaces.')
+ for attr in ('directory', 'dependencies_dir'):
+ directory = getattr(
+ namespace, attr).replace('~', namespace.home_dir)
+ if not directory.startswith(namespace.home_dir + os.path.sep):
+ raise ValidationError(
+ 'argument {0} does not reside under the home '
+ 'directory of the system user.'.format(attr))
+ setattr(namespace, attr, directory)
+
+
+class ArgumentParser(argparse.ArgumentParser):
+ """A customized argument parser for `argparse`."""
+
+ def __init__(self, *args, **kwargs):
+ self.validators = kwargs.pop('validators', ())
+ self.actions = []
+ self.subparsers = None
+ super(ArgumentParser, self).__init__(*args, **kwargs)
+
+ def add_argument(self, *args, **kwargs):
+ """Override to store actions in a "public" instance attribute.
+
+ >>> parser = ArgumentParser()
+ >>> parser.add_argument('arg1')
+ >>> parser.add_argument('arg2')
+ >>> [action.dest for action in parser.actions]
+ ['help', 'arg1', 'arg2']
+ """
+ action = super(ArgumentParser, self).add_argument(*args, **kwargs)
+ self.actions.append(action)
+
+ def register_subcommand(self, name, subcommand_class, handler=None):
+ """Add a subcommand to this parser.
+
+ A sub command is registered giving the sub command `name` and class::
+
+ >>> parser = ArgumentParser()
+
+ >>> class SubCommand(BaseSubCommand):
+ ... def handle(self, namespace):
+ ... return self.name
+
+ >>> parser = ArgumentParser()
+ >>> _ = parser.register_subcommand('foo', SubCommand)
+
+ The `main` method of the subcommand class is added to namespace, and
+ can be used to actually handle the sub command execution.
+
+ >>> namespace = parser.parse_args(['foo'])
+ >>> namespace.main(namespace)
+ 'foo'
+
+ A `handler` callable can also be provided to handle the subcommand
+ execution::
+
+ >>> handler = lambda namespace: 'custom handler'
+
+ >>> parser = ArgumentParser()
+ >>> _ = parser.register_subcommand(
+ ... 'bar', SubCommand, handler=handler)
+
+ >>> namespace = parser.parse_args(['bar'])
+ >>> namespace.main(namespace)
+ 'custom handler'
+ """
+ if self.subparsers is None:
+ self.subparsers = self.add_subparsers(
+ title='subcommands', description='valid subcommands',
+ help='-h, --help show subcommand help and exit')
+ subcommand = subcommand_class(name, handler=handler)
+ parser = self.subparsers.add_parser(
+ subcommand.name, help=subcommand.help)
+ subcommand.add_arguments(parser)
+ parser.set_defaults(main=subcommand.main, get_parser=lambda: parser)
+ return subcommand
+
+ def get_args_from_namespace(self, namespace):
+ """Return a list of arguments taking values from `namespace`.
+
+ Having a parser defined as usual::
+
+ >>> parser = ArgumentParser()
+ >>> _ = parser.add_argument('--foo')
+ >>> _ = parser.add_argument('bar')
+ >>> namespace = parser.parse_args('--foo eggs spam'.split())
+
+ It is possible to recreate the argument list taking values from
+ a different namespace::
+
+ >>> namespace.foo = 'changed'
+ >>> parser.get_args_from_namespace(namespace)
+ ['--foo', 'changed', 'spam']
+ """
+ args = []
+ for action in self.actions:
+ dest = action.dest
+ option_strings = action.option_strings
+ value = getattr(namespace, dest, None)
+ if value:
+ if option_strings:
+ args.append(option_strings[0])
+ if isinstance(value, list):
+ args.extend(value)
+ elif not isinstance(value, bool):
+ args.append(value)
+ return args
+
+
+class BaseSubCommand(object):
+ """Base class defining a sub command.
+
+ Objects of this class can be used to easily add sub commands to this
+ script as plugins, providing arguments, validating them, restarting
+ as root if needed.
+
+ Override `add_arguments()` to add arguments, `validators` to add
+ namespace validators, and `handle()` to manage sub command execution::
+
+ >>> def validator(namespace):
+ ... namespace.bar = True
+
+ >>> class SubCommand(BaseSubCommand):
+ ... help = 'Sub command example.'
+ ... validators = (validator,)
+ ...
+ ... def add_arguments(self, parser):
+ ... super(SubCommand, self).add_arguments(parser)
+ ... parser.add_argument('--foo')
+ ...
+ ... def handle(self, namespace):
+ ... return namespace
+
+ Register the sub command using `ArgumentParser.register_subcommand`::
+
+ >>> parser = ArgumentParser()
+ >>> sub_command = parser.register_subcommand('spam', SubCommand)
+
+ Now the subcommand has a name::
+
+ >>> sub_command.name
+ 'spam'
+
+ The sub command handler can be called using `namespace.main()`::
+
+ >>> namespace = parser.parse_args('spam --foo eggs'.split())
+ >>> namespace = namespace.main(namespace)
+ >>> namespace.foo
+ 'eggs'
+ >>> namespace.bar
+ True
+
+ The help attribute of sub command instances is used to generate
+ the command usage message::
+
+ >>> 'spam Sub command example.' in parser.format_help()
+ True
+ """
+
+ help = ''
+ needs_root = False
+ validators = ()
+
+ def __init__(self, name, handler=None):
+ self.name = name
+ self.handler = handler or self.handle
+
+ def __repr__(self):
+ return '<{klass}: {name}>'.format(
+ klass=self.__class__.__name__, name=self.name)
+
+ def init_namespace(self, namespace):
+ """Add `run_as_root` and `euid` names to the given `namespace`."""
+ euid = os.geteuid()
+ namespace.euid, namespace.run_as_root = euid, not euid
+
+ def get_needs_root(self, namespace):
+ """Return True if root is needed to run this subcommand.
+
+ Subclasses can override this to dynamically change this value
+ based on the given `namespace`.
+ """
+ return self.needs_root
+
+ def get_validators(self, namespace):
+ """Return an iterable of namespace validators for this sub command.
+
+ Subclasses can override this to dynamically change validators
+ based on the given `namespace`.
+ """
+ return self.validators
+
+ def validate(self, parser, namespace):
+ """Validate the current namespace.
+
+ The method `self.get_validators` can contain an iterable of objects
+ that are called once the arguments namespace is fully populated.
+ This allows cleaning and validating arguments that depend on
+ each other, or on the current environment.
+
+ Each validator is a callable object, takes the current namespace
+ and can raise ValidationError if the arguments are not valid::
+
+ >>> import sys
+ >>> stderr, sys.stderr = sys.stderr, sys.stdout
+ >>> def validator(namespace):
+ ... raise ValidationError('nothing is going on')
+ >>> sub_command = BaseSubCommand('foo')
+ >>> sub_command.validators = [validator]
+ >>> sub_command.validate(ArgumentParser(), argparse.Namespace())
+ Traceback (most recent call last):
+ SystemExit: 2
+ >>> sys.stderr = stderr
+ """
+ for validator in self.get_validators(namespace):
+ try:
+ validator(namespace)
+ except ValidationError as err:
+ parser.error(err)
+
+ def restart_as_root(self, parser, namespace):
+ """Restart this script using *sudo*.
+
+ The arguments are recreated using the given `namespace`.
+ """
+ args = parser.get_args_from_namespace(namespace)
+ return subprocess.call(['sudo', sys.argv[0], self.name] + args)
+
+ def main(self, namespace):
+ """Entry point for subcommand execution.
+
+ This method takes care of:
+
+ - current argparse subparser retrieval
+ - namespace initialization
+ - namespace validation
+ - script restart as root (if this sub command needs to be run as root)
+
+ If everything is ok the sub command handler is called passing
+ the validated namespace.
+ """
+ parser = namespace.get_parser()
+ self.init_namespace(namespace)
+ self.validate(parser, namespace)
+ if self.get_needs_root(namespace) and not namespace.run_as_root:
+ return self.restart_as_root(parser, namespace)
+ return self.handler(namespace)
+
+ def handle(self, namespace):
+ """Default sub command handler.
+
+ Subclasses must either implement this method or provide another
+ callable handler during sub command registration.
+ """
+ raise NotImplementedError
+
+ def add_arguments(self, parser):
+ """Here subclasses can add arguments to the subparser."""
+ pass
+
+
+class ActionsBasedSubCommand(BaseSubCommand):
+ """A sub command that uses "actions" to handle its execution.
+
+ Actions are callables stored in the `actions` attribute, together
+ with the arguments they expect. Those arguments are strings
+ representing attributes of the argparse namespace::
+
+ >>> trace = []
+
+ >>> def action1(foo):
+ ... trace.append('action1 received ' + foo)
+
+ >>> def action2(foo, bar):
+ ... trace.append('action2 received {0} and {1}'.format(foo, bar))
+
+ >>> class SubCommand(ActionsBasedSubCommand):
+ ... actions = (
+ ... (action1, 'foo'),
+ ... (action2, 'foo', 'bar'),
+ ... )
+ ...
+ ... def add_arguments(self, parser):
+ ... super(SubCommand, self).add_arguments(parser)
+ ... parser.add_argument('--foo')
+ ... parser.add_argument('--bar')
+
+ This class implements an handler method that executes actions in the
+ order they are provided::
+
+ >>> parser = ArgumentParser()
+ >>> _ = parser.register_subcommand('sub', SubCommand)
+ >>> namespace = parser.parse_args('sub --foo eggs --bar spam'.split())
+ >>> namespace.main(namespace)
+ >>> trace
+ ['action1 received eggs', 'action2 received eggs and spam']
+
+ A special argument `-a` or `--actions` is automatically added to the
+ parser. It can be used to execute only one or a subset of actions::
+
+ >>> trace = []
+
+ >>> namespace = parser.parse_args('sub --foo eggs -a action1'.split())
+ >>> namespace.main(namespace)
+ >>> trace
+ ['action1 received eggs']
+
+ A special argument `--skip-actions` is automatically added to the
+ parser. It can be used to skip one or more actions::
+
+ >>> trace = []
+
+ >>> namespace = parser.parse_args(
+ ... 'sub --foo eggs --skip-actions action1'.split())
+ >>> namespace.main(namespace)
+ >>> trace
+ ['action2 received eggs and None']
+
+ The actions execution is stopped if an action raises `LaunchpadError`.
+ In that case, the error is returned by the handler.
+
+ >>> trace = []
+
+ >>> def erroneous_action(foo):
+ ... raise LaunchpadError('error')
+
+ >>> class SubCommandWithErrors(SubCommand):
+ ... actions = (
+ ... (action1, 'foo'),
+ ... (erroneous_action, 'foo'),
+ ... (action2, 'foo', 'bar'),
+ ... )
+
+ >>> parser = ArgumentParser()
+ >>> _ = parser.register_subcommand('sub', SubCommandWithErrors)
+ >>> namespace = parser.parse_args('sub --foo eggs'.split())
+ >>> error = namespace.main(namespace)
+ >>> error.message
+ 'error'
+
+ The action `action2` is not executed::
+
+ >>> trace
+ ['action1 received eggs']
+ """
+
+ actions = ()
+
+ def __init__(self, *args, **kwargs):
+ super(ActionsBasedSubCommand, self).__init__(*args, **kwargs)
+ self._action_names = []
+ self._actions = {}
+ for action_args in self.actions:
+ action, args = action_args[0], action_args[1:]
+ action_name = self._get_action_name(action)
+ self._action_names.append(action_name)
+ self._actions[action_name] = (action, args)
+
+ def _get_action_name(self, action):
+ """Return the string representation of an action callable.
+
+ The name is retrieved using attributes lookup for `action_name`
+ and then `__name__`::
+
+ >>> def action1():
+ ... pass
+ >>> action1.action_name = 'myaction'
+
+ >>> def action2():
+ ... pass
+
+ >>> sub_command = ActionsBasedSubCommand('foo')
+ >>> sub_command._get_action_name(action1)
+ 'myaction'
+ >>> sub_command._get_action_name(action2)
+ 'action2'
+ """
+ try:
+ return action.action_name
+ except AttributeError:
+ return action.__name__
+
+ def add_arguments(self, parser):
+ super(ActionsBasedSubCommand, self).add_arguments(parser)
+ parser.add_argument(
+ '-a', '--actions', nargs='+', choices=self._action_names,
+ help='Call one or more internal functions.')
+ parser.add_argument(
+ '--skip-actions', nargs='+', choices=self._action_names,
+ help='Skip one or more internal functions.')
+
+ def handle(self, namespace):
+ skip_actions = namespace.skip_actions or []
+ action_names = filter(
+ lambda action_name: action_name not in skip_actions,
+ namespace.actions or self._action_names)
+ for action_name in action_names:
+ action, arg_names = self._actions[action_name]
+ args = [getattr(namespace, i) for i in arg_names]
+ try:
+ action(*args)
+ except LaunchpadError as err:
+ return err
+
+
+class InstallSubCommand(ActionsBasedSubCommand):
+ """Install the Launchpad environment."""
+
+ actions = (
+ (initialize,
+ 'user', 'full_name', 'email', 'lpuser', 'private_key',
+ 'public_key', 'valid_ssh_keys', 'dependencies_dir', 'directory'),
+ (setup_apt, 'no_repositories'),
+ (setup_launchpad,
+ 'user', 'dependencies_dir', 'directory', 'valid_ssh_keys'),
+ )
+ help = __doc__
+ needs_root = True
+ validators = (
+ handle_user,
+ handle_lpuser,
+ handle_userdata,
+ handle_ssh_keys,
+ handle_directories,
+ )
+
+ def add_arguments(self, parser):
+ super(InstallSubCommand, self).add_arguments(parser)
+ parser.add_argument(
+ '-u', '--user',
+ help='The name of the system user to be created or updated. '
+ 'The current user is used if this script is not run as '
+ 'root and this argument is omitted.')
+ parser.add_argument(
+ '-e', '--email',
+ help='The email of the user, used for bzr whoami. This argument '
+ 'can be omitted if a bzr id exists for current user.')
+ parser.add_argument(
+ '-f', '--full-name',
+ help='The full name of the user, used for bzr whoami. '
+ 'This argument can be omitted if a bzr id exists for '
+ 'current user.')
+ parser.add_argument(
+ '-l', '--lpuser',
+ help='The name of the Launchpad user that will be used to '
+ 'check out dependencies. If not provided, the system '
+ 'user name is used.')
+ parser.add_argument(
+ '-v', '--private-key',
+ help='The SSH private key for the Launchpad user (without '
+ 'passphrase). If this argument is omitted and a keypair is '
+ 'not found in the home directory of the system user a new '
+ 'SSH keypair will be generated and the checkout of the '
+ 'Launchpad code will use HTTP rather than bzr+ssh.')
+ parser.add_argument(
+ '-b', '--public-key',
+ help='The SSH public key for the Launchpad user. '
+ 'If this argument is omitted and a keypair is not found '
+ 'in the home directory of the system user a new SSH '
+ 'keypair will be generated and the checkout of the '
+ 'Launchpad code will use HTTP rather than bzr+ssh.')
+ parser.add_argument(
+ '-d', '--dependencies-dir', default=DEPENDENCIES_DIR,
+ help='The directory of the Launchpad dependencies to be created. '
+ 'The directory must reside under the home directory of the '
+ 'given user (see -u argument). '
+ '[DEFAULT={0}]'.format(DEPENDENCIES_DIR))
+ parser.add_argument(
+ '-c', '--directory', default=CHECKOUT_DIR,
+ help='The directory of the Launchpad repository to be created. '
+ 'The directory must reside under the home directory of the '
+ 'given user (see -u argument). '
+ '[DEFAULT={0}]'.format(CHECKOUT_DIR))
+ parser.add_argument(
+ '-N', '--no-repositories', action='store_true',
+ help='Do not add APT repositories.')
+
+
+class UpdateSubCommand(ActionsBasedSubCommand):
+ """Update the Launchpad environment to latest version."""
+
+ actions = (
+ (update_launchpad,
+ 'user', 'valid_ssh_keys', 'dependencies_dir', 'directory', 'apt'),
+ (link_sourcecode_in_branches,
+ 'user', 'dependencies_dir', 'directory'),
+ )
+ help = __doc__
+ validators = (
+ handle_user,
+ handle_ssh_keys,
+ handle_directories,
+ )
+
+ def get_needs_root(self, namespace):
+ # Root is needed only if an apt update/upgrade is requested.
+ return namespace.apt
+
+ def add_arguments(self, parser):
+ super(UpdateSubCommand, self).add_arguments(parser)
+ parser.add_argument(
+ '-u', '--user',
+ help='The name of the system user used to update Launchpad. '
+ 'The current user is used if this script is not run as '
+ 'root and this argument is omitted.')
+ parser.add_argument(
+ '-d', '--dependencies-dir', default=DEPENDENCIES_DIR,
+ help='The directory of the Launchpad dependencies to be updated. '
+ 'The directory must reside under the home directory of the '
+ 'given user (see -u argument). '
+ '[DEFAULT={0}]'.format(DEPENDENCIES_DIR))
+ parser.add_argument(
+ '-c', '--directory', default=CHECKOUT_DIR,
+ help='The directory of the Launchpad repository to be updated. '
+ 'The directory must reside under the home directory of the '
+ 'given user (see -u argument). '
+ '[DEFAULT={0}]'.format(CHECKOUT_DIR))
+ parser.add_argument(
+ '-D', '--apt', action='store_true',
+ help='Also update deb packages.')
+
+
+class LXCInstallSubCommand(InstallSubCommand):
+ """Install the Launchpad environment inside an LXC."""
+
+ actions = (
+ (initialize,
+ 'user', 'full_name', 'email', 'lpuser', 'private_key',
+ 'public_key', 'valid_ssh_keys', 'dependencies_dir', 'directory'),
+ (create_lxc,
+ 'user', 'lxc_name', 'lxc_arch', 'lxc_os'),
+ (start_lxc, 'lxc_name'),
+ (wait_for_lxc, 'lxc_name'),
+ (initialize_lxc,
+ 'lxc_name', 'lxc_os'),
+ (setup_launchpad_lxc,
+ 'user', 'dependencies_dir', 'directory', 'valid_ssh_keys',
+ 'lxc_name'),
+ (stop_lxc, 'lxc_name'),
+ )
+ help = __doc__
+
+ def add_arguments(self, parser):
+ super(LXCInstallSubCommand, self).add_arguments(parser)
+ parser.add_argument(
+ '-n', '--lxc-name', default=LXC_NAME,
+ help='The LXC container name to setup. '
+ '[DEFAULT={0}]'.format(LXC_NAME))
+ parser.add_argument(
+ '-A', '--lxc-arch', default=LXC_GUEST_ARCH,
+ help='The LXC container architecture. '
+ '[DEFAULT={0}]'.format(LXC_GUEST_ARCH))
+ parser.add_argument(
+ '-r', '--lxc-os', default=LXC_GUEST_OS,
+ choices=LXC_GUEST_CHOICES,
+ help='The LXC container distro codename. '
+ '[DEFAULT={0}]'.format(LXC_GUEST_OS))
+
+
+def main():
+ parser = ArgumentParser(description=__doc__)
+ parser.register_subcommand('install', InstallSubCommand)
+ parser.register_subcommand('update', UpdateSubCommand)
+ parser.register_subcommand('lxc-install', LXCInstallSubCommand)
+ args = parser.parse_args()
+ return args.main(args)
+
+
+if __name__ == '__main__':
+ sys.exit(main())
=== modified file 'utilities/setuplxc.py'
=== modified file 'utilities/sourcedeps.cache'
--- utilities/sourcedeps.cache 2012-02-22 23:17:22 +0000
+++ utilities/sourcedeps.cache 2012-02-27 13:14:12 +0000
@@ -32,8 +32,13 @@
"launchpad@xxxxxxxxxxxxxxxxx-20111215014116-j7lw563fh7kuqp0n"
],
"loggerhead": [
+<<<<<<< TREE
470,
"william.grant@xxxxxxxxxxxxx-20120222230358-x1twbbwbo5o7fp3c"
+=======
+ 468,
+ "william.grant@xxxxxxxxxxxxx-20120209034255-lb11rkbjms172bju"
+>>>>>>> MERGE-SOURCE
],
"lpreview": [
23,
=== modified file 'utilities/sourcedeps.conf'
--- utilities/sourcedeps.conf 2012-02-22 23:17:22 +0000
+++ utilities/sourcedeps.conf 2012-02-27 13:14:12 +0000
@@ -10,7 +10,11 @@
cscvs lp:~launchpad-pqm/launchpad-cscvs/devel;revno=432
dulwich lp:~launchpad-pqm/dulwich/devel;revno=435
difftacular lp:~launchpad/difftacular/trunk;revno=6
+<<<<<<< TREE
loggerhead lp:~loggerhead-team/loggerhead/trunk-rich;revno=470
+=======
+loggerhead lp:~loggerhead-team/loggerhead/trunk-rich;revno=468
+>>>>>>> MERGE-SOURCE
lpreview lp:~launchpad-pqm/bzr-lpreview/devel;revno=23
mailman lp:~launchpad-pqm/mailman/2.1;revno=976
mustache.js lp:~abentley/mustache.js/master;revno=166
=== modified file 'utilities/yui-deps.py'
--- utilities/yui-deps.py 2012-02-21 16:11:38 +0000
+++ utilities/yui-deps.py 2012-02-27 13:14:12 +0000
@@ -7,6 +7,7 @@
from sys import argv
+<<<<<<< TREE
yui_roots = {
3: 'build/js/yui-3.3.0',
2: 'build/js/yui2',
@@ -116,6 +117,106 @@
'calendar/calendar.js',
]
}
+=======
+
+yui_root = 'build/js/yui-3.3.0'
+yui_deps = [
+ 'yui/yui',
+ 'oop/oop',
+ 'dom/dom',
+ 'dom/dom-style-ie',
+ 'event-custom/event-custom',
+ 'event/event',
+ 'pluginhost/pluginhost',
+ 'node/node',
+ 'event/event-base-ie',
+ 'node/align-plugin',
+ 'attribute/attribute',
+ 'base/base',
+ 'anim/anim',
+ 'async-queue/async-queue',
+ 'json/json',
+ 'plugin/plugin',
+ 'cache/cache',
+ 'classnamemanager/classnamemanager',
+ 'collection/collection',
+ 'dump/dump',
+ 'intl/intl',
+ 'substitute/substitute',
+ 'widget/widget',
+ 'widget/widget-base-ie',
+ 'console/lang/console.js',
+ 'console/console',
+ 'console/console-filters',
+ 'cookie/cookie',
+ 'dataschema/dataschema',
+ 'datatype/lang/datatype.js',
+ 'datatype/datatype',
+ 'querystring/querystring-stringify-simple',
+ 'queue-promote/queue-promote',
+ 'io/io',
+ 'datasource/datasource',
+ 'dd/dd',
+ 'dd/dd-gestures',
+ 'dd/dd-drop-plugin',
+ 'event/event-touch',
+ 'event-gestures/event-gestures',
+ 'dd/dd-plugin',
+ 'dom/selector-css3',
+ 'editor/editor',
+ 'event-simulate/event-simulate',
+ 'event-valuechange/event-valuechange',
+ 'escape/escape',
+ 'text/text-data-wordbreak',
+ 'text/text-wordbreak',
+ 'text/text-data-accentfold',
+ 'text/text-accentfold',
+ 'highlight/highlight',
+ 'history/history',
+ 'history/history-hash-ie',
+ 'history-deprecated/history-deprecated',
+ 'imageloader/imageloader',
+ 'jsonp/jsonp',
+ 'jsonp/jsonp-url',
+ 'loader/loader',
+ 'node/node-event-simulate',
+ 'transition/transition',
+ 'node-flick/node-flick',
+ 'node-focusmanager/node-focusmanager',
+ 'node-menunav/node-menunav',
+ 'widget/widget-position',
+ 'widget/widget-position-align',
+ 'widget/widget-position-constrain',
+ 'widget/widget-stack',
+ 'widget/widget-stdmod',
+ 'overlay/overlay',
+ 'profiler/profiler',
+ 'querystring/querystring',
+ 'querystring/querystring-parse-simple',
+ 'scrollview/scrollview-base',
+ 'scrollview/scrollview-base-ie',
+ 'scrollview/scrollview-scrollbars',
+ 'scrollview/scrollview',
+ 'scrollview/scrollview-paginator',
+ 'node/shim-plugin',
+ 'slider/slider',
+ 'sortable/sortable',
+ 'sortable/sortable-scroll',
+ 'stylesheet/stylesheet',
+ 'swfdetect/swfdetect',
+ 'swf/swf',
+ 'tabview/tabview-base',
+ 'widget/widget-child',
+ 'widget/widget-parent',
+ 'tabview/tabview',
+ 'tabview/tabview-plugin',
+ 'test/test',
+ 'uploader/uploader',
+ 'widget-anim/widget-anim',
+ 'widget/widget-locale',
+ 'yql/yql',
+]
+>>>>>>> MERGE-SOURCE
if __name__ == '__main__':
=== modified file 'versions.cfg'