← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/bug-780319 into lp:launchpad

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/bug-780319 into lp:launchpad.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)
Related bugs:
  Bug #780319 in Launchpad itself: "Sync large numbers of packages asynchronously"
  https://bugs.launchpad.net/launchpad/+bug/780319

For more details, see:
https://code.launchpad.net/~jtv/launchpad/bug-780319/+merge/60571

= Summary =

As part of the derived-distro work, users may request copying (synchronization) of packages from a parent distroseries into a derived distroseries.  This may take too long for a single web request, in which case we need to do it asynchronously; but if at all possible it would still be nice to get the job done immediately when the request is made.


== Proposed fix ==

Make the choice in the browser code.  For API calls there will be no particular reason to make this dynamic trade-off; it would merely confuse by making the request less predictable.

Then, delegate to one of two very similar functions.  One does the copy as is the case now; the other creates PackageCopyJobs.

A single PackageCopyJob may span any number of source packages, but only one source archive.  So the asynchronous copying code breaks the request down by the archives the packages to be copied come from, and issues one job for each.


== Pre-implementation notes ==

Gavin has been working on the PackageCopyJob implementation, which (thanks to our inflexible job system) involves moving the interface.  This may cause interfere with my landing; I expect to have to resolve that, but that change shouldn't need review.

Robert is concerned about multiple code paths; Julian was thinking along the lines of a solution that creates PackageCopyJobs either way and then executes them inline in the synchronous case.  However I did go for two separate code paths that are very similar, and minimize cyclomatic complexity.


== Implementation details ==

A lot of code needed to be refactored to support either the new code path or efficient testing.  The view can now choose between calling two functions: copy_synchronously and copy_asynchronously.  If I wanted to over-engineer this I could have used polymorphism, and to be honest it would have made fine-grained testing a bit easier.

The first new function you'll actually see in the diff is preload-binary_package_name.  This is a case where a very short stretch of code may still be worth extracting: first, it associates an identifier with the code instead of an informal comment.  Second, it helps preserve the abstraction level and legibility of the calling code.

Right after that come the examples of those refactorings to support the new code path.  And compose_synchronous_copy_feedback is an example of one that was needed to meet the 2-second goal for test times.

Extracting this kind of code also allowed me to unit-test escaping of various names.  I would have liked to use the parameter escaping logic inside "structured" instead of doing manual escaping before interpolation, but the handling of dest_url made me unsure.  It's not essential to the branch so I left that alone for now.

The decision of when to copy synchronously is a policy choice.  I kept that away from the mechanism choice, in a method called canCopySynchronously.  This also lets tests force the view's decision.

I also extracted the permissions-checking code into check_copy_permissions.  This allows copy_asynchronously to use the same permissions-checking code as the synchronous copy.  It might have been nicer to pull this up into the browser code, so that it's called in only one place, but it's not clear to me what else I might have upset by doing that.

The check itself is fundamentally unchanged.  It would be nice to unit-test this; it's probably tested in too much detail at the higher levels right now.  Cleaning up the testing would make a nice follow-up job, and could perhaps shave some time off the test suite.


== Tests ==

The view mixin is tested here:
{{{
./bin/test -vvc lp.soyuz.browser.tests.test_package_copying_mixin
}}}

This consists of two test cases.  The original test was too slow, so I pared it down a bit: one part runs completely without the database, mostly in a millisecond or so per test.  It uses fakes to avoid costly object creations in the factory.  The other goes through more layers, from the view all the way down to the database.  Here too I added some helpers: significant amounts of test time were wasted creating Persons for all the publishing records, archives, distroseries etc. that the test needs.


== Demo and Q/A ==

On the +localpackagediffs page for a derived distroseries, try synchronizing under 100 packages: it should work as before.

Now try the same with over 100 packages: this time it will say it created jobs for n packages, which will be processed in due time.


= Launchpad lint =

Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/browser/tests/test_archive_packages.py
  lib/lp/soyuz/scripts/packagecopier.py
  lib/lp/soyuz/interfaces/distributionjob.py
  lib/lp/soyuz/browser/tests/test_package_copying_mixin.py
  lib/lp/soyuz/browser/archive.py
-- 
https://code.launchpad.net/~jtv/launchpad/bug-780319/+merge/60571
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/bug-780319 into lp:launchpad.
=== modified file 'lib/lp/soyuz/browser/archive.py'
--- lib/lp/soyuz/browser/archive.py	2011-05-05 13:01:50 +0000
+++ lib/lp/soyuz/browser/archive.py	2011-05-11 03:15:55 +0000
@@ -144,6 +144,7 @@
 from lp.soyuz.interfaces.binarypackagebuild import BuildSetStatus
 from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet
 from lp.soyuz.interfaces.component import IComponentSet
+from lp.soyuz.interfaces.distributionjob import IPackageCopyJobSource
 from lp.soyuz.interfaces.packagecopyrequest import IPackageCopyRequestSet
 from lp.soyuz.interfaces.packageset import IPackagesetSet
 from lp.soyuz.interfaces.processor import IProcessorFamilySet
@@ -158,7 +159,10 @@
     BinaryPackagePublishingHistory,
     SourcePackagePublishingHistory,
     )
-from lp.soyuz.scripts.packagecopier import do_copy
+from lp.soyuz.scripts.packagecopier import (
+    check_copy_permissions,
+    do_copy,
+    )
 
 
 class ArchiveBadges(HasBadgeBase):
@@ -1214,15 +1218,156 @@
     _messageNoValue = _("vocabulary-copy-to-same-series", "The same series")
 
 
+def preload_binary_package_names(copies):
+    """Preload `BinaryPackageName`s to speed up display-name construction."""
+    bpn_ids = [
+        copy.binarypackagerelease.binarypackagenameID for copy in copies
+        if isinstance(copy, BinaryPackagePublishingHistory)]
+    load(BinaryPackageName, bpn_ids)
+
+
+def compose_synchronous_copy_feedback(copies, dest_archive, dest_url=None,
+                                      dest_display_name=None):
+    """Compose human-readable feedback after a synchronous copy."""
+    if dest_url is None:
+        dest_url = escape(
+            canonical_url(dest_archive) + '/+packages', quote=True)
+
+    if dest_display_name is None:
+        dest_display_name = escape(dest_archive.displayname)
+
+    if len(copies) == 0:
+        return structured(
+            '<p>All packages already copied to <a href="%s">%s</a>.</p>'
+            % (dest_url, dest_display_name))
+    else:
+        messages = []
+        messages.append(
+            '<p>Packages copied to <a href="%s">%s</a>:</p>'
+            % (dest_url, dest_display_name))
+        messages.append('<ul>')
+        messages.append("\n".join([
+            '<li>%s</li>' % escape(copy) for copy in copies]))
+        messages.append('</ul>')
+        return structured("\n".join(messages))
+
+
+def copy_synchronously(source_pubs, dest_archive, dest_series, dest_pocket,
+                       include_binaries, dest_url=None,
+                       dest_display_name=None, person=None,
+                       check_permissions=True):
+    """Copy packages right now.
+
+    :return: A `structured` with human-readable feedback about the
+        operation.
+    :raises CannotCopy: If `check_permissions` is True and the copy is
+        not permitted.
+    """
+    copies = do_copy(
+        source_pubs, dest_archive, dest_series, dest_pocket, include_binaries,
+        allow_delayed_copies=True, person=person,
+        check_permissions=check_permissions)
+
+    preload_binary_package_names(copies)
+
+    return compose_synchronous_copy_feedback(
+        [copy.displayname for copy in copies], dest_archive, dest_url,
+        dest_display_name)
+
+
+def partition_pubs_by_archive(source_pubs):
+    """Group `source_pubs` by archive.
+
+    :param source_pubs: A sequence of `SourcePackagePublishingHistory`.
+    :return: A dict mapping `Archive`s to the list of entries from
+        `source_pubs` that are in that archive.
+    """
+    by_source_archive = {}
+    for spph in source_pubs:
+        by_source_archive.setdefault(spph.archive, []).append(spph)
+    return by_source_archive
+
+
+def name_pubs_with_versions(source_pubs):
+    """Annotate each entry from `source_pubs` with its version.
+
+    :param source_pubs: A sequence of `SourcePackagePublishingHistory`.
+    :return: A list of tuples (name, version), one for each respective
+        entry in `source_pubs`.
+    """
+    sprs = [spph.sourcepackagerelease for spph in source_pubs]
+    return [(spr.sourcepackagename.name, spr.version) for spr in sprs]
+
+
+def copy_asynchronously(source_pubs, dest_archive, dest_series, dest_pocket,
+                        include_binaries, dest_url=None,
+                        dest_display_name=None, person=None,
+                        check_permissions=True):
+    """Schedule jobs to copy packages later.
+
+    :return: A `structured` with human-readable feedback about the
+        operation.
+    :raises CannotCopy: If `check_permissions` is True and the copy is
+        not permitted.
+    """
+    if check_permissions:
+        for spph in source_pubs:
+            spn = spph.sourcepackagerelease.sourcepackagename
+            check_copy_permissions(
+                person, dest_archive, dest_series, dest_pocket, spn)
+
+    job_source = getUtility(IPackageCopyJobSource)
+    archive_pubs = partition_pubs_by_archive(source_pubs)
+    for source_archive, spphs in archive_pubs.iteritems():
+        job_source.create(
+            name_pubs_with_versions(spphs), source_archive, dest_archive,
+            dest_series, dest_pocket, include_binaries=include_binaries)
+    return structured("""
+        <p>Requested sync of %s packages.</p>
+        <p>Please allow some time for these to be processed.</p>
+        """, len(source_pubs))
+
+
+def render_cannotcopy_as_html(cannotcopy_exception):
+    """Render `CannotCopy` exception as HTML for display in the page."""
+    error_lines = str(cannotcopy_exception).splitlines()
+
+    if len(error_lines) == 1:
+        intro = "The following source cannot be copied:"
+    else:
+        intro = "The following sources cannot be copied:"
+
+    # Produce structured HTML.  Include <li>%s</li> placeholders for
+    # each error line, but have "structured" interpolate the actual
+    # package names.  It will escape them as needed.
+    html_text = """
+        <p>%s</p>
+        <ul>
+        %s
+        </ul>
+        """ % (intro, "<li>%s</li>" * len(error_lines))
+    return structured(html_text, *error_lines)
+
+
 class PackageCopyingMixin:
     """A mixin class that adds helpers for package copying."""
 
+    def canCopySynchronously(self, source_pubs):
+        """Can we afford to copy `source_pubs` synchronously?"""
+        # Fixed estimate: up to 100 packages can be copied in acceptable
+        # time.  Anything more than that and we go async.
+        return len(source_pubs) <= 100
+
     def do_copy(self, sources_field_name, source_pubs, dest_archive,
                 dest_series, dest_pocket, include_binaries,
                 dest_url=None, dest_display_name=None, person=None,
                 check_permissions=True):
         """Copy packages and add appropriate feedback to the browser page.
 
+        This may either copy synchronously, if there are few enough
+        requests to process right now; or asynchronously in which case
+        it will schedule jobs that will be processed by a script.
+
         :param sources_field_name: The name of the form field to set errors
             on when the copy fails
         :param source_pubs: A list of SourcePackagePublishingHistory to copy
@@ -1244,61 +1389,24 @@
         :return: True if the copying worked, False otherwise.
         """
         try:
-            copies = do_copy(
-                source_pubs, dest_archive, dest_series,
-                dest_pocket, include_binaries, allow_delayed_copies=True,
-                person=person, check_permissions=check_permissions)
+            if self.canCopySynchronously(source_pubs):
+                notification = copy_synchronously(
+                    source_pubs, dest_archive, dest_series, dest_pocket,
+                    include_binaries, dest_url=dest_url,
+                    dest_display_name=dest_display_name, person=person,
+                    check_permissions=check_permissions)
+            else:
+                notification = copy_asynchronously(
+                    source_pubs, dest_archive, dest_series, dest_pocket,
+                    include_binaries, dest_url=dest_url,
+                    dest_display_name=dest_display_name, person=person,
+                    check_permissions=check_permissions)
         except CannotCopy, error:
-            messages = []
-            error_lines = str(error).splitlines()
-            if len(error_lines) == 1:
-                messages.append(
-                    "<p>The following source cannot be copied:</p>")
-            else:
-                messages.append(
-                    "<p>The following sources cannot be copied:</p>")
-            messages.append('<ul>')
-            messages.append(
-                "\n".join('<li>%s</li>' % escape(line)
-                    for line in error_lines))
-            messages.append('</ul>')
-
             self.setFieldError(
-                sources_field_name, structured('\n'.join(messages)))
+                sources_field_name, render_cannotcopy_as_html(error))
             return False
 
-        # Preload BPNs to save queries when calculating display names.
-        load(BinaryPackageName, (
-            copy.binarypackagerelease.binarypackagenameID for copy in copies
-            if isinstance(copy, BinaryPackagePublishingHistory)))
-
-        # Present a page notification describing the action.
-        messages = []
-        if dest_url is None:
-            dest_url = escape(
-                canonical_url(dest_archive) + '/+packages',
-                quote=True)
-        if dest_display_name is None:
-            dest_display_name = escape(dest_archive.displayname)
-        if len(copies) == 0:
-            messages.append(
-                '<p>All packages already copied to '
-                '<a href="%s">%s</a>.</p>' % (
-                    dest_url,
-                    dest_display_name))
-        else:
-            messages.append(
-                '<p>Packages copied to <a href="%s">%s</a>:</p>' % (
-                    dest_url,
-                    dest_display_name))
-            messages.append('<ul>')
-            messages.append(
-                "\n".join(['<li>%s</li>' % escape(copy.displayname)
-                           for copy in copies]))
-            messages.append('</ul>')
-
-        notification = "\n".join(messages)
-        self.request.response.addNotification(structured(notification))
+        self.request.response.addNotification(notification)
         return True
 
 

=== modified file 'lib/lp/soyuz/browser/tests/test_archive_packages.py'
--- lib/lp/soyuz/browser/tests/test_archive_packages.py	2011-03-29 00:11:57 +0000
+++ lib/lp/soyuz/browser/tests/test_archive_packages.py	2011-05-11 03:15:55 +0000
@@ -1,4 +1,4 @@
-# Copyright 2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2010-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 # pylint: disable-msg=F0401
@@ -16,7 +16,6 @@
 from testtools.matchers import (
     Equals,
     LessThan,
-    MatchesAny,
     )
 from zope.security.interfaces import Unauthorized
 from zope.security.proxy import removeSecurityProxy

=== added file 'lib/lp/soyuz/browser/tests/test_package_copying_mixin.py'
--- lib/lp/soyuz/browser/tests/test_package_copying_mixin.py	1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/browser/tests/test_package_copying_mixin.py	2011-05-11 03:15:55 +0000
@@ -0,0 +1,310 @@
+# Copyright 2011 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Tests for `PackageCopyingMixin`."""
+
+__metaclass__ = type
+
+from zope.component import getUtility
+
+from canonical.testing.layers import ZopelessDatabaseLayer
+from lp.registry.interfaces.pocket import PackagePublishingPocket
+from lp.services.propertycache import cachedproperty
+from lp.soyuz.browser.archive import (
+    compose_synchronous_copy_feedback,
+    copy_asynchronously,
+    copy_synchronously,
+    name_pubs_with_versions,
+    PackageCopyingMixin,
+    partition_pubs_by_archive,
+    render_cannotcopy_as_html,
+    )
+from lp.soyuz.interfaces.archive import CannotCopy
+from lp.soyuz.interfaces.distributionjob import IPackageCopyJobSource
+from lp.soyuz.enums import SourcePackageFormat
+from lp.soyuz.interfaces.sourcepackageformat import (
+    ISourcePackageFormatSelectionSet,
+    )
+from lp.testing import (
+    TestCase,
+    TestCaseWithFactory,
+    )
+from lp.testing.fakemethod import FakeMethod
+from lp.testing.views import create_initialized_view
+
+
+def find_spph_copy(archive, spph):
+    """Find copy of `spph`'s package as copied into `archive`"""
+    spr = spph.sourcepackagerelease
+    return archive.getPublishedSources(
+        name=spr.sourcepackagename.name, version=spr.version).one()
+
+
+class FakeDistribution:
+    def __init__(self):
+        self.archive = FakeArchive()
+
+
+class FakeDistroSeries:
+    def __init__(self):
+        self.distribution = FakeDistribution()
+
+
+class FakeSPN:
+    name = "spn-name"
+
+
+class FakeSPR:
+    def __init__(self):
+        self.sourcepackagename = FakeSPN()
+        self.version = "1.0"
+
+
+class FakeArchive:
+    def __init__(self, displayname="archive-name"):
+        self.displayname = displayname
+
+
+class FakeSPPH:
+    def __init__(self, archive=None):
+        if archive is None:
+            archive = FakeArchive()
+        self.sourcepackagerelease = FakeSPR()
+        self.displayname = "spph-displayname"
+        self.archive = archive
+
+
+class TestPackageCopyingMixinLight(TestCase):
+    """Test lightweight functions and methods.
+
+    This test does not run in a layer and does not access the database.
+    """
+
+    unique_number = 1
+
+    def getPocket(self):
+        """Return an arbitrary `PackagePublishingPocket`."""
+        return PackagePublishingPocket.SECURITY
+
+    def getUniqueString(self):
+        """Return an arbitrary string."""
+        self.unique_number += 1
+        return "string_%d_" % self.unique_number
+
+    def test_canCopySynchronously_allows_small_synchronous_copies(self):
+        # Small numbers of packages can be copied synchronously.
+        packages = [self.getUniqueString() for counter in range(3)]
+        self.assertTrue(PackageCopyingMixin().canCopySynchronously(packages))
+
+    def test_canCopySynchronously_disallows_large_synchronous_copies(self):
+        # Large numbers of packages must be copied asynchronously.
+        packages = [self.getUniqueString() for counter in range(300)]
+        self.assertFalse(PackageCopyingMixin().canCopySynchronously(packages))
+
+    def test_partition_pubs_by_archive_maps_archives_to_pubs(self):
+        # partition_pubs_by_archive returns a dict mapping each archive
+        # to a list of SPPHs on that archive.
+        spph = FakeSPPH()
+        self.assertEqual(
+            {spph.archive: [spph]}, partition_pubs_by_archive([spph]))
+
+    def test_partition_pubs_by_archive_splits_by_archive(self):
+        # partition_pubs_by_archive keeps SPPHs for different archives
+        # separate.
+        spphs = [FakeSPPH() for counter in xrange(2)]
+        mapping = partition_pubs_by_archive(spphs)
+        self.assertEqual(
+            dict((spph.archive, [spph]) for spph in spphs), mapping)
+
+    def test_partition_pubs_by_archive_clusters_by_archive(self):
+        # partition_pubs_by_archive bundles SPPHs for the same archive
+        # into a single dict entry.
+        archive = FakeArchive()
+        spphs = [FakeSPPH(archive=archive) for counter in xrange(2)]
+        mapping = partition_pubs_by_archive(spphs)
+        self.assertEqual([archive], mapping.keys())
+        self.assertContentEqual(spphs, mapping[archive])
+
+    def test_name_pubs_with_versions_lists_packages_and_versions(self):
+        # name_pubs_with_versions returns a list of tuples of source
+        # package name and source package version, one per SPPH.
+        spph = FakeSPPH()
+        spr = spph.sourcepackagerelease
+        self.assertEqual(
+            [(spr.sourcepackagename.name, spr.version)],
+            name_pubs_with_versions([spph]))
+
+    def test_render_cannotcopy_as_html_lists_errors(self):
+        # render_cannotcopy_as_html includes a CannotCopy error message
+        # into its HTML notice.
+        message = self.getUniqueString()
+        html_text = render_cannotcopy_as_html(CannotCopy(message)).escapedtext
+        self.assertIn(message, html_text)
+
+    def test_render_cannotcopy_as_html_escapes_error(self):
+        # render_cannotcopy_as_html escapes error messages.
+        message = "x<>y"
+        html_text = render_cannotcopy_as_html(CannotCopy(message)).escapedtext
+        self.assertNotIn(message, html_text)
+        self.assertIn("x&lt;&gt;y", html_text)
+
+    def test_compose_synchronous_copy_feedback_escapes_archive_name(self):
+        # compose_synchronous_copy_feedback escapes archive displaynames.
+        archive = FakeArchive(displayname="a&b")
+        notice = compose_synchronous_copy_feedback(
+            ["hi"], archive, dest_url="/")
+        html_text = notice.escapedtext
+        self.assertNotIn("a&b", html_text)
+        self.assertIn("a&amp;b", html_text)
+
+    def test_compose_synchronous_copy_feedback_escapes_package_names(self):
+        # compose_synchronous_copy_feedback escapes package names.
+        archive = FakeArchive()
+        notice = compose_synchronous_copy_feedback(
+            ["x<y"], archive, dest_url="/")
+        html_text = notice.escapedtext
+        self.assertNotIn("x<y", html_text)
+        self.assertIn("x&lt;y", html_text)
+
+    def test_copy_synchronously_checks_permissions(self):
+        # Unless told not to, copy_synchronously does a permissions
+        # check.
+        pocket = self.getPocket()
+        self.assertRaises(
+            CannotCopy,
+            copy_synchronously,
+            [FakeSPPH()], FakeArchive(), FakeDistroSeries(), pocket, False)
+
+    def test_copy_asynchronously_checks_permissions(self):
+        # Unless told not to, copy_asynchronously does a permissions
+        # check.
+        pocket = self.getPocket()
+        self.assertRaises(
+            CannotCopy,
+            copy_asynchronously,
+            [FakeSPPH()], FakeArchive(), FakeDistroSeries(), pocket, False)
+
+
+class TestPackageCopyingMixinIntegration(TestCaseWithFactory):
+    """Integration tests for `PackageCopyingMixin`."""
+
+    layer = ZopelessDatabaseLayer
+
+    @cachedproperty
+    def person(self):
+        """Create a single person who gets blamed for everything.
+
+        Creating SPPHs, Archives etc. in the factory creates lots of
+        `Person`s, which turns out to be really slow.  Tests that don't
+        care who's who can use this single person for all uninteresting
+        Person fields.
+        """
+        return self.factory.makePerson()
+
+    def makeDistribution(self):
+        """Create a `Distribution`, but quickly by reusing a single Person."""
+        return self.factory.makeDistribution(
+            owner=self.person, registrant=self.person)
+
+    def makeDistroSeries(self, distribution=None, parent_series=None):
+        """Create a `DistroSeries`, but quickly by reusing a single Person."""
+        if distribution is None:
+            distribution = self.makeDistribution()
+        return self.factory.makeDistroSeries(
+            distribution=distribution, parent_series=parent_series,
+            registrant=self.person)
+
+    def makeArchive(self, distribution=None, displayname=None):
+        """Create an `Archive`, but quickly by reusing a single Person."""
+        if distribution is None:
+            distribution = self.makeDistribution()
+        return self.factory.makeArchive(
+            owner=self.person, distribution=distribution,
+            displayname=displayname)
+
+    def makeSPPH(self, archive=None):
+        """Create a `SourcePackagePublishingHistory` quickly."""
+        if archive is None:
+            archive = self.makeArchive()
+        return self.factory.makeSourcePackagePublishingHistory(
+            maintainer=self.person, creator=self.person, archive=archive)
+
+    def makeDerivedSeries(self):
+        """Create a derived `DistroSeries`, quickly."""
+        series = self.makeDistroSeries(parent_series=self.makeDistroSeries())
+        getUtility(ISourcePackageFormatSelectionSet).add(
+            series, SourcePackageFormat.FORMAT_1_0)
+        return series
+
+    def makeView(self):
+        """Create a `PackageCopyingMixin`-based view."""
+        return create_initialized_view(
+            self.makeDerivedSeries(), "+localpackagediffs")
+
+    def test_copy_synchronously_copies_packages(self):
+        # copy_synchronously copies packages into the destination
+        # archive.
+        spph = self.makeSPPH()
+        dest_series = self.makeDerivedSeries()
+        archive = dest_series.distribution.main_archive
+        pocket = self.factory.getAnyPocket()
+        copy_synchronously(
+            [spph], archive, dest_series, pocket, include_binaries=False,
+            check_permissions=False)
+        self.assertNotEqual(None, find_spph_copy(archive, spph))
+
+    def test_copy_asynchronously_does_not_copy_packages(self):
+        # copy_asynchronously does not copy packages into the destination
+        # archive; that happens later, asynchronously.
+        spph = self.makeSPPH()
+        dest_series = self.makeDerivedSeries()
+        archive = dest_series.distribution.main_archive
+        pocket = self.factory.getAnyPocket()
+        copy_asynchronously(
+            [spph], archive, dest_series, pocket, include_binaries=False,
+            check_permissions=False)
+        self.assertEqual(None, find_spph_copy(archive, spph))
+
+    def test_copy_synchronously_lists_packages(self):
+        # copy_synchronously returns feedback that includes the names of
+        # packages it copied.
+        spph = self.makeSPPH()
+        dest_series = self.makeDerivedSeries()
+        pocket = self.factory.getAnyPocket()
+        notice = copy_synchronously(
+            [spph], dest_series.distribution.main_archive, dest_series,
+            pocket, include_binaries=False,
+            check_permissions=False).escapedtext
+        self.assertIn(
+            spph.sourcepackagerelease.sourcepackagename.name, notice)
+
+    def test_copy_asynchronously_creates_copy_jobs(self):
+        # copy_asynchronously creates PackageCopyJobs.
+        spph = self.makeSPPH()
+        dest_series = self.makeDerivedSeries()
+        pocket = self.factory.getAnyPocket()
+        archive = dest_series.distribution.main_archive
+        copy_asynchronously(
+            [spph], archive, dest_series, pocket, include_binaries=False,
+            check_permissions=False)
+        jobs = list(getUtility(IPackageCopyJobSource).getActiveJobs(archive))
+        self.assertEqual(1, len(jobs))
+        spr = spph.sourcepackagerelease
+        self.assertEqual(
+            [[spr.sourcepackagename.name, spr.version]],
+            jobs[0].metadata['source_packages'])
+
+    def test_do_copy_goes_async_if_canCopySynchronously_says_so(self):
+        # The view opts for asynchronous copying if canCopySynchronously
+        # returns False.  This creates PackageCopyJobs.
+        spph = self.makeSPPH()
+        pocket = self.factory.getAnyPocket()
+        view = self.makeView()
+        dest_series = view.context
+        archive = dest_series.distribution.main_archive
+        view.canCopySynchronously = FakeMethod(result=False)
+        view.do_copy(
+            'selected_differences', [spph], archive, dest_series, pocket,
+            False, check_permissions=False)
+        jobs = list(getUtility(IPackageCopyJobSource).getActiveJobs(archive))
+        self.assertNotEqual([], jobs)

=== modified file 'lib/lp/soyuz/interfaces/distributionjob.py'
--- lib/lp/soyuz/interfaces/distributionjob.py	2011-05-05 17:59:36 +0000
+++ lib/lp/soyuz/interfaces/distributionjob.py	2011-05-11 03:15:55 +0000
@@ -103,16 +103,16 @@
 class IPackageCopyJobSource(IJobSource):
     """An interface for acquiring IIPackageCopyJobs."""
 
-    def create(cls, source_archive, source_packages,
+    def create(cls, source_packages, source_archive,
                target_archive, target_distroseries, target_pocket,
                include_binaries=False):
         """Create a new sync package job.
 
-        :param source_archive: The `IArchive` in which `source_packages` are
-            found.
         :param source_packages: This is an iterable of `(source_package_name,
             version)` tuples, where both `source_package_name` and `version`
             are strings.
+        :param source_archive: The `IArchive` in which `source_packages` are
+            found.
         :param target_archive: The `IArchive` to which to copy the packages.
         :param target_distroseries: The `IDistroSeries` to which to copy the
             packages.

=== modified file 'lib/lp/soyuz/scripts/packagecopier.py'
--- lib/lp/soyuz/scripts/packagecopier.py	2011-05-10 19:02:47 +0000
+++ lib/lp/soyuz/scripts/packagecopier.py	2011-05-11 03:15:55 +0000
@@ -1,4 +1,4 @@
-# Copyright 2009 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """PackageCopier utilities."""
@@ -9,6 +9,7 @@
     'PackageCopier',
     'UnembargoSecurityPackage',
     'CopyChecker',
+    'check_copy_permissions',
     'do_copy',
     '_do_delayed_copy',
     '_do_direct_copy',
@@ -51,6 +52,7 @@
     )
 from lp.soyuz.scripts.processaccepted import close_bugs_for_sourcepublication
 
+
 # XXX cprov 2009-06-12: This function could be incorporated in ILFA,
 # I just don't see a clear benefit in doing that right now.
 def re_upload_file(libraryfile, restricted=False):
@@ -194,6 +196,46 @@
             return {'status': BuildSetStatus.NEEDSBUILD}
 
 
+def check_copy_permissions(person, archive, series, pocket,
+                           sourcepackagename):
+    """Check that `person` has permission to copy a package.
+
+    :param person: User attempting the upload.
+    :param archive: Destination `Archive`.
+    :param series: Destination `DistroSeries`.
+    :param pocket: Destination `Pocket`.
+    :param sourcepackagename: `SourcePackageName` of the package to be
+        copied.
+    :raises CannotCopy: If the copy is not allowed.
+    """
+    if person is None:
+        raise CannotCopy("Cannot check copy permissions (no requester).")
+
+    # If there is a requester, check that he has upload permission into
+    # the destination (archive, component, pocket). This check is done
+    # here rather than in the security adapter because it requires more
+    # info than is available in the security adapter.
+    package = series.getSourcePackage(sourcepackagename)
+    destination_component = package.latest_published_component
+
+    # If destination_component is not None, make sure the person
+    # has upload permission for this component.  Otherwise, any
+    # upload permission on this archive will do.
+    strict_component = destination_component is not None
+    reason = archive.checkUpload(
+        person, series, sourcepackagename, destination_component, pocket,
+        strict_component=strict_component)
+
+    if reason is not None:
+        # launchpad.Append on the main archive is sufficient
+        # to copy arbitrary packages. This allows for
+        # ubuntu's security team to sync sources into the
+        # primary archive (bypassing the queue and
+        # annoncements).
+        if not check_permission('launchpad.Append', series.main_archive):
+            raise CannotCopy(reason)
+
+
 class CopyChecker:
     """Check copy candiates.
 
@@ -399,11 +441,8 @@
         :raise CannotCopy when a copy is not allowed to be performed
             containing the reason of the error.
         """
-        # If there is a requester, check that he has upload permission
-        # into the destination (archive, component, pocket). This check
-        # is done here rather than in the security adapter because it
-        # requires more info than is available in the security adapter.
         if check_permissions:
+<<<<<<< TREE
             if person is None:
                 raise CannotCopy(
                     'Cannot check copy permissions (no requester).')
@@ -421,6 +460,11 @@
                     pocket, strict_component=strict_component)
                 if reason is not None:
                     raise CannotCopy(reason)
+=======
+            check_copy_permissions(
+                person, self.archive, series, pocket,
+                source.sourcepackagerelease.sourcepackagename)
+>>>>>>> MERGE-SOURCE
 
         if series not in self.archive.distribution.series:
             raise CannotCopy(