← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~wgrant/launchpad/bug-629835-copier-architectures into lp:launchpad

 

William Grant has proposed merging lp:~wgrant/launchpad/bug-629835-copier-architectures into lp:launchpad with lp:~wgrant/launchpad/bug-701383-ppa-component-override as a prerequisite.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)
Related bugs:
  #629835 Package copier doesn't restrict builds to the target distroseries' architectures
  https://bugs.launchpad.net/bugs/629835
  #649859 Architecture-independent population logic is duplicated
  https://bugs.launchpad.net/bugs/649859
  #701357 Should not copy binaries into disabled architectures
  https://bugs.launchpad.net/bugs/701357

For more details, see:
https://code.launchpad.net/~wgrant/launchpad/bug-629835-copier-architectures/+merge/46233

This branch fixes three architecture-related copy bugs:

 - Bug #629835: Delayed copies didn't take into account the
   architectures in the target series, so they could attempt to publish
   builds into a non-existent DistroArchSeries, crashing
   process-accepted.py. I fixed _do_delayed_copy to skip a build if
   there is no corresponding DAS in the target.

 - Bug #649859: PackageUploadBuild.publish (build publishing) and
   PublishingSet.copyBinariesTo (binary copying) had separate buggy
   implementations of the same architecture-independent copying logic.
   I factored these both out into PublishingSet.publishBinary, wrapping
   PublishingSet.newBinaryPublication to automatically create all
   relevant publications.
  
 - Bug #701357: We recently added the DistroArchSeries.enabled flag,
   which allows us to prevent builds and publications for a particular
   architecture, effectively removing it. However, copies do not respect
   this, and will happily create new publications in a disabled DAS.
   _do_delayed_copy and copyBinariesTo have been fixed to skip builds
   targeting disabled DASes, and newBinaryPublication refuses to create
   publications in them.

-- 
https://code.launchpad.net/~wgrant/launchpad/bug-629835-copier-architectures/+merge/46233
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~wgrant/launchpad/bug-629835-copier-architectures into lp:launchpad.
=== modified file 'lib/lp/soyuz/interfaces/publishing.py'
--- lib/lp/soyuz/interfaces/publishing.py	2010-12-09 16:22:27 +0000
+++ lib/lp/soyuz/interfaces/publishing.py	2011-01-14 05:48:21 +0000
@@ -878,6 +878,29 @@
             publishing histories.
         """
 
+    def publishBinary(archive, binarypackagerelease, distroarchseries,
+                      component, section, priority, pocket):
+        """Publish a `BinaryPackageRelease` in an archive.
+
+        Creates one or more `IBinaryPackagePublishingHistory` records,
+        handling architecture-independent and DDEB publications transparently.
+
+        Note that binaries will only be copied if they don't already exist in
+        the target; this method cannot be used to change overrides.
+
+        :param archive: The target `IArchive`.
+        :param binarypackagerelease: The `IBinaryPackageRelease` to copy.
+        :param distroarchseries: An `IDistroArchSeries`. If the binary is
+            architecture-independent, it will be published to all enabled
+            architectures in this series.
+        :param component: The target `IComponent`.
+        :param section: The target `ISection`.
+        :param priority: The target `PackagePublishingPriority`.
+        :param pocket: The target `PackagePublishingPocket`.
+
+        :return: A list of new `IBinaryPackagePublishingHistory` records.
+        """
+
     def newBinaryPublication(archive, binarypackagerelease, distroarchseries,
                              component, section, priority, pocket):
         """Create a new `BinaryPackagePublishingHistory`.

=== modified file 'lib/lp/soyuz/model/publishing.py'
--- lib/lp/soyuz/model/publishing.py	2011-01-14 05:48:19 +0000
+++ lib/lp/soyuz/model/publishing.py	2011-01-14 05:48:21 +0000
@@ -92,6 +92,7 @@
     ISourcePackagePublishingHistory,
     PoolFileOverwriteError,
     )
+from lp.soyuz.interfaces.queue import QueueInconsistentStateError
 from lp.soyuz.model.binarypackagename import BinaryPackageName
 from lp.soyuz.model.binarypackagerelease import (
     BinaryPackageRelease,
@@ -1259,58 +1260,67 @@
     def copyBinariesTo(self, binaries, distroseries, pocket, archive):
         """See `IPublishingSet`."""
         secure_copies = []
-
         for binary in binaries:
-            binarypackagerelease = binary.binarypackagerelease
-
-            # XXX 2010-09-28 Julian bug=649859
-            # This piece of code duplicates the logic in
-            # PackageUploadBuild.publish(), it needs to be refactored.
-
-            if binarypackagerelease.architecturespecific:
-                # If the binary is architecture specific and the target
-                # distroseries does not include the architecture then we
-                # skip the binary and continue.
-                try:
-                    # For safety, we use the architecture the binary was
-                    # built, and not the one it is published, coping with
-                    # single arch-indep publications for architectures that
-                    # do not exist in the destination series.
-                    # See #387589 for more information.
-                    target_architecture = distroseries[
-                        binarypackagerelease.build.arch_tag]
-                except NotFoundError:
-                    continue
-                destination_architectures = [target_architecture]
-            else:
-                destination_architectures = [
-                    arch for arch in distroseries.architectures
-                    if arch.enabled]
-
-            for distroarchseries in destination_architectures:
-
-                # We only copy the binary if it doesn't already exist
-                # in the destination.
-                binary_in_destination = archive.getAllPublishedBinaries(
-                    name=binarypackagerelease.name, exact_match=True,
-                    version=binarypackagerelease.version,
-                    status=active_publishing_status, pocket=pocket,
-                    distroarchseries=distroarchseries)
-
-                if binary_in_destination.count() == 0:
-                    pub = self.newBinaryPublication(
-                        archive, binarypackagerelease, distroarchseries,
-                        binary.component, binary.section, binary.priority,
-                        pocket)
-                    secure_copies.append(pub)
-
+            # XXX: wgrant 2011-01-10:
+            # This will go wrong if nominatedarchindep gets deleted in a
+            # future series.
+            try:
+                build = binary.binarypackagerelease.build
+                target_architecture = distroseries[
+                    build.distro_arch_series.architecturetag]
+            except NotFoundError:
+                continue
+            if not target_architecture.enabled:
+                continue
+            secure_copies.extend(
+                getUtility(IPublishingSet).publishBinary(
+                    archive, binary.binarypackagerelease, target_architecture,
+                    binary.component, binary.section, binary.priority,
+                    pocket))
         return secure_copies
 
+    def publishBinary(self, archive, binarypackagerelease, distroarchseries,
+                      component, section, priority, pocket):
+        """See `IPublishingSet`."""
+        if not binarypackagerelease.architecturespecific:
+            target_archs = distroarchseries.distroseries.enabled_architectures
+        else:
+            target_archs = [distroarchseries]
+
+        # DDEBs targeted to the PRIMARY archive are published in the
+        # corresponding DEBUG archive.
+        if binarypackagerelease.binpackageformat == BinaryPackageFormat.DDEB:
+            debug_archive = archive.debug_archive
+            if debug_archive is None:
+                raise QueueInconsistentStateError(
+                    "Could not find the corresponding DEBUG archive "
+                    "for %s" % (archive.displayname))
+            archive = debug_archive
+
+        published_binaries = []
+        for arch in target_archs:
+            # We only publish the binary if it doesn't already exist in
+            # the destination. Note that this means we don't support
+            # override changes on their own.
+            binaries_in_destination = archive.getAllPublishedBinaries(
+                name=binarypackagerelease.name, exact_match=True,
+                version=binarypackagerelease.version,
+                status=active_publishing_status, pocket=pocket,
+                distroarchseries=distroarchseries)
+            if binaries_in_destination.count() == 0:
+                published_binaries.append(
+                    getUtility(IPublishingSet).newBinaryPublication(
+                        archive, binarypackagerelease, arch, component,
+                        section, priority, pocket))
+        return published_binaries
+
     def newBinaryPublication(self, archive, binarypackagerelease,
                              distroarchseries, component, section, priority,
                              pocket):
         """See `IPublishingSet`."""
-        pub = BinaryPackagePublishingHistory(
+        assert distroarchseries.enabled, (
+            "Will not create new publications in a disabled architecture.")
+        return BinaryPackagePublishingHistory(
             archive=archive,
             binarypackagerelease=binarypackagerelease,
             distroarchseries=distroarchseries,
@@ -1322,8 +1332,6 @@
             datecreated=UTC_NOW,
             pocket=pocket)
 
-        return pub
-
     def newSourcePublication(self, archive, sourcepackagerelease,
                              distroseries, component, section, pocket,
                              ancestor=None):

=== modified file 'lib/lp/soyuz/model/queue.py'
--- lib/lp/soyuz/model/queue.py	2010-11-06 06:09:45 +0000
+++ lib/lp/soyuz/model/queue.py	2011-01-14 05:48:21 +0000
@@ -1448,54 +1448,24 @@
             target_das.distroseries.name,
             build_archtag))
 
-        # Get the other enabled distroarchseries for this
-        # distroseries.  If the binary is architecture independent then
-        # we need to publish it in all of those too.
-
-        # XXX Julian 2010-09-28 bug=649859
-        # This logic is duplicated in
-        # PackagePublishingSet.copyBinariesTo() and should be
-        # refactored.
-        other_das = set(
-            arch for arch in self.packageupload.distroseries.architectures
-            if arch.enabled)
-        other_das = other_das - set([target_das])
         # First up, publish everything in this build into that dar.
         published_binaries = []
         for binary in self.build.binarypackages:
-            target_dars = set([target_das])
-            if not binary.architecturespecific:
-                target_dars = target_dars.union(other_das)
-                debug(logger, "... %s/%s (Arch Independent)" % (
-                    binary.binarypackagename.name,
-                    binary.version))
-            else:
-                debug(logger, "... %s/%s (Arch Specific)" % (
-                    binary.binarypackagename.name,
-                    binary.version))
-
-
-            archive = self.packageupload.archive
-            # DDEBs targeted to the PRIMARY archive are published in the
-            # corresponding DEBUG archive.
-            if binary.binpackageformat == BinaryPackageFormat.DDEB:
-                debug_archive = archive.debug_archive
-                if debug_archive is None:
-                    raise QueueInconsistentStateError(
-                        "Could not find the corresponding DEBUG archive "
-                        "for %s" % (archive.displayname))
-                archive = debug_archive
-
-            for each_target_dar in target_dars:
-                bpph = getUtility(IPublishingSet).newBinaryPublication(
-                    archive=archive,
+            debug(
+                logger, "... %s/%s (Arch %s)" % (
+                binary.binarypackagename.name,
+                binary.version,
+                'Specific' if binary.architecturespecific else 'Independent',
+                ))
+            published_binaries.extend(
+                getUtility(IPublishingSet).publishBinary(
+                    archive=self.packageupload.archive,
                     binarypackagerelease=binary,
-                    distroarchseries=each_target_dar,
+                    distroarchseries=target_das,
                     component=binary.component,
                     section=binary.section,
                     priority=binary.priority,
-                    pocket=self.packageupload.pocket)
-                published_binaries.append(bpph)
+                    pocket=self.packageupload.pocket))
         return published_binaries
 
 

=== modified file 'lib/lp/soyuz/scripts/packagecopier.py'
--- lib/lp/soyuz/scripts/packagecopier.py	2010-08-27 11:19:54 +0000
+++ lib/lp/soyuz/scripts/packagecopier.py	2011-01-14 05:48:21 +0000
@@ -25,6 +25,7 @@
 
 from canonical.launchpad.interfaces.librarian import ILibraryFileAliasSet
 from canonical.librarian.utils import copy_and_close
+from lp.app.errors import NotFoundError
 from lp.buildmaster.enums import BuildStatus
 from lp.soyuz.adapters.packagelocation import build_package_location
 from lp.soyuz.enums import ArchivePurpose
@@ -635,7 +636,14 @@
     # If binaries are included in the copy we include binary custom files.
     if include_binaries:
         for build in source.getBuilds():
-            if build.status != BuildStatus.FULLYBUILT:
+            # Don't copy builds that aren't yet done, or those without a
+            # corresponding enabled architecture in the new series.
+            try:
+                target_arch = series[build.arch_tag]
+            except NotFoundError:
+                continue
+            if (not target_arch.enabled or
+                build.status != BuildStatus.FULLYBUILT):
                 continue
             delayed_copy.addBuild(build)
             original_build_upload = build.package_upload

=== modified file 'lib/lp/soyuz/scripts/tests/test_copypackage.py'
--- lib/lp/soyuz/scripts/tests/test_copypackage.py	2010-12-23 01:02:00 +0000
+++ lib/lp/soyuz/scripts/tests/test_copypackage.py	2011-01-14 05:48:21 +0000
@@ -929,15 +929,100 @@
         self.assertFalse(checked_copy.delayed)
 
 
-class DoDirectCopyTestCase(TestCaseWithFactory):
+class BaseDoCopyTests:
 
     layer = LaunchpadZopelessLayer
 
+    def createNobby(self, archs):
+        """Create a new 'nobby' series with the given architecture tags.
+
+        The first is used as nominatedarchindep.
+        """
+        nobby = self.factory.makeDistroSeries(
+            distribution=self.test_publisher.ubuntutest, name='nobby')
+        for arch in archs:
+            pf = self.factory.makeProcessorFamily(name='my_%s' % arch)
+            self.factory.makeDistroArchSeries(
+                distroseries=nobby, architecturetag=arch,
+                processorfamily=pf)
+        nobby.nominatedarchindep = nobby[archs[0]]
+        self.test_publisher.addFakeChroots(nobby)
+        return nobby
+
+    def assertCopied(self, copies, series, arch_tags):
+        raise NotImplementedError
+
+    def doCopy(self, source, archive, series, pocket, include_binaries):
+        raise NotImplementedError
+
+    def test_does_not_copy_disabled_arches(self):
+        # When copying binaries to a new series, we must not copy any
+        # into disabled architectures.
+
+        # Make a new architecture-specific source and binary.
+        archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        source = self.test_publisher.getPubSource(
+            archive=archive, architecturehintlist='any')
+        [bin_i386, bin_hppa] = self.test_publisher.getPubBinaries(
+            pub_source=source)
+
+        # Now make a new distroseries with two architectures, one of
+        # which is disabled.
+        nobby = self.createNobby(('i386', 'hppa'))
+        nobby['hppa'].enabled = False
+
+        # Now we can copy the package with binaries.
+        target_archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        copies = self.doCopy(
+            source, target_archive, nobby, source.pocket, True)
+
+        # The binary should not be published for hppa.
+        self.assertCopied(copies, nobby, ('i386',))
+
+    def test_does_not_copy_removed_arches(self):
+        # When copying binaries to a new series, we must not try to copy
+        # any into architectures that no longer exist.
+
+        # Make a new architecture-specific source and binary.
+        archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        source = self.test_publisher.getPubSource(
+            archive=archive, architecturehintlist='any')
+        [bin_i386, bin_hppa] = self.test_publisher.getPubBinaries(
+            pub_source=source)
+
+        # Now make a new distroseries with only i386.
+        nobby = self.createNobby(('i386',))
+
+        # Now we can copy the package with binaries.
+        target_archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        copies = self.doCopy(
+            source, target_archive, nobby, source.pocket, True)
+
+        # The copy succeeds, and no hppa publication is present.
+        self.assertCopied(copies, nobby, ('i386',))
+
+
+class TestDoDirectCopy(TestCaseWithFactory, BaseDoCopyTests):
+
     def setUp(self):
-        super(DoDirectCopyTestCase, self).setUp()
+        super(TestDoDirectCopy, self).setUp()
         self.test_publisher = SoyuzTestPublisher()
         self.test_publisher.prepareBreezyAutotest()
 
+    def assertCopied(self, copies, series, arch_tags):
+        self.assertEquals(
+            [u'foo 666 in %s' % series.name] +
+            [u'foo-bin 666 in %s %s' % (series.name, arch_tag)
+             for arch_tag in arch_tags],
+            [copy.displayname for copy in copies])
+
+    def doCopy(self, source, archive, series, pocket, include_binaries):
+        return _do_direct_copy(source, archive, series, pocket, True)
+
     def testCanCopyArchIndependentBinariesBuiltInAnUnsupportedArch(self):
         # _do_direct_copy() uses the binary candidate build architecture,
         # instead of the publish one, in other to check if it's
@@ -968,14 +1053,11 @@
         self.layer.txn.commit()
 
         # Copy succeeds.
-        copies = _do_direct_copy(
-            source, source.archive, hoary_test, source.pocket, True)
-        self.assertEquals(
-            ['foo 666 in hoary-test',
-             'foo-bin 666 in hoary-test amd64',
-             'foo-bin 666 in hoary-test i386',
-             ],
-            [copy.displayname for copy in copies])
+        target_archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        copies = self.doCopy(
+            source, target_archive, hoary_test, source.pocket, True)
+        self.assertCopied(copies, hoary_test, ('amd64', 'i386'))
 
     def test_copying_arch_indep_binaries_with_disabled_arches(self):
         # When copying an arch-indep binary to a new series, we must not
@@ -991,42 +1073,31 @@
 
         # Now make a new distroseries with two architectures, one of
         # which is disabled.
-        nobby = self.factory.makeDistroSeries(
-            distribution=self.test_publisher.ubuntutest, name='nobby')
-        i386_pf = self.factory.makeProcessorFamily(name='my_i386')
-        nobby_i386 = self.factory.makeDistroArchSeries(
-            distroseries=nobby, architecturetag='i386',
-            processorfamily=i386_pf)
-        hppa_pf = self.factory.makeProcessorFamily(name='my_hppa')
-        nobby_hppa = self.factory.makeDistroArchSeries(
-            distroseries=nobby, architecturetag='hppa',
-            processorfamily=hppa_pf)
-        nobby_hppa.enabled = False
-        nobby.nominatedarchindep = nobby_i386
-        self.test_publisher.addFakeChroots(nobby)
+        nobby = self.createNobby(('i386', 'hppa'))
+        nobby['hppa'].enabled = False
 
         # Now we can copy the package with binaries.
-        copies = _do_direct_copy(
-            source, source.archive, nobby, source.pocket, True)
+        target_archive = self.factory.makeArchive(
+            distribution=self.test_publisher.ubuntutest, virtualized=False)
+        copies = self.doCopy(
+            source, target_archive, nobby, source.pocket, True)
 
         # The binary should not be published for hppa.
-        self.assertEquals(
-            [u'foo 666 in nobby',
-             u'foo-bin 666 in nobby i386',],
-            [copy.displayname for copy in copies])
-
-
-class DoDelayedCopyTestCase(TestCaseWithFactory):
+        self.assertCopied(copies, nobby, ('i386',))
+
+
+class TestDoDelayedCopy(TestCaseWithFactory, BaseDoCopyTests):
 
     layer = LaunchpadZopelessLayer
     dbuser = config.archivepublisher.dbuser
 
     def setUp(self):
-        super(DoDelayedCopyTestCase, self).setUp()
+        super(TestDoDelayedCopy, self).setUp()
+
         self.test_publisher = SoyuzTestPublisher()
+        self.test_publisher.prepareBreezyAutotest()
 
         # Setup to copy into the main archive security pocket
-        self.test_publisher.prepareBreezyAutotest()
         self.copy_archive = self.test_publisher.ubuntutest.main_archive
         self.copy_series = self.test_publisher.distroseries
         self.copy_pocket = PackagePublishingPocket.SECURITY
@@ -1036,6 +1107,17 @@
         self.test_publisher.breezy_autotest.status = (
             SeriesStatus.CURRENT)
 
+    def assertCopied(self, copy, series, arch_tags):
+        self.assertEquals(
+            copy.sources[0].sourcepackagerelease.title,
+            'foo - 666')
+        self.assertEquals(
+            sorted(arch_tags),
+            sorted([pub.build.arch_tag for pub in copy.builds]))
+
+    def doCopy(self, source, archive, series, pocket, include_binaries):
+        return _do_delayed_copy(source, archive, series, pocket, True)
+
     def createDelayedCopyContext(self):
         """Create a context to allow delayed-copies test.