← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/pre-832661 into lp:launchpad

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/pre-832661 into lp:launchpad with lp:~jtv/launchpad/soyuz-lint-pre-cleanup as a prerequisite.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)
Related bugs:
  Bug #832661 in Launchpad itself: "Create DSDJs from PublishingSet.requestDeletion"
  https://bugs.launchpad.net/launchpad/+bug/832661

For more details, see:
https://code.launchpad.net/~jtv/launchpad/pre-832661/+merge/72847

= Summary =

I need to extend PublishingSet.requestDeletion to create DistroSeriesDifferenceJobs (DSDJs).  Before I can reasonably do that, I need to streamline it a bit (together with its brethren in a few other classes).

Rough overview of the changes:

BinaryPackagePublishingHistory (BPPH) and SourcePackagePublishingHistory (SPPH) share a common base class.  That base class has its own requestDeletion, which updates some fields on self and then creates DSDJs *if self is an SPPH*.  I extracted the invariant part of the method into setDeleted, and gave BPPH and SPPH each their own requestDeletion: the BPPH one just calls setDeleted whereas the SPPH one calls setDeleted and creates the DSDJs.

PublishingSet.requestDeletion manipulated query text to exploit some commonality.  I extracted the commonality into a separate method, a mass-deletion equivalent to setDeleted.

Previously, this method would query to look up the BPPHs for each SPPH that it marks as deleted.  The method that does this lookup forwards to another method back on PublishingSet that is perfectly capable of doing all required BPPH lookups in one go.  The whole thing is a single query now, hwich hopefully will repay some of the prospective cost of creating DSDJs.

Also, PublishingSet.requestDeletion used to require its "sources" list to be a list, and returned a list as well.  Neither appears necessary anywhere, so I cut both warts away.

You may also spot the disappearance of one or two redundant checks against empty lists.

As I sometimes do, I created a "—Lite" test case that avoids both the heavyweight layer and the costly setUp of the existing test case.  In my next branch this test case will sprout a new test to ensure the creation of DSDJs.


== Pre-implementation notes ==

Based on review discussion with wgrant, pertaining to my work on bug 830890.


== Tests ==

Based largely on a search for direct requestDeletion calls:
{{{
./bin/test -vvc lp.soyuz -t doc.archive.txt -t doc.sourcepackagerelease.txt -t doc.distribution.txt -t webservice.xx-source-package-publishing.txt -t xx-packagepublishinghistory.txt -t test_copypackage -t test_add_missing_builds -t test_publishdistro -t test_binarypackagebuild -t test_distroseriesdifferencejob -t test_binarypackagebuild -t test_publishing
}}}


== Demo and Q/A ==

Nothing should change.  Deleting an archive should still mark the archive's publications as deleted.


= Launchpad lint =

All lint was already there; I fixed what I could, in a separate branch that this one depends on.


Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/doc/publishing.txt
  lib/lp/soyuz/model/archive.py
  lib/lp/soyuz/model/publishing.py
  lib/lp/soyuz/tests/test_publishing.py

./lib/lp/soyuz/doc/publishing.txt
     119: want exceeds 78 characters.
     367: want exceeds 78 characters.
     925: want exceeds 78 characters.
     927: want exceeds 78 characters.
     929: want exceeds 78 characters.
     952: want exceeds 78 characters.
./lib/lp/soyuz/model/archive.py
     354: redefinition of function 'private' from line 350
-- 
https://code.launchpad.net/~jtv/launchpad/pre-832661/+merge/72847
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/pre-832661 into lp:launchpad.
=== modified file 'lib/lp/soyuz/doc/publishing.txt'
--- lib/lp/soyuz/doc/publishing.txt	2011-08-25 09:57:26 +0000
+++ lib/lp/soyuz/doc/publishing.txt	2011-08-25 09:57:27 +0000
@@ -1473,15 +1473,6 @@
     >>> cprov_binaries.count()
     8
 
-Please note also that the source publishing records to be deleted must
-be passed as a list. Otherwise an exception will be raised.
-
-    >>> deleted = publishing_set.requestDeletion(
-    ...     tuple(cprov_sources[:2]), cprov, 'OOPS-934EC47')
-    Traceback (most recent call last):
-    ...
-    AssertionError: The 'sources' parameter must be a list.
-
 
 ArchiveSourcePublications
 =========================

=== modified file 'lib/lp/soyuz/model/archive.py'
--- lib/lp/soyuz/model/archive.py	2011-08-25 09:57:26 +0000
+++ lib/lp/soyuz/model/archive.py	2011-08-25 09:57:27 +0000
@@ -1881,10 +1881,7 @@
             "This archive is already deleted.")
 
         # Set all the publications to DELETED.
-        statuses = (
-            PackagePublishingStatus.PENDING,
-            PackagePublishingStatus.PUBLISHED)
-        sources = list(self.getPublishedSources(status=statuses))
+        sources = self.getPublishedSources(status=active_publishing_status)
         getUtility(IPublishingSet).requestDeletion(
             sources, removed_by=deleted_by,
             removal_comment="Removed when deleting archive")

=== modified file 'lib/lp/soyuz/model/publishing.py'
--- lib/lp/soyuz/model/publishing.py	2011-07-28 18:37:47 +0000
+++ lib/lp/soyuz/model/publishing.py	2011-08-25 09:57:27 +0000
@@ -332,12 +332,16 @@
         self.status = PackagePublishingStatus.SUPERSEDED
         self.datesuperseded = UTC_NOW
 
-    def requestDeletion(self, removed_by, removal_comment=None):
-        """See `IPublishing`."""
+    def setDeleted(self, removed_by, removal_comment=None):
+        """Set to DELETED status."""
         self.status = PackagePublishingStatus.DELETED
         self.datesuperseded = UTC_NOW
         self.removed_by = removed_by
         self.removal_comment = removal_comment
+
+    def requestDeletion(self, removed_by, removal_comment=None):
+        """See `IPublishing`."""
+        self.setDeleted(removed_by, removal_comment)
         if ISourcePackagePublishingHistory.providedBy(self):
             if self.archive == self.distroseries.main_archive:
                 dsd_job_source = getUtility(IDistroSeriesDifferenceJobSource)
@@ -894,6 +898,15 @@
                     diff.diff_content, self.archive).http_url
         return None
 
+    def requestDeletion(self, removed_by, removal_comment=None):
+        """See `IPublishing`."""
+        self.setDeleted(removed_by, removal_comment)
+        if self.archive == self.distroseries.main_archive:
+            dsd_job_source = getUtility(IDistroSeriesDifferenceJobSource)
+            dsd_job_source.createForPackagePublication(
+                self.distroseries,
+                self.sourcepackagerelease.sourcepackagename, self.pocket)
+
     def api_requestDeletion(self, removed_by, removal_comment=None):
         """See `IPublishingEdit`."""
         # Special deletion method for the api that makes sure binaries
@@ -1296,6 +1309,10 @@
         # different here (yet).
         self.requestDeletion(removed_by, removal_comment)
 
+    def requestDeletion(self, removed_by, removal_comment=None):
+        """See `IPublishing`."""
+        self.setDeleted(removed_by, removal_comment)
+
 
 def expand_binary_requests(distroseries, binaries):
     """Architecture-expand a dict of binary publication requests.
@@ -1761,14 +1778,11 @@
 
         return result_set
 
-    def getBinaryPublicationsForSources(
-        self, one_or_more_source_publications):
+    def getBinaryPublicationsForSources(self,
+                                        one_or_more_source_publications):
         """See `IPublishingSet`."""
-        # Import Buildand DistroArchSeries locally to avoid circular imports,
-        # since Build uses SourcePackagePublishingHistory and DistroArchSeries
-        # uses BinaryPackagePublishingHistory.
-        from lp.soyuz.model.distroarchseries import (
-            DistroArchSeries)
+        # Avoid circular imports.
+        from lp.soyuz.model.distroarchseries import DistroArchSeries
 
         source_publication_ids = self._extractIDs(
             one_or_more_source_publications)
@@ -1955,66 +1969,56 @@
         return self.getBuildStatusSummariesForSourceIdsAndArchive([source_id],
             source_publication.archive)[source_id]
 
+    def setMultipleDeleted(self, publication_class, ids, removed_by,
+                           removal_comment=None):
+        """Mark multiple publication records as deleted."""
+        ids = list(ids)
+        if len(ids) == 0:
+            return
+
+        table = publication_class.__name__
+        permitted_tables = [
+            'BinaryPackagePublishingHistory',
+            'SourcePackagePublishingHistory',
+            ]
+        assert table in permitted_tables, "Deleting wrong type."
+
+        params = sqlvalues(
+            deleted=PackagePublishingStatus.DELETED, now=UTC_NOW,
+            removal_comment=removal_comment, removed_by=removed_by)
+
+        IMasterStore(publication_class).execute("\n".join([
+            "UPDATE %s" % table,
+            """
+            SET
+                status = %(deleted)s,
+                datesuperseded = %(now)s,
+                removed_by = %(removed_by)s,
+                removal_comment = %(removal_comment)s
+            """ % params,
+            "WHERE id IN %s" % sqlvalues(ids),
+            ]))
+
     def requestDeletion(self, sources, removed_by, removal_comment=None):
         """See `IPublishingSet`."""
-
-        # The 'sources' parameter could actually be any kind of sequence
-        # (e.g. even a ResultSet) and the method would still work correctly.
-        # This is problematic when it comes to the type of the return value
-        # however.
-        # Apparently the caller anticipates that we return the sequence of
-        # instances "deleted" adhering to the original type of the 'sources'
-        # parameter.
-        # Since this is too messy we prescribe that the type of 'sources'
-        # must be a list and we return the instances manipulated as a list.
-        # This may not be an ideal solution but this way we at least achieve
-        # consistency.
-        assert isinstance(sources, list), (
-            "The 'sources' parameter must be a list.")
-
+        sources = list(sources)
         if len(sources) == 0:
-            return []
-
-        # The following piece of query "boiler plate" will be used for
-        # both the source and the binary package publishing history table.
-        query_boilerplate = '''
-            SET status = %s,
-                datesuperseded = %s,
-                removed_by = %s,
-                removal_comment = %s
-            WHERE id IN
-            ''' % sqlvalues(PackagePublishingStatus.DELETED, UTC_NOW,
-                            removed_by, removal_comment)
-
-        store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR)
-
-        # First update the source package publishing history table.
-        source_ids = [source.id for source in sources]
-        if len(source_ids) > 0:
-            query = 'UPDATE SourcePackagePublishingHistory '
-            query += query_boilerplate
-            query += ' %s' % sqlvalues(source_ids)
-            store.execute(query)
-
-        # Prepare the list of associated *binary* packages publishing
-        # history records.
-        binary_packages = []
-        for source in sources:
-            binary_packages.extend(source.getPublishedBinaries())
-
-        if len(binary_packages) == 0:
-            return sources
-
-        # Now run the query that marks the binary packages as deleted
-        # as well.
-        if len(binary_packages) > 0:
-            query = 'UPDATE BinaryPackagePublishingHistory '
-            query += query_boilerplate
-            query += ' %s' % sqlvalues(
-                [binary.id for binary in binary_packages])
-            store.execute(query)
-
-        return sources + binary_packages
+            return
+
+        spph_ids = [spph.id for spph in sources]
+        self.setMultipleDeleted(
+            SourcePackagePublishingHistory, spph_ids, removed_by,
+            removal_comment=removal_comment)
+
+        # Mark binary publications deleted.
+        bpph_ids = [
+            bpph.id
+            for source, bpph, bin, bin_name, arch
+                in self.getBinaryPublicationsForSources(sources)]
+        if len(bpph_ids) > 0:
+            self.setMultipleDeleted(
+                BinaryPackagePublishingHistory, bpph_ids, removed_by,
+                removal_comment=removal_comment)
 
     def getNearestAncestor(
         self, package_name, archive, distroseries, pocket=None,

=== modified file 'lib/lp/soyuz/tests/test_publishing.py'
--- lib/lp/soyuz/tests/test_publishing.py	2011-08-25 06:18:33 +0000
+++ lib/lp/soyuz/tests/test_publishing.py	2011-08-25 09:57:27 +0000
@@ -25,6 +25,7 @@
     DatabaseFunctionalLayer,
     LaunchpadZopelessLayer,
     reconnect_stores,
+    ZopelessDatabaseLayer,
     )
 from lp.app.errors import NotFoundError
 from lp.archivepublisher.config import getPubConfig
@@ -1162,6 +1163,70 @@
         self.assertEqual(None, record)
 
 
+class TestPublishingSetLite(TestCaseWithFactory):
+
+    layer = ZopelessDatabaseLayer
+
+    def makeMatchingSourceAndBinaryPPH(self):
+        """Produce a matching pair of SPPH and BPPH.
+
+        Returns a tuple of one SourcePackagePublishingHistory "spph" and
+        one BinaryPackagePublishingHistory "bpph" stemming from the same
+        source package, published into the same distroseries, pocket, and
+        archive.
+        """
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        bpr = bpph.binarypackagerelease
+        self.assertNotEqual(None, bpr)
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            distroseries=bpph.distroarchseries.distroseries,
+            sourcepackagerelease=bpr.build.source_package_release,
+            pocket=bpph.pocket, archive=bpph.archive)
+        return spph, bpph
+
+    def test_makeMatchingSourceAndBinaryPPH(self):
+        # makeMatchingSourceAndBinaryPPH really produces a matching pair
+        # of spph and bpph.
+        spph, bpph = self.makeMatchingSourceAndBinaryPPH()
+        self.assertContentEqual([bpph], spph.getPublishedBinaries())
+
+    def test_requestDeletion_marks_SPPHs_deleted(self):
+        spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [spph], self.factory.makePerson())
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.DELETED, spph.status)
+
+    def test_requestDeletion_leaves_other_SPPHs_alone(self):
+        spph = self.factory.makeSourcePackagePublishingHistory()
+        other_spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [other_spph], self.factory.makePerson())
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.PENDING, spph.status)
+
+    def test_requestDeletion_marks_attached_BPPHs_deleted(self):
+        spph, bpph = self.makeMatchingSourceAndBinaryPPH()
+        getUtility(IPublishingSet).requestDeletion(
+            [spph], self.factory.makePerson())
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.DELETED, spph.status)
+
+    def test_requestDeletion_leaves_other_BPPHs_alone(self):
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        unrelated_spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [unrelated_spph], self.factory.makePerson())
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.PENDING, bpph.status)
+
+    def test_requestDeletion_accepts_empty_sources_list(self):
+        person = self.factory.makePerson()
+        getUtility(IPublishingSet).requestDeletion([], person)
+        # The test is that this does not fail.
+        Store.of(person).flush()
+
+
 class TestSourceDomination(TestNativePublishingBase):
     """Test SourcePackagePublishingHistory.supersede() operates correctly."""