← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/launchpad/bug-832661 into lp:launchpad

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/bug-832661 into lp:launchpad.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)
Related bugs:
  Bug #832661 in Launchpad itself: "Create DSDJs from PublishingSet.requestDeletion"
  https://bugs.launchpad.net/launchpad/+bug/832661

For more details, see:
https://code.launchpad.net/~jtv/launchpad/bug-832661/+merge/73009

= Summary =

When we mark a SourcePackagePublishingHistory as deleted in the primary archive of a distroseries that is involved in derivation, we also need to generate a DistroSeriesDifferenceJob for it.  SPPH.requestDeletion was already doing this but PublishingSet.requestDeletion was not.


== Proposed fix ==

Move some code around and then leave the real problem — making it scale — for someone else.


== Pre-implementation notes ==

Julian pointed out Archive.is_main; it's likely to be cheaper than checking DistroSeries.main_archive, which issues an extra query at least once.


== Implementation details ==

The core of it is IDistroSeriesDifferenceJobSource.createForSPPHs.  You'll find a naïve implementation there that was enough to get me through the tests; since we'll be executing this from Gina (which has a huge backlog of "active" publications that need deleting) and when deleting archives, it's probably important to make it scale better.  But given time constraints, I made that a separate work item.


== Tests ==

{{{
./bin/test -vvc lp.soyuz.tests.test_publishing -t requestDeletion
./bin/test -vvc lp.soyuz.tests.test_distroseriesdifferencejob -t createForSPPHs
}}}


== Demo and Q/A ==

Deleting an archive should still mark all its source publication records as deleted.


= Launchpad lint =

The lint is actually remaining pre-existing lint from files I only touched in a previous branch (and fixed in yet another branch), both of which I've been unable to land because of bug 833743 and various other buildbot failures.  It's been a wild few days.  I'll have to re-submit this branch to register the dependencies on those other branches, so the lint reported here will no longer apply.


Checking for conflicts and issues in changed files.

Linting changed files:
  lib/lp/soyuz/model/distroseriesdifferencejob.py
  lib/lp/soyuz/tests/test_publishing.py
  lib/lp/soyuz/model/publishing.py
  lib/lp/soyuz/doc/publishing.txt
  lib/lp/testing/factory.py
  lib/lp/soyuz/interfaces/distributionjob.py
  lib/lp/soyuz/tests/test_distroseriesdifferencejob.py
  lib/lp/soyuz/model/archive.py
  lib/lp/testing/tests/test_factory.py

./lib/lp/soyuz/doc/publishing.txt
     119: want exceeds 78 characters.
     367: want exceeds 78 characters.
     925: want exceeds 78 characters.
     927: want exceeds 78 characters.
     929: want exceeds 78 characters.
     952: want exceeds 78 characters.
./lib/lp/soyuz/model/archive.py
     350: redefinition of function 'private' from line 346
-- 
https://code.launchpad.net/~jtv/launchpad/bug-832661/+merge/73009
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/bug-832661 into lp:launchpad.
=== modified file 'lib/lp/soyuz/doc/publishing.txt'
--- lib/lp/soyuz/doc/publishing.txt	2011-07-27 23:06:20 +0000
+++ lib/lp/soyuz/doc/publishing.txt	2011-08-26 09:41:33 +0000
@@ -30,12 +30,14 @@
     >>> from canonical.launchpad.webapp.testing import verifyObject
     >>> from lp.registry.interfaces.distroseries import IDistroSeries
     >>> from lp.registry.interfaces.sourcepackage import ISourcePackage
-    >>> from lp.soyuz.interfaces.distributionsourcepackagerelease import IDistributionSourcePackageRelease
+    >>> from lp.soyuz.interfaces.distributionsourcepackagerelease import (
+    ...     IDistributionSourcePackageRelease)
     >>> from lp.soyuz.interfaces.publishing import (
     ...     IBinaryPackagePublishingHistory,
     ...     ISourcePackagePublishingHistory,
     ...     )
-    >>> from lp.soyuz.interfaces.sourcepackagerelease import ISourcePackageRelease
+    >>> from lp.soyuz.interfaces.sourcepackagerelease import (
+    ...     ISourcePackageRelease)
 
     >>> verifyObject(ISourcePackagePublishingHistory, spph)
     True
@@ -769,7 +771,7 @@
     ...     ppa_copied_source.id)
 
 createMissingBuilds will not create any builds because this is an
-intra-archive copy: 
+intra-archive copy:
 
     >>> ppa_source.createMissingBuilds()
     []
@@ -901,7 +903,8 @@
 Symmetric behaviour is offered for BinaryPackagePublishing,
 BinaryPackageFile and IBinaryPackagePublishingHistory
 
-    >>> from lp.soyuz.interfaces.publishing import IBinaryPackageFilePublishing
+    >>> from lp.soyuz.interfaces.publishing import (
+    ...     IBinaryPackageFilePublishing)
 
     >>> bpph = BinaryPackagePublishingHistory.get(15)
     >>> print bpph.displayname
@@ -1470,15 +1473,6 @@
     >>> cprov_binaries.count()
     8
 
-Please note also that the source publishing records to be deleted must
-be passed as a list. Otherwise an exception will be raised.
-
-    >>> deleted = publishing_set.requestDeletion(
-    ...     tuple(cprov_sources[:2]), cprov, 'OOPS-934EC47')
-    Traceback (most recent call last):
-    ...
-    AssertionError: The 'sources' parameter must be a list.
-
 
 ArchiveSourcePublications
 =========================
@@ -1491,8 +1485,8 @@
 fetch in a fixed number of queries (3) instead of varying according
 the size of the set (3 * N).
 
-     >>> from lp.soyuz.adapters.archivesourcepublication import (
-     ...     ArchiveSourcePublications)
+    >>> from lp.soyuz.adapters.archivesourcepublication import (
+    ...     ArchiveSourcePublications)
 
 We will use all published sources in Celso's PPA as our initial set.
 

=== modified file 'lib/lp/soyuz/interfaces/distributionjob.py'
--- lib/lp/soyuz/interfaces/distributionjob.py	2011-07-27 10:34:53 +0000
+++ lib/lp/soyuz/interfaces/distributionjob.py	2011-08-26 09:41:33 +0000
@@ -1,4 +1,4 @@
-# Copyright 2010 Canonical Ltd.  This software is licensed under the
+# Copyright 2010-2011 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 __metaclass__ = type
@@ -122,7 +122,7 @@
     """An `IJob` for creating `DistroSeriesDifference`s."""
 
     def createForPackagePublication(derivedseries, sourcepackagename, pocket):
-        """Create jobs as appropriate for a given status publication.
+        """Create jobs as appropriate for a given package publication.
 
         :param derived_series: A `DistroSeries` that is assumed to be
             derived from `parent_series`.
@@ -132,6 +132,9 @@
         :return: An iterable of `DistroSeriesDifferenceJob`.
         """
 
+    def createForSPPHs(spphs):
+        """Create jobs for given `SourcePackagePublishingHistory`s."""
+
     def massCreateForSeries(derived_series):
         """Create jobs for all the publications inside the given distroseries
             with reference to the given parent series.

=== modified file 'lib/lp/soyuz/model/archive.py'
--- lib/lp/soyuz/model/archive.py	2011-08-24 06:09:05 +0000
+++ lib/lp/soyuz/model/archive.py	2011-08-26 09:41:33 +0000
@@ -58,8 +58,8 @@
     DecoratedResultSet,
     )
 from canonical.launchpad.components.tokens import (
+    create_token,
     create_unique_token_for_table,
-    create_token,
     )
 from canonical.launchpad.database.librarian import (
     LibraryFileAlias,
@@ -150,8 +150,8 @@
     NoSuchPPA,
     NoTokensForTeams,
     PocketNotFound,
+    validate_external_dependencies,
     VersionRequiresName,
-    validate_external_dependencies,
     )
 from lp.soyuz.interfaces.archivearch import IArchiveArchSet
 from lp.soyuz.interfaces.archiveauthtoken import IArchiveAuthTokenSet
@@ -1881,10 +1881,7 @@
             "This archive is already deleted.")
 
         # Set all the publications to DELETED.
-        statuses = (
-            PackagePublishingStatus.PENDING,
-            PackagePublishingStatus.PUBLISHED)
-        sources = list(self.getPublishedSources(status=statuses))
+        sources = self.getPublishedSources(status=active_publishing_status)
         getUtility(IPublishingSet).requestDeletion(
             sources, removed_by=deleted_by,
             removal_comment="Removed when deleting archive")

=== modified file 'lib/lp/soyuz/model/distroseriesdifferencejob.py'
--- lib/lp/soyuz/model/distroseriesdifferencejob.py	2011-08-09 10:01:02 +0000
+++ lib/lp/soyuz/model/distroseriesdifferencejob.py	2011-08-26 09:41:33 +0000
@@ -195,13 +195,16 @@
         """See `IDistroSeriesDifferenceJobSource`."""
         if not getFeatureFlag(FEATURE_FLAG_ENABLE_MODULE):
             return
+
         # -backports and -proposed are not really part of a standard
         # distribution's packages so we're ignoring them here.  They can
         # always be manually synced by the users if necessary, in the
         # rare occasions that they require them.
-        if pocket in (
+        ignored_pockets = [
             PackagePublishingPocket.BACKPORTS,
-            PackagePublishingPocket.PROPOSED):
+            PackagePublishingPocket.PROPOSED,
+            ]
+        if pocket in ignored_pockets:
             return
 
         # Create jobs for DSDs between the derived_series' parents and
@@ -221,6 +224,18 @@
         return parent_series_jobs + derived_series_jobs
 
     @classmethod
+    def createForSPPHs(cls, spphs):
+        """See `IDistroSeriesDifferenceJobSource`."""
+        # XXX JeroenVermeulen 2011-08-25, bug=834499: This won't do for
+        # some of the mass deletions we're planning to support.
+        # Optimize.
+        for spph in spphs:
+            if spph.archive.is_main:
+                cls.createForPackagePublication(
+                    spph.distroseries,
+                    spph.sourcepackagerelease.sourcepackagename, spph.pocket)
+
+    @classmethod
     def massCreateForSeries(cls, derived_series):
         """See `IDistroSeriesDifferenceJobSource`."""
         if not getFeatureFlag(FEATURE_FLAG_ENABLE_MODULE):

=== modified file 'lib/lp/soyuz/model/publishing.py'
--- lib/lp/soyuz/model/publishing.py	2011-07-28 18:37:47 +0000
+++ lib/lp/soyuz/model/publishing.py	2011-08-26 09:41:33 +0000
@@ -332,18 +332,12 @@
         self.status = PackagePublishingStatus.SUPERSEDED
         self.datesuperseded = UTC_NOW
 
-    def requestDeletion(self, removed_by, removal_comment=None):
-        """See `IPublishing`."""
+    def setDeleted(self, removed_by, removal_comment=None):
+        """Set to DELETED status."""
         self.status = PackagePublishingStatus.DELETED
         self.datesuperseded = UTC_NOW
         self.removed_by = removed_by
         self.removal_comment = removal_comment
-        if ISourcePackagePublishingHistory.providedBy(self):
-            if self.archive == self.distroseries.main_archive:
-                dsd_job_source = getUtility(IDistroSeriesDifferenceJobSource)
-                dsd_job_source.createForPackagePublication(
-                    self.distroseries,
-                    self.sourcepackagerelease.sourcepackagename, self.pocket)
 
     def requestObsolescence(self):
         """See `IArchivePublisher`."""
@@ -894,6 +888,15 @@
                     diff.diff_content, self.archive).http_url
         return None
 
+    def requestDeletion(self, removed_by, removal_comment=None):
+        """See `IPublishing`."""
+        self.setDeleted(removed_by, removal_comment)
+        if self.archive.is_main:
+            dsd_job_source = getUtility(IDistroSeriesDifferenceJobSource)
+            dsd_job_source.createForPackagePublication(
+                self.distroseries,
+                self.sourcepackagerelease.sourcepackagename, self.pocket)
+
     def api_requestDeletion(self, removed_by, removal_comment=None):
         """See `IPublishingEdit`."""
         # Special deletion method for the api that makes sure binaries
@@ -1296,6 +1299,10 @@
         # different here (yet).
         self.requestDeletion(removed_by, removal_comment)
 
+    def requestDeletion(self, removed_by, removal_comment=None):
+        """See `IPublishing`."""
+        self.setDeleted(removed_by, removal_comment)
+
 
 def expand_binary_requests(distroseries, binaries):
     """Architecture-expand a dict of binary publication requests.
@@ -1761,14 +1768,11 @@
 
         return result_set
 
-    def getBinaryPublicationsForSources(
-        self, one_or_more_source_publications):
+    def getBinaryPublicationsForSources(self,
+                                        one_or_more_source_publications):
         """See `IPublishingSet`."""
-        # Import Buildand DistroArchSeries locally to avoid circular imports,
-        # since Build uses SourcePackagePublishingHistory and DistroArchSeries
-        # uses BinaryPackagePublishingHistory.
-        from lp.soyuz.model.distroarchseries import (
-            DistroArchSeries)
+        # Avoid circular imports.
+        from lp.soyuz.model.distroarchseries import DistroArchSeries
 
         source_publication_ids = self._extractIDs(
             one_or_more_source_publications)
@@ -1955,66 +1959,58 @@
         return self.getBuildStatusSummariesForSourceIdsAndArchive([source_id],
             source_publication.archive)[source_id]
 
+    def setMultipleDeleted(self, publication_class, ids, removed_by,
+                           removal_comment=None):
+        """Mark multiple publication records as deleted."""
+        ids = list(ids)
+        if len(ids) == 0:
+            return
+
+        table = publication_class.__name__
+        permitted_tables = [
+            'BinaryPackagePublishingHistory',
+            'SourcePackagePublishingHistory',
+            ]
+        assert table in permitted_tables, "Deleting wrong type."
+
+        params = sqlvalues(
+            deleted=PackagePublishingStatus.DELETED, now=UTC_NOW,
+            removal_comment=removal_comment, removed_by=removed_by)
+
+        IMasterStore(publication_class).execute("\n".join([
+            "UPDATE %s" % table,
+            """
+            SET
+                status = %(deleted)s,
+                datesuperseded = %(now)s,
+                removed_by = %(removed_by)s,
+                removal_comment = %(removal_comment)s
+            """ % params,
+            "WHERE id IN %s" % sqlvalues(ids),
+            ]))
+
     def requestDeletion(self, sources, removed_by, removal_comment=None):
         """See `IPublishingSet`."""
-
-        # The 'sources' parameter could actually be any kind of sequence
-        # (e.g. even a ResultSet) and the method would still work correctly.
-        # This is problematic when it comes to the type of the return value
-        # however.
-        # Apparently the caller anticipates that we return the sequence of
-        # instances "deleted" adhering to the original type of the 'sources'
-        # parameter.
-        # Since this is too messy we prescribe that the type of 'sources'
-        # must be a list and we return the instances manipulated as a list.
-        # This may not be an ideal solution but this way we at least achieve
-        # consistency.
-        assert isinstance(sources, list), (
-            "The 'sources' parameter must be a list.")
-
+        sources = list(sources)
         if len(sources) == 0:
-            return []
-
-        # The following piece of query "boiler plate" will be used for
-        # both the source and the binary package publishing history table.
-        query_boilerplate = '''
-            SET status = %s,
-                datesuperseded = %s,
-                removed_by = %s,
-                removal_comment = %s
-            WHERE id IN
-            ''' % sqlvalues(PackagePublishingStatus.DELETED, UTC_NOW,
-                            removed_by, removal_comment)
-
-        store = getUtility(IStoreSelector).get(MAIN_STORE, DEFAULT_FLAVOR)
-
-        # First update the source package publishing history table.
-        source_ids = [source.id for source in sources]
-        if len(source_ids) > 0:
-            query = 'UPDATE SourcePackagePublishingHistory '
-            query += query_boilerplate
-            query += ' %s' % sqlvalues(source_ids)
-            store.execute(query)
-
-        # Prepare the list of associated *binary* packages publishing
-        # history records.
-        binary_packages = []
-        for source in sources:
-            binary_packages.extend(source.getPublishedBinaries())
-
-        if len(binary_packages) == 0:
-            return sources
-
-        # Now run the query that marks the binary packages as deleted
-        # as well.
-        if len(binary_packages) > 0:
-            query = 'UPDATE BinaryPackagePublishingHistory '
-            query += query_boilerplate
-            query += ' %s' % sqlvalues(
-                [binary.id for binary in binary_packages])
-            store.execute(query)
-
-        return sources + binary_packages
+            return
+
+        spph_ids = [spph.id for spph in sources]
+        self.setMultipleDeleted(
+            SourcePackagePublishingHistory, spph_ids, removed_by,
+            removal_comment=removal_comment)
+
+        getUtility(IDistroSeriesDifferenceJobSource).createForSPPHs(sources)
+
+        # Mark binary publications deleted.
+        bpph_ids = [
+            bpph.id
+            for source, bpph, bin, bin_name, arch
+                in self.getBinaryPublicationsForSources(sources)]
+        if len(bpph_ids) > 0:
+            self.setMultipleDeleted(
+                BinaryPackagePublishingHistory, bpph_ids, removed_by,
+                removal_comment=removal_comment)
 
     def getNearestAncestor(
         self, package_name, archive, distroseries, pocket=None,

=== modified file 'lib/lp/soyuz/tests/test_distroseriesdifferencejob.py'
--- lib/lp/soyuz/tests/test_distroseriesdifferencejob.py	2011-08-16 10:11:32 +0000
+++ lib/lp/soyuz/tests/test_distroseriesdifferencejob.py	2011-08-26 09:41:33 +0000
@@ -27,7 +27,10 @@
 from lp.services.database import bulk
 from lp.services.features.testing import FeatureFixture
 from lp.services.job.interfaces.job import JobStatus
-from lp.soyuz.enums import PackagePublishingStatus
+from lp.soyuz.enums import (
+    ArchivePurpose,
+    PackagePublishingStatus,
+    )
 from lp.soyuz.interfaces.distributionjob import (
     DistributionJobType,
     IDistroSeriesDifferenceJobSource,
@@ -322,6 +325,121 @@
             [],
             find_waiting_jobs(dsp.derived_series, package, dsp.parent_series))
 
+    def test_createForSPPHs_creates_job_for_derived_series(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            dsp.parent_series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+
+        self.getJobSource().createForSPPHs([spph])
+
+        self.assertEqual(
+            1, len(find_waiting_jobs(
+                dsp.derived_series, spn, dsp.parent_series)))
+
+    def test_createForSPPHs_creates_job_for_parent_series(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            dsp.derived_series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+
+        self.getJobSource().createForSPPHs([spph])
+
+        self.assertEqual(
+            1, len(find_waiting_jobs(
+                dsp.derived_series, spn, dsp.parent_series)))
+
+    def test_createForSPPHs_obeys_feature_flag(self):
+        self.useFixture(FeatureFixture({FEATURE_FLAG_ENABLE_MODULE: ''}))
+        dsp = self.factory.makeDistroSeriesParent()
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            dsp.parent_series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+        self.getJobSource().createForSPPHs([spph])
+        self.assertContentEqual(
+            [], find_waiting_jobs(dsp.derived_series, spn, dsp.parent_series))
+
+    def test_createForSPPHs_ignores_backports_and_proposed(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        spr = self.factory.makeSourcePackageRelease()
+        spn = spr.sourcepackagename
+        ignored_pockets = [
+            PackagePublishingPocket.BACKPORTS,
+            PackagePublishingPocket.PROPOSED,
+            ]
+        spphs = [
+            self.factory.makeSourcePackagePublishingHistory(
+                distroseries=dsp.parent_series, sourcepackagerelease=spr,
+                pocket=pocket)
+            for pocket in ignored_pockets]
+        self.getJobSource().createForSPPHs(spphs)
+        self.assertContentEqual(
+            [], find_waiting_jobs(dsp.derived_series, spn, dsp.parent_series))
+
+    def test_createForSPPHs_creates_no_jobs_for_unrelated_series(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        other_series = self.factory.makeDistroSeries(
+            distribution=dsp.derived_series.distribution)
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            dsp.parent_series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+        self.getJobSource().createForSPPHs([spph])
+        self.assertContentEqual(
+            [], find_waiting_jobs(dsp.derived_series, spn, other_series))
+
+    def test_createForSPPHs_accepts_SPPHs_for_multiple_distroseries(self):
+        derived_distro = self.factory.makeDistribution()
+        spn = self.factory.makeSourcePackageName()
+        series = [
+            self.factory.makeDistroSeries(derived_distro)
+            for counter in xrange(2)]
+        dsps = [
+            self.factory.makeDistroSeriesParent(derived_series=distroseries)
+            for distroseries in series]
+
+        for distroseries in series:
+            self.factory.makeSourcePackagePublishingHistory(
+                distroseries, pocket=PackagePublishingPocket.RELEASE,
+                sourcepackagerelease=self.factory.makeSourcePackageRelease(
+                    sourcepackagename=spn))
+
+        job_counts = dict(
+            (dsp.derived_series, len(find_waiting_jobs(
+                dsp.derived_series, spn, dsp.parent_series)))
+            for dsp in dsps)
+        self.assertEqual(
+            dict((distroseries, 1) for distroseries in series),
+            job_counts)
+
+    def test_createForSPPHs_behaves_sensibly_if_job_already_exists(self):
+        # If a job already existed, createForSPPHs may create a
+        # redundant one but it certainly won't do anything weird like
+        # delete what was there or create too many.
+        dsp = self.factory.makeDistroSeriesParent()
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            dsp.parent_series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+
+        create_jobs = range(1, 3)
+        for counter in create_jobs:
+            self.getJobSource().createForSPPHs([spph])
+
+        job_count = len(find_waiting_jobs(
+            dsp.derived_series, spn, dsp.parent_series))
+        self.assertIn(job_count, create_jobs)
+
+    def test_createForSPPHs_creates_no_jobs_for_ppas(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        series = dsp.parent_series
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            series, pocket=PackagePublishingPocket.RELEASE,
+            archive=self.factory.makeArchive(
+                distribution=series.distribution, purpose=ArchivePurpose.PPA))
+        spn = spph.sourcepackagerelease.sourcepackagename
+        self.getJobSource().createForSPPHs([spph])
+        self.assertContentEqual(
+            [], find_waiting_jobs(dsp.derived_series, spn, dsp.parent_series))
+
     def test_massCreateForSeries_obeys_feature_flag(self):
         self.useFixture(FeatureFixture({FEATURE_FLAG_ENABLE_MODULE: ''}))
         dsp = self.factory.makeDistroSeriesParent()

=== modified file 'lib/lp/soyuz/tests/test_publishing.py'
--- lib/lp/soyuz/tests/test_publishing.py	2011-08-25 06:18:33 +0000
+++ lib/lp/soyuz/tests/test_publishing.py	2011-08-26 09:41:33 +0000
@@ -25,6 +25,7 @@
     DatabaseFunctionalLayer,
     LaunchpadZopelessLayer,
     reconnect_stores,
+    ZopelessDatabaseLayer,
     )
 from lp.app.errors import NotFoundError
 from lp.archivepublisher.config import getPubConfig
@@ -36,6 +37,7 @@
 from lp.registry.interfaces.pocket import PackagePublishingPocket
 from lp.registry.interfaces.sourcepackage import SourcePackageUrgency
 from lp.registry.interfaces.sourcepackagename import ISourcePackageNameSet
+from lp.services.features.testing import FeatureFixture
 from lp.services.log.logger import DevNullLogger
 from lp.soyuz.adapters.overrides import UnknownOverridePolicy
 from lp.soyuz.enums import (
@@ -54,6 +56,10 @@
     )
 from lp.soyuz.interfaces.queue import QueueInconsistentStateError
 from lp.soyuz.interfaces.section import ISectionSet
+from lp.soyuz.model.distroseriesdifferencejob import (
+    FEATURE_FLAG_ENABLE_MODULE,
+    find_waiting_jobs,
+    )
 from lp.soyuz.model.distroseriespackagecache import DistroSeriesPackageCache
 from lp.soyuz.model.processor import ProcessorFamily
 from lp.soyuz.model.publishing import (
@@ -1162,6 +1168,70 @@
         self.assertEqual(None, record)
 
 
+class TestPublishingSetLite(TestCaseWithFactory):
+
+    layer = ZopelessDatabaseLayer
+
+    def enableDistroDerivation(self):
+        self.useFixture(FeatureFixture({FEATURE_FLAG_ENABLE_MODULE: u'on'}))
+
+    def test_requestDeletion_marks_SPPHs_deleted(self):
+        spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [spph], self.factory.makePerson())
+        # XXX JeroenVermeulen 2011-08-25, bug=834388: obviate commit.
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.DELETED, spph.status)
+
+    def test_requestDeletion_leaves_other_SPPHs_alone(self):
+        spph = self.factory.makeSourcePackagePublishingHistory()
+        other_spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [other_spph], self.factory.makePerson())
+        # XXX JeroenVermeulen 2011-08-25, bug=834388: obviate commit.
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.PENDING, spph.status)
+
+    def test_requestDeletion_marks_attached_BPPHs_deleted(self):
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        spph = self.factory.makeSPPHForBPPH(bpph)
+        getUtility(IPublishingSet).requestDeletion(
+            [spph], self.factory.makePerson())
+        # XXX JeroenVermeulen 2011-08-25, bug=834388: obviate commit.
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.DELETED, spph.status)
+
+    def test_requestDeletion_leaves_other_BPPHs_alone(self):
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        unrelated_spph = self.factory.makeSourcePackagePublishingHistory()
+        getUtility(IPublishingSet).requestDeletion(
+            [unrelated_spph], self.factory.makePerson())
+        # XXX JeroenVermeulen 2011-08-25, bug=834388: obviate commit.
+        transaction.commit()
+        self.assertEqual(PackagePublishingStatus.PENDING, bpph.status)
+
+    def test_requestDeletion_accepts_empty_sources_list(self):
+        person = self.factory.makePerson()
+        getUtility(IPublishingSet).requestDeletion([], person)
+        # The test is that this does not fail.
+        Store.of(person).flush()
+
+    def test_requestDeletion_creates_DistroSeriesDifferenceJobs(self):
+        dsp = self.factory.makeDistroSeriesParent()
+        series = dsp.derived_series
+        spph = self.factory.makeSourcePackagePublishingHistory(
+            series, pocket=PackagePublishingPocket.RELEASE)
+        spn = spph.sourcepackagerelease.sourcepackagename
+
+        self.enableDistroDerivation()
+        getUtility(IPublishingSet).requestDeletion(
+            [spph], self.factory.makePerson())
+
+        self.assertEqual(
+            1, len(find_waiting_jobs(
+                dsp.derived_series, spn, dsp.parent_series)))
+
+
 class TestSourceDomination(TestNativePublishingBase):
     """Test SourcePackagePublishingHistory.supersede() operates correctly."""
 

=== modified file 'lib/lp/testing/factory.py'
--- lib/lp/testing/factory.py	2011-08-16 09:14:07 +0000
+++ lib/lp/testing/factory.py	2011-08-26 09:41:33 +0000
@@ -3799,6 +3799,24 @@
             naked_bpph.datepublished = UTC_NOW
         return bpph
 
+    def makeSPPHForBPPH(self, bpph):
+        """Produce a `SourcePackagePublishingHistory` to match `bpph`.
+
+        :param bpph: A `BinaryPackagePublishingHistory`.
+        :return: A `SourcePackagePublishingHistory` stemming from the same
+            source package as `bpph`, published into the same distroseries,
+            pocket, and archive.
+        """
+        # JeroenVermeulen 2011-08-25, bug=834370: Julian says this isn't
+        # very complete, and ignores architectures.  Improve so we can
+        # remove more of our reliance on the SoyuzTestPublisher.
+        bpr = bpph.binarypackagerelease
+        spph = self.makeSourcePackagePublishingHistory(
+            distroseries=bpph.distroarchseries.distroseries,
+            sourcepackagerelease=bpr.build.source_package_release,
+            pocket=bpph.pocket, archive=bpph.archive)
+        return spph
+
     def makeBinaryPackageName(self, name=None):
         """Make an `IBinaryPackageName`."""
         if name is None:

=== modified file 'lib/lp/testing/tests/test_factory.py'
--- lib/lp/testing/tests/test_factory.py	2011-08-03 11:00:11 +0000
+++ lib/lp/testing/tests/test_factory.py	2011-08-26 09:41:33 +0000
@@ -519,6 +519,20 @@
             dsc_maintainer_rfc822=maintainer)
         self.assertEqual(maintainer, spr.dsc_maintainer_rfc822)
 
+    # makeSPPHForBPPH
+    def test_makeSPPHForBPPH_returns_ISPPH(self):
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        spph = self.factory.makeSPPHForBPPH(bpph)
+        self.assertThat(spph, IsProxied())
+        self.assertThat(
+            removeSecurityProxy(spph),
+            Provides(ISourcePackagePublishingHistory))
+
+    def test_makeSPPHForBPPH_returns_SPPH_for_BPPH(self):
+        bpph = self.factory.makeBinaryPackagePublishingHistory()
+        spph = self.factory.makeSPPHForBPPH(bpph)
+        self.assertContentEqual([bpph], spph.getPublishedBinaries())
+
     # makeSuiteSourcePackage
     def test_makeSuiteSourcePackage_returns_ISuiteSourcePackage(self):
         ssp = self.factory.makeSuiteSourcePackage()