launchpad-reviewers team mailing list archive
-
launchpad-reviewers team
-
Mailing list archive
-
Message #10401
[Merge] lp:~cjwatson/launchpad/copy-auto-approve into lp:launchpad
Colin Watson has proposed merging lp:~cjwatson/launchpad/copy-auto-approve into lp:launchpad with lp:~cjwatson/launchpad/copy-allows-queue-admin as a prerequisite.
Requested reviews:
Launchpad code reviewers (launchpad-reviewers)
Related bugs:
Bug #1006871 in Launchpad itself: "Copying packages to -updates always goes through unapproved queue, even when copying user is privileged"
https://bugs.launchpad.net/launchpad/+bug/1006871
For more details, see:
https://code.launchpad.net/~cjwatson/launchpad/copy-auto-approve/+merge/117299
== Summary ==
Bug 1006871: in some cases it is cumbersome to have to approve package copies as a separate step, particularly given the delay on processing PackageCopyJobs. There should be a way for queue admins to copy a package and approve the copy in a single step. The bug has some concrete use cases for this.
== Proposed fix ==
Add an auto_approve parameter to Archive.copyPackage and Archive.copyPackages, defaulting to False. When True, copies are automatically accepted even if they would normally have required approval, although only if the requester has appropriate queue admin permissions.
== Pre-implementation notes ==
Iain Lane suggested that this should be explicit rather than (my initial proposal) implicit. On reflection, this is obviously correct. If I run syncpackage to copy a package from Debian, and Ubuntu happens to be frozen at the time, then it should go through UNAPPROVED even though I'm an Ubuntu archive administrator and thus have queue admin permissions. On the other hand, if (acting as an archive admin) I run sru-release to copy a verified stable release update from -proposed to -updates, I'm always going to accept it immediately afterwards so it makes sense to let me skip that extra step.
== Implementation details ==
There's a little duplication in PlainPackageCopyJob._checkPolicies (the calls to canAdministerQueue in both if branches), but if you look closely that's necessary because it has to come after the source override handling.
I arranged to silently ignore auto_approve=True from non-queue-admins because it's harmless to do so and it makes it easier to write tools that work for people both with and without queue admin permissions. (For instance, the kernel team sometimes copy packages from their PPA to the -proposed pocket in the primary archive and ask us to accept them; but since the copy-proposed-kernel tool is often run by archive admins, it would make sense for it to use auto_approve=True.)
== LOC Rationale ==
+129. I have 1833 lines of credit at the moment, and, as mentioned in https://code.launchpad.net/~cjwatson/launchpad/copy-allows-queue-admin/+merge/116731, this is the last known piece needed to remove the copy-package.py script.
== Tests ==
bin/test -vvct test_packagecopyjob
== Demo and Q/A ==
Add auto_approve=True to the Archive.copyPackage calls in sru-release in lp:ubuntu-archive-tools, and use this to copy a package from -proposed to -updates on dogfood (with queue admin permissions). It should be automatically accepted. Without that parameter, the copy should land in the unapproved queue.
== Lint ==
Just a pre-existing false positive due to pocketlint not understanding property setters:
./lib/lp/soyuz/model/archive.py
346: redefinition of function 'private' from line 342
--
https://code.launchpad.net/~cjwatson/launchpad/copy-auto-approve/+merge/117299
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~cjwatson/launchpad/copy-auto-approve into lp:launchpad.
=== modified file 'lib/lp/soyuz/interfaces/archive.py'
--- lib/lp/soyuz/interfaces/archive.py 2012-07-30 17:05:29 +0000
+++ lib/lp/soyuz/interfaces/archive.py 2012-07-30 17:05:29 +0000
@@ -1292,12 +1292,17 @@
title=_("Sponsored Person"),
description=_("The person who is being sponsored for this copy.")),
unembargo=Bool(title=_("Unembargo restricted files")),
+ auto_approve=Bool(
+ title=_("Automatic approval"),
+ description=_("Automatically approve this copy (queue admins "
+ "only)."),
+ required=False),
)
@export_write_operation()
@operation_for_version('devel')
def copyPackage(source_name, version, from_archive, to_pocket,
person, to_series=None, include_binaries=False,
- sponsored=None, unembargo=False):
+ sponsored=None, unembargo=False, auto_approve=False):
"""Copy a single named source into this archive.
Asynchronously copy a specific version of a named source to the
@@ -1321,6 +1326,9 @@
:param unembargo: if True, allow copying restricted files from a
private archive to a public archive, and re-upload them to the
public librarian when doing so.
+ :param auto_approve: if True and the `IPerson` requesting the sync
+ has queue admin permissions on the target, accept the copy
+ immediately rather than setting it to unapproved.
:raises NoSuchSourcePackageName: if the source name is invalid
:raises PocketNotFound: if the pocket name is invalid
@@ -1354,12 +1362,17 @@
title=_("Sponsored Person"),
description=_("The person who is being sponsored for this copy.")),
unembargo=Bool(title=_("Unembargo restricted files")),
+ auto_approve=Bool(
+ title=_("Automatic approval"),
+ description=_("Automatically approve this copy (queue admins "
+ "only)."),
+ required=False),
)
@export_write_operation()
@operation_for_version('devel')
def copyPackages(source_names, from_archive, to_pocket, person,
to_series=None, from_series=None, include_binaries=False,
- sponsored=None, unembargo=False):
+ sponsored=None, unembargo=False, auto_approve=False):
"""Copy multiple named sources into this archive from another.
Asynchronously copy the most recent PUBLISHED versions of the named
@@ -1386,6 +1399,9 @@
:param unembargo: if True, allow copying restricted files from a
private archive to a public archive, and re-upload them to the
public librarian when doing so.
+ :param auto_approve: if True and the `IPerson` requesting the sync
+ has queue admin permissions on the target, accept the copies
+ immediately rather than setting it to unapproved.
:raises NoSuchSourcePackageName: if the source name is invalid
:raises PocketNotFound: if the pocket name is invalid
=== modified file 'lib/lp/soyuz/interfaces/packagecopyjob.py'
--- lib/lp/soyuz/interfaces/packagecopyjob.py 2012-07-12 08:32:15 +0000
+++ lib/lp/soyuz/interfaces/packagecopyjob.py 2012-07-30 17:05:29 +0000
@@ -128,7 +128,7 @@
target_archive, target_distroseries, target_pocket,
include_binaries=False, package_version=None,
copy_policy=PackageCopyPolicy.INSECURE, requester=None,
- sponsored=None, unembargo=False):
+ sponsored=None, unembargo=False, auto_approve=False):
"""Create a new `IPlainPackageCopyJob`.
:param package_name: The name of the source package to copy.
@@ -147,11 +147,15 @@
:param sponsored: The user who is being sponsored to make the copy.
The person who is making this request then becomes the sponsor.
:param unembargo: See `do_copy`.
+ :param auto_approve: if True and the user requesting the sync has
+ queue admin permissions on the target, accept the copy
+ immediately rather than setting it to unapproved.
"""
def createMultiple(target_distroseries, copy_tasks, requester,
copy_policy=PackageCopyPolicy.INSECURE,
- include_binaries=False, unembargo=False):
+ include_binaries=False, unembargo=False,
+ auto_approve=False):
"""Create multiple new `IPlainPackageCopyJob`s at once.
:param target_distroseries: The `IDistroSeries` to which to copy the
@@ -164,6 +168,9 @@
:param include_binaries: As in `do_copy`.
:param unembargo: As in `do_copy`.
:return: An iterable of `PackageCopyJob` ids.
+ :param auto_approve: if True and the user requesting the sync has
+ queue admin permissions on the target, accept the copy
+ immediately rather than setting it to unapproved.
"""
def getActiveJobs(target_archive):
@@ -216,6 +223,10 @@
title=_("Unembargo restricted files"),
required=False, readonly=True)
+ auto_approve = Bool(
+ title=_("Automatic approval"),
+ required=False, readonly=True)
+
def addSourceOverride(override):
"""Add an `ISourceOverride` to the metadata."""
=== modified file 'lib/lp/soyuz/model/archive.py'
--- lib/lp/soyuz/model/archive.py 2012-07-30 17:05:29 +0000
+++ lib/lp/soyuz/model/archive.py 2012-07-30 17:05:29 +0000
@@ -1638,7 +1638,7 @@
def copyPackage(self, source_name, version, from_archive, to_pocket,
person, to_series=None, include_binaries=False,
- sponsored=None, unembargo=False):
+ sponsored=None, unembargo=False, auto_approve=False):
"""See `IArchive`."""
self._checkCopyPackageFeatureFlags()
@@ -1660,11 +1660,13 @@
target_pocket=pocket,
package_version=version, include_binaries=include_binaries,
copy_policy=PackageCopyPolicy.INSECURE, requester=person,
- sponsored=sponsored, unembargo=unembargo)
+ sponsored=sponsored, unembargo=unembargo,
+ auto_approve=auto_approve)
def copyPackages(self, source_names, from_archive, to_pocket,
person, to_series=None, from_series=None,
- include_binaries=None, sponsored=None, unembargo=False):
+ include_binaries=None, sponsored=None, unembargo=False,
+ auto_approve=False):
"""See `IArchive`."""
self._checkCopyPackageFeatureFlags()
@@ -1696,7 +1698,7 @@
job_source.createMultiple(
copy_tasks, person, copy_policy=PackageCopyPolicy.MASS_SYNC,
include_binaries=include_binaries, sponsored=sponsored,
- unembargo=unembargo)
+ unembargo=unembargo, auto_approve=auto_approve)
def _collectLatestPublishedSources(self, from_archive, from_series,
source_names):
=== modified file 'lib/lp/soyuz/model/packagecopyjob.py'
--- lib/lp/soyuz/model/packagecopyjob.py 2012-07-12 08:32:15 +0000
+++ lib/lp/soyuz/model/packagecopyjob.py 2012-07-30 17:05:29 +0000
@@ -256,7 +256,8 @@
@classmethod
def _makeMetadata(cls, target_pocket, package_version,
- include_binaries, sponsored=None, unembargo=False):
+ include_binaries, sponsored=None, unembargo=False,
+ auto_approve=False):
"""Produce a metadata dict for this job."""
if sponsored:
sponsored_name = sponsored.name
@@ -268,6 +269,7 @@
'include_binaries': bool(include_binaries),
'sponsored': sponsored_name,
'unembargo': unembargo,
+ 'auto_approve': auto_approve,
}
@classmethod
@@ -275,13 +277,13 @@
target_archive, target_distroseries, target_pocket,
include_binaries=False, package_version=None,
copy_policy=PackageCopyPolicy.INSECURE, requester=None,
- sponsored=None, unembargo=False):
+ sponsored=None, unembargo=False, auto_approve=False):
"""See `IPlainPackageCopyJobSource`."""
assert package_version is not None, "No package version specified."
assert requester is not None, "No requester specified."
metadata = cls._makeMetadata(
target_pocket, package_version, include_binaries, sponsored,
- unembargo)
+ unembargo, auto_approve)
job = PackageCopyJob(
job_type=cls.class_job_type,
source_archive=source_archive,
@@ -298,7 +300,8 @@
@classmethod
def _composeJobInsertionTuple(cls, copy_policy, include_binaries, job_id,
- copy_task, sponsored, unembargo):
+ copy_task, sponsored, unembargo,
+ auto_approve):
"""Create an SQL fragment for inserting a job into the database.
:return: A string representing an SQL tuple containing initializers
@@ -315,7 +318,7 @@
) = copy_task
metadata = cls._makeMetadata(
target_pocket, package_version, include_binaries, sponsored,
- unembargo)
+ unembargo, auto_approve)
data = (
cls.class_job_type, target_distroseries, copy_policy,
source_archive, target_archive, package_name, job_id,
@@ -326,14 +329,14 @@
def createMultiple(cls, copy_tasks, requester,
copy_policy=PackageCopyPolicy.INSECURE,
include_binaries=False, sponsored=None,
- unembargo=False):
+ unembargo=False, auto_approve=False):
"""See `IPlainPackageCopyJobSource`."""
store = IMasterStore(Job)
job_ids = Job.createMultiple(store, len(copy_tasks), requester)
job_contents = [
cls._composeJobInsertionTuple(
copy_policy, include_binaries, job_id, task, sponsored,
- unembargo)
+ unembargo, auto_approve)
for job_id, task in zip(job_ids, copy_tasks)]
return bulk.create(
(PackageCopyJob.job_type, PackageCopyJob.target_distroseries,
@@ -415,6 +418,10 @@
def unembargo(self):
return self.metadata.get('unembargo', False)
+ @property
+ def auto_approve(self):
+ return self.metadata.get('auto_approve', False)
+
def _createPackageUpload(self, unapproved=False):
pu = self.target_distroseries.createQueueEntry(
pocket=self.target_pocket, archive=self.target_archive,
@@ -452,7 +459,8 @@
return SourceOverride(source_package_name, component, section)
- def _checkPolicies(self, source_name, source_component=None):
+ def _checkPolicies(self, source_name, source_component=None,
+ auto_approve=False):
# This helper will only return if it's safe to carry on with the
# copy, otherwise it raises SuspendJobException to tell the job
# runner to suspend the job.
@@ -470,8 +478,11 @@
self.target_archive, self.target_distroseries,
self.target_pocket, [source_name], source_component)
self.addSourceOverride(defaults[0])
+ if auto_approve:
+ auto_approve = self.target_archive.canAdministerQueue(
+ self.requester, self.getSourceOverride().component)
- approve_new = copy_policy.autoApproveNew(
+ approve_new = auto_approve or copy_policy.autoApproveNew(
self.target_archive, self.target_distroseries,
self.target_pocket)
@@ -483,10 +494,13 @@
else:
# Put the existing override in the metadata.
self.addSourceOverride(ancestry[0])
+ if auto_approve:
+ auto_approve = self.target_archive.canAdministerQueue(
+ self.requester, self.getSourceOverride().component)
# The package is not new (it has ancestry) so check the copy
# policy for existing packages.
- approve_existing = copy_policy.autoApprove(
+ approve_existing = auto_approve or copy_policy.autoApprove(
self.target_archive, self.target_distroseries, self.target_pocket)
if not approve_existing:
self._createPackageUpload(unapproved=True)
@@ -565,7 +579,8 @@
[self.context.id]).any()
if pu is None:
self._checkPolicies(
- source_name, source_package.sourcepackagerelease.component)
+ source_name, source_package.sourcepackagerelease.component,
+ self.auto_approve)
# The package is free to go right in, so just copy it now.
ancestry = self.target_archive.getPublishedSources(
=== modified file 'lib/lp/soyuz/tests/test_packagecopyjob.py'
--- lib/lp/soyuz/tests/test_packagecopyjob.py 2012-07-25 18:56:53 +0000
+++ lib/lp/soyuz/tests/test_packagecopyjob.py 2012-07-30 17:05:29 +0000
@@ -925,6 +925,91 @@
# the target archive.
self.assertEqual('main', pcj.metadata['component_override'])
+ def createAutoApproveEnvironment(self, create_ancestry, component_names):
+ """Create an environment for testing the auto_approve flag."""
+ if create_ancestry:
+ self.distroseries.status = SeriesStatus.FROZEN
+ target_archive = self.factory.makeArchive(
+ self.distroseries.distribution, purpose=ArchivePurpose.PRIMARY)
+ source_archive = self.factory.makeArchive()
+ requester = self.factory.makePerson()
+ with person_logged_in(target_archive.owner):
+ for component_name in component_names:
+ target_archive.newQueueAdmin(requester, component_name)
+ spph = self.publisher.getPubSource(
+ distroseries=self.distroseries,
+ status=PackagePublishingStatus.PUBLISHED, archive=source_archive)
+ spr = spph.sourcepackagerelease
+ if create_ancestry:
+ self.publisher.getPubSource(
+ distroseries=self.distroseries, sourcename=spr.name,
+ version="%s~" % spr.version,
+ status=PackagePublishingStatus.PUBLISHED,
+ archive=target_archive)
+ return target_archive, source_archive, requester, spph, spr
+
+ def test_auto_approve(self):
+ # The auto_approve flag causes the job to be processed immediately,
+ # even if it would normally have required manual approval.
+ (target_archive, source_archive, requester,
+ spph, spr) = self.createAutoApproveEnvironment(True, ["main"])
+ job = self.createCopyJobForSPPH(
+ spph, source_archive, target_archive, requester=requester,
+ auto_approve=True)
+ self.runJob(job)
+ self.assertEqual(JobStatus.COMPLETED, job.status)
+ new_publication = target_archive.getPublishedSources(
+ name=spr.name, version=spr.version, exact_match=True).one()
+ self.assertEqual(target_archive, new_publication.archive)
+
+ def test_auto_approve_non_queue_admin(self):
+ # The auto_approve flag is ignored for people without queue admin
+ # permissions.
+ (target_archive, source_archive, requester,
+ spph, spr) = self.createAutoApproveEnvironment(True, [])
+ job = self.createCopyJobForSPPH(
+ spph, source_archive, target_archive, requester=requester,
+ auto_approve=True)
+ self.runJob(job)
+ self.assertEqual(JobStatus.SUSPENDED, job.status)
+
+ def test_auto_approve_new(self):
+ # The auto_approve flag causes copies to bypass the NEW queue.
+ (target_archive, source_archive, requester,
+ spph, spr) = self.createAutoApproveEnvironment(False, ["universe"])
+
+ # Without auto_approve, this job would be suspended and the upload
+ # moved to the NEW queue.
+ job = self.createCopyJobForSPPH(
+ spph, source_archive, target_archive, requester=requester)
+ self.runJob(job)
+ self.assertEqual(JobStatus.SUSPENDED, job.status)
+ switch_dbuser("launchpad_main")
+ pu = getUtility(IPackageUploadSet).getByPackageCopyJobIDs(
+ [removeSecurityProxy(job).context.id]).one()
+ self.assertEqual(PackageUploadStatus.NEW, pu.status)
+
+ # With auto_approve, the job completes immediately.
+ job = self.createCopyJobForSPPH(
+ spph, source_archive, target_archive, requester=requester,
+ auto_approve=True)
+ self.runJob(job)
+ self.assertEqual(JobStatus.COMPLETED, job.status)
+ new_publication = target_archive.getPublishedSources(
+ name=spr.name, version=spr.version, exact_match=True).one()
+ self.assertEqual(target_archive, new_publication.archive)
+
+ def test_auto_approve_new_non_queue_admin(self):
+ # For NEW packages, the auto_approve flag is ignored for people
+ # without queue admin permissions.
+ (target_archive, source_archive, requester,
+ spph, spr) = self.createAutoApproveEnvironment(False, [])
+ job = self.createCopyJobForSPPH(
+ spph, source_archive, target_archive, requester=requester,
+ auto_approve=True)
+ self.runJob(job)
+ self.assertEqual(JobStatus.SUSPENDED, job.status)
+
def test_copying_after_job_released(self):
# The first pass of the job may have created a PackageUpload and
# suspended the job. Here we test the second run to make sure
Follow ups