launchpad-reviewers team mailing list archive
-
launchpad-reviewers team
-
Mailing list archive
-
Message #04568
[Merge] lp:~jtv/launchpad/bug-659769 into lp:launchpad
Jeroen T. Vermeulen has proposed merging lp:~jtv/launchpad/bug-659769 into lp:launchpad.
Requested reviews:
Launchpad code reviewers (launchpad-reviewers)
Related bugs:
Bug #659769 in Launchpad itself: "should copy custom uploads to newly-initialised release"
https://bugs.launchpad.net/launchpad/+bug/659769
For more details, see:
https://code.launchpad.net/~jtv/launchpad/bug-659769/+merge/71173
= Summary =
Every time the Ubuntu people (particularly Colin) set up a new release, they need to copy some of the archive's custom-upload files to the new release. Custom uploads are almost entirely unmanaged, so they would probably do this by copying direcetly in the filesystem. They have asked for an easier, more integrated way to get it done.
== Proposed fix ==
As far as I can make out, the only types of custom upload that need to be copied are for the Debian installer and the dist upgrader. The Rosetta translations are shared within the database, and presumably both kinds of translations for a package will soon be re-uploaded anyway. So I focused on the installers and upgraders.
Because custom uploads are so lightly managed (little or no metadata is kept, and we'd have to parse tarballs from the Librarian just to see what files came out of which upload), it's hard to figure out exactly what should or should not be copied. Some of the files may be obsolete and if we copy them along, they'll be with us forever.
The uploads contain tarballs with names in just one of two formats: <package>_<version>_<architecture>.tar.gz and <package>_<version>.tar.gz. A tarball for a given installer version, say, should contain all files that that version needs; there's no finicky merging of individual files that may each be current or obsolete. We can just identify the upload with the current tarball, and copy that upload into the new series.
Rather than get into the full hairy detail of version parsing (some of the versions look a little ad-hoc), I'm just copying the latest custom uploads for each (upload type, <package>, <architecture>). The <architecture> defaults to "all" because that's what seems to be meant when it is omitted.
As far as I've seen, the scheme I implemented here would actually work just fine for the other upload types. It will also work for derived distributions, with two limitations:
1. The custom uploads are copied only between consecutive releases of the same distribution, not across inheritance lines.
2. Uploads have to follow this naming scheme in order to be copied. This is something the package defines.
Having this supported for derived distributions could help save manual maintenance and system access needs for derived distros. We may want to add cross-distro inheritance of custom uploads later, but then we'll have to solve the conflict-resolution problem: should we copy the previous series' latest installer, or the one from the parent distro in the series that we're deriving from? What if there are multiple parent distros or releases?
== Pre-implementation notes ==
Discussed with various people, but alas not with Colin at the time of writing. I'm going to hold off on landing until he confirms that this will give him what he needs. Watch this space for updates.
== Implementation details ==
== Tests ==
{{{
./bin/test -vvc lp.soyuz -t test_publishing -t test_custom_uploads_copier
./bin/test -vvc lp.archivepublisher.tests.test_publish_ftpmaster
}}}
== Demo and Q/A ==
We'll have to do this hand in hand with the distro gurus. Not only to validate the result, but also to help set up the test scenario!
= Launchpad lint =
Checking for conflicts and issues in changed files.
Linting changed files:
lib/lp/soyuz/tests/test_publishing.py
lib/lp/archivepublisher/scripts/publish_ftpmaster.py
lib/lp/registry/interfaces/distroseries.py
lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py
lib/lp/soyuz/scripts/custom_uploads_copier.py
lib/lp/archivepublisher/tests/test_publish_ftpmaster.py
lib/lp/registry/model/distroseries.py
--
https://code.launchpad.net/~jtv/launchpad/bug-659769/+merge/71173
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/launchpad/bug-659769 into lp:launchpad.
=== modified file 'lib/lp/archivepublisher/scripts/publish_ftpmaster.py'
--- lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-09 10:30:43 +0000
+++ lib/lp/archivepublisher/scripts/publish_ftpmaster.py 2011-08-11 10:24:38 +0000
@@ -27,6 +27,10 @@
)
from lp.services.utils import file_exists
from lp.soyuz.enums import ArchivePurpose
+<<<<<<< TREE
+=======
+from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
+>>>>>>> MERGE-SOURCE
from lp.soyuz.scripts.ftpmaster import LpQueryDistro
from lp.soyuz.scripts.processaccepted import ProcessAccepted
from lp.soyuz.scripts.publishdistro import PublishDistro
@@ -539,6 +543,25 @@
self.recoverWorkingDists()
raise
+ def prepareFreshSeries(self):
+ """If there are any new distroseries, prepare them for publishing.
+
+ :return: True if a series did indeed still need some preparation,
+ of False for the normal case.
+ """
+ have_fresh_series = False
+ for series in self.distribution.series:
+ suites_needing_indexes = self.listSuitesNeedingIndexes(series)
+ if len(suites_needing_indexes) != 0:
+ # This is a fresh series.
+ have_fresh_series = True
+ if series.previous_series is not None:
+ CustomUploadsCopier(series).copy(series.previous_series)
+ for suite in suites_needing_indexes:
+ self.createIndexes(suite)
+
+ return have_fresh_series
+
def setUp(self):
"""Process options, and set up internal state."""
self.processOptions()
@@ -550,14 +573,9 @@
self.setUp()
self.recoverWorkingDists()
- for series in self.distribution.series:
- suites_needing_indexes = self.listSuitesNeedingIndexes(series)
- if len(suites_needing_indexes) > 0:
- for suite in suites_needing_indexes:
- self.createIndexes(suite)
- # Don't try to do too much in one run. Leave the rest
- # of the work for next time.
- return
+ if self.prepareFreshSeries():
+ # We've done enough. Leave some room for others.
+ return
self.processAccepted()
self.setUpDirs()
=== modified file 'lib/lp/archivepublisher/tests/test_publish_ftpmaster.py'
--- lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-03 06:24:53 +0000
+++ lib/lp/archivepublisher/tests/test_publish_ftpmaster.py 2011-08-11 10:24:38 +0000
@@ -44,6 +44,7 @@
from lp.soyuz.enums import (
ArchivePurpose,
PackagePublishingStatus,
+ PackageUploadCustomFormat,
)
from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
from lp.testing import (
@@ -980,6 +981,29 @@
self.assertEqual([suite], kwargs['suites'])
self.assertThat(kwargs['suites'][0], StartsWith(series.name))
+ def test_prepareFreshSeries_copies_custom_uploads(self):
+ distro = self.makeDistroWithPublishDirectory()
+ old_series = self.factory.makeDistroSeries(
+ distribution=distro, status=SeriesStatus.CURRENT)
+ new_series = self.factory.makeDistroSeries(
+ distribution=distro, previous_series=old_series,
+ status=SeriesStatus.FROZEN)
+ custom_upload = self.factory.makeCustomPackageUpload(
+ distroseries=old_series,
+ custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
+ filename='debian-installer-images_1.0-20110805_i386.tar.gz')
+ script = self.makeScript(distro)
+ script.createIndexes = FakeMethod()
+ script.setUp()
+ have_fresh_series = script.prepareFreshSeries()
+ self.assertTrue(have_fresh_series)
+ [copied_upload] = new_series.getPackageUploads(
+ name=u'debian-installer-images', exact_match=False)
+ [copied_custom] = copied_upload.customfiles
+ self.assertEqual(
+ custom_upload.customfiles[0].libraryfilealias.filename,
+ copied_custom.libraryfilealias.filename)
+
def test_script_creates_indexes(self):
# End-to-end test: the script creates indexes for distroseries
# that need them.
=== modified file 'lib/lp/registry/interfaces/distroseries.py'
--- lib/lp/registry/interfaces/distroseries.py 2011-08-05 03:58:16 +0000
+++ lib/lp/registry/interfaces/distroseries.py 2011-08-11 10:24:38 +0000
@@ -786,20 +786,34 @@
DistroSeriesBinaryPackage objects that match the given text.
"""
- def createQueueEntry(pocket, archive, changesfilename, changesfilecontent,
+ def createQueueEntry(pocket, archive, changesfilename=None,
+ changesfilecontent=None, changes_file_alias=None,
signingkey=None, package_copy_job=None):
"""Create a queue item attached to this distroseries.
- Create a new records respecting the given pocket and archive.
-
- The default state is NEW, sorted sqlobject declaration, any
- modification should be performed via Queue state-machine.
-
- The changesfile argument should be the text of the .changes for this
- upload. The contents of this may be used later.
-
- 'signingkey' is the IGPGKey used to sign the changesfile or None if
- the changesfile is unsigned.
+ Create a new `PackageUpload` to the given pocket and archive.
+
+ The default state is NEW. Any further state changes go through
+ the Queue state-machine.
+
+ :param pocket: The `PackagePublishingPocket` to upload to.
+ :param archive: The `Archive` to upload to. Must be for the same
+ `Distribution` as this series.
+ :param changesfilename: Name for the upload's .changes file. You may
+ specify a changes file by passing both `changesfilename` and
+ `changesfilecontent`, or by passing `changes_file_alias`.
+ :param changesfilecontent: Text for the changes file. It will be
+ signed and stored in the Librarian. Must be passed together with
+ `changesfilename`; alternatively, you may provide a
+ `changes_file_alias` to replace both of these.
+ :param changes_file_alias: A `LibraryFileAlias` containing the
+ .changes file. Security warning: unless the file has already
+ been checked, this may open us up to replay attacks as per bugs
+ 159304 and 451396. Use `changes_file_alias` only if you know
+ this can't happen.
+ :param signingkey: `IGPGKey` used to sign the changes file, or None if
+ it is unsigned.
+ :return: A new `PackageUpload`.
"""
def newArch(architecturetag, processorfamily, official, owner,
=== modified file 'lib/lp/registry/model/distroseries.py'
--- lib/lp/registry/model/distroseries.py 2011-08-05 03:58:16 +0000
+++ lib/lp/registry/model/distroseries.py 2011-08-11 10:24:38 +0000
@@ -86,9 +86,7 @@
ISeriesBugTarget,
)
from lp.bugs.interfaces.bugtaskfilter import OrderedBugTask
-from lp.bugs.model.bug import (
- get_bug_tags,
- )
+from lp.bugs.model.bug import get_bug_tags
from lp.bugs.model.bugtarget import (
BugTargetBase,
HasBugHeatMixin,
@@ -1587,25 +1585,34 @@
get_property_cache(spph).newer_distroseries_version = version
def createQueueEntry(self, pocket, archive, changesfilename=None,
- changesfilecontent=None, signing_key=None,
- package_copy_job=None):
+ changesfilecontent=None, changes_file_alias=None,
+ signing_key=None, package_copy_job=None):
"""See `IDistroSeries`."""
- # We store the changes file in the librarian to avoid having to
- # deal with broken encodings in these files; this will allow us
- # to regenerate these files as necessary.
- #
- # The use of StringIO here should be safe: we do not encoding of
- # the content in the changes file (as doing so would be guessing
- # at best, causing unpredictable corruption), and simply pass it
- # off to the librarian.
-
- if package_copy_job is None and (
- changesfilename is None or changesfilecontent is None):
+ if (changesfilename is None) != (changesfilecontent is None):
+ raise AssertionError(
+ "Inconsistent changesfilename and changesfilecontent. "
+ "Pass either both, or neither.")
+ if changes_file_alias is not None and changesfilename is not None:
+ raise AssertionError(
+ "Conflicting options: "
+ "Both changesfilename and changes_file_alias were given.")
+ have_changes_file = not (
+ changesfilename is None and changes_file_alias is None)
+ if package_copy_job is None and not have_changes_file:
raise AssertionError(
"changesfilename and changesfilecontent must be supplied "
"if there is no package_copy_job")
- if package_copy_job is None:
+ if changesfilename is not None:
+ # We store the changes file in the librarian to avoid having to
+ # deal with broken encodings in these files; this will allow us
+ # to regenerate these files as necessary.
+ #
+ # The use of StringIO here should be safe: we do not encoding of
+ # the content in the changes file (as doing so would be guessing
+ # at best, causing unpredictable corruption), and simply pass it
+ # off to the librarian.
+
# The PGP signature is stripped from all changesfiles
# to avoid replay attacks (see bugs 159304 and 451396).
signed_message = signed_message_from_string(changesfilecontent)
@@ -1616,17 +1623,15 @@
if new_content is not None:
changesfilecontent = signed_message.signedContent
- changes_file = getUtility(ILibraryFileAliasSet).create(
+ changes_file_alias = getUtility(ILibraryFileAliasSet).create(
changesfilename, len(changesfilecontent),
StringIO(changesfilecontent), 'text/plain',
restricted=archive.private)
- else:
- changes_file = None
return PackageUpload(
distroseries=self, status=PackageUploadStatus.NEW,
pocket=pocket, archive=archive,
- changesfile=changes_file, signing_key=signing_key,
+ changesfile=changes_file_alias, signing_key=signing_key,
package_copy_job=package_copy_job)
def getPackageUploadQueue(self, state):
=== added file 'lib/lp/soyuz/scripts/custom_uploads_copier.py'
--- lib/lp/soyuz/scripts/custom_uploads_copier.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/scripts/custom_uploads_copier.py 2011-08-11 10:24:38 +0000
@@ -0,0 +1,149 @@
+# Copyright 2011 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Copy latest custom uploads into a distribution release series.
+
+Use this when initializing the installer and dist upgrader for a new release
+series based on the latest uploads from its preceding series.
+"""
+
+__metaclass__ = type
+__all__ = [
+ 'CustomUploadsCopier',
+ ]
+
+from operator import attrgetter
+import re
+
+from zope.component import getUtility
+
+#from canonical.launchpad.database.librarian import LibraryFileAlias
+from lp.services.database.bulk import load_referencing
+from lp.soyuz.enums import PackageUploadCustomFormat
+from lp.soyuz.interfaces.archive import (
+ IArchiveSet,
+ MAIN_ARCHIVE_PURPOSES,
+ )
+from lp.soyuz.model.queue import PackageUploadCustom
+
+
+class CustomUploadsCopier:
+ """Copy `PackageUploadCustom` objects into a new `DistroSeries`."""
+
+ copyable_types = [
+ PackageUploadCustomFormat.DEBIAN_INSTALLER,
+ PackageUploadCustomFormat.DIST_UPGRADER,
+ ]
+
+ def __init__(self, target_series):
+ self.target_series = target_series
+
+ def isCopyable(self, upload):
+ """Is `upload` the kind of `PackageUploadCustom` that we can copy?"""
+ return upload.customformat in self.copyable_types
+
+ def getCandidateUploads(self, source_series):
+ """Find custom uploads that may need copying."""
+ uploads = source_series.getPackageUploads(
+ custom_type=self.copyable_types)
+ load_referencing(PackageUploadCustom, uploads, ['packageuploadID'])
+ customs = sum([list(upload.customfiles) for upload in uploads], [])
+ customs = filter(self.isCopyable, customs)
+ customs.sort(key=attrgetter('id'), reverse=True)
+ return customs
+
+ def extractNameFields(self, filename):
+ """Get the relevant fields out of `filename`.
+
+ Scans filenames of any of these forms:
+
+ <package>_<version>_<architecture>.tar.<compression_suffix>
+ <package>_<version>.tar[.<compression_suffix>]
+
+ Versions may contain dots, dashes etc. but no underscores.
+
+ :return: A tuple of (<package>, <architecture>), or None if the
+ filename does not match the expected pattern. If no
+ architecture is found in the filename, it defaults to 'all'.
+ """
+ regex_parts = {
+ 'package': "[^_]+",
+ 'version': "[^_]+",
+ 'arch': "[^._]+",
+ }
+ filename_regex = (
+ "(%(package)s)_%(version)s(?:_(%(arch)s))?.tar" % regex_parts)
+ match = re.match(filename_regex, filename)
+ if match is None:
+ return None
+ default_arch = 'all'
+ fields = match.groups(default_arch)
+ if len(fields) != 2:
+ return None
+ return fields
+
+ def getKey(self, upload):
+ """Get an indexing key for `upload`."""
+ custom_format = (upload.customformat, )
+ name_fields = self.extractNameFields(upload.libraryfilealias.filename)
+ if name_fields is None:
+ return None
+ else:
+ return custom_format + name_fields
+
+ def getLatestUploads(self, source_series):
+ """Find the latest uploads.
+
+ :param source_series: The `DistroSeries` whose uploads to get.
+ :return: A dict containing the latest uploads, indexed by keys as
+ returned by `getKey`.
+ """
+ latest_uploads = {}
+ for upload in self.getCandidateUploads(source_series):
+ key = self.getKey(upload)
+ if key is not None:
+ latest_uploads.setdefault(key, upload)
+ return latest_uploads
+
+ def getTargetArchive(self, original_archive):
+ """Find counterpart of `original_archive` in `self.target_series`.
+
+ :param original_archive: The `Archive` that the original upload went
+ into. If this is not a primary, partner, or debug archive,
+ None is returned.
+ :return: The `Archive` of the same purpose for `self.target_series`.
+ """
+ if original_archive.purpose not in MAIN_ARCHIVE_PURPOSES:
+ return None
+ return getUtility(IArchiveSet).getByDistroPurpose(
+ self.target_series.distribution, original_archive.purpose)
+
+ def isObsolete(self, upload, target_uploads):
+ """Is `upload` superseded by one that the target series already has?
+
+ :param upload: A `PackageUploadCustom` from the source series.
+ :param target_uploads:
+ """
+ existing_upload = target_uploads.get(self.getKey(upload))
+ return existing_upload is not None and existing_upload.id >= upload.id
+
+ def copyUpload(self, original_upload):
+ """Copy `original_upload` into `self.target_series`."""
+ target_archive = self.getTargetArchive(
+ original_upload.packageupload.archive)
+ if target_archive is None:
+ return None
+ package_upload = self.target_series.createQueueEntry(
+ original_upload.packageupload.pocket, target_archive,
+ changes_file_alias=original_upload.packageupload.changesfile)
+ custom = package_upload.addCustom(
+ original_upload.libraryfilealias, original_upload.customformat)
+ package_upload.setAccepted()
+ return custom
+
+ def copy(self, source_series):
+ """Copy uploads from `source_series`."""
+ target_uploads = self.getLatestUploads(self.target_series)
+ for upload in self.getLatestUploads(source_series).itervalues():
+ if not self.isObsolete(upload, target_uploads):
+ self.copyUpload(upload)
=== added file 'lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py'
--- lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/scripts/tests/test_custom_uploads_copier.py 2011-08-11 10:24:38 +0000
@@ -0,0 +1,420 @@
+# Copyright 2011 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Test copying of custom package uploads for a new `DistroSeries`."""
+
+__metaclass__ = type
+
+from canonical.testing.layers import (
+ LaunchpadZopelessLayer,
+ ZopelessLayer,
+ )
+from lp.soyuz.enums import (
+ ArchivePurpose,
+ PackageUploadCustomFormat,
+ PackageUploadStatus,
+ )
+from lp.soyuz.interfaces.archive import MAIN_ARCHIVE_PURPOSES
+from lp.soyuz.scripts.custom_uploads_copier import CustomUploadsCopier
+from lp.testing import TestCaseWithFactory
+from lp.testing.fakemethod import FakeMethod
+
+
+def list_custom_uploads(distroseries):
+ """Return a list of all `PackageUploadCustom`s for `distroseries`."""
+ return sum(
+ [
+ list(upload.customfiles)
+ for upload in distroseries.getPackageUploads()],
+ [])
+
+
+class FakeDistroSeries:
+ """Fake `DistroSeries` for test copiers that don't really need one."""
+
+
+class FakeLibraryFileAlias:
+ def __init__(self, filename):
+ self.filename = filename
+
+
+class FakeUpload:
+ def __init__(self, customformat, filename):
+ self.customformat = customformat
+ self.libraryfilealias = FakeLibraryFileAlias(filename)
+
+
+class CommonTestHelpers:
+ """Helper(s) for these tests."""
+ def makeVersion(self):
+ """Create a fake version string."""
+ return "%d.%d-%s" % (
+ self.factory.getUniqueInteger(),
+ self.factory.getUniqueInteger(),
+ self.factory.getUniqueString())
+
+
+class TestCustomUploadsCopierLite(TestCaseWithFactory, CommonTestHelpers):
+ """Light-weight low-level tests for `CustomUploadsCopier`."""
+
+ layer = ZopelessLayer
+
+ def test_isCopyable_matches_copyable_types(self):
+ # isCopyable checks a custom upload's customformat field to
+ # determine whether the upload is a candidate for copying. It
+ # approves only those whose customformats are in copyable_types.
+ class FakePackageUploadCustom:
+ def __init__(self, customformat):
+ self.customformat = customformat
+
+ uploads = [
+ FakePackageUploadCustom(custom_type)
+ for custom_type in PackageUploadCustomFormat.items]
+
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ copied_uploads = filter(copier.isCopyable, uploads)
+ self.assertContentEqual(
+ CustomUploadsCopier.copyable_types,
+ [upload.customformat for upload in copied_uploads])
+
+ def test_extractNameFields_extracts_package_name_and_architecture(self):
+ # extractNameFields picks up the package name and architecture
+ # out of an upload's filename field.
+ package_name = self.factory.getUniqueString('package')
+ version = self.makeVersion()
+ architecture = self.factory.getUniqueString('arch')
+ filename = '%s_%s_%s.tar.gz' % (package_name, version, architecture)
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertEqual(
+ (package_name, architecture), copier.extractNameFields(filename))
+
+ def test_extractNameFields_does_not_require_architecture(self):
+ # When extractNameFields does not see an architecture, it
+ # defaults to 'all'.
+ package_name = self.factory.getUniqueString('package')
+ filename = '%s_%s.tar.gz' % (package_name, self.makeVersion())
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertEqual(
+ (package_name, 'all'), copier.extractNameFields(filename))
+
+ def test_extractNameFields_returns_None_on_mismatch(self):
+ # If the filename does not match the expected pattern,
+ # extractNameFields returns None.
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertIs(None, copier.extractNameFields('argh_1.0.jpg'))
+
+ def test_extractNameFields_ignores_names_with_too_many_fields(self):
+ # As one particularly nasty case that might break
+ # extractNameFields, a name with more underscore-seprated fields
+ # than the search pattern allows for is sensibly rejected.
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertIs(
+ None, copier.extractNameFields('one_two_three_four_5.tar.gz'))
+
+ def test_getKey_returns_None_on_name_mismatch(self):
+ # If extractNameFields returns None, getKey also returns None.
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ copier.extractNameFields = FakeMethod()
+ self.assertIs(
+ None,
+ copier.getKey(FakeUpload(
+ PackageUploadCustomFormat.DEBIAN_INSTALLER,
+ "bad-filename.tar")))
+
+
+class TestCustomUploadsCopier(TestCaseWithFactory, CommonTestHelpers):
+ """Heavyweight `CustomUploadsCopier` tests."""
+
+ # Alas, PackageUploadCustom relies on the Librarian.
+ layer = LaunchpadZopelessLayer
+
+ def makeUpload(self, distroseries=None,
+ custom_type=PackageUploadCustomFormat.DEBIAN_INSTALLER,
+ package_name=None, version=None, arch=None):
+ """Create a `PackageUploadCustom`."""
+ if distroseries is None:
+ distroseries = self.factory.makeDistroSeries()
+ if package_name is None:
+ package_name = self.factory.getUniqueString("package")
+ if version is None:
+ version = self.makeVersion()
+ filename = "%s.tar.gz" % '_'.join(
+ filter(None, [package_name, version, arch]))
+ package_upload = self.factory.makeCustomPackageUpload(
+ distroseries=distroseries, custom_type=custom_type,
+ filename=filename)
+ return package_upload.customfiles[0]
+
+ def test_copies_custom_upload(self):
+ # CustomUploadsCopier copies custom uploads from one series to
+ # another.
+ current_series = self.factory.makeDistroSeries()
+ original_upload = self.makeUpload(current_series)
+ new_series = self.factory.makeDistroSeries(
+ distribution=current_series.distribution,
+ previous_series=current_series)
+
+ CustomUploadsCopier(new_series).copy(current_series)
+
+ [copied_upload] = list_custom_uploads(new_series)
+ self.assertEqual(
+ original_upload.libraryfilealias, copied_upload.libraryfilealias)
+
+ def test_is_idempotent(self):
+ # It's safe to perform the same copy more than once; the uploads
+ # get copied only once.
+ current_series = self.factory.makeDistroSeries()
+ self.makeUpload(current_series)
+ new_series = self.factory.makeDistroSeries(
+ distribution=current_series.distribution,
+ previous_series=current_series)
+
+ copier = CustomUploadsCopier(new_series)
+ copier.copy(current_series)
+ uploads_after_first_copy = list_custom_uploads(new_series)
+ copier.copy(current_series)
+ uploads_after_redundant_copy = list_custom_uploads(new_series)
+
+ self.assertEqual(
+ uploads_after_first_copy, uploads_after_redundant_copy)
+
+ def test_getCandidateUploads_filters_by_distroseries(self):
+ # getCandidateUploads ignores uploads for other distroseries.
+ source_series = self.factory.makeDistroSeries()
+ matching_upload = self.makeUpload(source_series)
+ nonmatching_upload = self.makeUpload()
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ candidate_uploads = copier.getCandidateUploads(source_series)
+ self.assertContentEqual([matching_upload], candidate_uploads)
+ self.assertNotIn(nonmatching_upload, candidate_uploads)
+
+ def test_getCandidateUploads_filters_upload_types(self):
+ # getCandidateUploads returns only uploads of the types listed
+ # in copyable_types; other types of upload are ignored.
+ source_series = self.factory.makeDistroSeries()
+ for custom_format in PackageUploadCustomFormat.items:
+ self.makeUpload(source_series, custom_type=custom_format)
+
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ candidate_uploads = copier.getCandidateUploads(source_series)
+ copied_types = [upload.customformat for upload in candidate_uploads]
+ self.assertContentEqual(
+ CustomUploadsCopier.copyable_types, copied_types)
+
+ def test_getCandidateUploads_ignores_other_attachments(self):
+ # A PackageUpload can have multiple PackageUploadCustoms
+ # attached, potentially of different types. getCandidateUploads
+ # ignores PackageUploadCustoms of types that aren't supposed to
+ # be copied, even if they are attached to PackageUploads that
+ # also have PackageUploadCustoms that do need to be copied.
+ source_series = self.factory.makeDistroSeries()
+ package_upload = self.factory.makePackageUpload(
+ distroseries=source_series, archive=source_series.main_archive)
+ library_file = self.factory.makeLibraryFileAlias()
+ matching_upload = package_upload.addCustom(
+ library_file, PackageUploadCustomFormat.DEBIAN_INSTALLER)
+ nonmatching_upload = package_upload.addCustom(
+ library_file, PackageUploadCustomFormat.ROSETTA_TRANSLATIONS)
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ candidates = copier.getCandidateUploads(source_series)
+ self.assertContentEqual([matching_upload], candidates)
+ self.assertNotIn(nonmatching_upload, candidates)
+
+ def test_getCandidateUploads_orders_newest_to_oldest(self):
+ # getCandidateUploads returns its PackageUploadCustoms ordered
+ # from newest to oldest.
+ source_series = self.factory.makeDistroSeries()
+ for counter in xrange(5):
+ self.makeUpload(source_series)
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ candidate_ids = [
+ upload.id for upload in copier.getCandidateUploads(source_series)]
+ self.assertEqual(sorted(candidate_ids, reverse=True), candidate_ids)
+
+ def test_getKey_includes_format_package_and_architecture(self):
+ # The key returned by getKey consists of custom upload type,
+ # package name, and architecture.
+ source_series = self.factory.makeDistroSeries()
+ upload = self.makeUpload(
+ source_series, PackageUploadCustomFormat.DIST_UPGRADER,
+ package_name='upgrader', arch='mips')
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ expected_key = (
+ PackageUploadCustomFormat.DIST_UPGRADER,
+ 'upgrader',
+ 'mips',
+ )
+ self.assertEqual(expected_key, copier.getKey(upload))
+
+ def test_getLatestUploads_indexes_uploads_by_key(self):
+ # getLatestUploads returns a dict of uploads, indexed by keys
+ # returned by getKey.
+ source_series = self.factory.makeDistroSeries()
+ upload = self.makeUpload(source_series)
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertEqual(
+ {copier.getKey(upload): upload},
+ copier.getLatestUploads(source_series))
+
+ def test_getLatestUploads_filters_superseded_uploads(self):
+ # getLatestUploads returns only the latest upload for a given
+ # distroseries, type, package, and architecture. Any older
+ # uploads with the same distroseries, type, package name, and
+ # architecture are ignored.
+ source_series = self.factory.makeDistroSeries()
+ uploads = [
+ self.makeUpload(
+ source_series, package_name='installer', version='1.0.0',
+ arch='ppc')
+ for counter in xrange(3)]
+
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertContentEqual(
+ uploads[-1:], copier.getLatestUploads(source_series).values())
+
+ def test_getLatestUploads_bundles_versions(self):
+ # getLatestUploads sees an upload as superseding an older one
+ # for the same distroseries, type, package name, and
+ # architecture even if they have different versions.
+ source_series = self.factory.makeDistroSeries()
+ uploads = [
+ self.makeUpload(source_series, package_name='foo', arch='i386')
+ for counter in xrange(2)]
+ copier = CustomUploadsCopier(FakeDistroSeries())
+ self.assertContentEqual(
+ uploads[-1:], copier.getLatestUploads(source_series).values())
+
+ def test_getTargetArchive_on_same_distro_is_same_archive(self):
+ # When copying within the same distribution, getTargetArchive
+ # always returns the same archive you feed it.
+ distro = self.factory.makeDistribution()
+ archives = [
+ self.factory.makeArchive(distribution=distro, purpose=purpose)
+ for purpose in MAIN_ARCHIVE_PURPOSES]
+ copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
+ self.assertEqual(
+ archives,
+ [copier.getTargetArchive(archive) for archive in archives])
+
+ def test_getTargetArchive_returns_None_if_not_distribution_archive(self):
+ # getTargetArchive returns None for any archive that is not a
+ # distribution archive, regardless of whether the target series
+ # has an equivalent.
+ distro = self.factory.makeDistribution()
+ archives = [
+ self.factory.makeArchive(distribution=distro, purpose=purpose)
+ for purpose in ArchivePurpose.items
+ if purpose not in MAIN_ARCHIVE_PURPOSES]
+ copier = CustomUploadsCopier(self.factory.makeDistroSeries(distro))
+ self.assertEqual(
+ [None] * len(archives),
+ [copier.getTargetArchive(archive) for archive in archives])
+
+ def test_getTargetArchive_finds_matching_archive(self):
+ # When copying across archives, getTargetArchive looks for an
+ # archive for the target series with the same purpose as the
+ # original archive.
+ source_series = self.factory.makeDistroSeries()
+ source_archive = self.factory.makeArchive(
+ distribution=source_series.distribution,
+ purpose=ArchivePurpose.PARTNER)
+ target_series = self.factory.makeDistroSeries()
+ target_archive = self.factory.makeArchive(
+ distribution=target_series.distribution,
+ purpose=ArchivePurpose.PARTNER)
+
+ copier = CustomUploadsCopier(target_series)
+ self.assertEqual(
+ target_archive, copier.getTargetArchive(source_archive))
+
+ def test_getTargetArchive_returns_None_if_no_archive_matches(self):
+ # If the target series has no archive to match the archive that
+ # the original upload was far, it returns None.
+ source_series = self.factory.makeDistroSeries()
+ source_archive = self.factory.makeArchive(
+ distribution=source_series.distribution,
+ purpose=ArchivePurpose.PARTNER)
+ target_series = self.factory.makeDistroSeries()
+ copier = CustomUploadsCopier(target_series)
+ self.assertIs(None, copier.getTargetArchive(source_archive))
+
+ def test_isObsolete_returns_False_if_no_equivalent_in_target(self):
+ # isObsolete returns False if the upload in question has no
+ # equivalent in the target series.
+ source_series = self.factory.makeDistroSeries()
+ upload = self.makeUpload(source_series)
+ target_series = self.factory.makeDistroSeries()
+ copier = CustomUploadsCopier(target_series)
+ self.assertFalse(
+ copier.isObsolete(upload, copier.getLatestUploads(target_series)))
+
+ def test_isObsolete_returns_False_if_target_has_older_equivalent(self):
+ # isObsolete returns False if the target has an equivlalent of
+ # the upload in question, but it's older than the version the
+ # source series has.
+ source_series = self.factory.makeDistroSeries()
+ target_series = self.factory.makeDistroSeries()
+ self.makeUpload(
+ target_series, package_name='installer', arch='ppc64')
+ source_upload = self.makeUpload(
+ source_series, package_name='installer', arch='ppc64')
+ copier = CustomUploadsCopier(target_series)
+ self.assertFalse(
+ copier.isObsolete(
+ source_upload, copier.getLatestUploads(target_series)))
+
+ def test_isObsolete_returns_True_if_target_has_newer_equivalent(self):
+ # isObsolete returns False if the target series already has a
+ # newer equivalent of the upload in question (as would be the
+ # case, for instance, if the upload had already been copied).
+ source_series = self.factory.makeDistroSeries()
+ source_upload = self.makeUpload(
+ source_series, package_name='installer', arch='alpha')
+ target_series = self.factory.makeDistroSeries()
+ self.makeUpload(
+ target_series, package_name='installer', arch='alpha')
+ copier = CustomUploadsCopier(target_series)
+ self.assertTrue(
+ copier.isObsolete(
+ source_upload, copier.getLatestUploads(target_series)))
+
+ def test_copyUpload_creates_upload(self):
+ # copyUpload creates a new upload that's very similar to the
+ # original, but for the target series.
+ original_upload = self.makeUpload()
+ target_series = self.factory.makeDistroSeries()
+ copier = CustomUploadsCopier(target_series)
+ copied_upload = copier.copyUpload(original_upload)
+ self.assertEqual([copied_upload], list_custom_uploads(target_series))
+ self.assertNotEqual(
+ original_upload.packageupload, copied_upload.packageupload)
+ self.assertEqual(
+ original_upload.customformat, copied_upload.customformat)
+ self.assertEqual(
+ original_upload.libraryfilealias, copied_upload.libraryfilealias)
+ self.assertEqual(
+ original_upload.packageupload.changesfile,
+ copied_upload.packageupload.changesfile)
+ self.assertEqual(
+ original_upload.packageupload.pocket,
+ copied_upload.packageupload.pocket)
+
+ def test_copyUpload_accepts_upload(self):
+ # Uploads created by copyUpload are automatically accepted.
+ original_upload = self.makeUpload()
+ target_series = self.factory.makeDistroSeries()
+ copier = CustomUploadsCopier(target_series)
+ copied_upload = copier.copyUpload(original_upload)
+ self.assertEqual(
+ PackageUploadStatus.ACCEPTED, copied_upload.packageupload.status)
+
+ def test_copyUpload_does_not_copy_if_no_archive_matches(self):
+ # If getTargetArchive does not find an appropriate target
+ # archive, copyUpload does nothing.
+ source_series = self.factory.makeDistroSeries()
+ upload = self.makeUpload(distroseries=source_series)
+ target_series = self.factory.makeDistroSeries()
+ copier = CustomUploadsCopier(target_series)
+ copier.getTargetArchive = FakeMethod(result=None)
+ self.assertIs(None, copier.copyUpload(upload))
+ self.assertEqual([], list_custom_uploads(target_series))
=== modified file 'lib/lp/soyuz/tests/test_publishing.py'
--- lib/lp/soyuz/tests/test_publishing.py 2011-08-03 11:00:11 +0000
+++ lib/lp/soyuz/tests/test_publishing.py 2011-08-11 10:24:38 +0000
@@ -159,7 +159,7 @@
signing_key = self.person.gpg_keys[0]
package_upload = distroseries.createQueueEntry(
pocket, archive, changes_file_name, changes_file_content,
- signing_key)
+ signing_key=signing_key)
status_to_method = {
PackageUploadStatus.DONE: 'setDone',