launchpad-reviewers team mailing list archive
-
launchpad-reviewers team
-
Mailing list archive
-
Message #02300
[Merge] lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 into lp:launchpad
Steve Kowalik has proposed merging lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 into lp:launchpad with lp:~stevenk/launchpad/bpb-currentcomponent-assertion as a prerequisite.
Requested reviews:
Launchpad code reviewers (launchpad-reviewers)
For more details, see:
https://code.launchpad.net/~stevenk/launchpad/bpb-currentcomponent-assertion-part-2/+merge/45693
This branch does more of the preparation work that was started in https://code.launchpad.net/~stevenk/launchpad/bpb-currentcomponent-assertion/+merge/45210. It kills build-estimated-dispatch.txt and xx-build-redirect.txt, converting them both to unittests. I also did a few drive-by code and comment cleanups where I noticed.
Also allow me to apologise in advance for how large the diff for this branch is.
--
https://code.launchpad.net/~stevenk/launchpad/bpb-currentcomponent-assertion-part-2/+merge/45693
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~stevenk/launchpad/bpb-currentcomponent-assertion-part-2 into lp:launchpad.
=== modified file 'lib/lp/soyuz/browser/tests/test_build_views.py'
--- lib/lp/soyuz/browser/tests/test_build_views.py 2011-01-10 13:27:45 +0000
+++ lib/lp/soyuz/browser/tests/test_build_views.py 2011-01-10 13:27:52 +0000
@@ -281,3 +281,14 @@
job.suspend()
self.assertEquals(job.status, JobStatus.SUSPENDED)
self.assertFalse(view.dispatch_time_estimate_available)
+
+ def test_old_url_redirection(self):
+ # When users go to the old build URLs, they are redirected to the
+ # equivalent new URLs.
+ build = self.factory.makeBinaryPackageBuild()
+ build.queueBuild()
+ url = "http://launchpad.dev/+builds/+build/%s" % build.id
+ expected_url = canonical_url(build)
+ browser = self.getUserBrowser(url)
+ self.assertEquals(browser.url, expected_url)
+
=== modified file 'lib/lp/soyuz/doc/binarypackagebuild.txt'
--- lib/lp/soyuz/doc/binarypackagebuild.txt 2010-10-18 22:24:59 +0000
+++ lib/lp/soyuz/doc/binarypackagebuild.txt 2011-01-10 13:27:52 +0000
@@ -1,400 +1,3 @@
-= The Build table =
-
-The build table contains the information pertaining to a given build
-of a sourcepackagerelease on a distroarchseries.
-
-The build record may have many BinaryPackageRelease records pointing
-at it and it may reference a build log if the build was done on a
-launchpad build daemon.
-
- # Create a 'mozilla-firefox' build in ubuntutest/breezy-autotest/i386.
- >>> from zope.security.proxy import removeSecurityProxy
- >>> from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
- >>> login('foo.bar@xxxxxxxxxxxxx')
- >>> test_publisher = SoyuzTestPublisher()
- >>> test_publisher.prepareBreezyAutotest()
- >>> source = test_publisher.getPubSource(
- ... sourcename='mozilla-firefox', version='0.9')
- >>> binaries = test_publisher.getPubBinaries(
- ... binaryname='firefox', pub_source=source)
- >>> [firefox_build] = source.getBuilds()
- >>> job = firefox_build.buildqueue_record.job
- >>> removeSecurityProxy(job).start()
- >>> removeSecurityProxy(job).complete()
- >>> login(ANONYMOUS)
-
-A build has a title which describes the context source version and in
-which series and architecture it is targeted for.
-
- >>> print firefox_build.title
- i386 build of mozilla-firefox 0.9 in ubuntutest breezy-autotest RELEASE
-
-A build directly links to the archive, distribution, distroseries,
-distroarchseries, pocket in its context and also the source version
-that generated it.
-
- >>> print firefox_build.archive.displayname
- Primary Archive for Ubuntu Test
-
- >>> print firefox_build.distribution.displayname
- ubuntutest
-
- >>> print firefox_build.distro_series.displayname
- Breezy Badger Autotest
-
- >>> print firefox_build.distro_arch_series.displayname
- ubuntutest Breezy Badger Autotest i386
-
- >>> firefox_build.pocket
- <DBItem PackagePublishingPocket.RELEASE, (0) Release>
-
- >>> print firefox_build.arch_tag
- i386
-
- >>> firefox_build.is_virtualized
- False
-
- >>> print firefox_build.source_package_release.title
- mozilla-firefox - 0.9
-
-A build has an state that represents in which stage it is in a
-life-cycle that goes from PENDING to BUILDING until FULLYBUILT or one
-of the intermediate failed states (FAILEDTOBUILD, MANUALDEPWAIT,
-CHROOTWAIT, SUPERSEDED and FAILEDTOUPLOAD).
-
- >>> firefox_build.status
- <DBItem BuildStatus.FULLYBUILT, (1) Successfully built>
-
-Builds which were already processed also offer additional information
-about its process such as the time it was started and finished and its
-'log' and 'upload_changesfile' as librarian files.
-
- >>> firefox_build.was_built
- True
-
- >>> firefox_build.date_started
- datetime.datetime(...)
-
- >>> firefox_build.date_finished
- datetime.datetime(...)
-
- >>> firefox_build.duration
- datetime.timedelta(...)
-
- >>> print firefox_build.log.filename
- buildlog_ubuntutest-breezy-autotest-i386.mozilla-firefox_0.9_FULLYBUILT.txt.gz
-
- >>> print firefox_build.log_url
- http://launchpad.dev/ubuntutest/+source/mozilla-firefox/0.9/+build/.../+files/buildlog_ubuntutest-breezy-autotest-i386.mozilla-firefox_0.9_FULLYBUILT.txt.gz
-
- >>> print firefox_build.upload_changesfile.filename
- firefox_0.9_i386.changes
-
- >>> print firefox_build.changesfile_url
- http://launchpad.dev/ubuntutest/+source/mozilla-firefox/0.9/+build/.../+files/firefox_0.9_i386.changes
-
-The 'firefox_build' is already finished and requesting the estimated build
-start time makes no sense. Hence an exception is raised.
-
- >>> firefox_build.buildqueue_record.getEstimatedJobStartTime()
- Traceback (most recent call last):
- ...
- AssertionError: The start time is only estimated for pending jobs.
-
-On a build job in state `NEEDSBUILD` we can ask for its estimated
-build start time.
-
- # Create a brand new pending build.
- >>> login('foo.bar@xxxxxxxxxxxxx')
- >>> source = test_publisher.getPubSource(sourcename='pending-source')
- >>> [pending_build] = source.createMissingBuilds()
- >>> login(ANONYMOUS)
-
- >>> pending_build.buildqueue_record.getEstimatedJobStartTime()
- datetime.datetime(...)
-
-The currently published component is provided via the 'current_component'
-property. It looks over the publishing records and finds the current
-publication of the source in question.
-
- >>> print firefox_build.current_component.name
- main
-
-It is not necessarily the same as:
-
- >>> print firefox_build.source_package_release.component.name
- main
-
-which is the component the source was originally uploaded to, before
-any overriding action.
-
-The build can report any corresponding uploads using the package_upload
-property:
-
- >>> firefox_build.package_upload
- <PackageUpload ...>
-
- >>> firefox_build.package_upload.status
- <DBItem PackageUploadStatus.DONE, (3) Done>
-
-If the build does not have any uploads, None is returned:
-
- >>> from lp.buildmaster.enums import BuildStatus
- >>> from lp.soyuz.interfaces.binarypackagebuild import (
- ... IBinaryPackageBuildSet)
- >>> at_build = getUtility(IBinaryPackageBuildSet).getByBuildID(15)
- >>> print at_build.package_upload
- None
-
-Test "retry" functionality:
-
- >>> firefox_build.can_be_retried
- False
-
- >>> frozen_build = getUtility(IBinaryPackageBuildSet).getByBuildID(9)
- >>> frozen_build.title
- u'i386 build of pmount 0.1-1 in ubuntu warty RELEASE'
- >>> frozen_build.status.title
- 'Failed to build'
- >>> frozen_build.can_be_retried
- False
-
-See section 'AssertionErrors in IBinaryPackageBuild' for further
-documentation about consequences of an denied 'retry' action.
-
-Let's retrieve a build record that can be retried.
-
- >>> active_build = getUtility(IBinaryPackageBuildSet).getByBuildID(9)
-
- >>> print active_build.title
- i386 build of pmount 0.1-1 in ubuntu warty RELEASE
-
- >>> print active_build.status.name
- FAILEDTOBUILD
-
- >>> print active_build.builder.name
- bob
-
- >>> print active_build.log.filename
- netapplet-1.0.0.tar.gz
-
-At this point, it's also convenient to test if any content can be
-stored as 'upload_log' using storeUploadLog().
-
-This method will upload a file to librarian with the given content and
-update the context `upload_log` reference.
-
-We store such information persistently to allow users to revisit it,
-and potentially fix any issue, after the build has been processed.
-
-We continue to send the upload information with the
-build-failure-notification for FAILEDTOUPLOAD builds, see
-build-failedtoupload-workflow.txt for further information.
-
- >>> print active_build.upload_log
- None
-
- >>> print active_build.upload_log_url
- None
-
- >>> active_build.storeUploadLog('sample upload log.')
- >>> print active_build.upload_log.filename
- upload_9_log.txt
-
- >>> print active_build.upload_log_url
- http://launchpad.dev/ubuntu/+source/pmount/0.1-1/+build/9/+files/upload_9_log.txt
-
-Once the transaction is committed, the file is available in the
-librarian, and we can retrieve its contents.
-
- >>> transaction.commit()
- >>> active_build.upload_log.open()
- >>> print active_build.upload_log.read()
- sample upload log.
-
-The 'upload_log' library file privacy is set according to the build
-target archive. For an example, see lp.buildmaster.tests.test_packagebuild
-
-Since ubuntu/warty is already released the failed build can't be
-retried.
-
- >>> active_build.can_be_retried
- False
-
-We will reactivate ubuntu/warty allowing the pmount build to be
-retried.
-
- >>> from lp.registry.interfaces.distribution import IDistributionSet
- >>> from lp.registry.interfaces.series import SeriesStatus
- >>> login('foo.bar@xxxxxxxxxxxxx')
- >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
- >>> warty = ubuntu.getSeries('warty')
- >>> warty.status = SeriesStatus.DEVELOPMENT
- >>> flush_database_updates()
- >>> login(ANONYMOUS)
-
- >>> active_build.can_be_retried
- True
-
-Before we actually retry the build on hand let's set its start time.
-This will allow us to observe the fact that a build retry does not
-change the start time if it was set already.
-
- >>> from datetime import datetime, timedelta
- >>> import pytz
- >>> UTC = pytz.timezone('UTC')
- >>> time_now = datetime.now(UTC)
- >>> unsecured_build = removeSecurityProxy(active_build)
- >>> unsecured_build.date_first_dispatched = time_now
- >>> active_build.date_first_dispatched == time_now
- True
-
-Re-trying builds requires the user to be logged in as an admin (including
-buildd admin) to gain launchpad.Edit on the build record. As an anonymous
-user, retrying will fail:
-
- >>> active_build.retry()
- Traceback (most recent call last):
- ...
- Unauthorized:...
-
-Login as an admin and retry the Build record in question:
-
- >>> login('foo.bar@xxxxxxxxxxxxx')
- >>> active_build.retry()
-
-The build was retried but its start time remains the same.
-
- >>> active_build.date_first_dispatched == time_now
- True
-
-Build record has no history and is NEEDSBUILD and a corresponding
-BuildQueue record was created.
-
- >>> print active_build.builder
- None
-
- >>> print active_build.status.name
- NEEDSBUILD
-
- >>> print active_build.buildqueue_record
- <...BuildQueue...>
-
-'log' and 'upload_log' librarian references were removed when the
-build was retried. They will be garbage-collected later by
-'librariangc' and replaced by new ones when the build re-attempt
-finishes.
-
- >>> print active_build.log
- None
-
- >>> print active_build.upload_log
- None
-
-We will restore ubuntu/warty previously changes status, SUPPORTED, so
-it won't interfere in the next tests.
-
- >>> login('foo.bar@xxxxxxxxxxxxx')
- >>> warty.status = SeriesStatus.SUPPORTED
- >>> flush_database_updates()
- >>> login(ANONYMOUS)
-
-Initialize all the required arguments to create a binary package for a
-given build record entry.
-
- >>> from lp.soyuz.interfaces.binarypackagename import IBinaryPackageNameSet
- >>> binarypackagename = getUtility(IBinaryPackageNameSet).ensure('demo').id
- >>> version = '0.0.1-demo'
- >>> summary = 'Summmmmmmmary'
- >>> description = 'Descripppppppption'
- >>> from lp.soyuz.enums import BinaryPackageFormat
- >>> binpackageformat = BinaryPackageFormat.DEB
- >>> component = firefox_build.source_package_release.component.id
- >>> section = firefox_build.source_package_release.section.id
- >>> from lp.soyuz.enums import PackagePublishingPriority
- >>> priority = PackagePublishingPriority.STANDARD
- >>> installedsize = 0
- >>> architecturespecific = False
-
-Invoke createBinaryPackageRelease with all required arguments.
-
- # Load a build from the samepledata for creating binaries.
- >>> pmount_build = getUtility(IBinaryPackageBuildSet).getByBuildID(19)
-
- >>> bin = pmount_build.createBinaryPackageRelease(
- ... binarypackagename=binarypackagename, version=version,
- ... summary=summary, description=description,
- ... binpackageformat=binpackageformat, component=component,
- ... section=section, priority=priority, installedsize=installedsize,
- ... architecturespecific=architecturespecific)
-
- >>> from canonical.launchpad.webapp.testing import verifyObject
- >>> from lp.soyuz.interfaces.binarypackagerelease import IBinaryPackageRelease
- >>> verifyObject(IBinaryPackageRelease, bin)
- True
-
-Commit previous transaction, data we want to preserve:
-
-XXX: flush_database_updates() shouldn't be needed. This seems to be
-Bug 3989 -- StuarBishop 20060713
-
- >>> flush_database_updates()
- >>> transaction.commit()
-
-Check binarypackages property:
-
- >>> for b in pmount_build.binarypackages:
- ... b.version
- u'0.0.1-demo'
- u'0.1-1'
-
-Emulate a huge list of binaries for 'pmount':
-
- >>> bpnameset = getUtility(IBinaryPackageNameSet)
- >>> for i in range(15):
- ... version = "%d" % i
- ... binarypackagename = bpnameset.ensure("test-%d" % i).id
- ... b = pmount_build.createBinaryPackageRelease(
- ... binarypackagename=binarypackagename, version=version,
- ... summary=summary, description=description,
- ... binpackageformat=binpackageformat, component=component,
- ... section=section, priority=priority,
- ... installedsize=installedsize,
- ... architecturespecific=architecturespecific)
-
-
-Check if the property is still working:
-
- >>> pmount_build.binarypackages.count()
- 17
-
-Ensure the list is ordered by 'name'
-
- >>> for b in pmount_build.binarypackages:
- ... b.name, b.version
- (u'demo', u'0.0.1-demo')
- (u'pmount', u'0.1-1')
- (u'test-0', u'0')
- (u'test-1', u'1')
- (u'test-10', u'10')
- (u'test-11', u'11')
- (u'test-12', u'12')
- (u'test-13', u'13')
- (u'test-14', u'14')
- (u'test-2', u'2')
- (u'test-3', u'3')
- (u'test-4', u'4')
- (u'test-5', u'5')
- (u'test-6', u'6')
- (u'test-7', u'7')
- (u'test-8', u'8')
- (u'test-9', u'9')
-
-Rollback transaction to no disturb the other tests:
-
- >>> transaction.abort()
-
-
== The BuildSet Class ==
The BuildSet class gives us some useful ways to consider the
=== removed file 'lib/lp/soyuz/doc/build-estimated-dispatch-time.txt'
--- lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 2010-10-09 16:36:22 +0000
+++ lib/lp/soyuz/doc/build-estimated-dispatch-time.txt 1970-01-01 00:00:00 +0000
@@ -1,178 +0,0 @@
-In order to exercise the estimation of build job start times a setup
-with one job building and another job pending/waiting is to be created.
-
-Activate the builders present in sampledata; we need to be logged in
-as a member of launchpad-buildd-admin:
-
- >>> from canonical.launchpad.ftests import login
- >>> login('celso.providelo@xxxxxxxxxxxxx')
- >>> from lp.buildmaster.interfaces.builder import IBuilderSet
- >>> builder_set = getUtility(IBuilderSet)
-
-Do we have two builders?
-
- >>> builder_set.count()
- 2
-
-These are the builders available.
-
- >>> from canonical.launchpad.ftests import syncUpdate
- >>> for b in builder_set:
- ... b.builderok = True
- ... print "builder: name='%s', id=%d" % (b.name, b.id)
- ... syncUpdate(b)
- builder: name='bob', id=1
- builder: name='frog', id=2
-
-The 'alsa-utils' package is the one to be built (in the ubuntu/hoary
-distroseries).
-
- >>> from lp.registry.interfaces.distribution import IDistributionSet
- >>> ubuntu = getUtility(IDistributionSet)['ubuntu']
- >>> hoary = ubuntu['hoary']
- >>> hoary.main_archive.require_virtualized
- False
-
- >>> from lp.registry.interfaces.pocket import (
- ... PackagePublishingPocket)
- >>> alsa_hoary = hoary.getSourcePackage('alsa-utils')
- >>> alsa_spr = alsa_hoary['1.0.9a-4'].sourcepackagerelease
- >>> print alsa_spr.title
- alsa-utils - 1.0.9a-4
-
-Create new Build and BuildQueue instances (in ubuntu/hoary/i386) for
-the pending job.
-
- >>> from datetime import timedelta
- >>> from lp.buildmaster.enums import BuildStatus
- >>> alsa_build = alsa_spr.createBuild(
- ... hoary['i386'], PackagePublishingPocket.RELEASE,
- ... hoary.main_archive)
- >>> alsa_bqueue = alsa_build.queueBuild()
- >>> alsa_bqueue.lastscore = 500
- >>> alsa_build.status = BuildStatus.NEEDSBUILD
-
-Access the currently building job via the builder.
-
- >>> from datetime import datetime
- >>> import pytz
- >>> bob_the_builder = builder_set.get(1)
- >>> cur_bqueue = bob_the_builder.currentjob
- >>> from lp.soyuz.interfaces.binarypackagebuild import (
- ... IBinaryPackageBuildSet)
- >>> cur_build = getUtility(IBinaryPackageBuildSet).getByQueueEntry(cur_bqueue)
-
-Make sure the job at hand is currently being built.
-
- >>> cur_build.status == BuildStatus.BUILDING
- True
-
-The start time estimation mechanism for a pending job N depends on
-proper "build start time" and "estimated build duration" values for
-other jobs that are either currently building or pending but ahead
-of job N in the build queue. These values will now be set for the job
-that is currently building.
-
- >>> from zope.security.proxy import removeSecurityProxy
- >>> cur_bqueue.lastscore = 1111
- >>> cur_bqueue.setDateStarted(
- ... datetime(2008, 4, 1, 10, 45, 39, tzinfo=pytz.UTC))
- >>> print cur_bqueue.date_started
- 2008-04-01 10:45:39+00:00
-
-Please note that the "estimated build duration" is an internal property
-and not meant to be viewed or modified by an end user.
-
- >>> removeSecurityProxy(cur_bqueue).estimated_duration = (
- ... timedelta(minutes=56))
-
-The estimated start time for the pending job is either now or lies
-in the future.
-
- >>> now = datetime.now(pytz.UTC)
- >>> def job_start_estimate(build):
- ... return build.buildqueue_record.getEstimatedJobStartTime()
- >>> estimate = job_start_estimate(alsa_build)
- >>> estimate > now
- True
-
-The estimated build start time may only be requested for jobs that are
-pending.
-
- >>> job_start_estimate(cur_build)
- Traceback (most recent call last):
- ...
- AssertionError: The start time is only estimated for pending jobs.
-
-Now let's add two PPA packages to the mix in order to show how builds
-associated with disabled archives get ignored when it comes to the calculation
-of estimated dispatch times.
-
-We first add a build for the 'pmount' source package to cprov's PPA.
-
- >>> from lp.registry.interfaces.person import IPersonSet
- >>> cprov = getUtility(IPersonSet).getByName('cprov')
- >>> [pmount_source] = cprov.archive.getPublishedSources(
- ... name='pmount', version='0.1-1')
- >>> pmount_spr = pmount_source.sourcepackagerelease
- >>> print pmount_spr.title
- pmount - 0.1-1
-
- >>> pmount_build = pmount_spr.createBuild(
- ... hoary['i386'], PackagePublishingPocket.RELEASE, cprov.archive)
- >>> pmount_bqueue = pmount_build.queueBuild()
- >>> pmount_bqueue.lastscore = 66
- >>> removeSecurityProxy(pmount_bqueue).estimated_duration = (
- ... timedelta(minutes=12))
- >>> pmount_build.status = BuildStatus.NEEDSBUILD
-
-Followed by another build for the 'iceweasel' source package that is added
-to mark's PPA.
-
- >>> mark = getUtility(IPersonSet).getByName('mark')
- >>> [iceweasel_source] = cprov.archive.getPublishedSources(
- ... name='iceweasel', version='1.0')
- >>> iceweasel_spr = iceweasel_source.sourcepackagerelease
- >>> print iceweasel_spr.title
- iceweasel - 1.0
-
- >>> iceweasel_build = iceweasel_spr.createBuild(
- ... hoary['i386'], PackagePublishingPocket.RELEASE, mark.archive)
- >>> iceweasel_bqueue = iceweasel_build.queueBuild()
- >>> removeSecurityProxy(iceweasel_bqueue).estimated_duration = (
- ... timedelta(minutes=48))
- >>> iceweasel_bqueue.lastscore = 666
- >>> iceweasel_build.status = BuildStatus.NEEDSBUILD
-
-Since the 'iceweasel' build has a higher score (666) than the 'pmount'
-build (66) its estimated dispatch time is essentially "now".
-
- >>> now = datetime.now(pytz.UTC)
- >>> estimate = job_start_estimate(iceweasel_build)
- >>> estimate > now
- True
- >>> estimate - now
- datetime.timedelta(0, 5, ...)
-
-The 'pmount' build comes next in the queue and its estimated dispatch
-time is the estimated build time of the 'iceweasel' package i.e. 2880
-seconds (48 minutes * 60).
-
- >>> estimate = job_start_estimate(pmount_build)
- >>> estimate > now
- True
- >>> estimate - now
- datetime.timedelta(0, 2880, ...)
-
-Now mark's PPA will be disabled. This has the effect that all builds
-associated with it (i.e. the 'iceweasel' build) are ignored while
-calculating the estimated dispatch time and the latter becomes effectively
-"now" for the 'pmount' build.
-
- >>> mark.archive.disable()
- >>> syncUpdate(mark.archive)
- >>> estimate = job_start_estimate(pmount_build)
- >>> estimate > now
- True
- >>> estimate - now
- datetime.timedelta(0, 5, ...)
=== removed file 'lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt'
--- lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt 2007-06-12 12:23:15 +0000
+++ lib/lp/soyuz/stories/soyuz/xx-build-redirect.txt 1970-01-01 00:00:00 +0000
@@ -1,10 +0,0 @@
-= URL redirection for build pages =
-
-When users go to the old build URLs, they are redirected to the equivalent
-new URLs.
-
- >>> anon_browser.open("http://launchpad.dev/+builds/+build/18")
- >>> anon_browser.url
- 'http://launchpad.dev/ubuntu/+source/mozilla-firefox/0.9/+build/18'
-
-
=== added file 'lib/lp/soyuz/tests/test_build.py'
--- lib/lp/soyuz/tests/test_build.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/tests/test_build.py 2011-01-10 13:27:52 +0000
@@ -0,0 +1,254 @@
+# Copyright 2011 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+__metaclass__ = type
+
+from datetime import (
+ datetime,
+ timedelta,
+ )
+import pytz
+from zope.component import getUtility
+from zope.security.proxy import removeSecurityProxy
+
+from canonical.testing.layers import LaunchpadFunctionalLayer
+from lp.buildmaster.enums import BuildStatus
+from lp.registry.interfaces.person import IPersonSet
+from lp.registry.interfaces.pocket import PackagePublishingPocket
+from lp.registry.interfaces.series import SeriesStatus
+from lp.soyuz.enums import (
+ BinaryPackageFormat,
+ PackagePublishingPriority,
+ PackageUploadStatus,
+ )
+from lp.soyuz.interfaces.publishing import PackagePublishingStatus
+from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
+from lp.testing import (
+ person_logged_in,
+ TestCaseWithFactory,
+ )
+from lp.testing.sampledata import ADMIN_EMAIL
+
+
+class TestBuild(TestCaseWithFactory):
+
+ layer = LaunchpadFunctionalLayer
+
+ def setUp(self):
+ super(TestBuild, self).setUp()
+ self.admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL)
+ self.pf = self.factory.makeProcessorFamily()
+ pf_proc = self.pf.addProcessor(self.factory.getUniqueString(), '', '')
+ self.distroseries = self.factory.makeDistroSeries()
+ self.das = self.factory.makeDistroArchSeries(
+ distroseries=self.distroseries, processorfamily=self.pf,
+ supports_virtualized=True)
+ with person_logged_in(self.admin):
+ self.publisher = SoyuzTestPublisher()
+ self.publisher.prepareBreezyAutotest()
+ self.distroseries.nominatedarchindep = self.das
+ self.publisher.addFakeChroots(distroseries=self.distroseries)
+ self.builder = self.factory.makeBuilder(processor=pf_proc)
+
+ def test_title(self):
+ # A build has a title which describes the context source version and
+ # in which series and architecture it is targeted for.
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ expected_title = '%s build of %s %s in %s %s RELEASE' % (
+ self.das.architecturetag, spph.source_package_name,
+ spph.source_package_version, self.distroseries.distribution.name,
+ self.distroseries.name)
+ self.assertEquals(build.title, expected_title)
+
+ def test_linking(self):
+ # A build directly links to the archive, distribution, distroseries,
+ # distroarchseries, pocket in its context and also the source version
+ # that generated it.
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ self.assertEquals(build.archive, self.distroseries.main_archive)
+ self.assertEquals(build.distribution, self.distroseries.distribution)
+ self.assertEquals(build.distro_series, self.distroseries)
+ self.assertEquals(build.distro_arch_series, self.das)
+ self.assertEquals(build.pocket, PackagePublishingPocket.RELEASE)
+ self.assertEquals(build.arch_tag, self.das.architecturetag)
+ self.assertTrue(build.is_virtualized)
+ self.assertEquals(
+ build.source_package_release.title, '%s - %s' % (
+ spph.source_package_name, spph.source_package_version))
+
+ def test_processed_builds(self):
+ # Builds which were already processed also offer additional
+ # information about its process such as the time it was started and
+ # finished and its 'log' and 'upload_changesfile' as librarian files.
+ spn=self.factory.getUniqueString()
+ version="%s.1" % self.factory.getUniqueInteger()
+ spph = self.publisher.getPubSource(
+ sourcename=spn, version=version,
+ distroseries=self.distroseries,
+ status=PackagePublishingStatus.PUBLISHED)
+ with person_logged_in(self.admin):
+ binary = self.publisher.getPubBinaries(binaryname=spn,
+ distroseries=self.distroseries, pub_source=spph,
+ version=version, builder=self.builder)
+ build = binary[0].binarypackagerelease.build
+ self.assertTrue(build.was_built)
+ self.assertEquals(
+ build.package_upload.status, PackageUploadStatus.DONE)
+ self.assertEquals(
+ build.date_started, datetime(
+ 2008, 01, 01, 0, 0, 0, tzinfo=pytz.UTC))
+ self.assertEquals(
+ build.date_finished, datetime(
+ 2008, 01, 01, 0, 5, 0, tzinfo=pytz.UTC))
+ self.assertEquals(build.duration, timedelta(minutes=5))
+ expected_buildlog = 'buildlog_%s-%s-%s.%s_%s_FULLYBUILT.txt.gz' % (
+ self.distroseries.distribution.name, self.distroseries.name,
+ self.das.architecturetag, spn, version)
+ self.assertEquals(build.log.filename, expected_buildlog)
+ url_start = (
+ 'http://launchpad.dev/%s/+source/%s/%s/+build/%s/+files' % (
+ self.distroseries.distribution.name, spn, version, build.id))
+ expected_buildlog_url = '%s/%s' % (url_start, expected_buildlog)
+ self.assertEquals(build.log_url, expected_buildlog_url)
+ expected_changesfile = '%s_%s_%s.changes' % (
+ spn, version, self.das.architecturetag)
+ self.assertEquals(
+ build.upload_changesfile.filename, expected_changesfile)
+ expected_changesfile_url = '%s/%s' % (url_start, expected_changesfile)
+ self.assertEquals(build.changesfile_url, expected_changesfile_url)
+ # Since this build was sucessful, it can not be retried
+ self.assertFalse(build.can_be_retried)
+
+ def test_current_component(self):
+ # The currently published component is provided via the
+ # 'current_component' property. It looks over the publishing records
+ # and finds the current publication of the source in question.
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ self.assertEquals(build.current_component.name, 'main')
+ # It may not be the same as
+ self.assertEquals(build.source_package_release.component.name, 'main')
+ # If the package has no uploads, its package_upload is None
+ self.assertEquals(build.package_upload, None)
+
+ def test_retry_for_released_series(self):
+ # Builds can not be retried for released distroseries
+ distroseries = self.factory.makeDistroSeries()
+ das = self.factory.makeDistroArchSeries(
+ distroseries=distroseries, processorfamily=self.pf,
+ supports_virtualized=True)
+ with person_logged_in(self.admin):
+ distroseries.nominatedarchindep = das
+ distroseries.status = SeriesStatus.OBSOLETE
+ self.publisher.addFakeChroots(distroseries=distroseries)
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=distroseries)
+ [build] = spph.createMissingBuilds()
+ self.assertFalse(build.can_be_retried)
+
+ def test_retry(self):
+ # Test retry functionality
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ with person_logged_in(self.admin):
+ build.status = BuildStatus.FAILEDTOBUILD
+ self.assertTrue(build.can_be_retried)
+
+ def test_uploadlog(self):
+ # Test if the upload log can be attached to a build.
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ self.assertEquals(build.upload_log, None)
+ self.assertEquals(build.upload_log_url, None)
+ build.storeUploadLog('sample upload log')
+ expected_filename = 'upload_%s_log.txt' % build.id
+ self.assertEquals(build.upload_log.filename, expected_filename)
+ url_start = (
+ 'http://launchpad.dev/%s/+source/%s/%s/+build/%s/+files' % (
+ self.distroseries.distribution.name, spph.source_package_name,
+ spph.source_package_version, build.id))
+ expected_url = '%s/%s' % (url_start, expected_filename)
+ self.assertEquals(build.upload_log_url, expected_url)
+
+ def test_retry_does_not_modify_first_dispatch(self):
+ # Retrying a build does not modify the first dispatch time of the
+ # build
+ spph = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ version="%s.1" % self.factory.getUniqueInteger(),
+ distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ now = datetime.now(pytz.UTC)
+ with person_logged_in(self.admin):
+ build.status = BuildStatus.FAILEDTOBUILD
+ # The build can't be queued if we're going to retry it
+ build.buildqueue_record.destroySelf()
+ removeSecurityProxy(build).date_first_dispatched = now
+ with person_logged_in(self.admin):
+ build.retry()
+ self.assertEquals(build.status, BuildStatus.NEEDSBUILD)
+ self.assertEquals(build.date_first_dispatched, now)
+ self.assertEquals(build.log, None)
+ self.assertEquals(build.upload_log, None)
+
+ def test_create_bpr(self):
+ # Test that we can create a BPR from a given build.
+ spn = self.factory.getUniqueString()
+ version = "%s.1" % self.factory.getUniqueInteger()
+ bpn = self.factory.makeBinaryPackageName(name=spn)
+ spph = self.publisher.getPubSource(
+ sourcename=spn, version=version, distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ binary = build.createBinaryPackageRelease(
+ binarypackagename=bpn, version=version, summary='',
+ description='', binpackageformat=BinaryPackageFormat.DEB,
+ component=spph.sourcepackagerelease.component.id,
+ section=spph.sourcepackagerelease.section.id,
+ priority=PackagePublishingPriority.STANDARD, installedsize=0,
+ architecturespecific=False)
+ self.assertEquals(build.binarypackages.count(), 1)
+ self.assertEquals(list(build.binarypackages), [binary])
+
+ def test_multiple_create_bpr(self):
+ # Test that we can create multiple BPRs from a given build
+ spn = self.factory.getUniqueString()
+ version = "%s.1" % self.factory.getUniqueInteger()
+ spph = self.publisher.getPubSource(
+ sourcename=spn, version=version, distroseries=self.distroseries)
+ [build] = spph.createMissingBuilds()
+ expected_names = []
+ for i in range(15):
+ bpn_name = '%s-%s' % (spn, i)
+ bpn = self.factory.makeBinaryPackageName(bpn_name)
+ expected_names.append(bpn_name)
+ binary = build.createBinaryPackageRelease(
+ binarypackagename=bpn, version=str(i), summary='',
+ description='', binpackageformat=BinaryPackageFormat.DEB,
+ component=spph.sourcepackagerelease.component.id,
+ section=spph.sourcepackagerelease.section.id,
+ priority=PackagePublishingPriority.STANDARD, installedsize=0,
+ architecturespecific=False)
+ self.assertEquals(build.binarypackages.count(), 15)
+ bin_names = [b.name for b in build.binarypackages]
+ # Verify .binarypackages returns sorted by name
+ expected_names.sort()
+ self.assertEquals(bin_names, expected_names)
=== added file 'lib/lp/soyuz/tests/test_build_start_estimation.py'
--- lib/lp/soyuz/tests/test_build_start_estimation.py 1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/tests/test_build_start_estimation.py 2011-01-10 13:27:52 +0000
@@ -0,0 +1,88 @@
+# Copyright 2011 Canonical Ltd. This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+__metaclass__ = type
+
+from datetime import (
+ datetime,
+ timedelta,
+ )
+import pytz
+from zope.component import getUtility
+from zope.security.proxy import removeSecurityProxy
+
+from canonical.testing.layers import LaunchpadFunctionalLayer
+from lp.buildmaster.interfaces.builder import IBuilderSet
+from lp.registry.interfaces.person import IPersonSet
+from lp.soyuz.tests.test_publishing import SoyuzTestPublisher
+from lp.testing import (
+ person_logged_in,
+ TestCaseWithFactory,
+ )
+from lp.testing.sampledata import (
+ ADMIN_EMAIL,
+ BOB_THE_BUILDER_NAME,
+ )
+
+
+class TestBuildStartEstimation(TestCaseWithFactory):
+
+ layer = LaunchpadFunctionalLayer
+
+ def setUp(self):
+ super(TestBuildStartEstimation, self).setUp()
+ self.admin = getUtility(IPersonSet).getByEmail(ADMIN_EMAIL)
+ with person_logged_in(self.admin):
+ self.publisher = SoyuzTestPublisher()
+ self.publisher.prepareBreezyAutotest()
+ for buildd in getUtility(IBuilderSet):
+ buildd.builderok = True
+ self.distroseries = self.factory.makeDistroSeries()
+ self.bob = getUtility(IBuilderSet).getByName(BOB_THE_BUILDER_NAME)
+ das = self.factory.makeDistroArchSeries(
+ distroseries=self.distroseries,
+ processorfamily=self.bob.processor.id,
+ architecturetag='i386', supports_virtualized=True)
+ with person_logged_in(self.admin):
+ self.distroseries.nominatedarchindep = das
+ self.publisher.addFakeChroots(distroseries=self.distroseries)
+
+ def job_start_estimate(self, build):
+ return build.buildqueue_record.getEstimatedJobStartTime()
+
+ def test_estimation(self):
+ pkg = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ distroseries=self.distroseries)
+ build = pkg.createMissingBuilds()[0]
+ now = datetime.now(pytz.UTC)
+ estimate = self.job_start_estimate(build)
+ self.assertTrue(estimate > now)
+
+ def test_disabled_archives(self):
+ pkg1 = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ distroseries=self.distroseries)
+ build1 = pkg1.createMissingBuilds()[0]
+ build1.buildqueue_record.lastscore = 1000
+ # No user-serviceable parts inside
+ removeSecurityProxy(build1.buildqueue_record).estimated_duration = (
+ timedelta(minutes=10))
+ pkg2 = self.publisher.getPubSource(
+ sourcename=self.factory.getUniqueString(),
+ distroseries=self.distroseries)
+ build2 = pkg2.createMissingBuilds()[0]
+ build2.buildqueue_record.lastscore = 100
+ now = datetime.now(pytz.UTC)
+ # Since build1 is higher priority, it's estimated dispatch time is now
+ estimate = self.job_start_estimate(build1)
+ self.assertEquals((estimate - now).seconds, 5)
+ # And build2 is next, so must take build1's duration into account
+ estimate = self.job_start_estimate(build2)
+ self.assertEquals((estimate - now).seconds, 600)
+ # If we disable build1's archive, build2 is next
+ with person_logged_in(self.admin):
+ build1.archive.disable()
+ estimate = self.job_start_estimate(build2)
+ self.assertEquals((estimate - now).seconds, 5)
+
=== modified file 'lib/lp/soyuz/tests/test_doc.py'
--- lib/lp/soyuz/tests/test_doc.py 2010-11-06 12:50:22 +0000
+++ lib/lp/soyuz/tests/test_doc.py 2011-01-10 13:27:52 +0000
@@ -196,11 +196,6 @@
setUp=manageChrootSetup,
layer=LaunchpadZopelessLayer,
),
- 'build-estimated-dispatch-time.txt': LayeredDocFileSuite(
- '../doc/build-estimated-dispatch-time.txt',
- setUp=builddmasterSetUp,
- layer=LaunchpadZopelessLayer,
- ),
'package-arch-specific.txt': LayeredDocFileSuite(
'../doc/package-arch-specific.txt',
setUp=builddmasterSetUp,
=== modified file 'lib/lp/testing/factory.py'
--- lib/lp/testing/factory.py 2011-01-06 16:42:50 +0000
+++ lib/lp/testing/factory.py 2011-01-10 13:27:52 +0000
@@ -2127,7 +2127,7 @@
processorfamily = self.makeProcessorFamily()
if owner is None:
owner = self.makePerson()
- # XXX: architecturetag & processerfamily are tightly coupled. It's
+ # XXX: architecturetag & processorfamily are tightly coupled. It's
# wrong to just make a fresh architecture tag without also making a
# processor family to go with it (ideally with processors!)
if architecturetag is None: