launchpad-dev team mailing list archive
-
launchpad-dev team
-
Mailing list archive
-
Message #04076
Re: Python rebuilds (was: Archive deletion strategy)
On Aug 03, 2010, at 11:12 AM, Julian Edwards wrote:
>On Tuesday 03 August 2010 01:21:36 James Westby wrote:
>> On Mon, 2 Aug 2010 18:00:40 -0400, Barry Warsaw
>> <barry@xxxxxxxxxxxxx> wrote:
>> > Yep. I think PPAs are *usually* used to provide packages to
>> > users. In my case, I'm using them primarily as a kind of test
>> > build farm of a subset of the archive. This has other problems
>> > though (e.g. overwhelming the build machines) so while it's
>> > convenient, it might not be exactly the right (or typical) use of
>> > the resources, even though you and Julian keep insisting it's
>> > okay :).
>>
>> Julian was saying the other day that he doesn't think it is the
>> correct strategy, and that you should be using a COPY archive to do
>> these tests. Perhaps you should discuss it with him again and
>> consider switching if that would make more sense.
>
>I spoke to doko and he said that they weren't using COPY archives
>because it's harder to select which packages they want to build.
>
>This isn't a problem for me provided we know _in advance_ of doing
>this so that we can score the PPA's builds down globally. Otherwise,
>the rebuild ends up DOSing the build farm.
For which I'm truly sorry. When I started all this, I didn't really
understand the impact on the system from requesting all those builds. I do
now, and while I don't anticipate a huge number of package requests coming in
the future (mostly occasional resyncs now that both ppa are almost caught up),
I will definitely give the Soyuzeers fair warning beforehand.
A few other observations:
* I suspected, and Julian confirmed on irc, that queue jumping is possible. I
noticed that a gwibber resync requested late last week has still not
rebuilt, even though it's been tantalizingly close a few times. This is
because the ppa got scored down and other package rebuilds have been jumping
ahead.
* For my particular purposes this cycle, I'm with Doko in that I needed only a
subset of packages copied from main and universe, specifically all packages
that build-dep on python-all*. I have one ppa for main packages and one for
universe packages with these build-dep, and I have a set of scripts[1] to
manage these (i.e. get status, request resyncs for out-of-date packages,
etc). The initial sync requests were fairly large (~170 packages in main
and maybe ~500+ in universe IIRC). We won't talk about the accidental 2500+
request. ;)
* I wrote the scripts because I was getting way too many timeouts using the
web interface, and the batching is inconvenient. Fortunately, the API is
rich enough (once I understood what was going on) to do what I wanted to
do.
* The most important thing for me in this experiment is the ability to set up
the dependencies correctly. Doko has a toolchain PPA with Python 2.7 built,
and I have a dependent PPA with just the three packages which enable Python
2.7 support. Then the main PPA mentioned above, depends on both these and
the universe PPA from above depends on all those. While a bit complicated
to set up, this ensures that packages will get built for Python 2.7, though
when newer python-defaults, python-support, or python-central packages are
uploaded, my PPA versions get ignored and the packages in the downstream
PPAs don't get built correctly.
I don't know how often toolchain rebuild experiments like this will happen.
I've certainly learned a lot, and I appreciate Julian, JamesW, Aaron, et al's
patience, good cheer, forgiveness and helpfulness in this process.
-Barry
[1] lp:~barry/+junk/pydeps
Attachment:
signature.asc
Description: PGP signature
Follow ups
References