← Back to team overview

ubuntu-phone team mailing list archive

Re: TRAINCON-0 (v2) - IMPORTANT landing guidelines inside

 

Le 10/04/2014 10:22, Selene Scriven a écrit :
* Alexander Sack <asac@xxxxxxxxxxxxx> wrote:
With 14.04 release pending and a bunch of promotion blockers plaguing
us still, engineering leadership team has agreed that we put our tree
and landing engine into high alert mode again (e.g. TRAINCON-0). Since
we don't want to provoke a rush to deliver non-finished stuff,
TRAINCON-0 (v2) landing rules will be in effect starting yesterday :).

Note, that we looked at feedback received during our last TRAINCON-0
(v1) alert a couple weeks back and decided to do something slightly
different this time. For that we have come up with slightly relaxed
operational/landing guidelines for TRAINCON-0 (v2). These are:

  1. isolated regression and blocker fixes: can land easily; will get
extra peer review by LT; otherwise normal landing practices with some
extra care of the experienced lander.

  2. small features and not-isolated bugfixes: can land, but special
care applies; includes revisiting the test plan by LT/QA and
potentially pumping test plan up to include more tests as well as more
dogfooding/exploratory elements. Also, we want to introduce peer
review test results by QA to ensure we don't make fatal mistakes on
our way to green. Note: please mark your landing entries in the
self-service spreadsheet prominently as FEATURE if your case falls
into this category.

  3. large features can land in case-by-case exceptions  - these need a
case-by-case test and landing plan; should be the rare exception and
resourcing can be done adhoc. Note: please mark your landing entries
in the self-service spreadsheet prominently as BIGFEATURE if your case
falls into this category.

Depending on how well this goes or not, we might change to an even
more restrictive approach in a couple of days. Key is that every
engineering team really focuses on the regression/blocker bugs at
hands with all manpower they have available. So let's stay focused and
we will succeed.
I'd like to make sure I understand our approach / process.  If I
understand correctly...


First, fixes for landing blockers can land quickly and easily.
These issues are identified in Didier's daily email.

To make it easier to keep track of blockers, I've started tagging
them with qa-touch-blocker.  A simple Launchpad query can list
them, and I'm also hoping to include a similar report (and
others) in a bug wrangling tool we've been working on.

Open blockers:

   http://goo.gl/P2sOGh

I hope this will help us track and coordinate priorities better.


Second, other changes can land only through a new process we've
just created.  These require QA sign-off before landing, and I
think the process to get that is roughly:

   - Developers:
     - Make some changes to an image component.
     - Write tests for the new changes, or otherwise make sure it is
       included in the project's test plan.
     - Build and publish the new packages in a silo.
     - Add the change to the "Self service CI" spreadsheet, along
       with a summary of the changes and how to test them.
     - Run the tests and ensure they pass.
     - Mark the change as needing QA sign-off.  (also, possibly
       change the status field to orange)

Note: the orange is automated in the spreadsheet logic when all status are set.


   - QA:
     - Check the spreadsheet for changes needing sign-off.
       (status == orange, needs sign-off == Yes)  (is there some
       way to get notifications?)
     - Ensure all the previous steps have actually been done, and
       that the test plans are sufficient and sensible.  If not,
       mark the change as incomplete.
     - Run the tests and ensure the results match what is
       expected.  If not, mark the change as failed.
     - If everything went well, approve the change and clear it
       for landing.
You have the "QA sign off" field in the sheet like in https://docs.google.com/a/canonical.com/spreadsheet/ccc?key=0AuDk72Lpx8U5dFlCc1VzeVZzWmdBZS11WERjdVc3dmc&usp=drive_web#gid=1. just set it to Yes if everything went well. If things go wrong, poke the lander (you have the name in the same sheet) directly to get things worked on ASAP.


So, every sign-off request gets a pass, fail, or incomplete.

I looked over the spreadsheet for items needing QA sign-off.  If
I understand the process correctly, there were four changes
requesting sign-off, and all four are incomplete.  Or, at least,
they were when I checked.

QA Signoff requested:

   - silo landing-017:
     - Testing done: No
     - Owner: sergiusens/jhodapp
     - Desc: BIG FEATURE: media-hub(NEW), qtvideo-node,
       qtubuntu-media, qtubuntu-media-signals
     - Test plans:
       https://wiki.ubuntu.com/Process/Merges/TestPlan/media-hub
     - Comments:
       Testing not done, information omitted about what changed
       and how to test it.

   - silo landing-019:
     - Testing done: No
     - Owner: sergiusens
     - Desc: BIG FEATURE: mms support..., lxc-android-confnig
       network manager doesn't online the modem any more...
     - Test plans: (URLs truncated by spreadsheet awkwardness)
     - Comments:
       Testing not done, can't build: merge conflict.

   - silo (None):
     - Owner: mhr3
     - Desc: Add i18n support to click scope, display apps on
       unity8 desktop
     - Test plans:
       https://wiki.ubuntu.com/Process/Merges/TestPlan/unity-scope-click
     - Comments:
       No silo/packages, can't test.

   - silo landing-004:
     - Owner: Saviq
     - Desc: Unity8 fixes: dead areas in dash (blocker),
       predictive text in password entry, integration test type
       error (blocker), test improvements.
     - Test plans:
       https://wiki.ubuntu.com/Process/Merges/TestPlans/Unity8
     - Comments:
       Can't build, merge conflict.  Therefore, can't test.  Also,
       shouldn't need the extended sign-off process since it fixes
       two blockers.


Have I got this process anywhere close to what was intended?

Exactly (as long as the status isn't in an orange color, the request isn't ready for QA to do the additional testing).

Browsing the spreadsheet requests just to get the right colors takes less than 10s from experience (we do that to see things that needs publication for quite some months).

Didier


References