← Back to team overview

openstack team mailing list archive

[QA] Aligning "smoke" / "acceptance" / "promotion" test efforts

 

Hi all,

I'd like to get some alignment on the following:

1) The definition of what is a "smoke" test
2) How we envision the Tempest project's role in running such tests

First, a discussion on #1.

There seem to be very loose semantics used to describe what a "smoke test" is. Some groups use the term "smoke test". Others use "promotion test". And still others use "acceptance test".

I will only use the term "smoke test" here and I'm hoping we can standardize on this term to reduce confusion.

Now, what exactly *is* a smoke test? According to Wikipedia [1], a smoke test "is preliminary to further testing, intended to reveal simple failures severe enough to reject a prospective software release". Further, it states that smoke tests are a "subset of test cases that cover the most important functionality of a component or system are selected and run, to ascertain if the most crucial functions of a program work correctly".

From the above, I would surmise that smoke tests should have all three of the following characteristics:

* Test basic operations of an API, usually in a specific order that makes sense as a bare-bones use case of the API * Test only the correct action paths -- in other words, smoke tests do not throw random or invalid input at an API
* Use the default client tools to complete the actions
* Take a short amount of time to complete

Currently in the OpenStack community, there are a number of things that act as "smoke tests", or something like them:

* The devstack "exercises" [2]
* The Torpedo tests that SmokeStack runs against proposed commits to core projects [3]
* The next incarnation of the Kong tests -- now called "exerstack" [4]
* Tests within Tempest that are annotated (using the nosetest @attr decorator) with "type=smoke" * Various companies have "promotion tests" that are essentially the same as exercises from devstack -- for instance, at HPCS we have a set of promotion tests that do virtually the exact same thing as the devstack exercises, only are a mixture of some Python and bash code and have additional stuffs like validating the HPCS billing setup

It would be super-awesome if Tempest could be used to remove some of this duplication of efforts.

However, before this can happen, a number of improvements need to be made to Tempest. The issue with the "smoke tests" in Tempest is that they aren't really smoke tests. They do not use the default client tools (like novaclient, keystoneclient, etc) and are not annotated consistently.

I would like to have Tempest have a full set of real smoke tests that can be used as the upstream core projects' gates.

Steps I think we need to take:

* Create a base test class that uses the default CLI tools instead of raw HTTP calls * Create a set of Manager classes that would provide a config object and one or more client objects to test cases -- this is similar to the tempest.openstack.Manager class currently in the source code, but would allow using the default project client libraries instead of the raw HTTP client currently in Tempest * Allow test cases that derive from the base SmokeTest class to use an ordered test method system -- e.g. test_001_foo, test_002_bar * Have the base smoke test case class automatically annotate all of its test methods with type=smoke so that we can run all smoke tests with:

    nosetests --attr=type=smoke tempest

I have put together some code that shows the general idea as a draft patchset [5]

I've asked David to include this in the QA meeting agenda for this week. Hopefully interested folks can take a look at the example code in the patchset above, give it some thought, and we can chat about this on Thursday.

Best,
-jay

[1] http://en.wikipedia.org/wiki/Smoke_testing#Software_development
[2] https://github.com/openstack-dev/devstack/tree/master/exercises
[3] https://github.com/dprince/torpedo
[4] https://github.com/rcbops/exerstack/
[5] https://review.openstack.org/#/c/7069/


Follow ups