openstack team mailing list archive
-
openstack team
-
Mailing list archive
-
Message #11386
Re: [Nova] EC2 api testing
Sure, borrowing might be a good start.
I think u are right in that we need different levels of test compliance, the more tests the better, just as long as we don't start testing client libs, although I guess that wouldn't be a bad thing either, but I just would rather get the EC2 100% right, and then report client bugs on a different schedule.
On 5/8/12 11:39 AM, "John Garbutt" <John.Garbutt@xxxxxxxxxx> wrote:
I was really just thinking of "borrowing" the test framework around tempest, rather than push the tests back to tempest. I agree we don't want to be too tied to OpenStack. I certainly would like to have this running against CloudStack too.
I hoped we could look at doing testing from a user perspective. Maybe having tests like: test_instatance_s3_style_success, test_instance_s3_style_failure_bad_image, etc. Each can check the request, functional and scheme compliance.
Could the documentation start a the list of passed and failed tests? Possibly the failure should either be "non-compliant" or "not implemented". Maybe "non-compliant" could specify which versions it is not compliant with.
Cheers,
John
From: Joshua Harlow [mailto:harlowja@xxxxxxxxxxxxx]
Sent: 08 May 2012 18:49
To: John Garbutt; Doug Hellmann; Martin Packman; Joe Gordon; jaypipes@xxxxxxxxx
Cc: openstack
Subject: Re: [Openstack] [Nova] EC2 api testing
So the tempest approach I think starts to pull this EC2 validation suite into openstack to much. A goal that I think is useful is to have this not that tightly integrated with openstack, so that say cloudstack, and others can use this type of suite as well. That way everyone benefits and everyone can contribute (not just the openstack pep's).
I'd personally like to see the following kinds of validations/compliance checks:
Request compliance:
1. Send in X request, expect Y response (or at least Z sub-fields of Y response) to match known "valid" response Y2
* This means we have a need for a whole bunch of requests, a whole bunch of "valid" responses and a way to determine differences (xmlunit in java provides some of this, I don't know of anything in python). I have created something like this (a framework) that was used for the yahoo->bing ad provider xml transtiion project stuff here, that we might be able to use.
Schema compliance:
1. Something along the lines of validating XSDs? Amazon has about 10 versions of there api's, which ones are u complaint with (or that kind of question is to be answered here).
* Questions as to how strict and such need to be tackled here (all the python XSD validators seem to blow up on the first error, not so useful when trying to see multiple errors)
* https://github.com/yahoo/Openstack-EC2/tree/master/data/xsds
Functional compliance:
1. This is where we start to say use boto (which itself uses a very tolerant SAX parser to create python objects) and test anything the above 2 types of tests could not accomplish (say u need a if statement to check some conditions, the above 2 ways probably won't get that). Euca2ools runs on boto, so I would consider that the same case, but using a library might not be the best approach because then u start to run into the issue of the library is not compliant either (ie testing the library instead of the apis).
Documentation compliance:
1. This is I guess where we or others document how compliant a suite "target" is and what the known issues with it are.
That was my thoughts anyway :-)
- Josh
On 5/8/12 8:32 AM, "John Garbutt" <John.Garbutt@xxxxxxxxxx> wrote:
I am certainly up for helping with this effort.
I wondered about this approach:
* Starting with Tempest (mostly for its reporting, and configuration)
* Creating a new category "EC2_Compat" or something like that
* Trying to add in all the tests from those two repos into the new thing
* Looking at what the gaps are
I am not sure it makes sense for these tests to leave in tempest, but it does seem silly to create our own set of configuration and test runner code, if we don't have to. Seems too early to push this shared code into something like openstack-common-tests, but maybe that is what we need long term?
A totally different approach, that sounds attractive, is testing with some of the tools people actually want to use: boto, eucatools. However, this does then mean you end up writing N times as many tests, and it means you have to somehow pick which tools you want to test. Maybe the tests are easier to write, so the fact there is more doesn't matter. While those tests don't really test if we comply with ec2, maybe they do test what people care about.
Cheers,
John
From: openstack-bounces+john.garbutt=eu.citrix.com@xxxxxxxxxxxxxxxxxxx [mailto:openstack-bounces+john.garbutt=eu.citrix.com@xxxxxxxxxxxxxxxxxxx] On Behalf Of Joshua Harlow
Sent: 07 May 2012 18:17
To: Doug Hellmann; Martin Packman
Cc: openstack
Subject: Re: [Openstack] [Nova] EC2 api testing
TBD afaik.
I think it would be nice if we could have one tool to rule them all, but I need to get my hands on this enstrsatus thingy to see what is there :-)
I've started documenting some EC2 stuff that I see @ https://github.com/yahoo/Openstack-EC2/issues
If others want to put stuff on there (or a better location, that's cool with me).
On 5/4/12 3:04 PM, "Doug Hellmann" <doug.hellmann@xxxxxxxxxxxxx> wrote:
On Fri, May 4, 2012 at 1:09 PM, Martin Packman <martin.packman@xxxxxxxxxxxxx> wrote:
At the Folsom Design Summit we discussed[1] trying to collaborate on a
test suite for EC2 api support. Currently nova supports the common
stuff pretty well has differing behaviour in a lot of edge cases.
Having a separate suite, along the lines of tempest, that could be run
against other existing clouds as well as OpenStack would let us test
the tests as well, and would be useful for other projects.
Various parties have done work in this direction in the past, the
trick is going to be combining it into something we can all use. The
existing code I know about includes aws-compat[2], Openstack-EC2[3],
the tests in nova itself, some experimental code in awsome, and an
Enstratus test suite. I'm hoping to find out more about the Enstratus
code, James Urquhart suggested opening the remaining parts would be a
reasonable step. Is there anything else out there we should look at as
well?
Are there any strong opinions over the right way of getting started on this?
Are you going to try to get all of the code for those projects into one package, or build a meta-tool that downloads the others and uses them? I don't have an opinion one way or the other, I'm just curious.
Doug
Martin
[1] Nova EC2 compatibility sesson etherpad
<http://etherpad.openstack.org/FolsomEC2Compatibility>
[2] <https://github.com/cloudscaling/aws-compat>
[3] <https://github.com/yahoo/Openstack-EC2>
_______________________________________________
Mailing list: https://launchpad.net/~openstack
Post to : openstack@xxxxxxxxxxxxxxxxxxx
Unsubscribe : https://launchpad.net/~openstack
More help : https://help.launchpad.net/ListHelp
References