openstack team mailing list archive
-
openstack team
-
Mailing list archive
-
Message #09892
Re: EC2 compat.
Lurking on the thread, but love what I'm seeing :-)
Nice work, guys!
d
On Tue, Apr 10, 2012 at 10:43 PM, Joshua Harlow <harlowja@xxxxxxxxxxxxx> wrote:
> Very cool, glad to see that is being worked on, it looks pretty similar to
> what I was thinking of.
> I’m all for open dialogues.
> In fact.
> I was thinking of what is needed to make this work better.
>
> Open questions/thoughts/brainstorm (at least that I was thinking of):
>
> How strict do we want to be with the XSD? (there aren’t a lot of tolerant
> xsd validators out there, which sux)
>
> Should we use something like jaxb for python, that should be more tolerant
> (unsure as what the best solution here is)
>
> How do we continuously measure the compatibility level?
>
> # of test cases passing, # of xml differences, # of xsd issues
>
> Should we use boto as a intermediate layer? (it is very tolerant)
>
> From what I understand there XML code is basically selecting certain
> attributes out of the XML using SAX, then adding any unknown attributes
> dynamically on to a object
>
> How do we make it repeatable?
>
> For a given test X, if there is a problem with test X and its response Y,
> how do we easily recreate that test X and response Y (so that dev’s can fix
> it)?
> Do we have a “golden set” of responses that when test X is called it should
> match golden response Z (otherwise there is an issue)
>
> This is where the mock server maybe useful, in that we can point test X at
> the mock server; get the expected responses Z,
> Then point the test X at the real openstack server and get responses Y that
> should match Z (exactly, minus the request id?)
> EC2 seems to also already have some type of mocking, but I haven’t used
> it... (http://bit.ly/HJkdh7)
>
>
> I like how there is a tests folder that u guys have, that seems like it
> could be a good location for the “content checking tests” which actually
> require code/logic to dig into the XML response. It might make sense to use
> another tool to verify the XSD’s (how tolerant we want to be is an open
> question) and another tool that will show u the xml differences (some of
> which might be ok, some not). I have used in java xmlunit to do those kind
> of xml difference comparisons, it provides some nifty ways of ignoring
> certain differences and such. If say we had 3 levels of tests I think that
> would make sense (starting say from XSD validation, to difference
> comparisons to content comparisons), and would make a hell of a EC2 cool
> validation toolkit.
>
> The other usage of the site I was making was to list all the known error
> conditions, and any other incompatibilities that I am noticing with EC2
> (error conditions, features, parameters...). That seems really needed to
> allow for anyone to actually use the EC2 apis and handle all the cases which
> could be thrown at them.
>
> -Josh
>
>
>
>
> On 4/10/12 7:02 PM, "Eric Windisch" <eric@xxxxxxxxxxxxxxxx> wrote:
>
>
> Josh, as a follow-up, it would be good to keep an open dialogue on this.
> When/if you get a chance to review the aws-compat branch, I'd like to get
> your feedback as well.
>
> PS I meant to write "assess", not "access". I only noticed when I read back
> my email. I'm too pedantic to not correct myself.
>
>
>
> _______________________________________________
> Mailing list: https://launchpad.net/~openstack
> Post to : openstack@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~openstack
> More help : https://help.launchpad.net/ListHelp
>
Follow ups
References