OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

ebxml-iic message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: [ebxml-iic] Call for participation on COnformance Testing Task Force


Title: Call for participation on COnformance Testing Task Force

All:

Anyone interested in participating in some phase of MS conformance testing?
Payoff is first-hand exposure to - and impact on - the most advanced testing
technology ever (XML powered, B2B simulation - based, etc.), something
your company will need either to control or to comply with, some day!
 
There is room for every level of committment...
Various tasks involve:
(a)- definition of test cases/scenarios.
(b)- review of test cases/scenarios.
(c)- design of test cases format (XML docs).
(d)- design of validation process (processing of test ouputs).
(e)- testbed architecture design / implementation.

You may for now announce your interest in any of these.
Future upgrades in committment level are OK.
Please see also the attached document, which gives some hints on the possible
process we could follow, and some assumptions we may/not want to do.
You are encouraged also to dig-up from your mailbox the conformance test draft
I sent out some time ago and read it as a starter.
(comments welcome.)

Note that every contribution falls under the OASIS IPR policy.
http://www.oasis-open.org/who/intellectualproperty.shtml

For this conformance test task force (CTTF?),
we already get two heavyweights volunteers in the domain:
- XMLGlobal (Matthew MacKenzie / David R. Weber)
- NIST (Michael Kass, Standards and Conformance Testing Group,
has been involved in ebXML RegRep testing)

I'd like to have a first feedback by Wednesday Feb 20th.

Regards,

jacques Durand
Fujitsu Software Corp.
(408) 456 7917
jdurand@fs.fujitsu.com

 




 

ebXML MS Conformance Testing:

------------------------ the process ------------------

A possible way to proceed:

1. Start by defining "informally" a few test cases (scenarios), each of which addresss
some feature of the spec, and see how and if they map well to the architecture draft 
proposed earlier, in terms of overall message exchange, nature of input/ouput. 
(what input to the test driver, what messages are sent from / received by candidate MSH, 
expected output on the testing party side: MSH-level trace and/or driver-application trace).

2. Then try to define an XML format for the test case documents (message data, 
sequencing info if any, expected output on the other side, configuration/CPA data).

3. Investigate the automated "Validation" phase: what kind of document comparison it
will have to perform, with which supporting XML technology available (XPath, XSLT...)

4. Then after that, in a second phase, the details of the underlying architecture 
( API(s), etc.) should be easier to design - and implement.

NOTE: each test case scenario will require its own investigation of <1,2,3> above.
If there are enough of us, each subteam can work on a particular test case, and
come up with its own solution for <1,2,3>. This will in fact define the 
requirements for the testbed architecture. Then after that we could merge / "unify" 
across use cases, to come up with common test flows, doc formats, validation processes.

----------------------- the assumptions ------------------

Some assumptions of design we have to agree on (or to discuss):
(again, these refer to the initial architecture draft, which may
be itself re-discussed.)

1- In the initial draft, conformance testing has been conceived on the basis of
the candidate MSH interacting with a "testing party" through messages. 
This design assumes that the testing party has MSH-capability itself. 
It may or may not need to be a fully-loaded MSH (we do not know yet).
In case a full MSH is used on the testing part (as a kind
of "plugg-in"), it will have to be itself "certified", and accepted as a reference
implementation by a certification authority.

2- The Test Driver (simulating application layer) should be general enough, 
actually not ebXML-dependent.
It will send and receive "message data" using a standard "test interface".
This test interface will map to the candidate MSH using a specific adapter/wrapper.
That means that a candidate for conformance is required to write a (thin) adapter
to its MSH, to make it driveable by the test driver.

3- The inputs (test case definitions) and outputs (test outputs) of the Test Driver 
will be in XML format. "message data" would contain header data and payload data,
but does not have to be in ebXML format (preferably not?). If it is in ebXML header
format, again the Test Driver code should not depend on this - but the adapter code may.

4- The Validation phase would be - ideally - totally separate from Testing phase.
It will be also (code-wise) not dependent at all on ebXML format. But it may be
configured to have ebXML intelligence.





[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC