OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

ebxml-iic message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: [ebxml-iic] minutes


Title: RE: [ebxml-iic] Conformance Profiles
All:
 
Here are minutes of last meeting (Sept 3)
and also from F-2-F, with a little delay...
(follow-up mails give most of content for F-2-F, except for test framework discussion, please check it out)
 
Regards,
 
jacques
 
Minutes of IIC September 3, 2002
-------------------------------
 
Call info:
---------

Host: Fujitsu 
Time: Tuesday Sept 3rd, 10am Pacific 
Toll free number: 1-877-801-2058 
International Number:1-712-257-6652 
Minute taker: Jacques Durand

Present:
--------

Mike Kass (NIST)
Jaques Durand (Fujitsu)
Monica Martin (Drake Certivo)(call-in)
Pete Wenzel (SeeBeyond)(call-in)
Hatem El-Sebaaly (IPNet)
Steve Yung (Sun)
Jeff Turpin, (Cyclone Commerce)
Aaron Gomez (DGI)
Mike Dillon (DGI)
Thomas Bikeev (EAN)

Agenda:
-------

1. MS Conformance Suite: (Mike, Matt, Jeff T.) 
- Test Case material, CPA, message templates. 
- Profiles? 

2. MS Interoperability Suite: (Hatem, Steve, Jacques) 
- Interop test cases: profiles and comparison to DGI 
- status of our relationship with ECOM, CEN-ISSS, our role. 

3. Deployment template / guidelines (Pete Wenzel, Thomas B.) 
- status, template content 
- EAN material 

Minutes:
--------

0. General:

- we should make sure we CC to one of our iic lists, when 
dialoging between us by mail (even if its a small group of 2 or 3...)

1. MS Conformance Suite: 

- Test Case material latest version sent out by Mike. 
To be reviewed for next call.
- Monica and Mke raise the issue of Error conditions: this has to be reviewed 
and improved.
- CPA subset need be finalized. Likely same CPA subset as for Interop tests.
- Jeff T. will define MIME components to be used to generate message material,
via adequate assembly tool (e.g. Java MIME generator). But no need to define
the tool - we should not impose a particular tool either.
- Conformance Profiles? discussion going on by mail. Jeff will submit some
candidate profiles. We will review the profiles voted last year. Jacques:
A small number of profiles is better.

2. MS Interoperability Suite:

- Steve will flesh out the basic set of Interop test cases he drafted in F-2-F:
need to define how these would execute in our test framework (test steps?
test service Actions invoked?)
- We need to indentify one or two "basic interop profiles" that represent the
baseline of interoperability, agreeable by any user community.
Starting from the set of test cases Steve drafted, and possibly covering UCC/DGI
tests (except for those relevant to Conformance, as Conformance tests supposed
to be passed before doing Interoperability tests.)
- Hatem and Jacques started to compare IIC interop tests to UCC/DGI tests.
To be sent out soon.
- A recent version of DGI tests has been out (V3.2), a main difference with previous
is that SyncReply not required. That fits our tests better.
- We have initiated formal contacts with ECOM (letter sent by Jacques available
to anyone interested, showing the areas of cooperation we could do.)
- A conf call with EAN, OASIS Europe, and CEN-ISSS will take place either week of
Sept 9, or week after.

3. MS Deployment templates / EAN guidelines.

- Pete just sent out his first complete draft of template, that covers
all MS spec options that can be narrowed to further comply with specific
business requirements.
- probably need to identify also:
. header fields that can be subject to additional conventions (e.g. PartyID)
. specific practices that go beyond spec options
(e.g. use of 3rd party security package software, etc.)
- Thomas B. may try to see how the EAN requirements fit into this template.
- Ultimately, the template definition doc may be a separate document, and a
particular template instance (e.g. EAN) may be defined in another doc,
as it addresses a particular business community.
- the requirements listed in a template instance, can be later used to define
business-level interoperability (and conformance) tests.


Reminders:
---------

- Next teleconference planned for Monday, September 9th, 10am Pacific time.
Now will be weekly until we wrap-up our specs.
 

Jacques Durand
ebXML IIC chair







Minutes of IIC F2F August 22-23, 2002
---------------------------------------------------
 
Location:
---------

Host: 
Fujitsu Software Corporation
3055, Orchard Drive, San Jose, CA 95134-2022

Present:
--------

Mike Kass (NIST)
Jaques Durand (Fujitsu)
Jeff Eck (GXS)
Monica Martin (Drake Certivo)(call-in)
Pete Wenzel (SeeBeyond)(call-in)
Eric Van Lydegraf (Kinzan)
Matt McKenzie (XMLG)
Hatem El-Sebaaly (IPNet)
Steve Yung (Sun)
Jeff Turpin, (Cyclone Commerce)


Agenda:
=======

Thursday:
---------

Morning: (9am - 12)

1. Review of Test Framework document (ebXMLTestFramework.doc V0.4), in particular:
- Test Service: review and finalize the definition of Actions (Section 3.3).
- Architecture configurations assumed for MS COnformance, MS Interoperability.
- Overall positioning of the test Framework (just an automation option? 
indispensable for describing test semantics?)

2. Common issues for MS conformance and interoperability Test Suites:
- user perspective of testing procedures: who and how, overall usage of test suites. 
- review of general Test Case protocol (e.g. initial config & CPA aspects) 
- common approach for defining sequences of test steps, for specifying 
the verification condition 
- finalize format for message material:  CPA samples, message templates, parameters.
 
Afternoon: (1:30pm-3:30pm and 4pm-6pm)

3. Continuation of (2) if needed.

4a. MS conformance Test Suite:
- Test Cases definitions for the current set of Test Requirements ("Master list"):
(can we identify some "test case patterns" that could be reused across Test Cases?)
- XML mark-up?
- Conformance profiles/levels.

4b. MS interoperability Test Suite:
- finalize a basic set of interoperability profile(s)
- Test Cases definitions for our profile(s)
- operational issues (initialization, sharing CPA data)

4c. MS Deployment Template (contingent to sub-team presence):
- the template elements (MSH options, header content, CPA references).
- the EAN instance.

(NOTE: 4a, 4b and 4c could be done in parallel)


Friday:
-------

Morning: (9am - 12)

5. Joint review of common issues from previous day
(both conformance and interop, test framework)

6a. MS conformance Test Suite: (continuation)
- Test Cases for the current set of Test Requirements.

6b. MS interoperability Test Suite: (continuation)
- set of Test Cases for interoperability profile(s)

6c. MS Deployment Template (contingent to sub-team presence):
- the template elements (MSH options, header content, CPA references).
- the EAN instance.

(NOTE: 6a 6b 6c could be done in parallel)


Afternoon: (1:30pm-3:30pm and 4pm-5pm)

7. Continuation of (6)

8. IIC work in the context of other initiatives, users perspective:
- 3rd party specific interop initiatives (DGI, ECOM, ...)
and role of IIC here, how that impacts our test definitions.


Minutes:
========

1- Test Framework review (Thursday morning):

- Test Driver: has built-in "transport" capability (the notion of HTTP adapter
as described in doc is an abstract concept: should not need be a separate 
piece of software.)
- Test Driver will behave as an MSH, but only with very limited MSH capability
(sending/receiving messages) plus additional capability (create "bad"
messages, notify of Ack receptions...). So it cannot be compared
in any way with a reference MSH.
- Discussion focused on the Test Service actions: need review for
a possibly smaller set of actions.
- Normally, users should not have to add new actions: the predefined
action set is enough to carry out conformance tests and interop tests,
at least for MS. New actions may be added later, but they are NOT 
specific to implementations.
- Users do not have to write code other than interfacing pre-defined
actions of Test Service with their own implementation (e.g. MSH callback)
- Reflector action can't be merged with PayloadVerify action: the latter
checks that the received payload is same as a local version of it.
(needed in Interop tests).
- Dummy action, Reflector action, Initiator action: all play a specific role,
and have different ability in responses. But "Mute" action may not be
needed, especially if "Dummy" can be mute when running on Driver side.
- "Dummy" need be renamed.
- Error Messages: captured by two actions. Cannot always require MSH
to invoke these, as "notify application" may be done in various ways,
including logging the error. So we may not be able to fully automate
testing of error cases.

Action Item: [Matt] will review the standard actions of Test Services,
in the light of conf and interop test case requirements.

- Interop Testing: can it be done without "sniffing" on the wire?
Sniffing allows for looking such things as MIME envelope, and 
manipulating message material - all things needed in COnformance Tests,
but normally not in Interop tests? Desired answer: no sniffing needed.
(we need to confirm. For example, could be hard to test "Ack" signal messages
interoperability if we look at MSH only from application perspective?)

Action Item: [Steve, Hatem, Jacques]: define test cases (test steps)
for Interop tests, see if they can be done without monitoring the wire.

- issue on the Interop Test harness: how can we drive the test cases,
from Test Driver? The interop harness will involve two communicating MSH, and these
will report to their application layer that will be two Test Service instances.
- a Test Service instance can initiate a test case, via its "Initiator" action,
that will send out any message we want. But how to locally trigger this
Initiator action? 
- Answer: a bridge from the Test Driver to local Test Service
could allow Test Driver to invoke directly the "Initiator", in order to
execute "SetMessage" test steps. In addition, invocation of local Test Service 
from remote messages will be notified (callback) to Test Driver, via bridge.
- That would avoid implementing Test Driver as a 3rd party URL entity on the wire
(as in conformance), as this set-up may be more constraining on a deployed
installation (e.g. MSH needs to deal with a 3rd party [fake]MSH.
- So When several Test Service instances are used (e.g. 2 in Interop harness)
then always one of them is directly controlled by (local to) the Test Driver.
- The behavior of a Test Service instance bridged to a local Test Driver,
is a little different than that of another (remote) instance, in that it is supposed
to notify the Test Driver of its invocations (received messages), so the Test Driver
can monitor the test case execution. (such test service is in "driver mode" in doc.)

Action Item: [Jeff t., Jacques, Matt]: finalize the test harness for MS Interoperability.


- issue on test case dependencies: should we add such dependency across test cases
of a test suite? Idea: in orde rto avoid complexity, we will asume:
. test cases of a test suite are ordered by execution order, inside test modules.
. there is a show-stopper attribute for each test case, that may cause 
interruption of test for the rest of the test module, in case of failure.

- We need to define a set of CPAs for a test suite.

 
2. Common issues for MS conformance and interoperability Test Suites (Thu PM):

- We reviewed mostly the test case material as illustrated by Mike draft
of test cases (~70 cases drafted).

- on the Conformance test suite, no final decision on conformannce levels / profiles.
Jeff T. will make proposal. Jacques suggested we use a minimal set of MS conformance 
levels, and pointed at the IIC vote last November on only 2 levels.

- The Test Case material still needs some work in following areas:

A. [Matt]: specify CPA subset used. Which format should we pick? 
So far we get two candidates: tpaSample.xml? minicpa.xml from Hatem?
(quick comments on tpaSample.xml:
- SyncReplyMode options missing
- what is the distinction between Ack "expected" and "requested"?)

B. [Jeff]: we need to finalize message template data, in particular
- the way we parameterize these templates (XPath?)
- the way we build complete MIME envelopes and their content (either
using again a template approach - restrictive but simple - or some other
doc building.)

C. [Mike, Monica?] mapping of Test Cases to Test Assertions.
Can we really assume that there is always 1 test case for each test assertion?
I am sure that is the case for 98% of them, but it would be prudent to not preclude
the possibility of more than 1 test case for an assertion. A test case is always
more concrete than an assertion, could there be situations where it makes sense to
have two or more tests for a same assertion that we would not split?
My question is in fact: do we really have to decide on this, or can we adopt an
Test Case ID scheme that allow for this if we need it later:
it can be same as current assertion ID (e.g. urn:semreq:id:3), and in case we have 1-to-n, 
we can use additional letters: e.g. urn:semreq:id:3a, urn:semreq:id:3b, ... ? 
or dot numbering: urn:semreq:id:3.1, urn:semreq:id:3.2... 
would that be an issue?

D. Test Case table formatting [Mike,...]:
- Test Case ID field: see above remarks on numbering. (by the way, why "semreq"?)
- "Action Element" field: we could use more intuitive "step names", 
e.g. for the sending of message: "SendMessage" instead of "SetMessage".
- Also I strongly suggest that we make the "verification" of the test, a separate
and final step. (could be called "Verification").
- "Party" field: probably not needed, as it is always the TestDriver, as per our
definition of what a "step" is: an event that is always observable in TestDriver.
- "ErrorStatus" field needs revision. See below "Test failures".
- ErrorMessage: for each step is fine.
- "XPath" field: let us use a better name... should be more general , 
like "message expression" or something like that.

E. XPath / message expressions [Mike, Matt]:
- some XPath expressions are for building message material ("SetMessage" action),
some are for expressing "filters" to select the right message (GetMessage).
It would be good to distinguish them in syntax, e.g. the assignment operator "="
be distinguished from equal operator, like in programming languages (e.g. "==").
- GetMessage steps should not be aggregated with the final Verification
condition: GetMessage only contains filters to select the right message.
- For the final step (or Verification): will contain the boolean expression
that defines success (currently merged with the "filter" expression of GetMessage step
in current draft.)
- Use of parameters ($MessageId, etc.): it seems that these parameters need sometimes
to be set to current (e.g. received) material. That is not clear how it is done (see Case id:3)
We face two issues:
(a) how to "remember" message material from past test steps?
We could use XPath-based assignment, e.g. a GetMessage could contain filter
expressions as well as assignment expressions: e.g. $MessageId = <xpath expr>
(b) Across several steps, as several message are involved, and we may want to
refer material from  more than 1 step, we can use step# to identify the parameter: 
$1MessageId, $2MessageId...
- advanced verification conditions: sometimes verification conditions need more
than just constraints on message material: e.g. check that step N completed
within 10sec from Step M. It seems anyway we need to set a timeout for step completion.
What else? How to improve script language for this? When it comes to checking
that we got say 3 messages of a kind, e.g. for retries in reliability, 
could that be an enhancement of the GetMessage step? (where we would specify how
many messages of this kind need be received for the step to complete?)


F. Test Verification , and Test failures [Mike, Matt]:
- A separate step for this would be good, as mentioned above.
- sometimes a successful test needs to verify that no error message wwas received,
in addition to completing all its steps. How do we do that? Should we define
"Exception step(s)" to a test case, that will capture messages that should NOT occur...
and then when completed, generate test failure?
- important to distinguish two types of failure for a Test Case:
(a) "operation" failure, resulting from the impossibility to carry out the test properly.
e.g. some test step could not complete, for some reason unrelated to the spec requirements
that we are trying to test. 
Typically, this happens when the Test Requirement "pre-condition" cannot be realized.
In such case, the conformance report should NOT conclude that MSH implementation 
is not conforming, just that the test could not be performed.
(b) "conformance" failure, clearly showing that the spec requirement is not satisfied
by the MSH implementation.
Generally, the failures (a) correspond to some step that could not be completed.
So we could associate with each step either type of error: (1) failure causing
"operation" failure, (2) failure causing "conformance" failure.
- should we also make room for a "failure" expression in the verification step?
In other words, in case the "success" expression is not satisfied, we may
still need to distinguish the kind of test failure. A specific error message
could be associated with each kind.


3. Interoperability (Friday morning and early PM)

- we decided to focus on interoperability tests, and the notion of profiles.
Steve has drafted a set of basic Interop test cases. We reviewed them,
and ended up with 7 tests for basic interoperability.
- Additional tests on top of these, deal with Ack testing. These still need work
to see if we can do them without "sniffing" on the wire, or without specific
MSH logging capability (not required by spec.)
- signed message testing need be broken in two cases:
(1) message-embedded public key, (2) key is in certificate.
- single payload simple exchange need be done with  and without syncReply.
- We ended up with preliminary set of test cases:

1.1	BasicExchange	2.1.4	MSIP1	Basic message exchange, no payload.	Success: 
receive response.	
1.2	BasicSinglePayload	2.1.4	MSIP1	Basic message exchange, single payload.	
Success: payload comes back, payload integrity.	
1.3	BasicMultiPayload	2.1.4	MSIP1	Basic message exchange, multiply payloads, 
including binary payload.	Same as before, except 3 files: XML, EDI, binary.	
1.4	BasicErrorDelivery		MSIP1	Ensure that error messages are properly 
delivered.		
1.5	SignedMessageKeyInfo	4.1.1	MSIP1	Message exchange with digital signature.	
Expect signed response? Or just use dummy.	
1.6	SignedMessageWOKeyInfo	4.1.1	MSIP1	Digital signature without key info.		
1.7	SyncReply		MSIP1	Basic message exchange with sync reply.	Same as 1.2, 
except with sync reply.	
2.1	UnsignedDataUnsignedAck	6.5.3	MSIP2			
2.2	UnsignedDataSignedAck	6.5.3	MSIP2			
2.3	SignedDataUnsignedAck	6.5.3	MSIP2			
2.4	SignedDataSignedAck	6.5.3	MSIP2			
2.5	UnsignedDataUnsignedAckSyncReply	6.5.3	MSIP2		Same as 2.1, except 
with sync reply.	
2.6	OnceAndOnlyOnce	6.5.6	MSIP2	Positive test of RM and Ack.	Send to dummy, 
expect response.	
2.8	DuplicationDetection	6.5.6	MSIP2	Duplicate message send.	Test of at most 
once messaging	
3.1	MultipleMessageOrder	9.1	MSIP3		Has to be asyhc mode	
3.2	MessageOutOfSequence	9.1	MSIP3			
4.1	PingPong	8.1	MSIP4			
4.2	MessageStatus	7.1	MSIP4			
 	LargeFileTransfer		MSIPE			
 	EncryptedFile		MSIPE			
						
-  Jacques suggests we reduce the number of "basic interop profiles", avoiding to layer
too much the tests.

Action Item [Steve]: to specify CPA related to these, and describe the test cases 
and material.

Action Item: [Hatem, Jacques] compare with DGI tests, see if we can package an 
interop test suite that is close to UCC/DGI tests.

Meeeting was adjourned at 3pm.



Jacques Durand
ebXML IIC chair


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC