[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]
Subject: Re: 1_1 comments
Monica,
Thanks for your comments re: the Test
Framework errata sheet. Here is my belated response
(had the flu this week)
Thanks,
Mike
mm1: Comments on Errata document:
Scripting Parameters Several scripting parameters seem to indicate that something can be customized, which is inline with our discussions today - distinguish implementation from the specification of the framework. This brings up the question whether or not an entity may or may not allow specialization. For example, an industry may not allow test suite to change the message structure or context because it defines an enumeration set or has a fixed set of minimum required elements. It may wish to enforce that rigor on any testing of a constrained structure. Do we need some overall flag that allows some of this specialization or not, or just assume maximum flexibility? Is this part of the configuration group? [MIKE] - If we want a truly "flexible" framework,
then constraints would be defined by
the industry ( i.e. they define their own
"guidelines" for defining their configuration).
We have done that with ebXML Messaging ( defining
configuration parameters, message
declaration syntax ).
Rosettanet could do the same
#12 Can we get more details on Block and Release test operations specified by Woo? This looks like constraints on use of synch/asynch (?). [MIKE] - I have not heard from Woo on this, but have
proceeded along the line that this is
a synch/asynch issue, and have modified the ebTest.xsd schema to
reflect a possible solution
to this ( also using Jacques input regarding exception threads
)
General Separating the communication between the Test Driver<-->Test Service may help isolate what happens in the test (exercising of requirements) vs. what happens to support the test (communication between test driver and service). However, I believe the lines will get quite gray as we move into business process territory. [MIKE] - I interpreted this isssue to be one of "separting
the implementation being tested"
from the testing infrastructure ( i.e. don't use ebXML
Messaging ( what you are testing ) to communicate between
Test Driver and Test Service. I agree with
this
#13 Branching Should support the minimum seen in BPSS: 1. either/or 2. and 3. concurrency (back to a particular join) 4. parallel (exceptions and message response, for example) where they both are valid to start..... This brings up two interesting points about nesting which we could see in BPSS where we may not be able to avoid the compositional aspects of the testing we discussed in 12/22 call: a. with compound binary collaborations b. with the possibility of composition based on other than dependencies [MIKE] - I included a proposed XML syntax to define this
logic in my latest
release of the modified Test Framework and errata
sheet b-1 differentiate outer-inner from outer handoff to inner and an inner that is then independent [1] [MIKE] - I am not familiar with these BPSS scenarios, and
need input as to whether
the modified schema will support this
[1] May equate to a pre-condition that must be true before another activity starts. This is beginning to look like another ebXML set of specifications. :>) #16 Does this nesting of test requirements within test requirements also speak of implementation or test suite functionality not of our specification? Where do we draw the line on 'packaging'? We can provide maximum flexibility but need to still some rigor in our framework. I see value in allowing the capability of associating a given set of functional and test requirements [2]. [MIKE] - Nesting of requirements provides a more "structured" way
to define Test Requirement
and Functional Requirement dependency ( rather than just
"referencing" the ID of another requirement as a
"prerequisite"). It would require the Test Driver to
work "bottom-up" regarding which Test Requirements ( and
Test Cases ) to execute first, since those "child" Test
Requirements ( and Functional Requirements ) must
first be satisfied, before the "parent" Test Requirements can be
declared satisfied... so yes this is
an implementation issue for a Test Driver that is not currently
defined in the specification.
[2] Note, this could be M-M, which gets back to my packaging question. #23 Plug-ins I would encourage you to allow an 'any' type contentType and allow user-defined specification. I am still concerned about any reference to a vendor-specific tool. If we allow the user-defined abstraction, Schematron could be a best practice defined and held in a separate guideline outside of the specifications. [MIKE] - I agree. Schematron is not yet a standard... and
XUpdate is also not a standard.
#24 and #25 Payload Integrity Do we lose any community if we don't allow another 'any' selection for other than XMLDSIG? See suggestion in #23. [MIKE] - I do not know the answer to this question.
Comments?
#26 Remote Test Driver I can understand the concern that you want to separate pre-test configuration from test execution. How this is accomplished is an implementation detail IMHO, although we should allow its specification. Same comment on #29. [MIKE] - I agree that it should be specified, but not
explicitly define HOW it is to be implemented.
Specifying the format/schema of the configuration information for a
Test Driver, without specifying
how that driver loads the information would seem to be
sufficient.
#27 Errors This might be difficult because we get closer and closer to binding to applications, as we define these errors. And too, when we do BPSS, you have exceptions that may result in errors (in messaging). See suggestion in #23. I would indicate what errors may occur between the test driver and service exclusively (not as a by-product of the testing itself but between the test component infrastructure). [MIKE] - Agreed
#30 API Definitions Need more information on #30 API request. Can see need for #31 as it is an important part of the test framework functionality, although uncertain with #30. Should there be a minimum set of guidelines to provide some structure around an API without implementing it? Borders on our implementation / design discussion 12/22. [MIKE] - It sounds like what is needed is a "formalized" expression
of the calls
and data to be passed between Test Driver <-->Service
<-->MSH, rather than the current "narrative"
description of this interface in the Test Framework
specification. What we are talking
about here is "interoperability mode", and what an implementer
needs to do to bind their Test Driver to their MSH
implementation.
|
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]