OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

tag message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Best Practice


Greetings

I've started drawing together material from the Wiki with potential
as best practice tips and I couldn't see how to add a new wiki page.

There is much to do to edit this but for now would someone mind
putting this up on the wiki as a 'Best Practice' page.

Below is what I have so far. It includes some questions, some of
which I think we still need to answer so they are included as
placeholders.

Regards and thanks

-- 
Stephen Green

Partner
SystML, http://www.systml.co.uk
Tel: +44 (0) 117 9541606

http://www.biblegateway.com/passage/?search=matthew+22:37 .. and voice


----------------------------------------------------------------------


BEST PRACTISE
-------------


GENERAL BEST PRACTISE

TAs should balance having the right level of abstraction with being  
precise enough ... that their implementation in test cases is  
unambiguous


TEST ENVIRONMENT CONSIDERATIONS

Question:
how much of the assumed underlying test environment (identify  
implementations under test, their operations, etc.) must be described?
Answers:
the TA will likely be more specific than the specification and refer  
to an IUT (impl under test) behavior... This focus on the IUT really  
makes the difference in wording the TA

focus more on how to test rather than on trying to figure out what to test

I don't believe that the assertions should make any reference to a  
test harness. An assertion should describe (in precise, normative  
language) what should be tested, and not how.

in some cases, the behavior under test depends on some specific  
context of operation

some abstract hi-level test harness. This harness could include all  
required definitions and relationships between them but there will be  
no any behavioral elements. Seems like definitions of concrete: TA,  
IUT, etc; written using this "test environment model" could be easily  
converted into any (or most) certain test harnesses.

Related Question:
Should a TA include procedure data for related Test Cases?

Some assumptions can/must be made on the test environment, yet should  
not preempt decisions made by test suite writers



COVERAGE

guaranteeing correctness or coverage of TA?

coverage-measurement, make it easier ... to quantify the level of  
coverage provided by ... test suites

a TA does NOT have to be fully covered by test cases



VALUE

writing TAs related to a specification helps uncover defects and gaps,  
especially around corner cases, exception conditions

the earlier TAs are written in the specification process, the better

unlike test cases, test assertions can be written quite early on - by  
the spec team itself - as none of the implementation details found in  
a test case are needed

a set of TAs is the best starting point for a testing team to write a  
test suite for a specification

writing TAs as above improves quality and time-to-deployment of specifications

benefits for spec writers to also write the TAs: better quality of the  
specificaiton, and it helps later the team writing the test cases, as  
these may not be experts enough on the specification domain and on the  
importance of features to be tested



SO WHO WRITES TAs?

Question:
Who will write TAs Test script developer, Tech Committee, software  
vendor, or other?

help specification writers write themselves the Test Assertions, not  
the test case writers

enforce the presence of TAs in specifications if they want (e.g. OASIS  
just mandated the introduction of a conformance clause in all specs.)



RELATIONSHIP TO CONFORMANCE

OASIS recently mandated the presence of a conformance clause in its  
future standards. TAs are the most precise way to define a conformance  
profile or level, while making abstraction from implementation details.

difference between a test assertion and a conformance rule? not much  
difference

conformance rules need to be precise in referencing XML Schemas, etc  
which have to be complied with

TAs are not restricted to conformance testing



SIMPLER TAs

Approach aimed at human-readable documentation of TAs

  * Prose TA either copied exactly from a specification (literal),  
reworded (derived) or referenced

the "prose" component (called ... "assertion prose"), optional but  
sufficient when chosen, so that no further TA component is needed.
can be same as spec req when no ambiguity on the way this is tested or  
when there is already a generally agreed on test process for this kind  
of spec statement.

In many cases, there is no need to use a sophisticated TA structure

The simplest possible test assertion is a normative statement that is  
either contained within or derived from the specification, together  
with an identifier.

it should be possible to identify to a "useful" level of granularity  
the location within the spec where the statement may be found or from  
where it has been derived.

group such associations in a separate formal document

markup both for specs (to enable the identification of normative  
statements from which assertions can be derived) and also for  
"standalone assertion lists". Of course, given the former, the latter  
can be automatically generated.

We know there are different objectives for TAs: (a) improve quality of  
specifications, (b) help test case development. Some users are only  
interested in (a)

mandatory components: TA Id, spec ref, Item under test



KEYWORDS MAY, SHOULD, MUST, ETC

Opinions differ on whether RFC keywords typical in specifications  
should be excluded or allowed in TAs but
Where the TA prose is derived, it should be worded in a way which  
evaluates to a boolean expression  {add example(s)}
and there may need to be an accompanying statement to relate the true  
or false expression to the pass or fail outcome

the interpretation of this result will depend on MUST, SHOULD, MAY.  
Even for an optional requirement, the test outcome may be of interest  
for interoperability assesment. Also, TAs may be associated with  
conformance levels/profiles that may interpret differently a test  
result for a SHOULD.



REFERENCING

specification architects and ... implementation guide architects ...  
link test assertions to precisely identified parts of a specification

specifications (such as for profiles to aid adoption) to reference  
other specifications and themselves be referenced

simplest case where the text of the assertion can be found directly  
within the spec, as well as with more complex cases where the text of  
the assertion is derived from the spec.

Specs often reference other specs. It should not be assumed that all  
"specification references" are to the same specification: within a  
single "assertion list" there may be references to multiple  
specifications.

when there are dependencies between specifications (layering,  
bindings), a TA should restrict its scope, yet make clear assumptions  
on how much of the referenced spec is supposed to be previously tested.


Question:
How to treat dependencies? Can we always "delegate" the testing of  
artifacts for conformance to referenced specification to another TA?

Question:
what, in a typical specification, ought to be accompanied by a formal  
test assertion?

Question:
which kinds of aspects are not appropriate for test assertions?



ADVANCED, COMPLEX TAs


Approach aimed at being either formally processed according to a  
particular notation or methodology or even a set of TAs which are so  
defined (as predicates or sets of predicates using say OCL XMI) as to  
be processable into tests automatically with software tools

  * A predicate - a more formal expression than free-text prose and  
which evaluates to true or false
  * A structure to allow separation of prose and/or predicate  
expressions which describe preconditions for tests, post conditions  
for tests, triggers and outcomes

alternative to assertion prose

Question:
We know there are different objectives for TAs: (a) improve quality of  
specifications, (b) help test case development. Some users are only  
interested in (a). In many cases, there is no need to use a  
sophisticated TA structure - how to reconcile this with other usage  
(b) where test case writers need enough guidance on how to interpret  
the TA, e.g. more structure? How to bring flexibility in the  
recommended design of a TA? (and in subsequent mark-up? e.g. to avoid  
redundancies?) How much should be optional?


FORMAL EXPRESSIONS

Question:
Is purpose of writing TAs describing specification requirement more formally?

ADL language used to formally describe a specification and its assertions

In predicate, specify which dialect is being used (e.g.  
@dialect="XQuery", @dialect="OCL", etc.) using metadata

a boolean expression that describes the state of the system  
immediately after the function under test executes
use operators to describe the behavioral relationship within an assertion

one could view TAG assertions as codified logical expressions

Question:
what is considered good-style for the logical expression of a TA ?

an assertion's logical behavior is expressed as a (potential)  
relationship of pre and post conditional states (rather than singly  
describing pre and post conditions outside of a specified test behavior)



AUTOMATION

automatically generating tests from assertions ... and automatically  
generating assertions from "well structured" specifications.

tools such as those which include test assertions themselves in code  
like compiler instructions, etc would actually be able to pull in and  
reuse is some way such conformance tests /rules

any formalism for Specification Analysis will have downstream effects  
on tools that may be designed to automate the process of analysis - as  
well as the metrics applied to an analysis
specify this formalism in a pragmatic unambiguous way, such that  
analysis tools may be unambiguous and useful


DERIVING

- important: identify derivations - trail/log


GROUPING

Question:
grouping useful, for example, for shared pre-conditions, artifacts,  
properties, etc or should each TA have all of its necessary  
information locally defined? What is most useable? And what is most  
workable for the specification writer on the other hand?

grouping by prerequisites

grouping by IUT

grouping by specification modules and categories, eg to simplify  
assertions of conformance requirements

provide interesting classifications of assertion characterizations by  
considering a larger granularity (and grouping of assertions)



SEQUENCING

There may be inherent logic of conditions specified in an assertion,  
based on the order and specification of previous assertions.

sequential grouping of assertions



OTHER

Question:
that a conformant instance be valid against a given XML schema ...  
whether such a requirement should be stated formally and precisely


Question:
Can we come-up with common semantics for pre-requisites in TAs? Some  
seem to affect the "qualification" phase (pre-condition), but some  
will be needed as part of the evaluation of the test effect (e.g. the  
received message must comply to SOAP in addition to comply with WS-RX  
which is SOAP-based.). Should a pre-requisite play an "dynamic" role  
in the outcome of a TA, or be just informative, for test case writers  
to know that this should have been tested prior to executing this test  
suite?









[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]