OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] What are we validating and why?



"Dave Pawson" <dave.pawson@gmail.com> wrote on 06/06/2008 03:00:19 PM:

> 2. If required it should be in the spec.
>
> 3. I object to this group talking about scripting 'glue'. that's very clearly
> a how?
> At most we might spec that:
> Tests shall seek visual comparison wrt all paras requiring it in 1.2
> and specify how comparison is to be made objectively.
>
> If we start to state 'how' tests are linked, then we're doomed.
> You use xyz glue, I use Java, someone else uses ant.
> I think we can identify requirements (perhaps numbered)
> against which a test specification can be mapped.
> So if Rob implements it, I can go from this groups output (3.4) to
> a list of test requirements (3.4-1 to 3.4-59) through to absolute
> test numbers 567-776. That would complete the test matrix.
>
> We just need to imagine the mapping.
> What shall be tested
> What may be tested
> What output is expected (hopefully mostly objective!)
> Any subjective tests might lie in the nice to have domain.

In ISO lingo we have a "interoperability assessment methodology" which is defined as:

"An assessment methodology for interoperability may include the
specification of some or all of the following: terminology, basic
concepts, requirements and guidance concerning test methods, the
appropriate depth of testing, test specification and means of testing,
and requirements and guidance concerning the operation of assessment
services and the presentation of results. In technical areas where
there is a conformity assessment methodology and an interoperability
assessment methodology, the relationship between them must be
specified."


I hope this aligns with what you called "meta requirements".

I agree that, as an OASIS TC deliverable, our formal deliverables need to be documents.  However, like any other OASIS deliverable, member implementations are highly desired, and in the case of OASIS Standards, successful uses of the standard by members are required.  So although the OASIS label would be on the "interoperability assessment methodology" document itself, I would not be surprised if some TC members cooperated on an open source implementation of that document as well.

We should also step back and reflect on the difference between conformance testing and interoperability testing.  For conformance, we are limited to what is in the ODF standard.  We cannot add or remove or change its definition of conformance.  If it says nothing about how a particular combinations of elements are rendered, then we cannot require a specific rendering in our conformance assessment.  I think we will find that most rendering and runtime behaviors are outside the scope of what ODF defines.

However, from the interoperability standpoint -- and I'd like the TC's charter to be broad enough to encompass this -- there is nothing wrong with a set of test cases that exercise all of the rendering primitives of ODF.  These could be used by a vendor to self-test to see which ones have actually been implemented and which ones are lacking.  Note that I'm using the word "test" rather loosely here.  If the drawing primitive "means" to draw a quarter circle in the upper left quadrant with green fill and a red dotted edge, then one thing we can do is have a test cases called circle_1.odt with that test case, and a caption that says "quarter circle in the upper left quadrant with green fill and a red dotted edge".  I vendor could self test with this test case.  There is not standard-mandated meaning for that test file.  But a vendor can surely judge for themselves whether the rendering is what they intend.

So conformance tests check conformance with the standard.  They are of interest to purchasing agents as well as vendors.  Interoperability tests are used by a group of vendors to improve practical interoperability.  The value proposition of ODF increases to the extent practical interoperability increases.

-Rob

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]