OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] What are we validating and why?


2008/6/6  <robert_weir@us.ibm.com>:
>
> "Dave Pawson" <dave.pawson@gmail.com> wrote on 06/06/2008 03:00:19 PM:

>> We just need to imagine the mapping.
>> What shall be tested
>> What may be tested
>> What output is expected (hopefully mostly objective!)
>> Any subjective tests might lie in the nice to have domain.
>
> In ISO lingo we have a "interoperability assessment methodology" which is
> defined as:
>
> "An assessment methodology for interoperability may include the
> specification of some or all of the following: terminology, basic
> concepts, requirements and guidance concerning test methods, the
> appropriate depth of testing, test specification and means of testing,
> and requirements and guidance concerning the operation of assessment
> services and the presentation of results. In technical areas where
> there is a conformity assessment methodology and an interoperability
> assessment methodology, the relationship between them must be
> specified."

I prefer the non ISO lingo - in particular 'means of testing'? Surely that's
telling an implementer 'how' to do it?

The relationship between conformity and interop has me confused.
I don't understand that.



>
> I hope this aligns with what you called "meta requirements".

Rather broader than mine perhaps?


>
> I agree that, as an OASIS TC deliverable, our formal deliverables need to be
> documents.  However, like any other OASIS deliverable, member
> implementations are highly desired, and in the case of OASIS Standards,
> successful uses of the standard by members are required.  So although the
> OASIS label would be on the "interoperability assessment methodology"
> document itself, I would not be surprised if some TC members cooperated on
> an open source implementation of that document as well.

I'd hope so, possibly the best way to go. The document would be a waste of
time if no one took it up for implementation.


>
> We should also step back and reflect on the difference between conformance
> testing and interoperability testing.  For conformance, we are limited to
> what is in the ODF standard.  We cannot add or remove or change its
> definition of conformance.  If it says nothing about how a particular
> combinations of elements are rendered, then we cannot require a specific
> rendering in our conformance assessment.  I think we will find that most
> rendering and runtime behaviors are outside the scope of what ODF defines.

Sorry, that doesn't help me understand interop Rob?
My gut feeling is that two implementations that 'are conformant' will
by default be interoperable, but that may be too loose.

Rendering I'd agree is hard, unlikely to be included.
Behaviours I'd need to consult the spec. User interaction could
tend towards conformance?


>
> However, from the interoperability standpoint -- and I'd like the TC's
> charter to be broad enough to encompass this -- there is nothing wrong with
> a set of test cases that exercise all of the rendering primitives of ODF.

Only if the tests can be objective? For some definition of testable.
I'd like to see an example of this from the spec (or something that
could be written as a test (against ODF rather than a user scenario
such as the NIH (I'm sure that's not 'not invented here') one about
min font size etc).



>  These could be used by a vendor to self-test to see which ones have
> actually been implemented and which ones are lacking.  Note that I'm using
> the word "test" rather loosely here.  If the drawing primitive "means" to
> draw a quarter circle in the upper left quadrant with green fill and a red
> dotted edge, then one thing we can do is have a test cases called
> circle_1.odt with that test case, and a caption that says "quarter circle in
> the upper left quadrant with green fill and a red dotted edge".  I vendor
> could self test with this test case.  There is not standard-mandated meaning
> for that test file.  But a vendor can surely judge for themselves whether
> the rendering is what they intend.

Shades of gray Rob? Perhaps when printed out on size X paper,
Do we measure line thickness, against some colour standard etc.
I agree we could have a suite of subjective (user assessed) tests
such as this one which would be very useful as you note, though
I'd prefer these were grouped into the non-automated section(s).

Point: Are we expecting a suite of fully automated tests which
will run and give a black and white pass|fail?
Should individual tests be numbered or otherwise identified?
Should all tests be traceable back to the requirements spec
and to the ODF spec?



>
> So conformance tests check conformance with the standard.  They are of
> interest to purchasing agents as well as vendors.  Interoperability tests
> are used by a group of vendors to improve practical interoperability.

Is that the basis of our output/ the interop requirements? To assist
vendors in assessing
(how do we say this without using the word interop :-)
how an ODF instance is used and presents in their product, one wrt
another?)



  The
> value proposition of ODF increases to the extent practical interoperability
> increases.


+1

regards

-- 
Dave Pawson
XSLT XSL-FO FAQ.
http://www.dpawson.co.uk


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]