OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

xliff message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [xliff] Where we stand on conformance, and how to best resolve(RE: XLIFF TC Meeting 16 Nov 2010)


----- "Christian Lieske" <christian.lieske@sap.com> wrote:
> I think the excerpts from the guidelines match pretty well the
> suggestions I made. Thus, I see them as support for the points I was
> trying to make.
> 
> In addition, I am under the impression that we need to clearly
> distinguish between the following facets related to conformance:
> 
> 1. defining conformance
> 2. checking/testing conformance
> 3. claiming conformance
> 
> OASIS requires us to define conformance.
> 
> Having tools (e.g. test suites) that help to check/test conformance
> usually is very valuable. I consider the interoperability test which
> are required in some context to be also valuable types of
> checks/tests.
> 
> Conformance claims which are not justified are risky.

Revisiting your original proposal, I agree more and more, and see the striking similarity in thoughts. I am not sure I agree that we need to define the processor conformance on such a granular level (read/write etc) - but I'm confident we'll get more clarity in that as the specification moves forward. I would also like to point out the following paragraph on Conformance Testing from http://xml.coverpages.org/conform20000112.html (NIST)

"Ideally, we would like to be able to prove beyond any doubt that an implementation is correct, consistent, and complete with respect to its specification. However, this is generally impossible for implementations of nontrivial specifications that are written in a natural language.

The alternative is falsification testing, which subjects an implementation to various combinations of legal and illegal inputs, and compares the resulting output to a set of corresponding "expected results." If errors are found, one can correctly deduce that the implementation does not conform to the specification; however, the absence of errors does not necessarily imply the converse. Falsification testing can only demonstrate non-conformance. Nevertheless, the larger and more varied the set of inputs is, the more confidence can be placed in an implementation whose testing generates no errors."

I think that for a rather complex specification like XLIFF - although we hope to make it simpler - having test and validation tools are a good thing, but vendors will not be able to tick a "conformant-box" just because they pass a test suite.

(My vague memory recalls some activity in this direction w.r.t. TMX some years ago which I doubt was a success..)

cheers,
asgeir


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]