OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]

Subject: Interoperability versus Conformity

"Dave Pawson" <dave.pawson@gmail.com> wrote on 06/07/2008 01:56:37 AM:

> 2008/6/6  <robert_weir@us.ibm.com>:
> >
> > "Dave Pawson" <dave.pawson@gmail.com> wrote on 06/06/2008 03:00:19 PM:
> >> We just need to imagine the mapping.
> >> What shall be tested
> >> What may be tested
> >> What output is expected (hopefully mostly objective!)
> >> Any subjective tests might lie in the nice to have domain.
> >
> > In ISO lingo we have a "interoperability assessment methodology" which is
> > defined as:
> >
> > "An assessment methodology for interoperability may include the
> > specification of some or all of the following: terminology, basic
> > concepts, requirements and guidance concerning test methods, the
> > appropriate depth of testing, test specification and means of testing,
> > and requirements and guidance concerning the operation of assessment
> > services and the presentation of results. In technical areas where
> > there is a conformity assessment methodology and an interoperability
> > assessment methodology, the relationship between them must be
> > specified."
> I prefer the non ISO lingo - in particular 'means of testing'? Surely that's
> telling an implementer 'how' to do it?
> The relationship between conformity and interop has me confused.
> I don't understand that.

Excellent point.  Let me state my interpretation.  And anyone let me know if they think this contradicts how they think of the terms.

Conformity is the relationship between a technological artifact and the standard or standards that defines that technology.  So in our present case, conformity is the relationship between an ODF document, or an application that produces or consumers an ODF document, and the ODF standard.  The artifact is conformant with the standard when it implements all required provisions of the standard, and implements none of the prohibited provisions of the standard.

Conformity can be stated as black or white :  "Application X conforms, Document Y does not conform" or as partial conformance, "Application Z conforms to parts 1 2 and 3, except for Part 3, clause 26 and 29."  

Interoperability, on the other hand, is the relationship between two or more technological artifacts that implement the same protocol or protocols.  I can't give you a crisp black and white definition here.  But I can suggest some analogies.

First, consider the C/C++ programming languages.  Both define formal provisions, and a compiler implementation, or a program file, could be tested as to whether it conforms to the underlying programming language standards.  However, this does not guarantee that two conformant C++ compilers will create programs that yield the same runtime results.  This is because the C/C++ standards have items that are undefined and implementation-defined, like the size of an integer, or the sign of of a character, etc.  This is well-known to practitioners -- they know where the bodies are buried -- and there are a variety of practices which they know to institute if they want to create interoperable C/C++ code (or portable code as it is more often termed in this space).

Further, a more mature expression of these interoperability constraints (and that is what they really are -- additional constraints beyond the C/C++ standards)can be written up in detail and agreed to by a set of vendors, becoming a standard that defines conformance of "portable C/C++" within a particular domain.  For example, Embedded C++ took that route, as a proper subset of ISO C++.  PDF/A did that as well, a constrained subset of PDF to increase interoperability in a particular domain.

So "interoperability" in the large is not something we'd just want to go out and start testing.  But we could define, for example, a proper subset of ODF for browser-based implementations, which contained only rendering primitives that could be losslessly mapped to the HTML/CSS2 model.  I could also imagine a profile of ODF for desktop use.

But I don't think we need to go that far initially.  We would make progress even with a check list of ODF features, set out in a table, and some simple tests cases that allowed an implementation to verify whether or not the features are even implemented.  Even doing this much would raise awareness of missing functionality, and when that is addressed interoperability increases.

Note also something unintuitive -- a high degree of interoperability is possible even without conformity.  I know this may sound like sacrilege, but a look at the web itself, where only a small fraction of web pages are valid HTML or XHTML.

Ideally, of course, you aim for both.


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]