OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] Interoperability versus Conformity


On Mon, Jun 9, 2008 at 12:29 AM, Dave Pawson <dave.pawson@gmail.com> wrote:
> 2008/6/8  <robert_weir@us.ibm.com>:
>>
>> "Dave Pawson" <dave.pawson@gmail.com>
>
>>> The relationship between conformity and interop has me confused.
>>> I don't understand that.

Under ISO/IEC/JTC 1 Directives, "interoperability is understood to be
the ability of two or more IT systems to exchange information at one
or more standardised interfaces and to make mutual use of the
information that has been exchanged."
<http://www.jtc1sc34.org/repository/0856rev.pdf>, pg. 145.

A definition of "interoperability" with no relevant differences was
interpreted last September in the case of European Commission v.
Microsoft to require 2-way -- as opposed to 1-way -- interoperability
with disclosure of sufficient specificity to allow competitors'
software to interoperate with Microsoft's sofware on an "equal
footing" with other Microsoft software. See e.g., this quotation from
the Court of First Instance opinion.
<http://www.universal-interop-council.org/node/4#N_6_>.

The relationship between conformity and interop (and interoperability
assessment) is spelled out in the following paragraph from the same
page of JTC 1 Directives, supra:

*"Standards designed to facilitate interoperability need to specify
clearly and unambiguously the conformity requirements that are
essential to achieve the interoperability.* Complexity and the number
of options should be kept to a minimum and the implementability of the
standards should be demonstrable. Verification of conformity to those
standards should then give a high degree of confidence in the
interoperability of IT systems using those standards. *However, the
confidence in interoperability given by conformity to one or more
standards is not always sufficient and there may be need to use an
interoperability assessment methodology in demonstrating
interoperability between two or more IT systems in practice."*

The emphasized first sentence in that paragraph explains the
relationship between conformity and interop, mandating that the
conformity requirements essential to achieve interop must be "clearly
and unambiguously" specified in the standard itself. The emphasized
last sentence in the paragraph identifies interoperability assessment
methodology as something that may be necessary in addition to
conformity assessment procedures for edge cases where conformity
assessment procedures alone are insufficient to give high confidence
of interoperability "in practice," i.e., that interoperability must be
demonstrated, whatever it takes to get to that goal. (Rob and I are
not agreed that interoperability must be *demonstrated,* as opposed to
*demonstrable.* See our previous conversation here.
<http://lists.opendocumentfellowship.com/pipermail/odf-discuss/2008-April/thread.html#7276>.)

That paragraph precedes the one quoted by Rob earlier, repeated here
for convenience:

"An assessment methodology for interoperability may include the
specification of some or all of the following: terminology, basic
concepts, requirements and guidance concerning test methods, the
appropriate depth of testing, test specification and means of testing,
and requirements and guidance concerning the operation of assessment
services and the presentation of results. In technical areas where
there is a conformity assessment methodology and an interoperability
assessment methodology, the relationship between them must be
specified."

The principle barrier for this group, I think, is that ODF has almost
no interop conformity requirements whatsoever beyond validation
against the schema after all foreign elements and attributes are
removed.  See e.g., section 1.5 (Conformance) in ODF v. 1.1,
<http://docs.oasis-open.org/office/v1.1/OS/OpenDocument-v1.1.pdf>.

But development of conformity requirements is unmistakably the task of
the ODF TC. This proposed TC could only make recommendations in that
regard. Moreover, I see huge difficulties in identifying what
interoperability assessment methodologies are necessary before ODF
gains "conformity requirements essential to achieve the
interoperability." As the language quoted above makes plain,
interoperability assessment procedures are only for the edge cases
where conformity assessment procedures cannot provide the high
confidence in interoperability that the Directives mandate.

Here is my short list of the principle interop checkpoints that ODF fails:

1.  Full-featured editors available that are capable of not generating
application-specific extensions to the formats?

2.  Interoperability of implementations mandatory?

3.  Interoperability between different IT systems either demonstrable
or demonstrated?

4.  Profiles developed and required for interoperability?

5.  Methodology specified for interoperability between less and more
featureful applications?

6.  Specifies conformity requirements essential to achieve interoperability?
7.  Interoperability conformity assessment procedure(s) formally
established and validated?

8.  Document validation procedures valid?

9.  Specifies an interoperability framework?

10. Vendor-specific extensions classified as non-conformant?

11. Preservation of metadata necessary to achieve interoperability mandatory?

12. XML namespaces for incorporated standards properly implemented?
(ODF-only failure because Microsoft didn't incorporate any relevant
standards.)

13. Optional feature interop breakpoints eliminated?

14. Scripting language fully specified for embedded scripts?

15. Hooks fully specified for use by embedded scripts?

16. Standard is vendor- and application-neutral?

17. Capable of converging desktop, server, Web, and mobile device
editors and viewers?

While the above interop breakpoints might seem like a dismaying mess
to repair, I suggest that a good starting point would be to focus this
proposed TC's work in addressing them within the ambit of the W3C's
Compound Document by Reference Framework ("CDRF"),
<http://www.w3.org/TR/2007/CR-CDR-20070718/>. That interoperability
framework has numerous advantages, e.g.:

a. CDRF is a well-developed and long-implemented XML interop
framework, introduced in 1995.

b. CDRF is truly vendor-neutral.

c. CDRF is fully compatible with both ODF and OOXML if those two
standards are profiled.

d.  CDRF includes exacting requirements for the development and
processing of profiles.

e. CDRF provides a framework for the interoperability of less and more
featureful applications. E.g., from the conformance section: "A
conformant user agent of a superset profile specification must process
subset profile content as if it were the superset profile content."

f. CDRF and CDF have an undeserved poor reputation in some quarters
due to a widely-publicized but highly erroneous article. I've
addressed those errors here.
<http://www.universal-interop-council.org/node/4>.

g. Were ODF and OOXML profiles developed that correspond to the
feature sets of the W3C Web Integration Compound Document ("WICD")
profiles with attention to lossless transformability, the WICD
profiles could provide the meta-language needed for conversions
between ODF and OOXML as well as convergence of desktop, server,
mobile devices, and Web editors and viewers. See WICD profiles at
<http://www.w3.org/2004/CDF/>. (But note the permissiveness for
application-specific extensions in the incorporated SVG Tiny profile.)
On the need for an intermediate meta-language, see the discussion
between Patrick Durusau and Rick Jelliffe in the comments here.
<http://www.oreillynet.com/xml/blog/2007/07/can_a_file_be_odf_and_open_xml.html>.

h. Were the last paragraph accepted as a goal, then the WICD Test
Suite 1.0 could be adapted as a model for the ODF and OOXML test
suites, with vendor-neutral renderings of features and short sequences
of features already developed.
<http://www.w3.org/2004/CDF/TestSuite/WICD_CDR_WP1/>; see especially
<http://www.w3.org/2004/CDF/TestSuite/WICD_CDR_WP1/wicdmatrix.xhtml>
and <http://www.w3.org/2004/CDF/TestSuite/WICD_CDR_WP1/wicdmatrix.xhtml?implemented2>.
Such an approach would also promote the convergence of desktop,
server, mobile device, and Web editors and viewers in a vendor-neutral
way.

i. In summary, CDRF provides an already-developed cookbook for the
major tasks this proposed TC would need to perform.

The major barrier I foresee for such an effort would be in persuading
all of the major players to work from the same script, i.e., Sun, IBM,
Microsoft, W3C, and the major web browser developers for both the
desktop and mobile devices.  For example, the major web browser
developers are not all that keen on supporting XHTML2, preferring
HTML5 (WHATWG). (IMHO, HTML5 may be the cat's meow for browser
developers, but is severely crippled as a standard for the
interoperable exchange of documents among web app editors, lacking
such basic defined elements as table of contents, footnotes, and
footnote calls. Likewise, such definition is also lacking in CSS,
which is in any event commonly implemented in site-wide template files
rather than at the document level.)

Pressure from regulatory authorites and government IT departments
might be necessary to encourage the needed collaboration.

My 2 cents,

Paul Merrell (Marbux)


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]