| [Thread Prev]
| [Thread Next]
| [Date Next]
| [Thread Index]
| [List Home]
Subject: Re: [oiic-formation-discuss] Interoperability versus Conformity
- From: firstname.lastname@example.org
- To: email@example.com
- Date: Mon, 9 Jun 2008 17:53:10 -0400
"Dave Pawson" <firstname.lastname@example.org>
wrote on 06/09/2008 03:29:16 AM:
> 2008/6/8 <email@example.com>:
> > "Dave Pawson" <firstname.lastname@example.org>
> >> The relationship between conformity and interop has me confused.
> >> I don't understand that.
> > Excellent point. Let me state my interpretation. And
anyone let me know if
> > they think this contradicts how they think of the terms.
> > Conformity is the relationship between a technological artifact
> > standard or standards that defines that technology. So
in our present case,
> > conformity is the relationship between an ODF document, or an
> > that produces or consumers an ODF document, and the ODF standard.
> > artifact is conformant with the standard when it implements all
> > provisions of the standard, and implements none of the prohibited
> > of the standard.
> A nuance there I think we need to clarify.
> an ODF document, or an application
> that produces or consumers an ODF document,
> Which? IMHO ODF specifies what a document instance should be/do.
> It says nothing about producers or consumers?
> Which are we testing.
Actually, ODF defines conformance classes for both
documents and applications, though it is certainly far more detailed on
See for example, ODF 1.0, section 1.5. It isn't
deep, but it is testable, or at least a claim of ODF application conformance
There are also hundreds of items that are simply stated,
but not with the operative "shall" or "should" language.
For example, page headers can display the current page number. Very
basic functionality. ODF 1.0 defines this in 6.2.3 as:
number fields display the current page number. These fields are particularly
useful in headers and footers. E.g., if a page number field is inserted
into a footer, the current page number is displayed on every page on which
the footer appears."
Now maybe we should have phrased it as, "Conformant
ODF consumers which implement the page number field defined in this section
shall render <text:page-number>
with the current page number", or something like that. The way
we have it stated in ODF 1.0 is strictly speaking a definition, and not
an implementation requirement. But even without the control language,
the meaning is clear, unambiguous, and it is reasonable for a test suite
(interoperability not conformity) to give a warning if an implementation
as (page-number-3)/2 rather than as the current page number.
> Another refinement Rob. "A" Document? I guess we could produce
> that minimally is valid to one part of the standard.
Yes. I think these "atomic" tests
that exercise features in isolation can help isolate problems. Of
course, we still need features used on combination, since there are unique
defects that can be exhibited only under those circumstances.
> How do we say that (a suite of) documents from vendor X complies
> with all the aspects of the standard that it uses/should be compliant
> See the point? Vendor X may only implement a word processor.
> Vendor Y implements all 3. We need to address this 'scope' issue?
There is some guidance in ODF 1.0, Appendix D "Core
Feature Sets" which defines which features are applicable to text,
spreadsheets, drawings, presentations charts and images. But it is
a fair point. And ODF is not limited to those convention uses either.
One thing ODF does that makes our life a little easier
is that it reuses the same markup across application types for the basic
building blocks. So font styles, tables, etc., are represented in
XML the same in word processor versus spreadsheets, etc. So I would
imagine that a test suite for text styles would be defined once in the
abstract and then "inserted" into shell ODT/ODS/ODP documents
via a script, to make versions appropriate to ODF word processors, spreadsheets
> > Conformity can be stated as black or white : "Application
> > Document Y does not conform" or as partial conformance,
> > conforms to parts 1 2 and 3, except for Part 3, clause 26 and
> And, possible, Test groups 123-148 not run, hence untested.
> (E.g. where a vendor doesn't implement a presentation app)
> > Interoperability, on the other hand, is the relationship between
two or more
> > technological artifacts that implement the same protocol or protocols.
> > can't give you a crisp black and white definition here. But
I can suggest
> > some analogies.
> <grin/> Just what has Rob swallowed!
> "technological artifacts that implement the same protocol"
> Agreed we need the terms for a spec, I'd prefer plain English for
The nuance I was trying to suggest is that interoperability
is between two or more implementations. Maybe applications, services,
command line tools, etc. This isn't an exclusive definition, but
the one that comes to mind. You can also talk about interoperability
between two markup standards that stand in a relationship with each other,
but that is a more abstract concept and we don't need to go there.
> For our case, I'm interpreting this as 'Two word
processor instances created
> by different vendors'. Is that right?
That would be the main case, yes. But I believe
that two word processors created by the same vendor, say version N and
version N+1, could be tested with regards to interoperability without abusing
> > First, consider the C/C++ programming languages. Both define
> > provisions, and a compiler implementation, or a program file,
> > tested as to whether it conforms to the underlying programming
> > standards. However, this does not guarantee that two conformant
> > compilers will create programs that yield the same runtime results.
> > because the C/C++ standards have items that are undefined and
> > implementation-defined, like the size of an integer, or the sign
of of a
> > character, etc. This is well-known to practitioners --
they know where the
> > bodies are buried -- and there are a variety of practices which
they know to
> > institute if they want to create interoperable C/C++ code (or
> > as it is more often termed in this space).
> Scary but understood. Thanks.
> > Further, a more mature expression of these interoperability constraints
> > that is what they really are -- additional constraints beyond
> > standards)can be written up in detail and agreed to by a set
> > becoming a standard that defines conformance of "portable
C/C++" within a
> > particular domain. For example, Embedded C++ took that
route, as a proper
> > subset of ISO C++. PDF/A did that as well, a constrained
subset of PDF to
> > increase interoperability in a particular domain.
> No, you've lost me. Is this the two C++ compilers *without* the bodies
> I.e. the subsets that produce identical results, or the 'list of bodies'
I am saying this: the C++ standard had permitted
implementation-specific behavior. This was allowed variability in
implementations conformant to the same standard. In order to increase
interoperability among a specialized set of implementations (embedded devices)a
number of vendors in this area defined a set of additional constraints
on C++ that reduced the variability allowed in implementations of Embedded
C++. So at that point you have 2 standards and 2 sets of implementations,
and interoperability looks like this:
a) among C++ implementations, consistency of runtime
results of the same C++ code is not the guaranteed , due to the allowed
variability in runtime behaviors.
b) among Emebdded C++ implementations, consistency
of runtime results is more prevelant, since they agreed to reduce variability
by adopting additional constraints.
c) All conformant Embedded C++ programs are also conformant
C++ programs since Embedded C++ added only constraints
d) Not all conformant C++ programs are also conformant
Embedded C++ programs for that same reason.
I'm not saying that profiles like this solve all our
problems. There is likely a bunch of good we can do without touching
a profile. But it is an arrow in our quiver, and we should keep it
in mind places where it might be appropriate.
> > So "interoperability" in the large is not something
we'd just wantto go out
> > and start testing. But we could define, for example, a
proper subset of ODF
> > for browser-based implementations, which contained only rendering
> > that could be losslessly mapped to the HTML/CSS2 model. I
> > imagine a profile of ODF for desktop use.
> Wow. This is miles different from my own assumptions.
> This sounds like
> "The subset of the Word processor parts of the standard
> that should be common between any two implementations"
> Is that right? Especially the 'should be', i.e. we go hunting
> the mandatory bits, then move out towards the 'hard to spec'
> bits, determining what is a reasonable stop point?
> E.g. Omit pixel perfect visual output of an example line of
Whether it is "should" or "shall"
is a decision we would need to make, if we define a profile. Remember,
none of this changes conformance at the level of any of the ODF standards.
But a profile is a standard, and can define its own conformance.
So we could have a profile called "A Profile for web-based rendering
of ODF documents" that defines conformance to that profile in terms
of additional constraints on ODF application behavior. This might
include subsetting the elements and attributes in the document conformant
to that profile. This then opens the door to having a conformity
assessment definition that tests these additional constraints.
It really depends on the interest level. If
we had 2 or 3 vendors or projects who were dealing with web rendering of
ODF documents, and wanted to negotiated a higher level of interoperability
among their implementations, then I think it would be reasonable to accomplish
this in the ODF IIC TC.
> I do like the idea of profiles though.
> The fluffy definition might be "The parts of the standard that
> a user would reasonably expect to produce identical output"
> in profile X for application Y. "
I'm not sure we're using the word "profile"
the same way. XHTML Basic is what I'd called a "profile"
for example (http://www.w3.org/TR/xhtml-basic/).
A document set of constraints on an existing standard intending to
increase interoperability in a particular domain.
You can also have profiles that add functionality.
I don't want to deny that they exist, but they can cause as many
interop problems as they solve, so I am not talking about them.
> > But I don't think we need to go that far initially. We
would make progress
> > even with a check list of ODF features, set out in a table, and
> > tests cases that allowed an implementation to verify whether
or not the
> > features are even implemented. Even doing this much would
> > of missing functionality, and when that is addressed interoperability
> > increases.
> A long slog, but doable. It would highlight weaknesses in the standard
> The other key aspect would be an inter-dependence table?
> This feature depends on that feature|para|clause whatever.
> I see this as the pre-conditions for a test etc.
> Equally I can't test this feature unless I have passed this list
> of tests.
This all makes good sense.
> > Note also something unintuitive -- a high degree of interoperability
> > possible even without conformity. I know this may sound
like sacrilege, but
> > a look at the web itself, where only a small fraction of web
> > HTML or XHTML.
> Yes, I can see that :-)
> Thanks Rob. Enlightening.
> Dave Pawson
> XSLT XSL-FO FAQ.
> To unsubscribe, e-mail: oiic-formation-discuss-
> For additional commands, e-mail: oiic-formation-discuss-
| [Thread Prev]
| [Thread Next]
| [Date Next]
| [Thread Index]
| [List Home]