OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] The importance to users of documents looking the same


On Thu, Jun 19, 2008 at 11:35 AM, Radoslav Dejanović
<radoslav.dejanovic@opsus.hr> wrote:
> robert_weir@us.ibm.com wrote:
>
>> Putting aside the question of achievability (or not) of "pixel
>> perfection", is there anyone here that doubts the proposition that the
>> proposed TC can and should improve the interoperability of rendered
>> output of ODF applications?
>>
>> Put differently, is there anyone who would refuse improvement if it is
>> not perfection?

Too vague to respond to. My study of the fundamental requirements for
competition in the market defined by a standard is that no
implementation should have its round-trip interoperability impaired by
the lack of specifity in a standard, without need to reverse engineer
other implemenations, negotiate profiles between indvidual developers,
file lawsuits for disclosure of specs, etc. Standards are supposed to
be agreements to end feature wars and to refocus R&D expense on
lowering the costs of production, providing better customer service,
etc., in a stable market. This does not mean that folks can't engage
in feature wars, but it does mean that conformant status cannot be
bestowed on the embraced and extended versions.

In the context of document format standards, my reading of the law is
that vendors are free to embrace and extend standards (so long as
other law doesn't get in the way, such as a monopolist extinguishing
standards through their embrace and extension. But if they wish to
claim conformance, their editor implementations must remain capable of
reading and writing to the unextended version of the standard. ODF
required this in OASIS ODF 1.0. But every interop requirement in the
spec got toggled off by a careless JTC 1 substitution of RFC 2119
requirements keyword definitions to ISO/IEC Guidelines defintions.

So I can agree that perfection is neither possible nor required by
law, but I cannot agree that there is no threshold requirement in the
degree of interop improvement. A standard must place all competitors
on an equal footing and the formats they create must be substitutable
in the market. A highly useful synthesis of the law governing the
leveling of the competitive playing field in a market defined by a
standard and both developer requirements is found in ISO/IEC/JTC 1
Directives:

"For the purpose of this policy statement, interoperability is
understood to be the ability of two or more IT systems to exchange
information at one or more standardised interfaces and to make mutual
use of the information that has been exchanged. An IT system is a set
of IT resources providing services at one or more interfaces.

...

"Standards designed to facilitate interoperability need to specify
clearly and unambiguously the conformity requirements that are
essential to achieve the interoperability. Complexity and the number
of options should be kept to a minimum and the implementability of the
standards should be demonstrable."

<http://isotc.iso.org/livelink/livelink.exe/fetch/2000/2489/186491/186605/AnnexI.html>.

That is a fair summary of the applicable law, developer requirements,
and user requirements.

With only slight modification to make clear to more people what is
only obvious to a lawyer or other grammarian, I've asked Rob whether
he can agree to something very similar being added to the charter as
being applicable to all deliverables

>>
>> If not, let's include a goal of improving that area in the charter.  The
>> fact that perfection is not possible should not prevent us from doing
>> what is doable.

Again, "improving" is not specfic enough for me.

> I might be wrong here, but I suppose that interoperability is about
> having information sent from point A to point B in such a way that it
> does not degrade in the process?
>

No. You describe intraoperability as opposed to  interoperability, a
1-way trip. Interoperability imports the notion of round-tripping. See
e.g., the JTC 1 definition quoted above ("mutal use)". A definition of
interoperability that has no relevant difference from that defined by
JTC 1 Directives above was held in the case of Commission v. Microsoft
to exclude 1-way interoperability and to require 2-way
interoperability. The Court also held that the degree of specficity in
the disclosures of Microsoft interop specs was that necessary to put
competitors on an "equal footing" with Microsoft's own software in
regard to interoperability.

Even 2-way interoperability does not fully capture the range of
interoperability that must be enabled by a standard's specification of
conformity requirements essential to achieve interoperability. A
conformant implementation of a lawful standard must be able to
interoperate with any and all other conformant implementations.

So we are not only concerned with the one-way trip  A > B. We are also
concerned with A <> B <> C <> D, etc. I am aware of no common term for
the interoperability that involves more than the round-trip between
two applications. I have a personal preference for using "business
process interop" as the shorthand term. E.g., in automated business
processes, the next app to process the data can be unpredictable.  The
sequence might be any variation of A <> B <> C <> D, etc. ordering.
The notion here is that any app that conforms to a standard must be
able to correctly process documents generated by any other conformant
app. If this cannot be done, it is the standard that is defective, not
the apps. And with ODF (and with OOXML for that matter), we face the
problem of repairing a defective standard.

> In that case, and this is about something called Open Document Format,
> I'd say that while retaining perfect rendering is crucial for things
> like images, I do not think that it does matter for, say, a spreadsheet
> or a letter to retain perfect alignment or flawless table boundaries, as
> long as important data - text, numbers, formula, etc. If I get a letter
> from my bank, I do want to be sure that the numbers are correct, I don't
> really care whether my office program would render it slightly more to
> the left than the original document written on other software.
>
> Rendering documents should be of primary concern to software developers,
> and let them handle that, it's their job.

I disagree to the extent that there are market requirements for
documents that do not define rendering, market requirements for
documents that require pixel perfect replication, and a continuum of
market requirements in between. This is only my personal opinion, but
I see pixel perfect replication as being adequately handled by the
existing PDF international standards. I suggest that we focus on two
parallel branches of profiles, one for the folks more concerned about
automated parsing, extraction, transformation, and document assemly,
and serialization of the asembled document for rendering as desired.
Let's call this the "business process branch" of profiles for now. The
second branch of profiles would focus on improved uniformity in
rendering. Let's call that the "better rendering" branch of profiles.

If we take that approach, we can serve what I believe to be a happy
medium of a range of market requirements, particularly if on each
paralleling profile, the business process profile is defined before
defining the corresponding profile in the better rendering branch. If
we begin from a core profile of features that must be supported by all
conformant implementations, we can then branch from there to the two
parallel branches of successively more featureful profiles. And if we
do the business process profile first at each layer, we can also
define the rules as we go for round-tripping of documents between the
two branches.

I have been unable to identify any other method of cleaning up the ODF
interop mess than by first defining a core profile and working our way
out to progressively supersetting profiles. We've got to get the
fundamentals right and implemented in the apps before we can even make
rational decisions about any more complex profile. Taking this
incremental approach also presents the developers with manageable
steps in implementing the profiles. E.g., if we profiled the entire
spec before beginning and woked our way inward, all we would do is to
document what is already non-interoperable and both the standard and
the implementations would have moved on long before we completed the
profile work.

But we really do need first to achieve consensus on interop being the
goal and then deciding whether compatibility with the existing
standard is a requirement. If it's the latter, then the ODF standard
must be rewritten and only the ODF TC can do that. To paraphrase, I
ask whether the goal is ODEF or ODF. Which is the tail and which is
the dog? .

Best regards,

Paul E. Merrelll, J.D. (Marbux)

-- 
Universal Interoperability Council
<http:www.universal-interop-council.org>


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]