OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oic-comment message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: ODF-shots proposal


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Dear Bart et al,

as promised the proposal concerning an online ODF implementation check
tool. We could have the API and a first version up and running in by the
second week of January, as it now stands. It is up to the individual
vendors and/or communities to then set up their part of the service.

Kind regards,
Michiel Leenaars
Director of Strategy
NLnet Foundation
Vice-chair OpenDoc Society

~   ************ ODF implementation check tool ************

Summary: A proposal to set up a online service where end users can
compare the output of different ODF implementations for their own
documents.

Detailed proposal:

With ODF implementations and new types of applications appearing on the
market at significant speed - varying in maturity and thus quality -
comes the undesirable task for users to predict how their documents
will look ('work') under unpredictable conditions. How does my document
that renders perfectly in OpenOffice.org 3.0 and in Symphony 1.2 look
in Microsoft Office 2000 with the ODF addin from Microsoft or the one
from Sun Microsystems? And how does it look on Mac OS X in version Y of
software package Z?

Especially when people are preparing heavily styled organisational
templates this can be very challenging. Since in some cases - for
instance the new Netherlands governmental templates, which are to be
used by 1600 different government organisations for all their day to
day documents - these templates will quickly lead to millions of
identically styled documents such a check is no luxury. Due to the
relative inexperience with ODF in designer circles also the quality of
the ODF code itself may vary, despite the high control the designer has.
It is imperative that people can check different ODF implementations
(and versions of these implementations), so that they can isolate any
errors in either the document or individual implementations.

Here we can learn from the web standards world. After submitting a site
link to a service called browsershots.org, the site delivers the result
of the rendering of such a page in a great variety of different browsers
and different versions of those browsers. It will be an invaluable
service to the community - and a great benchmarking tool - if we can
provide the same for ODF. In addition, it will be a great help in
converging ODF implementations and the handling of legacy document
formats as well.

* How should it work

- - A user uploads a document to a website connected to the lead server
~  and selects which available implementations he wants to check
~  against and how he wants the results presented
- - The document is checked for known viruses (and maybe rough
~  conformance to the ODF spec)
- - A script picks up the document and distributes it across a number
~  of (virtual) servers running different versions of diffent
~  applications. These either run somewhere in the cloud, or on
~  independently managed machines operated by vendors and/or the
~  community.
- - Locally the documents are placed in a stack and processed one
~  by one. The documents are exported in either a standardised way
~  to Postscript or PDF, or in a non-standardised way (for those
~  applications that have their own print engines for this).
- - The results are fed back to the lead server which may perform
~  postprocessing (like combining PDF's into a side-by-side
~  presentation)
- - The user is notified, either by email, an Atom/RSS feed etc.
- - The user is provided with a web view of the documents in order
~  to spot quick differences

Providing an API for automated use inside applications may be
interesting.

* What is necessary?

On the lead server side the common infrastructure needs to be setup,
but this is probably very lightweight (upload, distribute, receive,
present). The slave servers are set up by individual vendors and/or
communities, conforming to a simple API that allows to receive
documents, process them through the application and send the results
back. Probably this works out of the box on most applications, however
some applications may need some scripting to produce the necessary
PDF/Postscript output from the application.

The service should be (at least in this phase) be set up decentralized:
every vendor maintains their own server and obeys a (very simple)
common API. This is to be preferred over a situation where management
is central (unless there is adequate funding for this, and every vendor
provides his software in a ready to run and fully licensed virtual
machine). The advantage of a decentralized set-up is that bootstrapping
the service can be done very easily and at low cost - which is why this
model is used by the 'original' browsershots.org. The cost for running
the service will be at any rate be much lower than in the situation
where maintaining a number of (virtual) servers is necessary.

Q & A

* Isn't this covered by a test suite or some tool already available?

No, it is not. This is for individual end users that have a source
document that they want to check for consistency across
implementations. Once fidelity is 100%, the need for it may vanish.
However, it will be of great service for setting up the test material.

* Should we limit this to ODF?

That is a good question. Why not allow RTF, XBIFF (old binary Microsoft
formats) and other legacy formats? That way each and every user can find
out that these (by some labelled as 'proven') formats may not in fact be
as stable at all. Perhaps even the new ISO archiving format OOXML should
be included, so that the current state of the art for OOXML applications
and potential interoperability problems across platforms and
applications can be made visible.

* Why should vendors care about this now?

ODF as a file format is no longer tied to an individual implementation,
and this is why we only now see the need for such a service. If any big
vendor makes mistakes in the ODF implementation, you will be blamed for
it so long as you can't prove the opposite. People are more likely to
consider you when they know your software works with their documents -
and it helps them to identify either design errors in their ODF
documents or to single out erroneous implementations.

* How about mobile applications

That is a good one. Probably these only render on screen and in a
highly simplified way, however it might be interesting for people to
see what the output is. After all, mobile (small screen) use is
predicably a significant potential market. Perhaps this can be done the
'browsershots.org' way - i.e. providing stitched-together screenshots.

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (FreeBSD)

iEYEARECAAYFAkk2oGYACgkQPPKB2FVlk18aAACfdtD+6xas/bst9sGZ0PdbxBGJ
SfAAnRKWui0mVttNwAx5p5eiJY6CPjbG
=EqOO
-----END PGP SIGNATURE-----



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]