OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] (1)(d) A list of deliverables,with projected completion dates.


robert_weir@us.ibm.com wrote:
> 
> Another piece I suggest we start working on:  "(1)(d) A list of 
> deliverables, with projected completion dates".
> 
> However, I'd suggest we discuss this as if it said "a list of 
> prioritized list deliverables".  From a practical standpoint, it is 
> impossible to project completion dates until we have a good idea who 
> will be joining the proposed TC.  Those who do line up to join the TC 
> can huddle before we submit the charter and turn the prioritization into 
> projected dates.
> 
> So far I've heard the following items (in no particular order)
> 
> 1) A conformance test of ODF documents and implementations, i.e., test 
> the formal shall's and should's, etc.

Dave has asked that already: What is this exactly. A piece of software?

> 2) An Acid-style test of ODF implementations, i.e., feature and 
> rendering-oriented, essentially highlighting ODF features that are not 
> widely implemented (or implemented correctly) but are desired (by whom???)

As for "rendering-oriented": This is only applicable to text documents,
and maybe graphical documents, but not to spreadsheets and database
frontend documents (new in ODF 1.2). If we provide tests, then we should
provide tests that are applicable to all kind of documents. Another
issue with rendering-based tests is that rendering is system dependent.
On the one hand documents may render differently because of different
fonts that are installed, or different hyphenation rules, and so on. On
the other hand, and more important, documents can be rendered on very
different devices. On a small device, you may for instance want to
render a document differently than on a desktop. In a browser, you may
want to render a document without page breaks. Test cases in my opinion
should consider that.

As for "not widely implemented": Most vendors implement those features
that are important for their users. If a feature is not widely
implemented, then this is a hint that the feature is maybe not so
important. I think we should focus on those features actually that are
widely used rather than on test cases where we know that the features
are not widely implemented.

> 3) A comprehensive test suite of atomic (single feature) tests
> 4) A formal profile of ODF for portability and archiving, aka ODF/A
> 5) A formal profile of ODF for browser-based applications
> 6) A formal profile of ODF for mobile devices

I would suggest to combine 4-6 into a single deliverable, and add to
work out a definition what a profile exactly is, and how it could be
defined. Something like

6a) A definition of an "ODF profile" concept, where an "ODF profile" is
a well defined subset of ODF suitable to represent a certain class of
documents.
6b) A set of ODF profiles, including but not limited to
- A formal profile of ODF for portability and archiving, aka ODF/A
- A formal profile of ODF for browser-based applications
- A formal profile of ODF for mobile devices


> 7) A report on best practices for authoring portable documents
> 8) A periodic interoperability report on the state of ODF 
> interoperability, with specific recommendations for implementors.
> 
> What did I miss?

What I'm missing a little bit is to provide guidance for implementors.
Simply speaking, the best way to achieve interoperability between ODF
applications is that these application implement as many of ODF as
possible and reasonable for the specific application, and with as little
bugs as possible. Tests are helpful to measure the quality of an
implementation, but they don't help implementors with the implementation
itself.

So far we have suggestion for tests, but we do not have suggestion how
we can help implementors in their implementation work.

It would probably be too simple to just put an "ODF Implementors
Guidelines" on the list of deliverables, since we don't know if
implementors have issues with implementing ODF, and if so where. So,
preparing guidelines, which is a huge effort, without knowing where the
issues are has the risk we are doing something no one needs. But what
about having a mailing list or a forum at opendocument.xml.org that the
oiic TC moderates and that could be used as input for guidelines.

That is:

7) A mailing list or forum for implementors that is moderated by this TC
8) Optionally an implementation guideline document that summarizes
important guidelines discussed in the mailing list or forum.


Michael



> 
> Let's get the list complete, and then we can have a round of scoring, 
> where we each prioritize each deliverable on a scale of 1-10 (10 is 
> highest priority).  We can use that to inform the overall priorities and 
> eventual the projected schedule.
> 
> -Rob


-- 
Michael Brauer, Technical Architect Software Engineering
StarOffice/OpenOffice.org
Sun Microsystems GmbH             Nagelsweg 55
D-20097 Hamburg, Germany          michael.brauer@sun.com
http://sun.com/staroffice         +49 40 23646 500
http://blogs.sun.com/GullFOSS

Sitz der Gesellschaft: Sun Microsystems GmbH, Sonnenallee 1,
	   D-85551 Kirchheim-Heimstetten
Amtsgericht Muenchen: HRB 161028
Geschaeftsfuehrer: Thomas Schroeder, Wolfgang Engels, Dr. Roland Boemer
Vorsitzender des Aufsichtsrates: Martin Haering





[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]