OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] (1)(d) A list of deliverables, with projected completion dates.


2008/6/13 Michael Brauer - Sun Germany - ham02 - Hamburg
<Michael.Brauer@sun.com>:


>> 2) An Acid-style test of ODF implementations, i.e., feature and
>> rendering-oriented, essentially highlighting ODF features that are not
>> widely implemented (or implemented correctly) but are desired (by whom???)
>
> As for "rendering-oriented": This is only applicable to text documents,
> and maybe graphical documents, but not to spreadsheets and database
> frontend documents (new in ODF 1.2).


You sound very positive and authoritative there Michael?
On what basis do you make that statement?
Since you don't provide any definitions to back it up, I'll let it pass
but please note I disagree with the statement.


> If we provide tests, then we should
> provide tests that are applicable to all kind of documents.

+1

> Another
> issue with rendering-based tests is that rendering is system dependent.

Another assumption Michael? We haven't tied it down to an objective
test, but we are trying. Perhaps you can help?



> On the one hand documents may render differently because of different
> fonts that are installed, or different hyphenation rules, and so on. On
> the other hand, and more important, documents can be rendered on very
> different devices. On a small device, you may for instance want to
> render a document differently than on a desktop. In a browser, you may
> want to render a document without page breaks. Test cases in my opinion
> should consider that.

We haven't finished yet, but profiles (a term we haven't tied down as yet)
seem an option that will handle that. Rendering may be profile based,
e.g. different for a braille output compared to a 24 inch plasma display
compared to an A4 printout compared to browser X.


>
> As for "not widely implemented": Most vendors implement those features
> that are important for their users. If a feature is not widely
> implemented, then this is a hint that the feature is maybe not so
> important.

How about "Not so important to the Vendor who made that decision"
or some variant of that. Compliance might be a better term, and
compliance will be with respect to the standard (1.0, 1.1 or 1.2
which also has not yet been decided)


> I think we should focus on those features actually that are
> widely used rather than on test cases where we know that the features
> are not widely implemented.

With what objective please?
A conformance check on the easily implemented tests (or weakly defined
features)?

>
>> 3) A comprehensive test suite of atomic (single feature) tests
>> 4) A formal profile of ODF for portability and archiving, aka ODF/A
>> 5) A formal profile of ODF for browser-based applications
>> 6) A formal profile of ODF for mobile devices
>
> I would suggest to combine 4-6 into a single deliverable, and add to
> work out a definition what a profile exactly is, and how it could be
> defined. Something like
>
> 6a) A definition of an "ODF profile" concept, where an "ODF profile" is
> a well defined subset of ODF suitable to represent a certain class of
> documents.

I hadn't thought of a profile as a subset of the standard. Thanks.



> 6b) A set of ODF profiles, including but not limited to
> - A formal profile of ODF for portability and archiving, aka ODF/A
> - A formal profile of ODF for browser-based applications
> - A formal profile of ODF for mobile devices

Second and third examples match what I have heard so far
as examples of profiles. I.e. implementation areas for applications.
Do they match a subset of ODF features?




> What I'm missing a little bit is to provide guidance for implementors.
> Simply speaking, the best way to achieve interoperability between ODF
> applications is that these application implement as many of ODF as
> possible and reasonable for the specific application, and with as little
> bugs as possible. Tests are helpful to measure the quality of an
> implementation, but they don't help implementors with the implementation
> itself.


I hope we can prove you wrong there Michael. Passing well designed
tests (of a document compliance with the standard) will, I hope
help with both implementation and interoperability.


>
> So far we have suggestion for tests, but we do not have suggestion how
> we can help implementors in their implementation work.

I didn't read that in the charter Michael. Could you point it out please?


>
> It would probably be too simple to just put an "ODF Implementors
> Guidelines" on the list of deliverables, since we don't know if
> implementors have issues with implementing ODF, and if so where.

Out of scope for this group?


So,
> preparing guidelines, which is a huge effort, without knowing where the
> issues are has the risk we are doing something no one needs.

Yes. Perhaps the main TC could help there by finding out where
implementation difficulties are?


 But what
> about having a mailing list or a forum at opendocument.xml.org that the
> oiic TC moderates and that could be used as input for guidelines.


Only if it is added to this groups charter.


>
> That is:
>
> 7) A mailing list or forum for implementors that is moderated by this TC
> 8) Optionally an implementation guideline document that summarizes
> important guidelines discussed in the mailing list or forum.

We only have 90 days Michael. Perhaps the implementers could do
that job on their own? They have most to gain from it?


regards




-- 
Dave Pawson
XSLT XSL-FO FAQ.
http://www.dpawson.co.uk


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]