OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oiic-formation-discuss] (1)(d) A list of deliverables, with projectedcompletion dates.



Michael.Brauer@Sun.COM wrote on 06/13/2008 07:38:55 AM:

> robert_weir@us.ibm.com wrote:
> >
> > Another piece I suggest we start working on:  "(1)(d) A list of
> > deliverables, with projected completion dates".
> >
> > However, I'd suggest we discuss this as if it said "a list of
> > prioritized list deliverables".  From a practical standpoint, it is
> > impossible to project completion dates until we have a good idea who
> > will be joining the proposed TC.  Those who do line up to join the TC
> > can huddle before we submit the charter and turn the prioritization into
> > projected dates.
> >
> > So far I've heard the following items (in no particular order)
> >
> > 1) A conformance test of ODF documents and implementations, i.e., test
> > the formal shall's and should's, etc.
>
> Dave has asked that already: What is this exactly. A piece of software?
>


Sorry, I stated it too loosely.  The base deliverable is a "conformity assessment methodology" (I've also heard it called a "test requirements document"), a document that details the requirements of a conformance testing tool.  This would mainly be a task of collecting and collating the normative provisions of each ODF version, along with provisions in referenced standards, and putting them in a logical order, noting dependencies, assigning each one an ID, etc.  It would also define scoring and reporting requirements for a conformance test.

Then, we obviously would want an implementation of that assessment methodology.    The W3C develops and hosts Validators for many of their tools online. However, I have not seen this done at OASIS.  I don't know if this is for lack of interest or because it is somehow problematic.

But in general, whether we develop the tool in OASIS or outside of OASIS, we start with the assessment methodology document.


> > 2) An Acid-style test of ODF implementations, i.e., feature and
> > rendering-oriented, essentially highlighting ODF features that are not
> > widely implemented (or implemented correctly) but are desired (by whom???)
>
> As for "rendering-oriented": This is only applicable to text documents,
> and maybe graphical documents, but not to spreadsheets and database
> frontend documents (new in ODF 1.2). If we provide tests, then we should
> provide tests that are applicable to all kind of documents. Another
> issue with rendering-based tests is that rendering is system dependent.
> On the one hand documents may render differently because of different
> fonts that are installed, or different hyphenation rules, and so on. On
> the other hand, and more important, documents can be rendered on very
> different devices. On a small device, you may for instance want to
> render a document differently than on a desktop. In a browser, you may
> want to render a document without page breaks. Test cases in my opinion
> should consider that.
>


I agree.  Whatever the TC produces, whether conformance tests, acid tests, interoperability tests should be traceable to a provision of the ODF standard, or to a profile of the ODF standard that the proposed TC creates.  There is no value in testing what is essentially implementation-defined behavior.

But with spreadsheet, I could imagine some things that could work for an Acid-test.  For example, couldn't you resize rows and columns and make cells have color fills that make a picture?  So each cell is like a pixel?  Of course, I think the average user would be more interested in formula calculations than cell colors, but you could mix this.  So maybe a cell does a calculation and using an IF() statement shows a character or not depending on the correctness of the calculation.  You could then have a passing test suite show an "ASCII Art" picture.  Weird, but it could be done.

> As for "not widely implemented": Most vendors implement those features
> that are important for their users. If a feature is not widely
> implemented, then this is a hint that the feature is maybe not so
> important. I think we should focus on those features actually that are
> widely used rather than on test cases where we know that the features
> are not widely implemented.
>


In many cases, yes. It could be a deliberate choice by the vendor.  

> > 3) A comprehensive test suite of atomic (single feature) tests
> > 4) A formal profile of ODF for portability and archiving, aka ODF/A
> > 5) A formal profile of ODF for browser-based applications
> > 6) A formal profile of ODF for mobile devices
>
> I would suggest to combine 4-6 into a single deliverable, and add to
> work out a definition what a profile exactly is, and how it could be
> defined. Something like
>
> 6a) A definition of an "ODF profile" concept, where an "ODF profile" is
> a well defined subset of ODF suitable to represent a certain class of
> documents.
> 6b) A set of ODF profiles, including but not limited to
> - A formal profile of ODF for portability and archiving, aka ODF/A
> - A formal profile of ODF for browser-based applications
> - A formal profile of ODF for mobile devices
>
>


The concept of a profile is well-defined.  The W3C does them all the time.  OASIS has them as well.  But it is a good point that we would benefit from having a document that outlines how to profile ODF.


> > 7) A report on best practices for authoring portable documents
> > 8) A periodic interoperability report on the state of ODF
> > interoperability, with specific recommendations for implementors.
> >
> > What did I miss?
>
> What I'm missing a little bit is to provide guidance for implementors.
> Simply speaking, the best way to achieve interoperability between ODF
> applications is that these application implement as many of ODF as
> possible and reasonable for the specific application, and with as little
> bugs as possible. Tests are helpful to measure the quality of an
> implementation, but they don't help implementors with the implementation
> itself.
>

> So far we have suggestion for tests, but we do not have suggestion how
> we can help implementors in their implementation work.
>
> It would probably be too simple to just put an "ODF Implementors
> Guidelines" on the list of deliverables, since we don't know if
> implementors have issues with implementing ODF, and if so where. So,
> preparing guidelines, which is a huge effort, without knowing where the
> issues are has the risk we are doing something no one needs. But what
> about having a mailing list or a forum at opendocument.xml.org that the
> oiic TC moderates and that could be used as input for guidelines.
>
> That is:
>
> 7) A mailing list or forum for implementors that is moderated by this TC
> 8) Optionally an implementation guideline document that summarizes
> important guidelines discussed in the mailing list or forum.
>
>


So implementation guidelines for interoperability, we can certainly do that, listing best practices in that area.

But I'm not sure what implementation guidelines in general would look like.  Any ideas  We could, for example, give references that explain how to accomplish certain tasks in ODF.  So, for bezier curves we give a reference to an article that explains the most efficient way to do the calculations, etc.  But this seems like a lot of work that might not have an audience.  ODF is really an encoding of conventional office documents.  The applications already know how to do all these things.  They just, for the most part, need to figure out how to encode it in ODF.  So, text on "How to write a spreadsheet" would be overkill.  But "How to add ODF support to your Application" might be useful.



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]