OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help

oiic-formation-discuss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]

Subject: Re: [oiic-formation-discuss] Reference implementation strikes gold

--- On Mon, 6/16/08, jose lorenzo <hozelda@yahoo.com> wrote:

> From: jose lorenzo <hozelda@yahoo.com>
> Subject: [oiic-formation-discuss] Reference implementation strikes gold
> To: oiic-formation-discuss@lists.oasis-open.org
> Date: Monday, June 16, 2008, 1:29 AM
> [Reference that may be useful for adding context to: writing
> "twists" into documents, "cheating", and
> other parts below:
> http://lists.oasis-open.org/archives/oiic-formation-discuss/200806/msg00265.html
> ]


> There is one thing I haven't seen mentioned yet in what
> I have read. The reference implemention (eg, OO.o as you
> stated, but whatever it is) is not only so that people can
> see the source to study it or to provide some error
> redundancy to the spec and test suites. Very importantly,
> the reference implementation serves as THE tool that people
> should all have in order to display (doublecheck) their ODF
> documents made with their vendor's tools. This process
> should eventually be automated (through contributed
> scripts) to increase the likelihood users will make testing
> of their documents virtually automatic (Save + Test).


Following up on two points.


> ... What I mean (or one possibility
> for a round of tests) is that the macros could run so that
> the document is exposed in various ways and the user then
> gets to decide if that behavior is correct (as expected).
> So this could definitely involve the user's subjective
> call, but can possibly be made as simple as the smiley acid
> test.

Contrary to the confidence expressed in the quote above, I am not sure how much this can be automated to result in a smiley simple test or to arrive at an automatic P/F. There is one obvious/trivial test and that's just opening up the document within the ref impl; however, I am thinking that other actions (scrolling, changing worksheets, editing a cell, changing colors, editing titles, etc) will help reveal if the internals of the data are represented as expected by an ODF ref impl.

The idea is to test *handling* and, indirectly, internal data representations. This approach may help address the example you gave for pixel matching, where a vector drawing would be rasterized -- a "bad" thing that would ideally be caught by a test.

We would also be testing interface responsiveness to the user since potentially the user would see something that looks good superficially and the document would have something that also looked ok in terms of data representation (in isolated pattern matching tests perhaps [Schematron?]) but somehow the two were not really linked or only haphazardly. For example, there might be some vector drawings and the user would see something that looks familiar but trying to edit the on screen vectors would not work or not link properly into the vector model in the document. Ie, separately and superficially, the model and the visuals would seem "probably correct". But tested together (user trying to edit a known property with keyboard/mouse) might cut through the superficiality to reveal a major break.

Anyway, so I am not sure what can be done here in terms of factoring out the subjective aspects and simplifying any subjective checks that remain.


The other item I want to mention is that this type of testing can have some success testing "that capabilities/features explicit in ODF are not bypassed (or less so).. that any capabilities/features that resemble those in ODF are actually implemented in legal ODF."

Not defending against this means that an app can generate strictly conforming document one after the other but these would be void of any useful ODF compat structure that would be usable and understood by other apps because the ODF-like features would be implemented through different codes (this can be the case even if "foreign element tags" are avoided.. as discussed in the link mentioned in the parent post).

OK, so the ref impl tests help test these aspect of ODF interop. The question now becomes (assuming this all ends up working out in the TC and with working examples), would it be a good move to try and codify what is being done, ie, should, if necessary, the ODF TC be contacted to try and add language that codifies this? Maybe a particular profile or the actual ODF proper might actually gain from wording that says that ODF-like features must be implemented through ODF defined elements. This is somewhat open ended and subject to ambiguities, so it may be a better idea to simply push that aspect of interoperability informally by marketing the testing. [Surely, if we could find suitable wording, that would be very desirable.]


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]