OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oic message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oic] Thoughts about Interoperability (Part 1)


Hi Svante,

When the OIC TC first started, we had thoughts similar to this.  We talked 
about reviewing the text of the standard and developing "test assertions" 
which are testable statements, derived from normative statements in the 
specification.  We thought that these test assertions could then be 
automated. 

This all works very well in theory.  And even in practice, it would find 
via automated means nearly 100% of the statically testable defects in an 
implementation.  Note that I make the distinction here between 
statically-testable and dynamically-testable.  A statically-testable test 
assertion can be tested by examining the document itself.  It is a 
statement of the structure, validity, syntax, encryption, packing, etc. 
Dynamic tests on the other hand require a runtime environment.  They deal 
with issues of layout, presentation,  calculation, etc.  They are much 
harder to automate.

When we then looked at the kinds of interoperability problems that were 
popping up in the real world, what users were actually reporting, I think 
we see that most of them are on the dynamic side.  Sure, there are some 
static defects as well, but they are few and are not generally causing a 
lot of trouble. 

So that was why we focused on Plugfests.  It was a question of efficiency, 
how can we improve interoperability for users the most, in a given amount 
of time.  If you look at this as a QA question, that is always the big 
question: what technique or combination of techniques will give the best 
"defect yield", meaning the number of defects found per unit of testing 
effort.  Generally, some combination of techniques will be best, since any 
single technique tends to give diminishing returns. 

So I think a static approach is worth pursing in parallel.  Especially if 
we use ODFDOM, this will help find defects that impact processing of 
documents via automated means, which is a new class of users, and 
presumably will uncover a new class of bugs.

-Rob

Svante.Schubert@Sun.COM wrote on 01/26/2010 07:00:02 PM:

> 
> To archive interoperability there are certainly many approaches. A 
> very good one is to have Plugfests, 
> where ODF application maintainer come together, discuss annoying 
> problems and show their improvements since last Plugfest.
> 
> Unfortunately this approach does not maximize management 
> satisfaction as the punctual examination does not align with 
> management goals to be planned a year ahead and to be reviewed 
afterward.
> 
> Even more it is hard to give a status like
> a) to explain what is still missing for perfect interoperability and 
> b) what time is needed to be finished with interoperability 
functionality.
> 
> For this reason my colleagues and I would like to pursuit aside of 
> Plugfests a more formal approach (certainly overtaking work already 
> done by the OIC TC):
> First of all we would like to have a strategical liaison between OIC
> TC and the ODF Toolkit Union, especially the ODFDOM library.
> The idea: While in the OIC TC we are specifying tests and test 
> documents, ODFDOM will fill the theory with life and use these tests
> documents and tst blue prints to implement tests working on the 
> documents. Becoming some kind of reference implementation and a 
> proof of concept.
> The idea to do as much automation as possible, in the end hopefully 
> even loading test specification by a tool, but step by step.
> 
> Why are tests necessary? 
> Let's say not only in regard of interoperability: Trust is good, but
> control is better. 
> In the end we might come up with something like the ACID3 tests 
> testing CSS for browsers, pushing standardization in browsers. 
> 
> So how do we start? What is the formal approach.
> One thing is certain, this formal approach will be anything but 
> easy. Therefore the first task is to break down complexity:
> 1) For the start I would like to neglect layout completely, even 
> without layout the start is complex enough.
> 2) Second I would like to focus only on the latest ODF 1.2, it's 
> likely that we won't be finished with this task when ODF 1.2 is 
> already an ISO standard.
> 3) Aside of neglecting things, there is a further way to lower 
complexity. 
> We can lower complexity by modularization of ODF 1.2 into smaller 
> disjunctive testable pieces.
> What can these pieces be?
> Taking a close look on ODF 1.2 it become clear it is already 
> modularized. There are three parts of the spec:
> a) The Package Layer can be tested without the other two
> b) ODF Schema can be tested without formula, but requires Package 
> Layer functionality
> c) Formula.... I neglect for the sake of easiness.
> 
> Remember we already started to find out that the package layer and 
> the ODF Schema spec can be modularized into components, I will call 
> all components features in the following.
> Package features are indicated by the content table and a draft of 
> package features where mentioned before. 
> Mandatory features are indicated in the spec by ISO/IEC directives 
> (ie. SHALL/SHALL NOT).
> 
> The biggest modules of the ODF Schema part are certainly the 
> different document MIME types ODF 1.2 is providing.
> But even the XML nodes within can be grouped/modularized to 
> features. These are features as image, hyperlink or a table.  Most 
> often a set of element and attributes form these commonly known 
> object, often mentioned by product managers, but as well existent in
> other file formats as HTML, docbook, etc. and some of them have 
> their origin from the print media.
> 
> The possible set of features can be guessed by taking a look at the 
> headings of the ODF 1.1 spec.
> (NOTE: As XML nodes are being reused across different MIME types 
> features like a 'table' exist in a text and spreadsheet document. 
> Similar tests can be reused).
> 
> ....to be continued
> 
> (Next time:  How ODFDOM might help, Bart's tests & what additional 
> test tools we already work on, a test may look like and what 
> unexpected positive side effects may occur).
> 
> Regards,
> Svante
> 
> 



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]