OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oic message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: Idea about how to create conformance testing documents


Dear TC members,

I continue the last topic raised 2 months ago, actually the initial message about how to create test suites is here(http://www.oasis-open.org/apps/org/workgroup/oic/email/archives/200810/msg00024.html). But the previous 2 messages both are not mature, and besides Bart, no other responses.

Based on the previous ideas and Bart's comments,as well as my thinking over again, I summarize here:

Now some OASIS facilities, such as Subversion, Jira are ready for us to use, as well as a test metadata proposal is ready.This is the TC current achievement. Then next step:

Firstly we need to solve 2 issues sequencially:

Issue #1: We may all agree to create a test document feature by feature(? need to make sure), but how to define the feature unit? The basic principles should be: (i)avoid duplicated efforts,(ii) convenient to test;(iii)convenient to maintain. The following options could be considerded and make a choice:
(a) Each test document is for each MIME type, so we may only create and maintain N test documents finally. N = MIME type number

(b) Each test document is for each indepenent feature defined in ODF specification, Here the independent is not absolute, I think the feature defined at Heading 2 level in ODF specification can be regarded as independent feature. In ODF 1.2 part 1, such feature is about 160. I do not count chapter 18 and 19 because the 2 chapters are attributes or properties list. For ODF 1.2 part 2,OpenFormula, the sub TC already has a self-test document, so is not counted here. Also For ODF 1.2 part 3, package is ODF fundermental, not necessary to test, digital signature can be regarded as one independent feature.
If we decide to choose this optoin to define feature unit, often we may need to create multiple test documents for the same feature unit according to different MIME types because the feature in ODF is common, and not limited to some specific MIME type. The number of test documents for the same feature unit is not certain, should be decided by TC review. For example, most of features may only need to create 3 test documents for text document, spreadsheet and presentation, but Data Pilot may be only adapt to spreadsheet, so only need to create 1 test document for spreadsheet.
So finally, we may create and maintain about 500 test documents according to this option.

(c) Each test document is for each element defined in ODF specification. The final number of test documents will be very big.
(d) Other definition by TC members?

I prefer option (b) since it meets the 3 basic principles. But let's discuss about it and make a decision.


Issue #2: Need to setup an operable process for test suite creation and maintenance, and submit to TC approval. Since test suite creation and maintanance is a long term thing, I suggest this process should be approved as one of TC standing rules. TC need to firstly make a decision for this. This process should be simple, operable,and efficient. Welcome your suggestions.

Below is the draft process:
(1)According to the feature unit definition, some member creates one or more test documents for some feature unit. The member should do enough unit test before submitting to TC;
(2)The member upload the test document to Subversion respository;
(3)The member send a mail to TC mailing group,let members be aware his/her willing to contribute the test document, shall provide the document link at Subversion repository;
(4)TC members review the test document, send comments through mailing group, the review contents shall at least include:
(4.1) The test document shall not contain extra info,e.g. specific vendor info;
(4.2) The schema in the test document shall be valid. This can be done by some validation tool, e.g. Jing or ODF Validator at odftoolkit.org.
(4.3) The test document shall pass the semantic verification. That is the behavior are consistent when the test document is opened and rendered by different existed ODF applications. Once having confliction, TC vote to make a decision which behavior is correct by majority voting numbers.
(5)The member request to discuss/approve the proposal in the TC meeting at least 2 days before the next TC meeting.
(6)TC meeting discuss/approve the proposal, or request the member to modify and resubmit the proposal, or transfer this proposal to ODF TC if the proposal is related to ODF specification.
(7)Repeat step (4)-(6) until the proposal is approved.

A question here, I know test meta data is important, but I am not clear now where we test metat data is appropriate to use for test document creation. I understand test meta data should be used for interoperability testing to record test case, right? Maybe an ignarance question:)

Second, besides the above 2 issues to be solved, we have the following tasks to follow up:
(1)Need to define Subversion structure to store the test documents. Adapt to ODF specification analysis? Or unified storage location, and each SpecAnalysis node has a link to the corresponding document location.
(2)Need to develop or borrow a tool to erase extra info from test documents, not mandatory, but better if we have.
(3)Need to provide the application name in meta.xml, proposed to use "OASIS OIC TC". Propose to use a tool to process it.



Inactive hide details for Ming Fei Jia---12/20/2008 10:46:24 PM---Dear TC members, On the last TC call, I've mentioned this ideMing Fei Jia---12/20/2008 10:46:24 PM---Dear TC members, On the last TC call, I've mentioned this idea. Here state it in the mailing list.


From:

Ming Fei Jia/China/IBM

To:

oic@lists.oasis-open.org

Date:

12/20/2008 10:46 PM

Subject:

Idea about how to create conformance testing documents




Dear TC members,

On the last TC call, I've mentioned this idea. Here state it in the mailing list.

We know one of TC deliverables is the test corpus documents. I've seen Bart have proposed the test metadata and also uploaded some example documents to the wiki, created some document categories, e.g. atomic tests, complex tests,etc.,and call for volunteers to contribute. But it seems we still do not know clearly how to start. ODF is a kind of XML based document format, which contain many many elements and attributes. I think the targets of the test corpus documents are: (1)to cover all the elements(attributes) defined in ODF specification,(2)as well as these test corpus documents shall be opened/saved by different real ODF applications and (3)the behaviors in different ODF applications shall be the same or a minor difference allowed by the conformance clause.

To satisfy the above 3 targets, one of methods may be that we create many many test documents, e.g. each test document for each element. But so many test documents are difficult to manage and also difficult to test for ODF applications.

So another method may be that we create test documents according to the kinds of applications. For example, currently ODF has defined 5 applications(Text document, Presentation, Spreadsheet, Drawing, Database), so we only create 5 test documents, each of which should be a whole big test document. The process of creating such test document may be:
(1)Someone, e.g.Bart first create the basic document framework,e.g. packaging, root document element,meta data etc.
(2)Put such test document into some version management tool, e.g. Subversion;
(3)Call for volunteers to contribute to the test document, the contribution unit could be one or more elements;
(4)TC members review the contribution, the 2 things shall be done during review, one is to validate the XML schema, the other is to verify the semantics(open/save/behavior) in at least 2 real ODF applications;
(5)TC meeting ballots the contribution, if pass(generally will pass since it is not like specification), the contributor is responsible for checking in the contribution to the test document via version management tool;
(6)TC chair or secretary marks which elements are contributed and maintain a statistics table, or maybe version management tool can do such statitics so that other TC member can search the status before contributing;
(7)Repeat the above 6 steps for many life cycles until all the defined elements are written to the test document, so a complete test document for one kind of ODF application is built up.
Finally, our test corpus documents only have 5 test documents which already be verified by real ODF applications step by step. This process is just like an iterative product development.

What I can see the advantage of the above method may be:
(1)TC is easy to manage the test corpus documents;
(2)Avoid duplicated work for creating common document elements;
(3)Convenient for ODF applications to verify their conformance.

What I can see the limitations or troubles that the above method may bring are:
(1)The generated test corpus documents are used to verify ODF application conformance, could not validate one ODF document, which can be validated by the existed ODF validators;
(2)It needs that each feature unit(may be composed of one or more elements) are independent in the whole big test document, that is,when ODF application is rendering the test document, if some feature unit fails, ODF application can skip this feature unit and continue to render next feature unit;
(3)It seems difficult to record the verification result(pass or fail for each feature unit) automatically if ODF application does not support that.Maybe provide some meta data in the document. Need TC members help to work out a solution.

This is only my unmature idea,welcome any comments. One reason to raise it is that we need to define the test corpus contribution process ASAP since the time is so fast and closing to the end of 2008:) If the process is not defined, the actual work is difficult to go forward.


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]