OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

oic message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [oic] some thoughts on test documents


Seems difficult to embed comments in the last mail. Based on previous thoughts, put what I am understanding here:

The reference documents should conform to standard, so the procedure of creating test documents is just a conformance test procedure:

1. Conformance test
1.1 Test environment preparation
1.1.1. Select ODF version, e.g. ODF 1.1/1.2?
1.1.2. Select testing platforms,e.g. Windows, Linux
1.1.3. Select one reference ODF implementation, e.g. OpenOffice.org
1.1.4. Select compared ODF implementation,e.g. Symphony
1.1.5. Figure out the test case format
1.1.6. Test cases central management repository, need OASIS help
1.1.4.1 how to upload reference ODF documents
1.1.4.2 location of putting test cases descriptions
1.1.7. Build the script for removing unnecessary markups from reference documents
1.1.8. Build the universal schema validation tool or service. Can we borrow the OpendocumentFellowship ODF validator or OpenOffice.org ODF validator? Or do we plan to build another one?
1.2 Test cases creation
1.2.1. Prioritize the test features, e.g. error-prone features have top priority, the test cases can be delivered by multi-phases
1.2.2. TC members cooperate to create test cases,e.g. each member takes task for different features
1.2.2.1 Create reference documents using reference implementation
1.2.2.2 Run script to remove unnecessary markup from the reference ODF documents
1.2.2.3 Write test case on the central repository, e.g. upload reference document,write test description
1.3 Test cases verification
1.3.1. TC members cooperate to verify each test case. e.g. tasks are assigned like the test case creation phase
1.3.1.1 Run schema validation tool to verify the syntax conformance
1.3.1.2 Run test case to verify the semantics conformance
1.3.1.2.1 Open the reference document separately by the reference ODF implementation and the compared ODF implementation
1.3.1.2.2 If the behaviors of reference ODF implenentation and the compared ODF implementation are the same, the case is passed
1.3.1.2.3 Else report defect to the test case
1.4.Fix test defects
1.4.1 TC members work out a universal method to fix those defects, e.g. manually compare the definition in the specification with the reference documents
1.4.2.TC members cooperate to fix the defects according to the task assignments
1.4.2.1 Is it the reference implementation issue? Or is it the compared implementation issue? If one of the implementations is correct, use that implementation to re-create a reference document. If both have issues, can refer to the third or more implementations.
1.4.2.3 Is it a standard issue? If yes, report defect to ODF TC, as well as record the issue to the internal report for future tracking
1.4.3. When almost all the test cases are passed, our first goal is achieved:)

In the above I mark the numbers with prefix "1" because I would like to write the No "2" contents for methodologies and best practices for interoperability issues. But of course, it is not the topic in this loop, so stop here:)

Best Regards,

Mingfei Jia(¼ÖÃ÷·É)
IBM Lotus Symphony Development
IBM China Software Development LAB, Beijing
Tel: 86-10-82452493 Fax: 86-10-82452887
NOTES:Ming Fei Jia/China/IBM E-mail: jiamingf@cn.ibm.com
Address: No.28 Building, Zhong Guan Cun Software Park, No.8 Dong Bei Wang West Road, ShangDi, Haidian District, Beijing 100193, P.R.China

"Alan Clark" <aclark@novell.com> wrote on 10/28/2008 02:52:32 AM:

> Re: [oic] some thoughts on test documents

>
> Comments embedded below...
>
> >>> On 10/25/2008 at 04:21 PM, in message
> <OFD39A5E44.7A27BFB2-ON852574ED.0077AFF9-852574ED.007AA173@lotus.com>,
> <robert_weir@us.ibm.com> wrote:
> > Bart Hanssens <bart.hanssens@skynet.be> wrote on 10/25/2008 03:40:46 PM:
> >
> >> Some thoughts and questions on getting started with the test documents.
> >>
> >> Dave, Rob and others already made several detailed comments about this
> >> topic on the formation mailing list, see also
> >> http://sites.google.com/a/odfiic.org/tc/Home/odf-interoperability-
> >> and-conformance
> >>
> >> IMHO, we should start with the atomic documents.
> >> Now, the opendocumentfellowship has already created a test suite:
> >> http://develop.opendocumentfellowship.com/testsuite
> >>
> >> Maybe we can reuse it (if the fellowship agrees with this) or do
> >> something similar, minus the comments on specific implementations, but
> >> adding remarks on ODF or the testing itself (like "spec not clear")
> >>
> >
> > http://develop.opendocumentfellowship.com/testsuite/
> >
> > It would be good to get the Fellowship to contribute their test suite to
> > the TC so we can work with in.  I know that it is under a Creative Commons
> > Attribution 2.5 licence, but this is weaker than the required OASIS
> > Feedback Licence, since CC waives copyright only, but does not grant any
> > protection against patents.
> >
> >
> >> We could create a Wiki with a page per test document, and an upload
> >> directory on the OASIS website (as mentioned during the TC conf call,
> >> the OASIS Wiki doesn't support file upload) like:
> >>
> >> ODF_1_1/
> >> - atomic/
> >> -- 5_Para_Elements/
> >> --- 5_3_1_Note_Element.odt
> >> - complex/
> >> ...
> >> ODF_1_2/
> >> - atomic/
> >>
> >>
> >> And if anyone feels like creating a document for a specific item, he/she
> >> can mention it on the wiki, create a test and upload the file.
> >>
> >> Once a week/month/..., a snapshot can be made and archived in a ZIP file
> >> for convenient downloading.
> >>
> >
> > That's sounds like a reasonable way to structure it.  But it would be good
> > to first define exactly what we mean by a "test case", maybe agree on a
> > single test case as an example.
> >
>
> In trying to think about this from a product testers point of view,
> each "test case" has at least a two potential test scenarios:
> - Validate that product X conforms to the ODF spec as demonstrated
> through the TC generated reference .odf file downloaded from the wiki
> - Validate that product X interoperates with products Y and Z.
>
> The test steps would then be:
> A. Using product X open the reference .odf file and verify that it
> read the file correctly
> B. Using product X write a .odf file from step 1 comparing the
> result with the reference .odf file
> C. Using product X create the reference file from scratch, ie using
> product X visual commands, macros, scripts, etc...
> D. Using product X write the resulting .odf file from step 3
> comparing the result with the reference .odf file
> E. Using product X open a related odf file that was created from
> product Y or Z verifying that product X read the file correctly
> F. Using product X write the resulting odf file from step 5
> comparing the result with the referenced odf file (and the original
> odf file from product Y or Z?)
>
> To execute those steps the tester would need:
> 1. A reference ODF file
> 2. Manual steps or automated scripts to create an equivalent
> reference odf file (I recognize this may be product or project specific)
> 3. Access to equivalent odf files from other products
> 4. Information, steps or scripts that will enable the tester for
> validate test results
>
> As a TC, we should build #1.  To make it useful and of interest to
> our audience we should enable them to build and contribute #2, #3 and #4
>
> > In my mind any test cases should have:
> > 1) A unique name or identifier
>
> 1a) A reference odf file
>
> > 2) An indication of what version(s) of ODF the test case applies to
> > 3) A list of pre-conditions for executing the test.  What must be done
> > first to set up the test case environment?  In most cases it will be
> > simple:  load the document in an ODF editor.  But even simple cases may
> > presuppose the installation of a particular font, for example.  There may
> > be a common set of pre-conditions which all or most test cases.  We can
> > list those once.  But we may have test cases that involve special
> > requirements, say a connection with an external database which may require
> > that such a database is first created
>
> 3a) A list of steps to create an equivalent odf reference file, and/
> or scripts that could be used to create an equivalent reference odf
> file  using product X, Y, Z
> 3b) Ability to store product X,Y,Z generated odf files that equate
> to the reference odf file
>
> > 4) A list of post-conditions used to judge whether the test case passed.
> > This could be described in English, or in a formal language.
>
> 4a) or scripts to automate the verification
>
> >
> > As you see this is similar to how unit tests are commonly done:
> >
> > double sqrt(double x)
> > {
> > assert(x>=0)
> >
> > double ret = doCalculation()
> >
> > assert(ret*ret==x)
> > }
> >
> > (and yes, I know one shouldn't really do equality tests of floating point
> > calculations...)
> >
> > This isn't the only model for how we can do test cases, but that is one
> > pattern of testing which has proven itself useful in other contexts.
> >
> >>
> >> This doesn't necessarily mean going through the whole spec in sequential
> >> order, we can start with areas that are known to have issues (I would
> >> hint: forms, gradients, slide effects, charts...) or easy to test
> >>
> >>
> >> Questions:
> >> - do we create atomic tests with an office suite, or craft it by hand to
> >> make it really as small as possible ?
> >>
> >
> > It might be easier to start with output from, say OpenOffice.  But we
> > could have a tool that then strips out all the unnecessary markup in order
> > to reduce the file to the bare essentials.  We should also examine the
> > markup by hand and verify that it indeed is correct according to the ODF
> > standard.  So at the very least we should validate it with an XML
> > validator.
> >
> >> - should we do ODF 1.0, now that many (all ?) vendors use 1.1 or 1.2
> >> draft ? Personally, I think we should start with issue-prone parts of
> >> 1.1 and 1.2's OpenFormula.
> >>
> >
> > There is a lot of overlap between the releases.  So most test cases we do
> > will apply to all three versions.  Of course, open formula would only
> > apply to ODF 1.2.  But I agree that the "issue-prone" areas are the ones
> > to start with.
> >
> >> - who will run the tests and report the results, so that the OIC can
> >> create a summary and general recommendations ? Per charter, the OIC
> >> won't be commenting or identifying implementations (at least not in
> >> reports, but isn't a wiki some kind of report ?)
> >>
> >>
> >
> > That's the sensitive issue.  We, as an OASIS TC, cannot issue a report on
> > implementations and their interoperability results.  
>
> I agree, and may make it difficult to do what I suggested above, but
> thought I would throw it out in hopes of generating ideas.
>
> > But as individual
> > members, especially members who are also ODF vendors, I'd expect that we
> > all run these test cases ourselves against our own implementations.  
>
> That's what started me thinking this from a testers perspective. If
> we can ensure that our results are highly useful, easily consumable
> and re-usable then the vendors will pick up on our work and the TC
> will thrive.
>
> > If an
> > individual member then wishes to report on their own product's results,
> > then that is fine.  And if a third part, having access to our test suite,
> > wishes to test and compare multiple implementations, then that is fine as
> > well.
> >
> > Maybe we want to talk to the OpenDocument Fellowship on this?  Maybe they
> > would relinquish the test case creation work to us, and instead take on
> > the testing of the applications against our test suite?  I think that
> > gives a needed separation of interests.
> >
> > -Rob
> >
> > ---------------------------------------------------------------------
> > To unsubscribe from this mail list, you must leave the OASIS TC that
> > generates this mail.  Follow this link to all your TCs in OASIS at:
> > https://www.oasis-open.org/apps/org/workgroup/portal/my_workgroups.php
>
>
> ---------------------------------------------------------------------
> To unsubscribe from this mail list, you must leave the OASIS TC that
> generates this mail.  Follow this link to all your TCs in OASIS at:
> https://www.oasis-open.org/apps/org/workgroup/portal/my_workgroups.php
>



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]