odf-adoption message
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
| [List Home]
Subject: Test suite work
- From: robert_weir@us.ibm.com
- To: odf-adoption@lists.oasis-open.org
- Date: Tue, 29 Apr 2008 14:30:20 -0400
So this is what I'm proposing:
We identify a core group of 4-6 technical
contributors who are willing to put in the time over the next 9 months
or so. We shouldn't fool ourselves -- a test suite for ODF is a large
undertaking. Obviously, I'd like to have more people involved eventually,
but I think we need an up-front commitment of 4-6 people to get this moving.
Where do we do the work? We have
a few options.
1) In the ODF Adoption TC, or as in
a subcommittee
2) In the ODF TC, or in a subcommittee
3) In a new TC
4) In a new organization, i.e., not
in OASIS
I'd note that there have been previous
activities within OASIS at creating test suites and conformance assessment
methodologies, and these have been done in their own TC's. For example,
see the ebXML Implementation Interoperability and Conformance (IIC) TC:
http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=ebxml-iic
I'm inclined to think that 3) is the
way to go on this. This would require the initial participation of
at least 5 OASIS members.
The primary deliverable would be an
ODF Conformance Assessment Methodology. One form this might take
would look like this:
1) A comprehensive set of ODF documents
that exercise functionality described in the ODF standard.
2) Each test case would be atomic, i.e.,
exercises the smallest testable unit of functionality. So a test
for center alignment, a different document for right alignment, etc.
3) Each test case should be self-describing,
i.e., it includes text that explains the expected appearance of the test
when executed correctly.
4) We would follow good QA practices
in creating tests for both positive and negative scenarios, including error
detection.
5) We would also have a standard reporting
template for reporting the results of a conformance test.
6) It is possible for us to automate
some parts of this. For example, we could have a user interface that
will ask the assessor to chose a category to test, say numbered lists.
It could then launch the 20 different documents that comprise that
test suite and prompt the user for whether or not they displayed correctly.
The scores could be recorded to a local ODF spreadsheet instance,
etc. There is probably also automation that would be useful for the
test case authors. So we would ideally have a "tool smith"
as part of the effort from the start.
What we would not do is officially assess
any implementations. That would be out of scope. Of course,
any members could go home and assess their own implementations (self-assessment),
but we need to avoid becoming the assessors, because of the associated
liabilities.
With a broad enough charter, as the
ebMXL IIC TC mentioned above, we could also undertake interoperability
work, including with other markup standards. But I think the first
task in developing the conformance assessment instruments.
Regards,
-Rob
___________________________
Rob Weir
Software Architect
Workplace, Portal and Collaboration Software
IBM Software Group
email: robert_weir@us.ibm.com
phone: 1-978-399-7122
blog: http://www.robweir.com/blog/
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
| [List Home]