oiic-formation-discuss message
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
| [List Home]
Subject: Re: [oiic-formation-discuss] (1)(d) A list of deliverables, with projectedcompletion dates.
- From: robert_weir@us.ibm.com
- To: <oiic-formation-discuss@lists.oasis-open.org>
- Date: Fri, 13 Jun 2008 17:13:20 -0400
"Dave Pawson" <dave.pawson@gmail.com>
wrote on 06/13/2008 12:19:15 PM:
> 2008/6/13 <robert_weir@us.ibm.com>:
> >
> > Michael.Brauer@Sun.COM wrote on 06/13/2008 07:38:55 AM:
>
> RW:
> > Sorry, I stated it too loosely. The base deliverable is
a "conformity
> > assessment methodology" (I've also heard it called a "test
requirements
> > document"), a document that details the requirements of
a conformance
> > testing tool. This would mainly be a task of collecting
and collating the
> > normative provisions of each ODF version, along with provisions
in
> > referenced standards, and putting them in a logical order, noting
> > dependencies, assigning each one an ID, etc. It would also
define scoring
> > and reporting requirements for a conformance test.
>
> "Each ODF version"? You earlier suggested working from 1.0?
> So would you expect 3 so far? 1.0 1.1 1.2?
>
I didn't imply an sequence. I didn't imply that
all versions would have conformity assessment methodology documents created
for them. There are several reasonable interpretations possible.
I think this is a detail safely left to the proposed TC to determine.
> Could you suggest a logical order?
I would probably start with ODF 1.1, and then add
on the ODF 1.2 bits once that draft is completed. But I do not see
any value in fixing the exact order at this point.
> Mine might be for common test groups? I.e. those that might sensibly
> be run together.
> What are 'scoring' requirements please Rob?
>
I believe I said the conformity assessment methodology
document would define scoring requirements. Thus the definition of
"scoring" would go there, not in the charter we're defining now,
so we don't need to agree on a definition now. But for sake of argument,
one could say that total score = number of tests which pass / total number
of tests defined in the suite.
>
> MB:
> >> As for "rendering-oriented": This is only applicable
to text documents,
> >> and maybe graphical documents, but not to spreadsheets and
database
> >> frontend documents (new in ODF 1.2). If we provide tests,
then we should
> >> provide tests that are applicable to all kind of documents.
Another
> >> issue with rendering-based tests is that rendering is system
dependent.
> >> On the one hand documents may render differently because
of different
> >> fonts that are installed, or different hyphenation rules,
and so on. On
> >> the other hand, and more important, documents can be rendered
on very
> >> different devices. On a small device, you may for instance
want to
> >> render a document differently than on a desktop. In a browser,
you may
> >> want to render a document without page breaks. Test cases
in my opinion
> >> should consider that.
> >>
> >
>
> RW:
>
> > I agree. Whatever the TC produces, whether conformance
tests, acid tests,
> > interoperability tests should be traceable to a provision of
the ODF
> > standard, or to a profile of the ODF standard that the proposed
TC creates.
> > There is no value in testing what is essentially implementation-defined
> > behavior.
>
>
> It would appear that you are talking about different subjects Rob?
> Do you agree with Michael on this one? Do you understand the 'rending'
aspect?
> Agree it is so limited?
>
In fact, I do agree with what I said, despite your
suggestion otherwise.
> I'm quite confused by this response.
>
I was saying that if we ensure that our test cases
are traceable to provisions of the standard, than this will also ensure
that we avoid test cases that are rely on system dependent behavior, since
the ODF standard does not define system-dependent behavior.
>
> MB:
> >> As for "not widely implemented": Most vendors implement
those features
> >> that are important for their users. If a feature is not widely
> >> implemented, then this is a hint that the feature is maybe
not so
> >> important. I think we should focus on those features actually
that are
> >> widely used rather than on test cases where we know that
the features
> >> are not widely implemented.
> >>
> >
> > In many cases, yes. It could be a deliberate choice by the vendor.
>
> This contradicts your (RW) statement above about responding to
> the ODF standard with a test spec? Are you now proposing to
> restrict it in some way to an MB/Sun definition of 'implemented features'
>
There is no a contradiction. Both approaches
have a use and both are valid. Imagine it from the perspective of
a vendor. You've worked 6 months on a new release of an ODF editor.
Your big partner conference is coming up in 3 weeks. You are
announcing the availability of the editor at that event. The date
cannot slip. The contents of your release is already set. The
final tests are being executed. You check your email and see a new
announcement from OASIS, that the ODF IIC TC has released a new test. Yeah!
But then you find out it is an Acid-test of 30 features that no ODF
editor implements. Useless! Much better at that stage would
be to have version 2.1 of the atomic test suite, since the vendor had been
testing with version 2.0 and could get some bug fixes (yes, Virginia, test
suites have bugs too).
On the other hand, another vendor at the start of
a product planning cycle, might find that email announcement of the Acid-test
very useful.
Sometimes you want to test what you have, and sometimes
you want to be reminded of what you don't have.
> Again I'm left confused by the politics.
And I'm confused that you think that any politics
are involved here.
>
> >
> >> > 3) A comprehensive test suite of atomic (single feature)
tests
> >> > 4) A formal profile of ODF for portability and archiving,
aka ODF/A
> >> > 5) A formal profile of ODF for browser-based applications
> >> > 6) A formal profile of ODF for mobile devices
> >>
> MB:
> >> I would suggest to combine 4-6 into a single deliverable,
and add to
> >> work out a definition what a profile exactly is, and how
it could be
> >> defined. Something like
> >>
> >> 6a) A definition of an "ODF profile" concept, where
an "ODF profile" is
> >> a well defined subset of ODF suitable to represent a certain
class of
> >> documents.
> >> 6b) A set of ODF profiles, including but not limited to
> >> - A formal profile of ODF for portability and archiving,
aka ODF/A
> >> - A formal profile of ODF for browser-based applications
> >> - A formal profile of ODF for mobile devices
> >>
> >>
> >
> > The concept of a profile is well-defined.
>
> I'd love to see a definition that this group can agree on. Do you
> have one Rob?
>
No. And I don't have a definition for "definition"
either. Do you? I'm an engineer, not a lexicographer. I
have yet to see a man made healthy by a physician defining the word "health"
or man made rich by defining the world "wealth" or a set of ODF
editors made interoperable by defining the word "profile".
If you are interested, ISO defines a profile as a
"harmonised document which identifies a standard or group of standards,
together with options and parameters, necessary to accomplish a function
or set of functions". And no, I am not going to define "harmonised".
> The W3C does them all the time.
> > OASIS has them as well. But it is a good point that
we would benefit from
> > having a document that outlines how to profile ODF.
>
> Then do you agree that this is work for the TC, to define which profiles
> will be addressed?
> I.e. it is not a task for this group?
>
If there are specific profiles we know now that we
need, then let's list them, while at that same time ensuring the charter
accommodates the creation of others that also server the stated purpose
of the TC.
>
>
> >> > What did I miss?
> >>
> MB:
> >> What I'm missing a little bit is to provide guidance for
implementors.
> >> Simply speaking, the best way to achieve interoperability
between ODF
> >> applications is that these application implement as many
of ODF as
> >> possible and reasonable for the specific application, and
with as little
> >> bugs as possible. Tests are helpful to measure the quality
of an
> >> implementation, but they don't help implementors with the
implementation
> >> itself.
> >>
> >> So far we have suggestion for tests, but we do not have suggestion
how
> >> we can help implementors in their implementation work.
>
> <sniip/>
>
> RW:
> > So implementation guidelines for interoperability, we can certainly
do that,
> > listing best practices in that area.
>
> which is orthogonal to the MB comment I think?
>
>
> >
> > But I'm not sure what implementation guidelines in general would
look like.
>
> Rob, do you believe the TC should author ODF implementor guidelines,
> not "how to author interop best practice guidelines", but
ODF
> implementor guidelilnes?
>
>
No strong preference. I think there should be
ODF Implementation guides, but I am indifferent to whether it is done in
this TC or the ODF TC. It would reasonably fit in the charter of
either committee (or in fact the ODF Adoption TC).
>
> > Any ideas We could, for example, give references
that explain how to
> > accomplish certain tasks in ODF. So, for bezier curves
we give a reference
> > to an article that explains the most efficient way to do the
calculations,
> > etc. But this seems like a lot of work that might not have
an audience.
> > ODF is really an encoding of conventional office documents.
The
> > applications already know how to do all these things. They
just, for the
> > most part, need to figure out how to encode it in ODF. So,
text on "How to
> > write a spreadsheet" would be overkill. But "How
to add ODF support to your
> > Application" might be useful.
>
>
> IMHO this is a totally different ballgame.
>
Well, we are discussion a proposed charter for an
ODF Implementation, Interoperability and Conformance TC. These three
concerns have gone with each other in other TC's, such as the ebXML IIC
TC.
>
> Is anyone on this list expecting to deliver this to Michael/ the main
TC?
>
I believe he is subscribed to this list, right? So
no need to deliver anything.
-Rob
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
| [List Home]