[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]
Subject: RE: [oiic-formation-discuss] Deliverable: odf-diff?
-----Original Message----- From: marbux [mailto:firstname.lastname@example.org] Sent: Sunday, June 22, 2008 10:56 AM To: email@example.com Subject: Re: [oiic-formation-discuss] Deliverable: odf-diff? On Sat, Jun 21, 2008 at 4:11 AM, Sander Marechal <firstname.lastname@example.org> wrote: > marbux wrote: >> >> But I don't see that as highly relevant to ODF app <> ODF app interop >> before app dependencies are removed from the ODF spec in a core profile. > > As I've written previously, IMHO the two sources of interop problems are > bugs in the spec and bugs in the implementations. This is squarely aimed > at fixing bugs in the implementation. O.K. > > It would help ODF application and tool developers > to get better conformance and interop. We'd be able to check if two > applications produce the same document (given the same data). We could > integrate it into an application's automated regression test suite. > Etcetera. At least with elements attributes, just about every conformance is already tested by validation against the schema after all foreign elements are removed. The only thing remaining that doesn't have a test is the small bits of the conformance section relating to the processing of foreign elements and attributes in specific situations. That and the fact that there are no interop conformance requirements are why I raised the issue of timing in what you want. I.e., do we build tools for the existing spec or the repaired version. I'm not against what you want but I don't see the need for it before the spec is fixed unless the goal is application-level interop hacks rather than a tool based on conformance requirements. To the extent that interop is hacked at the application level before app-neutral interop conformance requirements are established, we just introduce more app dependencies on the under-specified parts of the spec and developer incentives for those parts staying under-specified. So I see creating the interop conformance requirements as a high priority item. Like I said, my question goes to the timing of what you want rather than to opposition to what you want. I see an issue of prioritization in the relevant deliverables. Do we develop the tool you want for a badly broken spec or for the repaired version? There is no way to repair the under-specification without breaking compatibility with apps' present coding. The same would be true with the tool you want if we create it before the spec is repaired. Were it up to me, ODF 1.2 would be put on hold until we get the chunks inherited from earlier versions fixed and then see what has to be done to add the new features in ODF 1.2. An ODF designed for interop will be profoundly different from the present spec. Both the spec and the apps have to change. The longer we put it off, the bigger the mess gets that has to be cleaned up. -- Universal Interoperability Council <http:www.universal-interop-council.org> --------------------------------------------------------------------- To unsubscribe, e-mail: email@example.com For additional commands, e-mail: firstname.lastname@example.org Paul, I find it obvious from the tone and direction of this and other emails you have sent that you have little or no formal background in software development. This is fine. However, I think you are missing the point that Sander originally proposed. He suggested that we follow one of the best practices of software engineering, which is to develop your tests based on the specifications, then build the apps on the specifications. If you do both properly, the applications will always pass the tests, or they will reveal design flaws in the specifications. There are names for this approach. The Unified Process and its very popular subclassification, the Rational Unified Process (RUP)are examples of this. Another popular term is Test-Driven Development (TDD). In both approaches, the use-case is specified (I put the hyphen in the word use case for clarification, but please know that it is improper to hyphenate the term) first, then a test developer sits down with the specs and writes tests for all possible execution paths through the use case. Meanwhile, back at the ranch, an application developer sits down with the spec and starts to build the application. Since both parties, located separately, have their own copies of the specification, neither is directly aware of what the other is doing, but both know that with the inputs A, B, and C, you should get the output Q. The application developer writes his code to produce Q from A, B, and C, and the tester writes code to submit A, B, and C to the application and expect Q in return. If the specification is written properly, both put A, B, and C in the same order. If not, each puts the parameters in his own chosen order, and the test fails, revealing a defect in the spec. What Sander is saying makes sense. You cannot determine the proper approach to fixing a problem until you examine what the problem actually is. Think back to your days as an attorney. Would you knowingly go to court if your client failed to provide you information that could be damaging or helpful to your case? If so, what would the outcome be? Aren't your odds of winning higher if you have all of the information on your side? In that sense, software development is no different. We need to know how tall the cliff is before we jump off (that might have saved me a broken ankle eighteen years ago). Garry L. Hurley Jr. Application Developer 2 Office of Information Technology - Bureau of Application Development PA Department of Labor & Industry 651 Boas Street, Harrisburg, PA 17121 Phone: 717.506.9373 (UCMS) or 717.346.9799 (Harrisburg) Fax: 717.506.0798 Mobile: 717.649.0597 www.dli.state.pa.us <http://www.dli.state.pa.us> My comments do not reflect those of the Commonwealth of Pennsylvania, its various agencies and departments, or its citizens. They are my own, and may or may not be shared by those in positions of authority over me.