OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

dss-dev message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [dss-dev] Data format guessing of data within Base64Data tag


Dear Carlos,

> Actually, the more I think the less I am sure there is such a big 
> ambiguity...in the text :
> 
> "<Base64Data> [Optional]
> This contains a base64 encoding of data that are not XML."
>
> I do not see much place for such an ambiguity: it says that this element 
> contains base64-encoded data that **is not** XML...

The ambiguity lies in the way the sentence is understood. The sentence
IMHO should be taken with common sense, since data is always
subject to interpretation. There is no benefit in prohibiting
<Base64Data> usage to those streams that the implementation interprets
as possible XML data. For example, let's suppose somebody needs to digitally
sign a MAC address, 3c:68:69:74:2f:3e, base64-encoded as follows:

  Original octet stream (MAC address): [0x3c, 0x68, 0x69, 0x74, 0x2f, 0x3e]
  Base64-encoded data: "PGhpdC8+Cg=="

The platform would refuse to sign the MAC address. "Why?", one would ask.
Because your interpretation of the <Base64Data> tag is forcing the
implementation to spend time in guessing what kind of data the user is trying
to sign. In the guessing, it decodes the base64 string and finds out
that the MAC address happens to be the text "<hit/>". As long as this text is
interpreted as XML (and this is the point), the implementation refuses
to perform the signing.

"But", the user would say, "I didn't send XML and obviously I didn't want the
platform to spend time in guessing". Just a digital signature is the
expected response. It makes no sense for the programmer to invoke a <Base64XML> signature request over a
MAC address (or any arbitrary binary stream). And even worse, it makes no sense to code a "if-then-else"
for those cases where the implementation interprets the binary stream as XML.

I must insist: I see no benefit in trying to interpret the raw data.

> to me this is the same as saying that the data **must not be** XML....

I don't think it's the same. "MUST NOT" clauses are usually written in a clear way.

> there is not any "should not", which would have allowed this to be XML....
> so if the encoded data **is** XML then it breaks the requirement expressed in this 
> sentence,

I don't see the "requirement". I see a differently interpreted sentence in the
general definitions part, whereas the precise algorithm (which includes
clarifications for errors and the such) clearly states that the octet
stream is not to be processed: simply by applying the signature is enough.

> which, as I understand is not a recommendation (no should) or 
> an alternative. I do not see where this sentence leaves place to think 
> that one may include data that is XML in a way that does not break this 
> requirement...

In the example above you can see an example of non-XML data which, when
interpreted as XML, the "requirement" makes a nonsense.

To summarize, the main point IMHO is: Is there any advantage, benefit
or simplification in the implementations by rejecting raw data signing
requests, when the raw data appears to be XML?

Kind regards,
Daniel M. Lambea



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]