OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

dss message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: RE: [dss] t+/-dt


Colleagues - Two questions, then ...

1. Do we expect relying parties to use the "accuracy" indication purely and
simply to evaluate the quality of the service, or do we expect them to use
it to calculate the most conservative estimate of time?

2. If the former, then, is putting this information in each timestamp the
right approach, or should it simply be a mandatory component of the
service's policy?  There are (after all) many components of quality,
including which source of time the service uses (e.g. which national time
standard), and we are not proposing to include this information explicitly
in each timestamp.  Do we expect authorities to continually evaluate their
own accuracy and adjust the accuracy indicator accordingly?

I don't see the relevance of the "network delay" argument.  Naturally, an
authority can only make assertions about information it has fully received.
We should not expect it to speculate about what might have happened
beforehand.  This is an inevitable (and hopefully acceptable) consequence of
our architecture.

I could certainly live with a model in which relying parties always add or
subtract a delta from the asserted time and only recognize authorities or
timestamps whose accuracy value is less than that delta.

All the best.  Tim.

PS.  The final paragraph suggests we should call the indicator "estimated
error", rather than "accuracy".

-----Original Message-----
From: Nick Pope [mailto:pope@secstan.com]
Sent: Tuesday, August 26, 2003 10:02 AM
To: Tim Moses; 'DSS'
Subject: RE: [dss] t+/-dt


Tim et all on DSS

As I mentioned on yesterdays conference call my view is that bringing in two
times add a whole new set of complexities to applications in deciding which
time is relevant.  If the difference in the two possible time becomes
significant to an application then other factors, such as network delay,
come into play.  Accuracy gives a simple indication of the quality of the
time-source and provided that this is within an level that is not
significant to an application then the diffence between the t+dt and t-dt is
not significant.  If this is of significance then other factors which effect
the accuracy (e.g. variance in network delay) also need to be taken into
account.

Nick

> -----Original Message-----
> From: Tim Moses [mailto:tim.moses@entrust.com]
> Sent: 25 August 2003 21:10
> To: 'DSS'
> Subject: [dss] t+/-dt
>
>
> Colleagues - There are two common ways of expressing the time
> when there is
> a degree of uncertainty.
>
> 1. a=t+dt, b=t-dt, or
> 2. a=t, b=dt
>
> Of course, one can readily convert values between the two styles.
>
> I can think of a couple of reasons for choosing style 2.  The
> first is that
> paradoxical statements cannot be made in style 2 (one cannot say before
> 2:00pm on Friday, but after 2:30pm on Friday).  The other is that
> style 2 is
> more compact, or stated differently, in most cases, the year, month, date,
> hour, minute and time-zone offset values in t+dt and t-dt will be
> identical.
>
> But, it seems to me that the most appropriate way to choose
> between the two
> styles is to consider whether the client or the server should bear the
> computational burden in the most common applications.  In keeping with the
> philosophy of the architecture (simplifying matters for the
> client) it seems
> apparent that any computational burden should fall on the server.
>
> Except in cases where the client is not concerned about the accuracy, I
> cannot think of an application in which the client is interested
> separately
> in the server's estimate of the time and its estimate of the
> uncertainty in
> that time.
>
> Some of the most common situation are where the relying party
> wants to know
> definitively whether t is earlier or later than some t'.  In the former
> case, it should compare t' with t+dt and in the latter case, it should
> compare t' with t-dt.
>
> Here are some examples where the relying party compares times:
>
> - Determining if a signed document was created within the
> validity interval
> of a private key.
> - Determining if a signed document was created before a certificate was
> revoked.
> - Determining whether a bank deposit was made before a withdrawal.
>
> There are also situations in which the relying-party wants to know
> definitively the earliest or latest time at which a certain event
> could have
> occurred (e.g. creation of a document).  In the former case the relying
> party is interested in t+dt and in the latter case in t-dt.
>
> If the relying-party is not concerned about accuracy, then there it can
> simply choose to use t-dt or t+dt, in place of t.  We could make it
> mandatory to include t+dt in a timestamp token and optional to
> include t-dt.
>
> It has been argued that style 2 is more conventional or intuitive; it is
> more consistent with a service that simply asserts that it performed a
> certain action at a certain time, regardless of how one might use that
> information in a given business context.  Personally, I don't find that
> argument persuasive.
>
> What do others think?  All the best.  Tim.
>
>
> -----------------------------------------------------------------
> Tim Moses
> 613.270.3183
>
> You may leave a Technical Committee at any time by visiting
http://www.oasis-open.org/apps/org/workgroup/dss/members/leave_workgroup.php





You may leave a Technical Committee at any time by visiting
http://www.oasis-open.org/apps/org/workgroup/dss/members/leave_workgroup.php


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]