OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

cti-users message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [cti-users] Timestamp sub-second precision


Provided the timestamp value is not changed or loses accuracy I donât see any problem generating a new string value for the timestamp.

 

The primary reason for the text is to ensure you donât end up losing precision/accuracy.

 

So *not* .133Z becomes .1Z. That would be a loss of precision and should *not* occur.

 

Regards

 

Allan

 

From: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org> on behalf of Stephen Russett <stephen@digitalstate.ca>
Date: Friday, April 12, 2019 at 7:31 AM
To: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org>
Subject: [cti-users] Timestamp sub-second precision

 

Hi all

 

For all time stamps / dates, the spec says that sub second precision is optional and when used, can be 1 to 3 digits. 

 

Is it expected that a parser maintain the digit count? Or can the parser return a standard 3 digit output regardless of the original source sub second count provided. 

 

Example:

 

If you parsed a date with the end have subsecond being .100Z vs something like .1Z: if the parser excepted both, but when the parser converts back to JSON, is it expected to use the original digit count? Where the second example would return .1Z rather than .100Z?  

 

If it is expected to keep the digit count originally used by the source Json string, can you please explain the specific reasons for this.

 

Thanks.   

 



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]