[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]
Subject: Timestamp sub-second precision
Hi all
For all time stamps / dates, the spec says that sub second precision is optional and when used, can be 1 to 3 digits.
Is it expected that a parser maintain the digit count? Or can the parser return a standard 3 digit output regardless of the original source sub second count provided.
Example:
If you parsed a date with the end have subsecond being .100Z vs something like .1Z: if the parser excepted both, but when the parser converts back to JSON, is it expected to use the original digit count? Where the second example
would return .1Z rather than .100Z?
If it is expected to keep the digit count originally used by the source Json string, can you please explain the specific reasons for this.
Thanks.
|
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]