OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

cti-users message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Re: [cti-users] Timestamp sub-second precision


Stephen et al,

As one of the passionate advocates for temporality/causality representation the the STIX Language.  I can address your question quickly and succinctly:

What was the original reason to accept variations in digits?  Why is it not just set at 3 digits?


(1) There was a strong belief in the Community that there should only be "One way of Doing Things".  Period.

(2) This tenet got applied to "Timestamps"

So while it makes absolutely ***NO SENSE*** to assign picosecond granularity to something like when a STIX Package or STIX Incident was created,  it makes ***EVERY SENSE*** to express convey picosecond granularity to temporality/causality aspects of system/networking events and relationships when sharing/conveying these events/patterns in a real-time eco-system


For the more esoteric aspects of the basis for assertion that sub-millisecond granularity is required one can go back to the three major "eruptions" of this temporal discourse since 2012.  A simple experiment would be to take the inverse of a 10 Gigabit packet stream to get the required periodicity granularity.

Pat Maroney

PS - I actually can send you copies of the discourse if you are truly interested.  Or just search the historical Nablle forum.  Reach out to me directly and I'll send you the keywords.




On Apr 12, 2019, at 6:03 PM, Stephen Russett <stephen@digitalstate.ca> wrote:

Hey Jason, can you link to the specific thread that is the most relevant to the conversation ?





 

From: Jason Keirstead <jason.keirstead@ca.ibm.com>
Sent: Friday, April 12, 2019 6:01 PM
To: Stephen Russett
Cc: cti-users@lists.oasis-open.org; Allan Thomson
Subject: Re: [cti-users] Timestamp sub-second precision
 
Stephen et al, before we dive into this on GitHub can we ensure all involved have read through the TC email history on this issue.

All of these things were discussed at extreme, extreme length in the past. Many many months of discussion went into this timestamp format, it was one of the most debated single types in STIX 2. No one got exactly what they wanted, but concensus on something that was workable for all was reached eventually.

I don't know why we're re-raising these things based on theoreticals that we have already covered many times before.

Sent from IBM Verse


Stephen Russett --- Re: [cti-users] Timestamp sub-second precision ---

From:"Stephen Russett" <stephen@digitalstate.ca>
To:cti-users@lists.oasis-open.org, "Allan Thomson" <athomson@lookingglasscyber.com>
Date:Fri, Apr 12, 2019 5:02 PM
Subject:Re: [cti-users] Timestamp sub-second precision


Create a issue to continue the discussion:






From: Allan Thomson <athomson@lookingglasscyber.com>
Reply: Allan Thomson <athomson@lookingglasscyber.com>
Date: April 12, 2019 at 4:18:04 PM
To: Stephen Russett <stephen@digitalstate.ca>, cti-users@lists.oasis-open.org <cti-users@lists.oasis-open.org>
Subject:  Re: [cti-users] Timestamp sub-second precision 

Stephen â instead of over email I suggest you file a github issue with the specific change you would like to see in the specification and the TC can review and discuss adopting your proposed change.

 

Regards

 

Allan Thomson

CTO (+1-408-331-6646)

LookingGlass Cyber Solutions

 

From: Stephen Russett <stephen@digitalstate.ca>
Date: Friday, April 12, 2019 at 1:16 PM
To: Allan Thomson <athomson@lookingglasscyber.com>, "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org>
Subject: Re: [cti-users] Timestamp sub-second precision

 

But if we are fine with .1Z becoming .100Z (as mentioned in the previous emails) then why do we need to know if the source system was not capable of that precision or if the zero was just added for format purposes?

 
 

From: Allan Thomson <athomson@lookingglasscyber.com>
Sent: Friday, April 12, 2019 4:14 PM
To: Stephen Russett; cti-users@lists.oasis-open.org
Subject: Re: [cti-users] Timestamp sub-second precision 

 

Not all use cases require or capable of that precision.

 

Its also true that some higher level intelligence you only require day or hour accuracy where as the more real-time intelligence objects likely you want very accurate (or as accurate as you can get) from milliseconds.

 

Allan

 

From: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org> on behalf of Stephen Russett <stephen@digitalstate.ca>
Date: Friday, April 12, 2019 at 1:10 PM
To: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org>
Subject: Re: [cti-users] Timestamp sub-second precision

 

What was the original reason to accept variations in digits?  Why is it not just set at 3 digits?

 

Steve

 
 

From: Allan Thomson <athomson@lookingglasscyber.com>
Sent: Friday, April 12, 2019 12:16 PM
To: jordan2175@gmail.com
Cc: Stephen Russett; cti-users@lists.oasis-open.org
Subject: Re: [cti-users] Timestamp sub-second precision 

 

Yup. This is one of the challenges of digital signatures based on original content because this requirement forces many systems to not only ingest/export from their own internal formats and *also* keep the original content.

 

I see this is one of the reasons why the digital signature approach purely on the textual content is flawed and wont work for all use cases.

 

Allan Thomson

CTO (+1-408-331-6646)

LookingGlass Cyber Solutions

 

From: Bret Jordan <jordan2175@gmail.com>
Date: Friday, April 12, 2019 at 8:54 AM
To: Allan Thomson <athomson@lookingglasscyber.com>
Cc: Stephen Russett <stephen@digitalstate.ca>, "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org>
Subject: Re: [cti-users] Timestamp sub-second precision

 

The one thing to be mindful of is, when we do digital signatures, the number of digits cannot change, if you were to retransmit the data.  Otherwise the signature will break. 

 

Thanks,

Bret

PGP Fingerprint: 63B4 FC53 680A 6B7D 1447  F2C0 74F8 ACAE 7415 0050

"Without cryptography vihv vivc ce xhrnrw, however, the only thing that can not be unscrambled is an egg."

 

On Apr 12, 2019, at 8:37 AM, Allan Thomson <athomson@lookingglasscyber.com> wrote:

 

Provided the timestamp value is not changed or loses accuracy I donât see any problem generating a new string value for the timestamp.

 

The primary reason for the text is to ensure you donât end up losing precision/accuracy.

 

So *not* .133Z becomes .1Z. That would be a loss of precision and should *not* occur.

 

Regards

 

Allan

 

From: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org> on behalf of Stephen Russett <stephen@digitalstate.ca>
Date: Friday, April 12, 2019 at 7:31 AM
To: "cti-users@lists.oasis-open.org" <cti-users@lists.oasis-open.org>
Subject: [cti-users] Timestamp sub-second precision

 

Hi all

 

For all time stamps / dates, the spec says that sub second precision is optional and when used, can be 1 to 3 digits. 

 

Is it expected that a parser maintain the digit count? Or can the parser return a standard 3 digit output regardless of the original source sub second count provided. 

 

Example:

 

If you parsed a date with the end have subsecond being .100Z vs something like .1Z: if the parser excepted both, but when the parser converts back to JSON, is it expected to use the original digit count? Where the second example would return .1Z rather than .100Z?  

 

If it is expected to keep the digit count originally used by the source Json string, can you please explain the specific reasons for this.

 

Thanks.   




[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]