[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]
Subject: Re: [cti] Patterning Feedback -Timescales/Periodicity
Pat – I have yet to see from you any real examples of patterns around 10 GbE metadata/PCAP captures, just the theoretical notion that it may be possible. As such, I don’t think this meets
the bar of being MVP for patterning. Regards, Ivan From:
Patrick Maroney <Pmaroney@Specere.org> @Eric: I was going to argue for Planck Time precision as we will indeed one day be dealing with covert channels and remote sensing using quantum entanglement. However, I thought I -might- get push back on this and compromised with Picoseconds.
😬 @Ivan: 10 Gigabit ethernet Netflows and metadata from PCAP captures and network/security appliances that operate at these speeds is the current application/use case. @All : I also agree with Eric. My argument is only for the need, not how we meet the need. Adding three words to the spec seemed to be a reasonable compromise. Get Outlook for iOS From: cti@lists.oasis-open.org <cti@lists.oasis-open.org> on behalf of Kirillov, Ivan A. <ikirillov@mitre.org> I completely agree with Eric on his comments regarding precision being arbitrary. Also, given that our current set of Cyber Observable Objects does not model entities such as CPU cache where signal latency is in the realm of microseconds/nanoseconds/etc., can anyone
(Pat or others) actually come up with a real pattern for detecting malice using our current set of Objects that requires such precision? I suspect not, though I’d be happy to be proven wrong. Regards, Ivan From:
<cti@lists.oasis-open.org>, Eric Burger <ewb25@georgetown.edu> on behalf of Eric Burger <Eric.Burger@georgetown.edu> I thought this topic was discussed, argued, beaten to death, and the consensus was that precision was
arbitrary and could be infinitesimal. In a machine readable language, why are we using English words to describe units? In a machine readable language, designed for exchanging information at
machine speeds, why are we using human speeds to describe time? I disagree with Patrick here that we should add three more decimal multiples. I do agree that in a
machine readable language operating at machine speeds, we will have to make sure our language scales as those speeds get faster and faster. The current proposal fails to do that. As such, unless we are willing to use the Planck Time (~ 5 x 10-44 s)
as the base unit, we will always be wishing we had smaller decimal multiples. Here is a counter proposal: 1: Pick exactly two time units. One for humans, one for machines. I would offer
seconds and picoseconds. 2: Allow for integer, float, and scientific notation for the time specification. Integer seconds covers 136 years of time in just a 32-bit integer. That should be more than enough time to correlate two events in human scale. Integer picoseconds covers 213 days of time in a 64-bit integer. That should be more than enough time to correlate any two events in machine scale. For that matter, it really should be enough time to correlate
most events in human scale. Is this future proof? When we have optimal quantum computers, we can still have expressions that say: The wavelength changes within 2.5E-42 seconds or The wavelength changes within 2.5E-27 picoseconds Saying we will have to add femtoseconds, attoseconds, yadda, yadda is
not future-proof.
|
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]