OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.


Help: OASIS Mailing Lists Help | MarkMail Help

was message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]

Subject: Re: [was] Updated WAS Classification Scheme

A ten point scale is certainly granular enough for each access.  I kind of
like the approach that you start with zero points and add factors in to get
your rating.  So if the vulnerability has 2 or 3 of the items it gets 5 or 7
points (e.g.).  If you go over 10, it's a 10, right?

The more specific we can get with these, the more it's going to work. I
worry about things like "Data read access" because it totally depends on
what is read.  If the vulnerability allows attackers to read unused image
files, who cares.  But if it's customer account files or something, then
it's a big deal.  I think this is the "Business Impact" dimension and I
don't know how to split it out and have something meaningful left behind.

Perhaps we could at least indicate if the factor applies to the OS, App
Server, or application assets. So it would say OS file read, OS file write,
App Server file read, etc...  Would that help?  It would also be nice to get
some coverage here in terms of confidentiality, availability, integrity, and

Likelihood (aka Threat Prevalence -- don't like this name)
 - difficulty to discover the vulnerability
 - window of opportunity and motive (but these are sort of organization
 - Local or remote -- this seems more like an impact factor to me.  How does
this affect likelihood?
 - prerequisites for the attack (must have account, must have certificate,
must have physical access)

On your VNC + local buffer overflow example, this is an extremely difficult
problem.  This is essentially the composition problem (see the NSA Technical
Paper by Mario Tinto).  It means that you have to re-evaluate every
vulnerability in light of all the others in a system.

I see where you're going with the correlation idea, but I can't make it
work. We'd have to make the categories much more specific.  Right now, a
vulnerability category could easily apply to a whole range of
vulnerabilities -- from really risky to pretty harmless.


----- Original Message ----- 
From: "David Raphael" <draphael@citadel.com>
To: <was@lists.oasis-open.org>
Sent: Wednesday, April 21, 2004 1:43 PM
Subject: RE: [was] Updated WAS Classification Scheme

I am going to proceed with these assumptions if no one else has any
feedback for the first Draft of this document.


-----Original Message-----
From: David Raphael
Sent: Monday, April 19, 2004 10:20 AM
To: Mark Curphey; Jeff Williams; was@lists.oasis-open.org
Subject: RE: [was] Updated WAS Classification Scheme

Hi everyone,

I would like to get a little more feedback on which factors should play
into the Vulnerability Severity.

Additionally, I was thinking that a 10 point scale for each Axis would
provide sufficient granularity in determining the severity.  There is
another issue as well.  I am not sure as to where the curves should fall
on the graph.  With the draft that I sent out, I placed the endpoints of
the curves at equal points on the graph axes.  This is OK if the
Prevalence Factor balances correctly.  But I suppose we are just looking
for an accurate final output.  So as long as we weight the prevalence
factor components accurately, we will get an accurate result.

Here is another situation that I thought of:  What if you have to low
severity Vulns.  One is low severity REMOTE VNC exploit, and the other
is a low severity LOCAL BUFFER OVERFLOW that provides root access.  Both
are low severity, but combining the 2 exploits provides REMOTE SYSTEM
COMPROMISE.  Just curious if there is anyway to think about this.

Here is what I am looking at so far:

Technical Impact Factor Components:
- Administrator Access (Total System Compromise): +5
- User Access (Partial System Compromise): +2
- DoS: +2
- Data Read Access: +3

Threat Prevalence Factor Components:
- Tools Readily Available: +3
- Difficulty of exploit construction:  (+1)-(+3) Note:  I think that
this is a sliding scale of difficulty.  +1 would be very difficult, +3
would be very easy.
- Local or Remote: (x.5) for Local, (x1) for Remote

Should we correlate the components of the factors with the VulnTypes?
E.g. Buffer Overflows, DoS etc...?

Feedback appreciated.


-----Original Message-----
From: Mark Curphey [mailto:mark.curphey@foundstone.com]
Sent: Friday, April 02, 2004 1:39 PM
To: David Raphael; Jeff Williams; was@lists.oasis-open.org
Subject: RE: [was] Updated WAS Classification Scheme

100% agree. Risk is where this data becomes valuable. Focusing on System
Impact for WAS 1.0 seems like the right approach.

-----Original Message-----
From: David Raphael [mailto:draphael@citadel.com]
Sent: Friday, April 02, 2004 10:59 AM
To: Mark Curphey; Jeff Williams; was@lists.oasis-open.org
Subject: RE: [was] Updated WAS Classification Scheme

I think that the Risk model would be a valuable addition.  I think we
should definitely put it in the 2.0 roadmap.  See my other email
suggesting this as a separate component of the profile element.


-----Original Message-----
From: Mark Curphey [mailto:mark.curphey@foundstone.com]
Sent: Friday, April 02, 2004 9:49 AM
To: Jeff Williams; David Raphael; was@lists.oasis-open.org
Subject: RE: [was] Updated WAS Classification Scheme

Whilst I totally agree with what you are saying (and we all know the
ultimate value of this is in risk management, not vuln management) WAS
would be a risk management format not a vuln management format. That's
pretty massive scope difference to define all of the elements needed for
a risk management format.

Also a lot of people are starting to show interest in this concept
beyond the realm of just App Sec Vulns i.e. an enterprise vuln
management language. Again I think we can all see that's where this will
end up, but we need to keep a track on scope here so we can make sure we
get WAS 1.0 out in the timeframe we planned (end of April for VulnTypes
and Vuln Ranking Model and August for final spec).

Any merit in tabling for WAS 2.0 ?

-----Original Message-----
From: Jeff Williams [mailto:jeff.williams@aspectsecurity.com]
Sent: Friday, April 02, 2004 10:43 AM
To: Mark Curphey; David Raphael; was@lists.oasis-open.org
Subject: Re: [was] Updated WAS Classification Scheme


Great summary of the difficulty here. I think our scheme should allow us
to express as much information as possible, but needs to clear about
what parts of risk are not covered.

I'm thinking that in many cases, we WILL know quite a lot about the
business impact of a vulnerability -- especially if you're an employee
or consultant working with the company closely and want to use WAS to
describe and track the issue.

Even if some researcher is testing myPhpCreditCardStore and you find a
SQL injection, he'll want to be able to say that this will disclose all
the CC's in the DB, and that Visa and the FTC will levy fines and the
company's reputation will be shot.

So I'm leaning towards a system where we CAN specify as much as we know
about impact, but not required.  The real trick is prioritizing items
where you don't know enough about the impact.  But that has to be up to
the business.


----- Original Message -----
From: "Mark Curphey" <mark.curphey@foundstone.com>
To: "David Raphael" <draphael@citadel.com>; <was@lists.oasis-open.org>
Sent: Friday, April 02, 2004 10:05 AM
Subject: RE: [was] Updated WAS Classification Scheme


I know we just touched on discussing various models at the face to face
(i.e. we all know this needs a lot more thought) but here are some of my

I think we need to understand and plan this as our contribution to the
bigger picture of Risk i.e risk is what people ultimately want to
measure. Whats the risk to my business ? WAS is about Vulns which is
only a part of that risk equation and therefore I think we need to find
a way to rank the severity of the vulnerabities in such a way that we
can feed risk systems with meaningful useful data and companies can use
other data to calculate the risk in their own way. But we shouldn't
venture that way ourselves. Its very complex and well outside the realm

If we focus on NIST 800-30 as a high level way of determining risk it
may help to bring some clarity. It categories vulnerabilities into
operational, technical and management. The things we are dealing with
are obviously technical vulnerabities (although the root cause element
may also help indicate management or operational I guess). But point is
IMHO we need to be careful to define the scope to the vuln and part of
threat only. Part of threat explained in a second.

I think someone building a sig management system around WAS only data,
would want to have several views into the system, # of vulns and # vulns
of a particular severity, # of vulns by type (VulnType), # vulns within
dates etc

If we think of risk = vuln x threat x business impact as a basic model
we can understand how this vuln data would be used.

The real value of this data at a high level to me is when someone is
able to apply the vuln data to an asset and know what the impact to
their business is if that asset was exploited (i.e. threat matured).

We have no idea about the impact to the business so can't feed that in
anyway. We have an idea about the impact to a system (i.e. root
compromise etc but not to the business)

We can feed part of the Vuln (i.e. technical and maybe should influence
operational / management through root cause under certain circumstances)

We do not know the real threat (i.e. the threat of an overflow vuln
being used on a power utility company after NE blackouts is very much
higher than before but a WAS endpoint system wouldn't know the threat
model as we don't know (or shouldn't try and define the end environment.
That said we should feed into the threat portion of a risk model things
like if exploit code exists, if it can be automated etc. I think that's
useful data we should capture and allows people to build better models.

So I think this model should produce a vulnerability severity and threat
indicator which on their own are useful but are really intended to be
fed into risk management systems which is where that stuff is truly

In your overview we started defining data that a researcher of vuln
analysts may not know about.

*         Quantity of data (%)

*         ...TODO:  Add more consequence factors

I think we can define the vuln severity as a form of potential Impact to
the System (ie data modification, partial system compromise, total
system compromise, exploited remote or local etc etc) and place a
weighting on those factors

An example (and this is just for illustrative purposes) maybe a local
buffer overflow where you needed to be on the local system to exploit

(0.5, negative, 1.0 neutral, 1.5 positive)

Effect is Total system compromise (therefore 1.5) but locally
exploitable (0.5)

Ie Vuln Severity is a factor of Technical Impact Factor and a Threat
Prevalence Factor

That said there are loads of ways to do this.

I am open to any suggestions but would like to keep it simple !

Mark Curphey
Consulting Director
Foundstone, Inc.
Strategic Security

949.297.5600 x2070 Tel
781.738.0857 Cell
949.297.5575 Fax

http://www.foundstone.com <http://www.foundstone.com/>

This email may contain confidential and privileged information for the
sole use of the intended recipient. Any review or distribution by others
is strictly prohibited. If you are not the intended recipient, please
contact the sender and delete all copies of this message. Thank you.


From: David Raphael [mailto:draphael@citadel.com]
Sent: Thursday, April 01, 2004 6:09 PM
To: was@lists.oasis-open.org
Subject: [was] Updated WAS Classification Scheme

Hello Everyone,

I've updated this document with a rough draft of the Vulnerability
Ranking model.  Please review and pass along any comments you have.  I
will continue to update it this weekend with more detail.


David Raphael

To unsubscribe from this mailing list (and be removed from the roster of
the OASIS TC), go to

To unsubscribe from this mailing list (and be removed from the roster of
the OASIS TC), go to

To unsubscribe from this mailing list (and be removed from the roster of the
OASIS TC), go to

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]