OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

trust-el message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Minutes from First Face-to-face on November 10th


Minutes for the 6th meeting of the Electronic Identity Credential Trust Elevation Methods (Trust Elevation) Technical Committee and the first Face-to-Face meeting

10 November, 2011

 

1. Call to Order and Welcome.

 

2. Roll Call

Attending (please notify me if you attended the meeting but are not on the list below)

Abbie Barbir, Bank of America  - y

Anil Saldhana, Red Hat  - y

Brendan Peter, CA - y

Colin Wallis, New Zealand Government  - y

Dale Rickards, Verizon Business  - y

David Bossard, Axiomatics - y

Dazza Greenwood, - y

Debbie Bucci, NIH  - y

Don Thibeau, Open Identity Exchange - y

Ed Coyne, Dept Veterans Affairs - y

John Bradley - y

Kevin Mangold, NIST - y

Peter Alterman, NIST - y

Mary Ruddy, Identity Commons - Y

Marty Schleiff, The Boeing Company

Shaheen Abdul Jabbar, JPMorgan Chase Bank, N.A.  - Y

Shahrokh Shahidzadeh (Intel Corp)  - y

Thomas Hardjono, M.I.T.  - y

 

Observers

Cathy Tilton, Daon

 

Guests

Dan Combs, eCitizens

Sarbari  Gupta, Electorsoft

 

66 percent of the voting members were present at the meeting.  We had quorum.

 

We used the following chat room for the call: http://webconf.soaphub.org/conf/room/trust-el - chat room text is included at the end of the minutes

Abbie thanked people for coming. There was an introduction of the people here. He explained that the first document is comprehensive view of methods and we need agree how to deliver that. We need to review the use cases to adopt method points. Before starting on the first deliverable, we need to have an open discussion to get agreement on key points

1)      What do we mean by trust elevation, i.e., in this context, this is what we mean by trust-el and auth strength.  (The use cases justify methods, but we need to know what each technique means.

2)      Method factors.

Peter expressed a concern that it smells and feels like a very deep rat hole.  What is likely to happen if we open discussion to define and clarify, we will have rat hole after rat hole.   We should review the charter and post a definition of what is meant.

Abbie said this was fine. No fundamental difference. We need to reach quickly a bare minimum agreement.  There are 3 general purpose factors – what you are, what you know and what you have.  There is also a fourth category, which is contextual. These are pretty well accepted. To increase strength of auth you need to add factors from the same or different categories that don’t have the same vulnerabilities.  Vulnerabilities are a factor of what you are trying to protect.   The risk factor to you is relative to the use case and user. Some are willing to take higher or lower risk.  Whether a factor is implemented correctly, vs. incorrectly, changes risk. When a password is unprotected… [it can be stolen.]

Peter said we can certainly define trust-el that way, based upon the 4 factors or 3 plus the phantom factor. This is legitimate.  So we are defining trust-el in a way that is generic enough to be used in the wild.  The definition of trust-el is a statement, not an agenda item.

Abbie agreed strongly.  So no rat hole.  FSIDC said use a layered approach. Auth is contextual, it is time sensitive.  Financial services Institutions (FSI) failed in implementing strong auth for general consumer in a meaningful way. We have what we claims is [strong auth] but it is not. We are playing catch up with the health IT people. The end user is lost. We give them a fake sense of security.  A regular website provides a false sense of security.  One legitimate outcome is a movement towards the elimination of passwords completely.

Abbie summarized this in one phrase. The goal is to kill passwords.  Ideally this is part of deliverable 3.

Shaheen commented, when Abbie mentioned getting rid of passwords, Financial Service Institutions (FSI) don’t like storing passwords on their end.  For FSI, storing a password is a liability

Abbie asked that this be noted. (Noted in methods spreadsheet.)

Comments were made about Sony and other breaches.

Peter said that certainly NSTIC wants that. The death of user id and password is one of program goals.  What we are going to do in this TC is create ammunition to do this.

Abbie commented that using this ammunition is stage two. 

Peter said that a federation or framework of frameworks is what NSTIC is pulling together. The government’s recommendations are almost complete and are going thru clearance.

Brendan asked a foundational question. Are we considering ourselves bound by the NIST LOAs or are we conceptualizing breaking these down into sublevels? We could have an entirely different discussion if not bound by those restrictions.

Brendan commented that the street identity projects don’t cross thresholds, but they enhance trust within a level. We might want to consider being more inclusive.

Shaheen agreed.

Cathy indicated that this subject is near and dear.  NIST 800-63 gives a very narrow role to biometrics.  To allow for an expanded set of methods may be useful.

Abbie said instead of mentioning NIST 800-63, we should mention ISO/IEC 29115 or ITU-T x.1254. It is what we have moved it up to.  NIST has a fundamental flaw of reliance on in-person proofing.  Now things are much more flexible.

Abbie will send it after the meeting.

*Action item for Abby to send relevant ITU-T standards.

Abbie indicated that NIST is a good starting point, but we should more beyond.  Could be certified at one LOA, but might not trust it because of context. Trust is part of a relationship.  

Peter said, one of the things we hope to discover is the number of different approaches various RPs use to determine trust worthiness. To peg trust worthiness, some FSI employ a numeric [risk] system.  If my trust is above 76, it is acceptable for authorization.  As advisors to E-AUTH 2004, we were writing papers about other approaches.  This is where certain segments of the industry are going.  There are alternatives. Some companies have CIO’s telling them to avoid risk. At some other places, PayPal, if you have a working credit card account it is sufficient for high LOA. At the end of the day, he wants to see who is doing what, and how.  Feds are a big part of this world, so can’t discount that perspective, but also at NSIT we understand that there are other perspectives.

Abbie concluded, so we are reaching consensus.  Don’t know if we should make it as a position, or wait. There is no running away from LOA 1-4. It is becoming an international standard in a few months and we need this for many situations.

Abbie noted that it is the in betweens of LOA-3 implementations that can be full of holes, or not.  It is highly variable: 3+, 3- or 2+ based on how you used it and how much you do. The binding becomes part of the trust. We have to go with the spirit of the LOAs.  We have to find a way to elevate trust for a given interaction.

Peter said that even within LOA-1 it is a useful exercise to demarcate strategies and tasks that can enhance LOA-1, not LAO-2 or 3. This is the sense of what we are talking about.  We need to look at a practice irrespective of those conditions.  Need to look at multiple levels and combinations.

Abbie totally agreed.  Trust-el is within the context of the transaction.  Even if LAO- 1.

Brendan is concerned about LOA-1-4 and below that.

Ed said there are two things to look at: methods and criteria. So we need to have a mathematical (semi) way to show how these factors combine.  It may be fuzzy. We need to understand the factors, the way they are used and value i.e. the trust went from a 2 to a 3, to have an objective evaluation function.

Abbie replied yes, we are on the same page here. The addition of methods should reduce the overall vulnerability. If add something with the same vulnerabilities you are really not elevating the trust. If you are trying to protect against key logging, a longer password doesn’t help.  Gardner says that if you add a drop down with an OTP, you have mitigated this risk. But you are still using a password. It is the vulnerability. The additional factor should reduce vulnerabilities.

Peter asked if we could put together a brief statement of trust el.

Abbie asked if there were any objections to having an open meeting after lunch.

There were none.

Peter is happy to have them participate

We will review use cases in the afternoon.

Peter asked if we can ask for methods directly.

Abbie replied yes.

Peter remarked it seems use cases are more appealing to participants than methods?  Not clear.  He is concerned about having two conversations.

Abbie said in the afternoon, we can talk use cases: have you used this method before? What is so different about using these methods, etc.

Passwords can be clear, hashed, under SSL, or hash encrypted and signed. There are strong and weak passwords.  We have to come up with a way to ensure these factors.

Peter said I could get user name and validate the password then throw some KBA at it. We have a use case like that. We hand out username/password, and manage it. When it is used, we do an IP check, we check cookies, send KBA questions one of which is a negative.  

What triggers the KBA varies.  Cookie value depends on the OS (desktop, mobile) used, if banded or not, if it is an SSL session or not, if can validate the IP address. There are various ways to evaluate trust components in the risk engine.

The details of the validation are out of scope. The way B of A does its risk engine is secret.

Shaheen completely agreed. He doesn’t want to disclose proprietary information. 

Peter said the reason we don’t ask for details in the phase 1 questions is to address this concern. Within SSL it is mutually authenticated or not.  We can broadly describe it.  There are variants on how it is implemented.  You can check on where the user is coming from and decide yes or no on KBA.

Abbie said we are in violent agreement.

Shaheen asked if we need to define criteria for each LOA.

Abbie returned to the topic of vulnerabilities. He likes approach in the Gartner paper. 

We need to go down to the details of validate IP address, protect against man in the middle and man in router, hijacking, etc.

Peter is concerned about false positives and false negatives.

……..

There was a discussion of bank security.  You can’t put a machine on his bank network with a USB card.  It is banned and tested for.  Hardware is encrypted at boot time. There are frequent scans on all machines.

Token source and jurisdiction were mentioned.

We reviewed the definition of trust elevation we had been discussing and that was posted to the chat room.

Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that don’t have the same vulnerabilities.  There are four categories of methods: who you are, what you know, what you have and the context. Elevation can be within the classic four NIST or ITU-T levels of assurance or across levels of assurance.

 

Shaheen didn’t want to restrict it to NIST specifications. He would rather go with ITU-T.  One of the international standards editors is from NIST. NIST has contributed a lot.

It was also commented that we should include provisioning information. A 6 month certificate is too long.  We need to note initial state and duration.

Revised text:

Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that don’t have the same vulnerabilities. There are four categories of methods: who you are, what you know, what you have and the context. Context includes location, time, party, prior relationship, social relationship and source. Elevation can be within the classic four NIST and ISO/ITU-T levels of assurance or across levels of assurance.

 

We will start with this.  This is a jump start. It will be edited.

Cathy asked where the role of identify proofing comes in for LOA.  Trust depends on certifiers and TFPs.

Break.

Abbie summarized that we are setting the stage:

1)      Working statement on trust-el

2)      Next we need to look at various factors – what structure should we adopt so we can do the analysis. He wants to open the discussion.

3)      We need a table of contents and main sections [of the draft document] and to start to fill in spaces.

Peter said we got an introduction to the methods.  What we need is to get everyone participating to move forward and write down methods they use at whatever level of specificity they are comfortable with, to add to the knowledge base.  We are moving forward with the services. The next step is about making this information acquisition happen. And putting it writing

Abbie said we have a statement of work and it is being progressed. We will have someone who we will know. The way the SOW is structured, the TC will get updates at each phone call. He doesn’t want to wait.  There are people in room, and we can identify factors UN/PW and software and Hardware tokens, biometrics. We can look at the LOA levels, PIV, etc.  UN/PS can come from the RP or be federated.  KBA is also an elevation method.  Also IP checking - have we seen it before?  Is it trusted not to infect us? These are risk factors. There is a fine line between trust and security.  There are numerical scoring approaches that are built upon trusted credentials and trust in KBA (KBA is not all the same).  The AMA’s KBA data base is proprietary and more trust worthy than public databases.

Brendan asked if we should make these observations.

Peter replied yes. Reliability and reality are not value judgments, but analytic information. Data aggregators of publically available data on individuals have lower accuracy in data elements than proprietary providers (e.g. KBA on age.)

Abbie said at the end of day, it is how vulnerable you are to attack.

Peter said it isn’t about risk it is about trust.

Abbie said it is a choice of technique. The technique has to remedy certain weaknesses. Contextual knowledge base is more important, especially with biometrics.  Phone and voice print is another example.  Voice recording is used in the FSI. Can do voice comparison.

Abbie said sometimes a bank can request an in house interview.  What you can do depends on the laws and jurisdiction.

Abbie asked if you want more confidence, which attribute can you use to enhance trust? Usually KBA is used when context is changed. So the KBA is secondary,  i.e. convince me it is you. Some KBA is static and trivial. Some could be very dynamic. You could ask Verizon is the customer really roaming with you.

Peter remarked that some of these aren’t credentials.  Some of these may not be trust elevating, but rather credential validating. Sometimes you are validating the credential without increasing the trust.

Shaheen said the key is username /password s first. KBA is secondary. It is used when variables have changed. UN/PW comes in and risk engine takes IP address, cookie, location, and sees if these are different. If different we go to KBA.  It depends on how smart or accurate the KBA is. First pet is not as good as a voice print. Biometrics are almost always better than public KBA.  If done right, if KBA is via SSL and contains relationship /private data , like the AMA database, proprietary and not public, are more likely to rely on the KBA responses. If the KBA aggregator is in a secure environment it helps.

Brendon doesn’t know if that issue is actually resolved.  Competitors providing a redundancy of data records derived from multiple sources helps to reduce vulnerability.  One can use a hybrid approach using customer specific and public data.  We need to careful. There is a slippery slope of value judgment, rather than different tactics that can be used.

Peter said, first we should gather the raw data, then we should analyze it.  Yes there is a value judgment for some.

Abbie said there is a rat hole.  We need context to judge quality.

Peter commented there are varying levels in KBA.

Abbie gave examples of varying risk context: a person in an ambulance who you need to know is allergic to insulin vs. transferring money.  The risk is different. So don’t want to get into discussion of risk score.  Score and location and method of capture aggregation vary on KBA.

Peter agreed that KBA is not a monolithic thing. It is a variety of methods and systems. It is all this same general technique.

Abbie said that a second factor is a token, hardware or software.

Cathy commented that biometric is not a token according to 800.63, but she thinks it is. For biometrics, integrity, authenticity and privacy – not secrecy are key.  There are known security mechanisms that can make a biometric very viable.

Peter said the issue is which bucket to use for biometrics. Biometrics is something you are.

Abbie commented that biometrics can be thwarted if biometric data can be injected into the transport.

Shaheen said if you make biometrics a knowledge base, you could use as a secondary [factor] only.  Never use it as a primary.

Voice print is a biometric and heart beat.

Cathy discussed security vs. authenticity.  She didn’t disagree.  If looking at authenticity and integrity the key is that it comes from a live human and is not intervened.  If don’t have liveliness detection, than the reliability is lower than with liveliness.

Peter noted that physically present is a context of biometric.  Biometric is not just a one point thing, it has a broad range.

Abbie agreed. So need to be very careful.

Peter commented that biometric techniques run the gamut of trustworthiness based on context and implementation.

Cathy said that voice is less accurate than iris scan, so we need to categorize it that way.

Peter is looking information on lowest false positives and false negatives

Cathy said that she has info on this. The ISO .92 biometric security management for FSI covers this.  This is one of the things that needs to be put on the list

Peter said we can do a direct link. The bibliography will be part of the first or second deliverable.

*Action item to call again for key documents to be identified to put on the bibliography on the TC site.

*Action item for Peter to send a message to Tony R.

*Action item for Cathy to send a list of biometric references to the list. (Done).

Peter suggested that biometrics be listed a separate category as there are so many issues.

Abbie asked if there were any objections if we separate biometrics as a factor.

There were none.

Shaheen said it is something that you are.

Abbie said so now we have three.

Peter talked about methods of verifying your experience with the device.

Abbie called this end point identity. It includes the device and access totality (cell phone or laptop, router).  Issue is how confident are we that we are getting information from a known location

Peter recommended we call this endpoint.

Abbie: endpoint identity.

Peter said we are looking to see what endpoint is.  That is a broad spread (SSL, cloud, or mutually authenticated SSL, device or cloud, how much we trust our cloud provider.) 

Abbie stated that trust- el needs to take cloud into account.  May have token from various providers (i.e. T-Mobile or Verizon.) We should build that into the system.  Need to trust info coming to you about the endpoint.

Peter said so this is 4th factor.

Abbie listed KBA, tokens (hardware, software, certificate) biometric and endpoint (geo) doing binding.

Shahrokh mentioned trust execution, the environment and secure storage.  If leave credential open to   man in the middle attacks, it defeats whole purpose.

Peter said it is all about the context in which the method is implemented.

Abbie said this is the critical to what we will do to evaluate the trust.

Peter stated that the three factors are analogous to the three dimensions. Context is analogous to time (4th dimension.)

Ed asked: What is the difference between knowledge based and passwords?  

Peter commented on passwords and KBA. UN/PW is a credential. The questions and answers are transactional. Someone challenge you with a question and you answer with an attribute. So it is how it is used. 

Brendan commented that knowledge can be verified rather than self asserted.

Cathy said passwords are dynamic.  KBA is typically more static. See SSA example.

Peter said the baseline is a credential.

Abbie said Hotmail doesn’t accept weak passwords any more.

Abbie said the credential is only as good as the weakest link (e.g. password reset via email.)

Abbie related on how the bank doesn’t allow bank employees to use the bank password outside of the bank.

Peter said there are only so many static defenses.

Ed asked have we covered challenge response, or respond with transform?

Peter asked is challenge response KBA?  What bucket is CAPTCHA in?

Shahrokh said that challenge response can be identity of device. Is it provisioned by a white listed, banded platform, for example?  

Cathy commented on band vs. out of band. 

Abbie said OTP is a token. Adding a dimension is the delivery mechanism or channel of delivery. Decoupling channel adds value regardless of strength of factors. 

Peter said delivery is a context.

Abbie said we should we have context as an endpoint identity.

Peter said that no, context effects all of the buckets.  It is one of the factors you use.  It is one of the elements you use.

Cathy commented on one wrinkle. If there is a challenge response with what you know, sometimes you repeat this with an out of band response or with voice i.e. speaker verification and voice content.

Peter said context is a condition for each bucket. Implementation practice is a condition for each budget, though that may be too broad.

Cathy said in 800-63, OTP is separate.

Peter commented it does a good job of describing the current crop of tokens, but it is not a good model for future tokens.

Cathy agreed.  Do we start with what they have?

Peter said yes, definitely a separate token.   UN/PW is a token, SAML is a token.

Abbie said- what we need to do now is just start looking at what types of tokens we have and bindings.  Certificate binding is a context.

Lunch break.

This part of the meeting was open.

Cathy provided the following references:

·         ANSI X9.84:2003, “Biometric Information Management and Security for the Financial Services Industry.”

 

·         ISO 19092:2008, Financial services - Biometrics - Security framework

 

·         InterNational Committee for Information Technology Standards (INCITS) Technical Committee M1 (Biometrics), Study Report on Bio-metrics in e-Authentication, INCITS M1/07-0185rev, 30 March 2007, http://www.incits.org/tc_home/m1htm/m1070185rev.pdf

 

·         ISO/IEC 24761:2009, Information technology — Security techniques — Authentication context for biometrics (ACBio)

 

·         ISO/IEC 24745:2011, Information technology — Security techniques — Biometric information protection

 

Peter recommended we title the initial deliverable:  Survey of Methods of Trust Elevation.

We did a recap for those who just joined.   KBA, token, biometrics and endpoint identity – context goes across all of these.

We begin to review the method use cases we have received.

Shaheen’s use case:

1.User wishes to subscribe online services of Bank-A
2.Bank-A requests the user to obtain a credential that satisfies its level of assurance for the service it offers. The user could obtain the credential from any commercial Identity Provider (IdP) permitted by Bank-A.
3.User chooses IdP-X as his/her Identity Provider and obtains a credential (IDFIN) for online financial transaction from IdP-X.
4.Bank-A confirms acceptability of IDFIN and User registers his/her IDFIN with Bank-A
5.User then wishes to subscribe a different type of service from Bank-B
6.Bank-B requests the user to obtain a credential that satisfies its level of assurance for the service it offers. The user could obtain the credential from any commercial Identity Provider (IdP) permitted by Bank-B. 
7.User already has IDFIN from IdP-X that would satisfy Bank-B level of assurance requirement.
8.Bank-B confirms acceptability of IDFIN and User registers his/her IDFIN with Bank-B
 

Shaheen said the use case is a third party identity credential.  The same credential used for different functions at different banks. This is first testing ground for factors.

Is the credential in the form of a token?

Shaheen said to assume it is a token.  Assume one RP is retail banking and one is a commercial service.

Abbie described a software token within a trust framework (s). Token to be validated by multiple RPs. This is token reuse. Entitlements will be different at each bank.

Dale said if someone comes in and they want not auth, they auth to Verizon and Verizon send an assertion to the RP that it is Dale, she is ok. 

John commented there are two scenarios. If you have a RSA token that works with multiple RPs, you can get it and register it with multiple RP’s.  This is different from SSO, which is an identity assertion about a data subject. 

Abbie commented that it could be an unverified token. 

John said that in Canada, Canadian banks make an assertion to the hub that you are the same and strongly authenticated, but then deal with you pseudonymously.  This is about reuse of primary authenticators, pseudonymous or otherwise.

Abbie asked are we putting requirements on the format of the token or what is the impact?

Dale said we need to know who the verifier is: the Bank or Identity provider. 

Sarbari said it is better to discuss the use case, not discuss how. She heard credential reuse [in the use case]. So in this use case there is the same assertion but it is allowed to do different things.

Abbie said it is a software-based token.  Need to verify token and understand what it asserts.

Shaheen said he would use a hardware token.

Abbie asked if the use case would change if it was a software token.

How do you do shared seed at two different banks?

John said this is theoretically possible using a primary authenticator that can register at multiple banks to use as primary means of authentication.

Abbie brought the conversation back to elevation. If start with strong authenticator is better that UN/PW.

Cathy said one of the big benefits of doing use cases is to bring out issues.  Has to do with identity proofing, and who is responsible for the ID proofing. Banks already have know your customer requirements. NIST 800-63 puts everything on the identity provider. You could have shared responsibility for this.

Sarbari commented that the backdrop is NIST 800-63 or broader: ISO/IEC 29115 or ITU-T x.1254.

She has a use case also.

NIST 800-63 uses primary authenticator language.

Shaheen said one could use the same credential at different levels of assurance.

Abbie said we need to separate proofing (authentication) trust level from policy provider of consumer. If look back at table of factors, could be hardware or software.

LOA will vary based on token and initial seed and binding.

David has a customer retention use case.  Initially user may not be authenticated at all.

The challenge: user / customer conversion
 
The story: a business, typically a bank with an online app, wants to convert visitors to their websites to actual customers. Depending on the authentication level of the users, they can see more or less information. A user not logged in at all would see publicly available information. A user would have the choice to authenticate using a publicly available IdP such as Google via common standards such as in this case OpenID. The fact they are logged in gives them a better experience and grants them access to more content.
In a final step, a user could request to become a customer to create an account with the said business. Since they are already authenticated, some information could already be filled in.
 
How we did it:
I implemented the scenario with a colleague from Ping Identity. We used Ping Federate and the Axiomatics XACML Policy Server to achieve context-based access control (depending on the source of the authentication).
 
In the demo, the way attributes were collected and converted was via code we wrote - there is currently no standard there. There is no standard in XACML on how to take into account trust elevation (or augmented credentials)
 
Also, Google (for instance) doesn't release a lot of information because it doesn't trust the requestor (in this case the decision point or 'PDP'). The PDP would need to strengthen its trust relationship with the IdP in order to retrieve more attributes.

 

How much info you give to end user depends on level of auth.

Sabari said there is a third level if use bank ID.

David said the XACML TC eventually needs to support trust elevation.

Trust-el is different means to authentication. Could have a conditional policy: if IDP ABC and in home state, then yes.  If Google in UK, no, etc.  All this is info that can be used in access control policies.

Relate this back to table.

David said trust happens way before authorization. If user logs into account using user name and a password, providing some information is ok, but a more sensitive transfer  is a no unless use OTP to strengthen. For example risk for a transaction oversees is greater than for an internal transaction.

Sabari commented that this was nice. The same RP. To get more and more services, need higher and higher levels of assurance. So she says 3 different credentials to do this.

Each time it is elevated.

Very nice.

Abbie asked is this one credential decoded or three credentials?

David said there were four separate ones in four separate assertions, one is nothing, then OpenID (Google or Yahoo) and third internal LDAP, fourth requires bank two factor.

Abbie asked how does this relate to the chart.

Shaheen said he used a combination of two methods for any particular level.

1-      UN/PW

2-      UN/PW and token.

3-      Credential from marketing business – trusted it because accepted it.

Sabari commented that for the Feds, LOA-2 and LOA-3 have different requirements. Three s two factor.  She advocates staying at function level.

John echoed that.  The utility of the use case is showing why you want to do trust-el.  Specific methods are interchangeable.

Abbie said this is precisely the purpose of this use case.

David said OpenID is KBA.

Abbie disagreed.  A UN/PW is a shared secret, not KBA.

Shaheen said context is a condition we will use for every credential.

Sabari said we use classic factors and augment with context.

David said the decision engine will need context. 

Sabari said in context is strength.  Level of auth and other metadata context.

*Action item to create dictionary.

Abbie said this was a statement of operation rather than definitional.

Next use case is from Thomas:

A user on a client computer seeks to gain access to resources located at 
Cloud Provider (eg. Saas, PaaS).  In addition to being authenticated by 
an Identity Provider (IdP), the client computer needs to be 
integrity-evaluated by the a trusted Integrity Measurement Service 
(IMS). The IMS is assumed to be a participant under the same Trust 
Framework.
 
As part of the trust level evaluation by the IdP, the IdP re-directs the 
client to the IMS service.  The client and the IMS service then execute 
the integrity measurement protocol (single round or multi-round), 
resulting in the IMS service establishing (assigning) a "trust score" 
for the client platform (hardware and software). The IMS service then 
returns the trust score to the IdP (eg. via back channel), in the form 
of a signed assertion.
 
The IdP then includes the client's trust score when the IdP computes the 
trust level (eg. LOA) assigned to the user on the client computer.
 
This approach allows the consumer of the LOA assertions/claims (eg. a 
service provider) to obtain a better picture about the human user (eg. 
her/his identity) as well as the different client platforms that she/he 
is connecting form (eg. PC computer, iPad, mobile phone, etc).

 

John said so you want a context assertion about user’s environment to further evaluate trust independent of the LOA of the credential. For example, on compromised machine, you may not want to trust.

John said that CISCO has had this stuff for a long time.  Given the state of the Internet, it may be time to move some of these [approaches] out into the general Internet.

Sabari asked how will the RP know you are on a secured platform.  Are their quality parameters?

Thomas said you can define some components on the platform: BIOS, and antivirus version, etc. The schema needs to be delivered with the client platform, last time of the Symantec virus scan, etc.

John said there are a number of these. There is the platform as context vs. the platform as what you have.  So one scenario is a web cafe. If the PIV card is at a web cafe on a machine with spyware, you shouldn’t download classified information, even if it really is John.

So we need to trust the individual and trust the conduit.

We then revisited Abbie’s use case.

Date: 11/02/2011
Version: V1
Title: KBA use case
1. Background
Knowledge based authentication (KBA) is an authentication scheme that asks a user one or more secret question in order to confirm the user identity. This type of authentication is often used as a component in multifactor authentication (MFA). KBA is widely used in self-service password retrieval requests. Current KBA schemes use static information in order to help compose the secret question for the users.
This use case assumes that the secret questions and their answers are collected by the identity provider at the identity enrolment stage.
2. Use Case explanation
Consider a user that is trying to log on to his/her bank account. The bank notice that the user is logging in from a new location (a new IP address). The bank decides that it needs to elevate the trust in the authentication step and asks the user to respond to a secret questions (for example, what is you date of birth). The bank will either grant or deny access based on the user response to the question. In some cases multiple questions could be asked.
This case illustrate a trust elevation case
Case tries to protect access to account information
This method is vulnerable to all kind of social engineering attacks and is not secure.
Usability Issues:
oHow many questions should be used? 
oShould prepared questions be used at registration, or can make their own questions?
3. Use Case Variation
The bank decides to improve on the above case by providing limited access at initial log on. For example static KBA can be used to logon to the account with the ability to just view the account balance. The bank will require additional authentication and trust elevation if the user would like to do a payment or money transfer.
The bank can provide the following options for elevating trust. The bank can use a risk analysis engine that can provide access to the user based on risk factors. For example, if the device is identified the user can be asked to
1. Type a new password for performing a payment or transfer
2. Different passwords can be used based on which transactions are to be used
3. Based on the risk of the transaction, the bank can use an out of band mean to validate the customer. Examples include:
OTP through 
SMS, email 
Phone call from a rep or through avoice recognition software
4. Trust Elevation Analysis
Some questions arise about the use of KBA
1. Does KBA meats new FFIEC Guidance?
2. What are the pitfalls of static KBA
3. What are the alternatives

 

There was a comment about KBA plus context.

John asked if this includes decline in efficiency of KBA with use.  The trust elevation is a function of how stupid the question is and how deep the question pool is. It could also be a transactional/smart/dynamic knowledge base.

John said in the use case we should point out that KBA can be overused and done badly.  If not done correctly can hurt you.

Brendan made a comment about value judgments.

John remarked that KBA is scoped to sessions.  It can’t cross sessions. We need to scope the bounds of trust elevation.

It helps if it is more intelligent and dynamic.  If ask same question multiple times, it doesn’t help. Knowledge of static public information does not identify a public figure.

Sabari’s use case: 

Next use case is for the VA.  Imagine a veteran goes to the VA portal. When an active duty person becomes a veteran they get a UN/PW credential (and they turn in their common access card.) They log in with DS login = shared secret and LOA-2. Then they have some access. Then they want to access health records, the portal says they need LOA-3 (hardware token.)  At that point there are a couple of possibilities (i.e. mobile phone OTP.  If had already registered the phone, it could be used to bump–up a level for that session. Or could send a new secret or code to the phone and show possession by typing in a code in an online channel. Then they are at level three.   But it is a session-based bump-up.)

Debbie said there are many items with DS login.  They are identify proofed, even though only LOA-2 credentials.

Sabari commented she is a co-author of 800-63-1. 

John asked what the frequency of password reset was for that portal given that they need to be rotated.  What percentage of users go through password rest?  If first step is a password reset, then need to factor that into the context.  Maybe better to use the device as the primary credential.

Sabari said they want the process to be easy. UN/PW is socialized. NIST 800-63 talks about credential management at certain LOAs as part of the overall context for credential.

If perform ID proofing at registration, but use a weak token is still low LOA.  The token is the weak link.  We need to consider registration ID proofing, type of token and how it is managed.

John commented on the password recovery issue. It is a big deal. If there is a random email account in part of the path, then is only as secure as that.  We need to consider the entire password lifecycle.  That is part of the context.

Abbie agreed.

*Action item for Sabari to send write up of use case.

John’s use case:

John described using LOA proofed know your customer credential from Canadian banks, but the banks are asserting them as pseudonymous. Starts by log into agency and get re-directed to an identity hub that is an aggregator. That presents user with a list of institutions. They log into their institution with SAML and provide their online banking credential (org credential.)  They come back to the hub and the agency performs additional KBA. If doing a sensitive transaction, the agency can send them back though the hub, where they are providing a contactless auth mechanism thru the hub, where the person uses their contactless debit/credit card as additional factor. So there are 3 things. A bank makes an initial assertion.  Agency sets this up with additional attributes from transaction based KBA, for example. So they are using government info and credit bureau info to attach identity to a pseudonymous credential.

The Canadian government has in the past used a system provided by Entrust, but given that everyone needed to have their credential reset each year, they are using credential service providers via a hub.  Initially three Canadian banks are involved.  So bank sends an anonymous token back – this is the same person as last time. Goal is that all Canadian government sites will accept federated credentials via hub. (May have multiple hubs. The hub is not a bank.)

Sabari asked why it is pseudonymous.

John said it because the hubs don’t want to take liability of asserting to government. At some point may expand for hub to also assert attributes.

Dan’s use case:  step- up auth (trust elevation)

We expect user to come to the service with lower level credential.  Then expect may need to elevate to LOA-3 for certain transactions.  There may be technical and business elevations that are needed.

Dan explained this was just presented to the Kantara health WG.  Currently it specifies OpenID. We may use OpenID Connect. 

Abbie asked to walk thru the example. So first of all think online transactions.  Start the transaction off with an OpenID.  At some point, mid transaction, there may be a desire to prove consent.  At that point there would be a challenge. If the PIV was already bound to it, there would be challenge to prove that. If challenge is met, then could move forward.

Abbie commented so you can choose from among approved providers.  Each RP can recognize the providers it wants.  This is justification for what we are doing.

John took a stab at stating the use case:  the health club is a personal aggregator for health attributes and various credentials used for health serves.  For the health institution, it provides attributes for mapping to health records they may not have access to. It is also acting somewhat as an authorizing service.  So I need this LOA, can you use the appropriate stuff to elevate me for what I need?  Goal for majority of people is to get in simply and use a token like a Gmail to start, and elevate it as needed to access the more sensitive stuff.  

This would be done at a Policy enforcement point (PEP).

Deb talked about the need for personal info that isn’t necessarily LOA-3, but is definitely needed like blood type. She uses OpenID to access that type of info.

Dan asked what should you do when someone comes with a credential that is not LAO-3, just an insurance card.

John commented so the portal both provides multi-protocol auth and can provide step- up auth and provides some storage, personal data values for some personal stuff - maybe blood glucose levels. It then has other higher LOA API’s that interface to SAML /WS-Fed worked of institution health records. And give sufficient step ups to give individuals access to those records.

It is a multi-directional hub that empowers the health care users.

*Action item Dazza was asked to send the written use case. 

David asked if he’d looked at what is done in Sweden.  Are you tackling fine grained access control?

Dan commented that he had been in contact with Sweden informally via IDC at IIW. One part has XACML this is under specified now.  May take identity from PIV system and as a discoverable service it can be leveraged by XACML service.

Abbie said that we need to establish a dialogue with the Kantara Attribute Management discussion group and asked if there were any objections.

There were no objections. 

*Action item to establish a dialogue with the Kantara Attribute Management discussion group.

Abbie also wants to establish a liaison with Q10 asking for x.1254

There were no objections

Abbie made a motion

Brendan seconded it. 

There were no objections.

*An action item was taken to liaison with ITU-T for x.1254

Abbie made a motion to liaison with the Kantara Working Group

Dazza seconded the motion.  There were no objections.

*Action item for Dazza to liaison with Kantara working group.

Abbie asked if we should keep the call next week.

He moved to keep it.  There were no objections.  We will have a call on November 17.

Abbie asked Colin if the current meeting time was tolerable.

Colin replied that it was ok and asked to revisit it at Summer time.

*Action item to revisit meeting time with Colin at Spring time change.

Abbie agreed and asked Colin to please raise this as in issue if it becomes inconvenient.

Peter asked for a summary of action items.

Peter said that the OASIS IDtrust member section is planning its 2012 budget.

* An action item was taken to make a tentative budget request for three categories of activities:

1.       Phase 2 deliverable

2.       Two face-to-face meetings

3.       Travel funding for independents.

*An action item was noted to get Trust-el on the agenda for the NIST- IDtrust event in March.

*The next Face-2-Face meeting is most likely in March in conjunction with the NIST IDTrust workshop.  Hopefully the document will be mature enough to discuss at that event. An action item was noted to coordinate this meeting.

Dale moved to Adjourn and Peter seconded it.

There were no objections.  The meeting was adjourned.

>>>>>>>>>>>>>>>>>>>>>>>> 

Text from chat room
 
Mary Ruddy: Hello
Mary Ruddy: We will be starting in a few minutes at ~9:00
abbie: working on the bridge
Mary Ruddy: bridge is open
abbie: will fixing the bridge...
abbie: Passcode: 637 218 8139
1 866 222 6652
Int'l Toll: 1-980-939-6928
Dial-In Numbers - (Please see Conference Shortcuts Below)
Int'l Toll: 1-980-939-6928 Local
Mary Ruddy: we have started
abbie: test
Shaheen Abdul Jabbar: we can see you
Mary Ruddy: Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that dont have the same vulnerabilities.  There are four categories of methods: who you are, what you know, what you have and the context. Elevation can be within the classic four NIST or ITU-T levels of assurance or across levels of assurance.
Thomas Hardjono (MIT): Hello all.  I will be joining the call on and off today (due to meetings)
Mary Ruddy: Welcome Thomas
Thomas Hardjono (MIT): I'm not sure what criteria this falls under, but I would like to introduce the notion of "what is your state" (ie. machine or platform state).
Mary Ruddy: Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that dont have the same vulnerabilities. There are four categories of methods: who you are, what you know, what you have and the context. Elevation can be within the classic four NIST or ITU-T levels of assurance or across levels of assurance.
Context includes location, time, party, platform state, prior relationship and source.
Thomas Hardjono (MIT): Thanks Mary http://webconf.soaphub.org/conf/images/smile.gif
Mary Ruddy: Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that dont have the same vulnerabilities. There are four categories of methods: who you are, what you know, what you have and the context. Context includes location, time, party, prior relationship and source. Elevation can be within the classic four NIST and ISO/ITU-T levels of assurance or across levels of assurance.
Thomas Hardjono (MIT): Could we use the term "technical trust" instead of just "trust"
Mary Ruddy: ok
Mary Ruddy: more discussion on that last comment
Mary Ruddy: Trust elevation is increasing the strength of trust by adding factors from the same or different categories of methods that dont have the same vulnerabilities. There are four categories of methods: who you are, what you know, what you have and the context. Context includes location, time, party, prior relationship, social relationship and source. Elevation can be within the classic four NIST and ISO/ITU-T levels of assurance or across levels of assurance.
Mary Ruddy: added social relationship
Thomas Hardjono (MIT): So perhaps we could capture business trust as policy.
Thomas Hardjono (MIT): Is "context" too wide or too fuzzy ?
Mary Ruddy: policy is not a factor is how you compute the trust.
Thomas Hardjono (MIT): [Folks, I need to drop off for about an hour or so. Apologies.]
Mary Ruddy: are going to break for 15 minutes
Mary Ruddy: will leave the bridge open
Mary Ruddy: we are resuming
Carl Mattocks: rejoined the meeting
anonymous1 morphed into Shahrokh-Intel
Mary Ruddy: we are resuming at
Mary Ruddy: 1
Mary Ruddy: We are resuming at 1:15
Shaheen Abdul Jabbar: are we back?
Mary Ruddy: yes
abbie: we are warming up...
Mary Ruddy: we are going to start in another 5 minutes
Mary Ruddy: We are really resuming now
Mary Ruddy: Shaheen's use case
Mary Ruddy: Here is a use case for our discussion -
 
1.User wishes to subscribe online services of Bank-A
2.Bank-A requests the user to obtain a credential that satisfies its level of assurance for the service it offers. The user could obtain the credential from any commercial Identity Provider (IdP) permitted by Bank-A.
3.User chooses IdP-X as his/her Identity Provider and obtains a credential (IDFIN) for online financial transaction from IdP-X.
4.Bank-A confirms acceptability of IDFIN and User registers his/her IDFIN with Bank-A
5.User then wishes to subscribe a different type of service from Bank-B
6.Bank-B requests the user to obtain a credential that satisfies its level of assurance for the service it offers. The user could obtain the credential from any commercial Identity Provider (IdP) permitted by Bank-B. 
7.User already has IDFIN from IdP-X that would satisfy Bank-B level of assurance requirement.
8.Bank-B confirms acceptability of IDFIN and User registers his/her IDFIN with Bank-B
Thomas Hardjono (MIT): This sounds like a good use-case. Especially if Bank-A and Bank-B already have some kind of business relationship or work under some umbrella org.
Mary Ruddy: next use case
Mary Ruddy: Here is a skeleton of a use case I demoed at the European Identity Conference last May 2011.
 
The challenge: user / customer conversion
 
The story: a business, typically a bank with an online app, wants to convert visitors to their websites to actual customers. Depending on the authentication level of the users, they can see more or less information. A user not logged in at all would see publicly available information. A user would have the choice to authenticate using a publicly available IdP such as Google via common standards such as in this case OpenID. The fact they are logged in gives them a better experience and grants them access to more content.
In a final step, a user could request to become a customer to create an account with the said business. Since they are already authenticated, some information could already be filled in.
 
How we did it:
I implemented the scenario with a colleague from Ping Identity. We used Ping Federate and the Axiomatics XACML Policy Server to achieve context-based access control (depending on the source of the authentication).
 
In the demo, the way attributes were collected and converted was via code we wrote - there is currently no standard there. There is no standard in XACML on how to take into account trust elevation (or augmented credentials)
 
Also, Google (for instance) doesn't release a lot of information because it doesn't trust the requestor (in this case the decision point or 'PDP'). The PDP would need to strengthen its trust relationship with the IdP in order to retrieve more attributes.
Mary Ruddy: next use case
Mary Ruddy: Thomas
A user on a client computer seeks to gain access to resources located at 
Cloud Provider (eg. Saas, PaaS).  In addition to being authenticated by 
an Identity Provider (IdP), the client computer needs to be 
integrity-evaluated by the a trusted Integrity Measurement Service 
(IMS). The IMS is assumed to be a participant under the same Trust 
Framework.
 
As part of the trust level evaluation by the IdP, the IdP re-directs the 
client to the IMS service.  The client and the IMS service then execute 
the integrity measurement protocol (single round or multi-round), 
resulting in the IMS service establishing (assigning) a "trust score" 
for the client platform (hardware and software). The IMS service then 
returns the trust score to the IdP (eg. via back channel), in the form 
of a signed assertion.
 
The IdP then includes the client's trust score when the IdP computes the 
trust level (eg. LOA) assigned to the user on the client computer.
 
This approach allows the consumer of the LOA assertions/claims (eg. a 
service provider) to obtain a better picture about the human user (eg. 
her/his identity) as well as the different client platforms that she/he 
is connecting form (eg. PC computer, iPad, mobile phone, etc).
Thomas Hardjono (MIT): Summary: Trust Elevation Based on Integrity Measurements
 
A user on a client computer seeks to gain access to resources located at 
Cloud Provider (eg. Saas, PaaS).  In addition to being authenticated by 
an Identity Provider (IdP), the client computer needs to be 
integrity-evaluated by the a trusted Integrity Measurement Service 
(IMS). The IMS is assumed to be a participant under the same Trust 
Framework.
 
As part of the trust level evaluation by the IdP, the IdP re-directs the 
client to the IMS service.  The client and the IMS service then execute 
the integrity measurement protocol (single round or multi-round), 
resulting in the IMS service establishing (assigning) a "trust score" 
for the client platform (hardware and software). The IMS service then 
returns the trust score to the IdP (eg. via back channel), in the form 
of a signed assertion.
 
The IdP then includes the client's trust score when the IdP computes the 
trust level (eg. LOA) assigned to the user on the client computer.
 
This approach allows the consumer of the LOA assertions/claims (eg. a 
service provider) to obtain a better picture about the human user (eg. 
her/his identity) as well as the different client platforms that she/he 
is connecting form (eg. PC computer, iPad, mobile phone, etc).
Thomas Hardjono (MIT): http://www.trustedcomputinggroup.org/resources/infrastructure_work_group_core_integrity_schema_specification_version_101/
Mary Ruddy: next use case
Mary Ruddy: Thomas
A user on a client computer seeks to gain access to resources located at 
Cloud Provider (eg. Saas, PaaS).  In addition to being authenticated by 
an Identity Provider (IdP), the client computer needs to be 
integrity-evaluated by the a trusted Integrity Measurement Service 
(IMS). The IMS is assumed to be a participant under the same Trust 
Framework.
 
As part of the trust level evaluation by the IdP, the IdP re-directs the 
client to the IMS service.  The client and the IMS service then execute 
the integrity measurement protocol (single round or multi-round), 
resulting in the IMS service establishing (assigning) a "trust score" 
for the client platform (hardware and software). The IMS service then 
returns the trust score to the IdP (eg. via back channel), in the form 
of a signed assertion.
 
The IdP then includes the client's trust score when the IdP computes the 
trust level (eg. LOA) assigned to the user on the client computer.
 
This approach allows the consumer of the LOA assertions/claims (eg. a 
service provider) to obtain a better picture about the human user (eg. 
her/his identity) as well as the different client platforms that she/he 
is connecting form (eg. PC computer, iPad, mobile phone, etc).
Mary Ruddy: the real next use case
Mary Ruddy: Submitter: Abbie Barbir
Date: 11/02/2011
Version: V1
Title: KBA use case
1. Background
Knowledge based authentication (KBA) is an authentication scheme that asks a user one or more secret question in order to confirm the user identity. This type of authentication is often used  as a component in multifactor authentication (MFA). KBA is widely used in self-service password retrieval requests. Current KBA schemes use static information in order to help compose the secret question for the users.
This use case assumes that the secret questions and their answers are collected by the identity provider at the identity enrolment stage.
2. Use Case explanation
Consider a user that is trying to log on to his/her bank account. The bank notice that the user is logging in from a new location (a new IP address). The bank decides that it needs to elevate the trust in the authentication step and asks the user to respond to a secret questions (for example, what is you date of birth). The bank will either grant or deny access based on the user response to the question. In some cases multiple questions could be asked.
This case illustrate a trust elevation case
Case tries to protect access to account information
This method is vulnerable to all kind of social engineering attacks and is not secure.
Usability Issues:
oHow many questions should be used? 
oShould prepared questions be used at registration, or can make their own questions?
3. Use Case Variation
The bank decides to improve on the above case by providing limited access at initial log on. For example static KBA can be used to logon to the account with the ability to just view the account balance. The bank will require additional authentication and trust elevation if the user would like to do a payment or money transfer.
The bank can provide the following options for elevating trust. The bank can use a risk analysis engine that can provide access to the user based on risk factors. For example, if the device is identified the user can be asked to
1. Type a new password for performing a payment or transfer
2. Different passwords can be used based on which transactions are to be used
3. Based on the risk of the transaction, the bank can use an out of band mean to validate the customer. Examples include:
OTP through 
SMS, email 
Phone call from a rep or through a voice recognition sofwtare
4. Trust Elevation Analysis
Some questions arise about the use of KBA
1. Does KBA meats new FFIEC Guidance?
2. What are the pitfalls of static KBA
3. What are the alternaties
Mary Ruddy: we have quorum
anonymous morphed into Colin - NZ Gov
abbie: colin are you on the phone also
Mary Ruddy: please join on the phone if you can

 

 



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]