Showing posts with label authentication. Show all posts
Showing posts with label authentication. Show all posts

Jan 4, 2013

Step-up authentication as-a-service for SURFnet

Two-factor authentication used to be the domain of secret services and the military. The enterprise and consumer e-Banking and e-Government domains have since embraced two-factor (or: step-up) authentication. More recently social network sites such as Facebook and Google have started offering two-factor to protect their (free) services. Federations of higher education and research (operated by the NRENs) are still largely basing their authentication on username/password.

A parallel development is that service providers in federations for higher education and research are starting to offer services that deal with highly sensitive information; for instance privacy sensitive administrative, research, or medical data. This is both a consequence of the success story of federations as well as the "move to the cloud" where traditional in-house applications like accounting systems are increasingly being outsourced to cloud providers. Because of the sensitive nature of the data in such systems, stronger forms of authentication are necessary.

NRENs such as SURFnet have noticed these trends, and the discussion of how to best approach two-factor within a federated setting is now in full progress. An Identity Provider (IdP) within a federation is ultimately responsible for providing the identity of its users. This includes authentication and the IdP can make authentication as strong as it wishes, of course. The case for two-factor in a true federation is, however, significantly more complex than rolling out two-factor in a situation where Identity Provider and Service Provider coincide (such as in e-Banking or the enterprise), as information about the Level of Assurance is shared with and interpreted by Service Providers in the federation. Introducing multi-factor authentication within a federation is really only sensible if registration (enrollment of an authentication token) procedures also warrant a strict higher level of assurance.

Novay, in close collaboration with SURFnet, has made an initial design for a service to assist Identity Providers in the introduction of two-factor authentication solutions that can be used across the SURFconext federation. The report describing the design is available for download from the SURFnet website. The report describes both the technical (architecture, standards) and the procedural (registration, logging, de-registration) challenges.

Architecture
The two main architectural challenges to focus on are:

  • the best location for a multi-factor-authentication service, such that it can support multiple Identity Providers;
  • which standards (and what choices within those standards) should be used for uniformly signaling the level of assurance from Identity Providers to Service Providers.

As for the location of the service: in a SAML based hub-and-spoke federation (such as SURFconext) it makes sense to base (the initial version of) the service as a transparent proxy on the Service Provider bound exit of the hub (as shown above). This separates the service from the hub. It also means that the Identity Providers and Service Providers can remain unchanged, except for Service Providers that need to deal with higher levels of authentication. The paper builds a case for this simple architecture.

As for the standards to use: there are many variations on levels of assurance frameworks. To name  just a few: NIST SP800-63 for the US and STORK for the EU. The best option, at this moment, would be to standardize on the upcoming ISO/IEC 29115 standard which will unify some of these standards. SAML 2.0 has had support for signaling details about the authentication process (and related processes) since its inception in the form of the so-called Authentication Context. This concept, however, leaves a lot of implementation freedom (and therefore interpretation freedom) for Identity Providers and Service Providers. Attempts to merge the Authentication Context concept with ISO 29115-style levels of assurance are relatively recent, and also appear in other authentication protocols such as OpenID Connect. The paper will give recommendations for how to best apply these standards.

Registration process
The level of assurance of an authentication token is not only determined by characteristics of the token itself, but also by the process by which the token is bound to an individual user by the Identity  Provider. The paper recommends appointing a Registration Authority within the institute of higher education or research and making that person responsible for binding authentication tokens to users  (staff, students) of the institute. The paper gives precise guidelines for setting up a Registration Authority. The most important recommendation is that individual users should visit the Registration Authority in person for face-to-face binding to the authentication token. The user should bring the token and a valid passport or identity card. The Registration Authority will check that these match with the user and with the attributes as issued by the Identity Provider. The Registration Authority also oversees an authentication attempt (with the second factor only) to make sure the user actually  controls the token.

The registration process is supported by an online service hosted by the federation operator. The service contains both a self-service user interface for end-users (so that most of the administrative process can be dealt with before the user actually visits the Registration Authority) as well as a user  interface for the Registration Authority. The paper shows mock-ups for both user-interfaces.

It is highly likely that the service proposed in the paper has broader applications beyond the boundaries of SURFconext. The architecture was described with portability in mind, such that the service can be re-used in other federations. Although the paper makes some concrete choices (to make it relatively easy to actually start building such a service) these choices are documented in the  paper.
Acknowledgements
The authors would like to acknowledge Ruud Kosman of Novay for designing the mock-ups and other colleagues from SURFnet and Novay for reviewing early drafts of the paper.

Jul 13, 2011

Digipass Nano

I recently had an opportunity (thanks SURFnet, and VASCO) to have some hands-on experience with a novel class of authentication tokens. In a project for SURFnet my colleague Maarten Wegdam and myself looked at so-called SIM augmented authentication tokens, and the VASCO Digipass Nano in particular. The results of our analysis, in the form of a more detailed report, is available from the SURFnet website.

About the technology: A SIM augmented solution sits between the SIM and the handset (the ME) and consists of a very thin chip (see the image) in a sticker. It basically relays all traffic, consisting of so-called ISO7816-4 APDUs, from ME to SIM and back, while intercepting certain APDUs and injecting certain other APDUs. The user can interact with this benign man-in-the-middle through the SIM application toolkit (GSM 11.14, see also my earlier post on Mobile PKI), which is implemented in any GSM handset. VASCO's Digipass Nano uses this trick to implement an event based One Time Password token that is accessed by navigating the SIM menu in the handset, yet is fully secure (if GSM 11.14 is implemented securely) from snooping malware.

The man-in-the-middle characterization of SIM augmented solutions sounds scary, if you think about it, especially with respect to the trust that the ME (through GSM 11.14) puts in the SIM. On the other hand:
  • The (security, usability, and business model) advantages of secure storage of credentials may outweigh the (security, usability, and business model) disadvantages of asking the user to place a hardware device between SIM and ME. (I.e., the security should not be analyzed in isolation, and there are both security advantages and disadvantages.)
  • An attack which asks the user to place a (not-to-be-trusted) SIM augmented solution in their handset doesn't scale (and there is so much more low-hanging fruit for attackers, which scales much better). For a full threat analysis, see the report.
  • The average user isn't too concerned about what the SIM augmented solution can do. We did a small-scale user test as part of our research.
  • SIM augmentation based on GSM 11.14 allows, in principle, multiple secure elements (or secure cores, in Du Castel speak) within a single handset. Multiple secure elements, representing multiple stake holders, breaks the Mobile Network Operator dominated model for (very secure) credential storage. We also did a brief business model analysis as part of the report.
Whether we will see SIM augmented solutions in the short term remains to be seen. But it's certainly interesting technology to analyze.

Mar 31, 2009

Security in the workspace - Part 3


It seems that we will have to learn to live and work in a de-perimeterized world. Acceptance of the problem is often the first step towards a solution. So, what alternatives to perimeter defense are there? And what is the impact of these alternatives on the future workspace and vice versa? Below are some thoughts. I hesitate to call these conclusions. Please consider these as starting points for a discussion.
  • Defense in depth is the complete opposite of perimeter defense (when considering the location where controls are implemented). This security principle advises to apply multiple layers of security controls, so that if one layer fails other layers take over.
    • Unfortunately, complete defense in depth is increasingly expensive as it is difficult to maintain,
    • and because too many layers of security get in the way. (Is there a usabilty vs security trade-off? I'm not sure. But usability is probably not helped with adding multiple layers of security.)

  • Most experts see a shift from perimeter defense (and other location based defenses) to data oriented security. (Perhaps that should be information oriented security?)
    • Because of the multiple contexts in which employees now process data, this requires some sort of watermarking of sensitive and valuable data. If, for example, lost information can be tracked back to employees responsible for that information than those employees can be held accountable for the loss. But wasn't DRM declared dead?
    • Moreover, data oriented security makes valuation of information necessary: relative sensitivity and value to the organization should be made explicit. Valuation of assets should be done anyway (as part of information risk management), but that doesn't mean that it is easy, cheap or common practice today!
    • Related to the above point: information should be stored and processed with a clear goal in mind (for reasons of Governance, Regulations, Complicance). This is at least as difficult as valuation.

  • Accountability (the other A-word) may be an alternative to access control. Access control is somewhat problematic in the absence of a perimeter after all. Access control is expensive in the future workspace since employees join and leave the organization on a more regular basis (access credentials management is costly). Accountability certainly seems to be more compatible with the greater responsibility given to employees as part of the future workspace trends.

  • Identity management is necessary, as accountability usually means a great deal of logging (of actions of employees). Logging obviously requires the capability to distinguish between employees (try holding individuals accountable for their actions when you can't tell them apart). However, since we left the perimeter behind us, we can't rely on the classical identity management process which involves provisioning, authentication, and authorization.
    • The provisioning problem could be overcome if we could use an established identity provider's infrastructure which extends beyond the bounds of the organization. The existing identity provider (I'm thinking of national governments) has the infrastructure for issuing authentication means to individuals already in place. If such a global identity provider is not (yet) possible, federated identity management and user-centric identity management may be alternatives (in the mean time).
    • Authentication has to be done decentralized (in absense of a perimeter with check points) and preferably as often as possible yet also as unobtrusive as possible. Perhaps context-information could help here?
    • Authorization, on the other hand, is better done centralized so as to achieve consistent rules which are easy to manage. Well-defined roles could be useful here
Other points? Leave a comment!

Feb 2, 2009

The ePassport helps fight online identity fraud


This translation of an article in Dutch newspaper het Financiƫle Dagblad, 2 februari 2009, pp. 7 was created by Google translate (with only slight modifications by hand). The Dutch version also appears elsewhere on the interwebs.

A new tool in the fight against identity fraud has arrived. The Dutch ePassport with chip can be used as additional identification technology for Internet transactions. Without loss of privacy.

This is evident from NLnet Foundation funded research Martijn Oostdijk and Dirk-Jan van Dijk of the Telematica Institute have done. Using a simple card reader, the chip can be read on any PC. The standards to do this are public. The researchers developed software for an identity provider - a trusted party that creates digital identities and provides these to other parties - which they run on a server at the institute. Furthermore, the duo developed software that must be installed on the client's PC.

With passport in hand, the user may enter a web shop. The shop might need to know if the buyer is older than 18. The identity provider filters out only the information from the passport required for the purchase, and forwards that information to the shop. The buyer remains in charge of his own data and can terminate the transaction at any moment. The process is intended as additional evidence. Often, a user needs various account names and passwords to use various online services. But such credentials, also credit cards, etc. may fall into the wrong hands. Of course, a passport can also be stolen. "This is why the passport by itself should not be used as identification. But in combination with other authentication means it could stop simple forms of identity theft", said Michiel Leenaars, strategy director of Stichting NLnet. According to Martijn Oostdijk, the system is suitable for all forms of identification. "It's not just for online purchases. The system might play a role in safe surfing by children or patient access to electronic health records, etc."

Identity fraud is costing society billions of euros per year. In the U.S., the damage last year was 31 billion euro. At present, slightly less than half of all Dutch citizens have a passport or identity card with chip. In 2011, that will be the case for all citizens.

NLnet Foundation is committed to an open information society and supports projects that contribute financially. Software developed within the projects are published as "open source" and is freely available for parties who wish to further develop it. The Telematica Institute combines innovation power and knowledge of IT to achieve breakthroughs in how we live.