Thursday, 16 November 2017

Using IDM and DS to synchronise hashed passwords


In this post I will describe a technique for synchronising a hashed password from ForgeRock IDM to DS.
Out of the box, IDM has a Managed User object that encrypts a password in symmetric (reversible) encryption.  One reason for this is that sometimes it is necessary to pass, in clear text, the password to a destination directory in order for it to perform its own hashing before storing it.  Therefore the out of the box synchronisation model for IDM is take the encrypted password from its own store, decrypt it, and pass it in clear text (typically over a secure channel!) to DS for it to hash and store.
You can see this in the samples for IDM.

However, there are some times when storing an encrypted, rather than hashed, value for a password is not acceptable.  IDM includes the capability to hash properties (such as passwords) not just encrypt them.  In that scenario, given that password hashes are one way, it's not possible to decrypt the password before synchronisation with other systems such as DS.

Fortunately, DS offers the capability of accepting pre-hashed passwords so IDM is able to pass the hash to DS for storage.  DS obviously needs to know this value is a hash, otherwise it will try to hash the hash!

So, what are the steps required?

  1. Ensure that DS is configured to accept hashed passwords.
  2. Ensure the IDM data model uses 'Hashing' for the password property.
  3. Ensure the IDM mapping is setup correctly

Ensure DS is configured to accept hashed passwords

This topic is covered excellently by Mark Craig in this article here:

I'm using ForgeRock DS v5.0 here, but Mark references the old name for DS (OpenDJ) because this capability has been around for a while.  The key thing to note about the steps in the article is that you need the allow-pre-encoded-passwords advanced password policy property to be set for the appropriate password policy.  I'm only going to be dealing with one password policy - the default one - so Mark's article covers everything I need.

(I will be using a Salted SHA-512 algorithm so if you want to follow all the steps, including testing out the change of a user's password, then specify {SSHA512} in the userPassword value, rather than {SSHA}.  This test isn't necessary for the later steps in this article, but may help you understand what's going on).

Ensure IDM uses hashing

Like everything in IDM, you can modify configuration by changing the various config .json files, or the UI (which updates the json config files!)
I'll use IDM v5.0 here and show the UI.

By default, the Managed User object includes a password property that is defined as 'Encrypted':
We need to change this to be Hashed:

And, I'm using the SHA-512 algorithm here (which is a Salted SHA-512 algorithm).

Note that making this change does not update all the user passwords that exist.  It will only take effect when a new value is saved to the property.

Now the value of a password, when it is saved, is string representation of a complex JSON object (just like it is when encrypted) but will look something like:

Ensure IDM Mapping is setup correctly

Now we need to configure the mapping.
As you may have noted in the 1st step, DS is told that the password is pre-hashed by the presence of {SSHA512} at the beginning of the password hash value.  Therefore we need a transformation script that takes the algorithm and hash value from IDM and concatenates it in a way suited for DS.
The script is fairly simple, but does need some logic to convert the IDM algorithm representation: SHA-512 into the DS representation: {SSHA512}
This is the transformation script (in groovy) I used (which can be extended of course for other algorithms):
String strHash;
if (source.$crypto.value.algorithm == "SHA-512" ) {
  strHash = "{SSHA512}" + source.$

This script replaces the default IDM script that does the decryption of the password.
(You might want to extend the script to cope with both hashed and encrypted values of passwords if you already have data.  Look at functions such as openidm.isHashed and openidm.isEncrypted in the IDM Integrators Guide).

Now when a password is changed, the password is stored in hashed form in IDM.  Then the mapping is triggered to synchronise to DS applying the transformation script that passes the pre-hashed password value.

Now there is no need to store passwords in reversible encryption!

Wednesday, 21 June 2017

ForgeRock Self-Service Custom Stage


A while ago I blogged an article describing how to add custom stages to the ForgeRock IDM self-service config.  At the time I used the sample custom stage available from the ForgeRock Commons Self-Service code base.  I left it as a task for the reader to build their own stage!  However, I recently had cause to build a custom stage for a proof of concept I was working on.

It's for IDM v5 and I've detailed the steps here.

Business Logic

The requirement for the stage was to validate that a registering user had ownership of the provided phone number.  The phone number could be either a mobile or a landline.  The approach taken was to use Twilio (a 3rd party) to send out either an SMS to a mobile, or text-to-speech to a landline.  The content of the message is a code based on HOTP.

Get the code for the module

Building the module

Follow the instructions in

After deploying the .jar file you must restart IDM for the bundle to be correctly recognised.

The module is targeted for IDMv5.  It uses the maven repositories to get the binary dependencies.
See this article in order to access the ForgeRock 'private-releases' maven repo:

It also uses appropriate pom.xml directives to ensure the final .jar file is packaged as an OSGi bundle so that it can be dropped into IDM

Technical details

The code consists of a few files.  The first two in this list a the key files for any stage.  They implement the necessary interfaces for a stage.  The remaining files are the specific business logic for this stage.
  •  This class manages reading the configuration data from the configuration file.  It simply represents each configuration item for the stage as properties of the class.
  •  This is main orchestration file for the stage.  It copes with both registration and password reset scenarios.  It manages the 'state' of the flow within this stage and generates the appropriate callbacks to user, but relies on the other classes to do the real code management work.  If you want to learn about the way a 'stage' works then this is file to consider in detail.
  •  This is taken from the OATH Initiative work and is unchanged by me.  It is a java class to generate a code based on the HOTP algorithm.
  • This class manages the process of sending the code.  It generates the code then decides whether to send it using SMS or TTS.  (In the UK, all mobile phone numbers start 07... so it's very simple logic for my purpose!)  This class also provides a method to validate the code entered by the user.  
  •  The class provides the utility functions that interact directly with the Twilio APIs for sending either an SMS or TTS


There are also two sample config files for registration and password reset.  You should include the JSON section relating to this class in your self-service configuration files for IDM.
For example:
            "class" : "org.forgerock.selfservice.twilio.TwilioStageConfig",
            "codeValidityDuration" : "6000",
            "codeLength" : "5",
            "controlUrl" : "",
            "fromPhone" : "+441412803033",
            "accountSid" : "<Enter accountSid>",
            "tokenId" : "<Enter tokenId>",
            "telephoneField" : "telephoneNumber",
            "skipSend" : false

Most configuration items should be self explanatory.  However, the 'skipSend' option is worthy of special note.  This, when true, will cause the stage to avoid calling the Twilio APIs and instead return the code as part of the callback data.  This means that if you're using the OOTB UI then the 'placeholder' HTML attribute of the input box will tell you the code to enter.  This is really useful for testing this stage if you don't have access to a Twilio account as this also ignores the Twilio account specific configuration items.

Of course, now you need to deploy it as per my previous article!

Thursday, 17 November 2016

Calling external REST endpoints in OpenAMv13.5 scripted authorization policies


It is often useful to be able to call external services as part of an authorisation policy in OpenAM.  One such example is a policy that does a check to see if the IP address of the calling user is located in the same country as the registered address for the user.  Now, there's an out of the box scripted policy condition that does just this that relies on external services it calls using 'GET' requests.  I thought it might be nice to add some functionality to this policy that sent me a text message (SMS) when the policy determined that it was being evaluated from an IP address from a country other than my own.  This could act as a warning to me that my account has been compromised and is being used by someone else, somewhere else in the world.  A colleague had also been doing a little bit of integration work with Twilio who happen to provide RESTful endpoints for sending SMS so I decided to adopt Twilio for this purpose.  That Twilio is the endpoint here is of little consequence as this approach will work for any service provider, but it gave me a real service for SMS notifications.

The solution

Well, that's easy isn't it... there's a Developers Guide that explains the scripting API for OpenAM:!/docs/openam/13.5/dev-guide#scripting-api
We just use that, looking at how the existing external calls work in the policy, and we're done, right?
Err, no, wrong, as it turns out!
Tried that and it didn't work :(

The problem

The Twilio endpoint requires a POST of data including an Authorization and Content-Type header.  This should be fine.  But the method as described in the guide simply wouldn't send the required HTTP headers.
It turns out the 'post' and 'get' methods uses RESTlet under the covers which has very specific methods for including standardised HTTP headers.  Unfortunately these methods aren't exposed by the httpClient object available in the script.  The OpenAM developer documentation suggests that you should just be able to set these as headers in the 'requestdata' parameter but as the underlying code does not use the specific RESTlet methods for adding these then the RESTlet framework discards them.

As an example, from the guide, you might try to write code like this:
var response ="" + username, "", { cookies:[ { "domain": "", "field": "iPlanetDirectoryPro", "value": "E8cDkvlad83kd....KDodkIEIx*DLEDLK...JKD09d" } ], headers:[ { "field": "Content-Type", "value": "application/json" } ] });

If you do, then you'll see that the Content-Type header is not sent because it is considered standard.  However, if the header was a custom header then it would be passed to the destination.

The other thing you might notice in the logs is that the methods indicated by the developer guide are now marked as deprecated.  They still work - to a fashion - but an alternative method 'send' is recommended.  Unfortunately the guides don't describe this method...hence this blog post!

The real solution

So the real solution is to use the new 'send' method of the httpClient object.  This accepts one parameter 'request' which is defined as the following type:


So within our script we should define a Request object and set the appropriate parameters before passing it as a parameter to the httpClient.send method.

Great, easy right? Well, err, that depends...

As I was using a copy of the default policy authorization script this was defined as Javascript.  So I needed to import the necessary class using Javascript.  And, as I discovered, I was using Java8 which changed the engine from Rhino to Nashorn which recommends the 'JavaImporter' mechanism for importing packages

So, with this is at the top of my script:
    var fr = new JavaImporter(org.forgerock.http.protocol)

I can now instantiate a new Request object like this:
    with (fr) {
      var request = new fr.Request();
Note the use of the 'with' block.

Now I can set the necessary properties of the request object so that I can call the Twilio API.  This API requires the HTTP headers specified, to be POSTed, with url-encoded body contents that describe the various details of the message Twilio will issue.  All this needs to be done within the 'with' block highlighted above:
    request.method = 'POST';
    request.getHeaders().add('Content-Type', 'application/x-www-form-urlencoded');
    request.getHeaders().add('Authorization', 'Basic abcde12345');

Ok, so now I can send my request parameter?
Well, yes, but I also need to handle the response, which for the 'send' method is a Promise (defined as org.forgerock.util.promise.Promise) for a Response object (defined as org.forgerock.http.protocol.Response)

So this is another package I need to import into my Javascript file in order to access the Promise class.  JavaImporter takes multiple parameters to make them all available to the assigned variable so you can use them all within the same 'with' block.  Therefore my import line now looks like:
 var fr = new JavaImporter(org.forgerock.http.protocol, org.forgerock.util.promise)

And, within the 'with' block, I now include:
    promise = httpClient.send(request);
    var response = promise.get();

Which will execute the desired external call and return the response into the response variable.

So now I can use the Response properties and methods to check the call was successful e.g:

So my script for calling an external service looks something like this:
var fr = new JavaImporter(org.forgerock.http.protocol, org.forgerock.util.promise)
function postSMS() {
  with (fr) {
    var request = new Request();
    request.method = 'POST';
    request.getHeaders().add('Content-Type', 'application/x-www-form-urlencoded');
    request.getHeaders().add('Authorization', 'Basic abcde12345');
    promise = httpClient.send(request);
    var response = promise.get();
    logger.message("Twilio Call. Status: " + response.getStatus() + ", Cause: " + response.getCause());

Great, so we're done now?

Well, almost!

The various classes defined by the imported packages are not all 'whitelisted'. This is an OpenAM feature that controls which classes can be used within scripts. We therefore need to add the necessary classes to the 'whitelist' for Policy scripts which can be in the Global Services page of the admin console. As you run the script you'll see the log files will produce an error if the necessary class is not whitelisted. You can use this approach to see what is required then add the highlighted class to the list.

And, now we're done... happy RESTing!

Wednesday, 27 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘jwt’ parameter of the ‘Subject’


I've previously blogged about the 'claims' and 'ssoToken' parameters of the 'subject' item used in the REST call to evaluate a policy for a resource. These articles are:
Now we're going to look at the 'jwt' parameter.  

For reference, the REST call we'll be using is documented in the developer guide, here:

The 'JWT' Parameter

The documentation describes the 'jwt' paramter as:
The value is a JWT string
What does that mean?
Firstly, it's worth understanding the JWT specification: RFC7519
To summarise, a JWT is a URL-safe encoded, signed (and possibly encrypted) representation of a 'JWT Claims Set'. The JWT specification defines the 'JWT Claims Set' as:
A JSON object that contains the claims conveyed by the JWT.

Where 'claims' are name/value pairs about the 'subject' of the JWT.  Typically a 'subject' might be an identity representing a person, and the 'claims' might be attributes about that person such as their name, email address, and phone number etc

So a JWT is generic way of representing a subject's claims.

OpenID Connect (OIDC)

OIDC makes use of the JWT specification by stating that the id_token must be a JWT.  It also defines a set of claims that must be present within the JWT when generated by an OpenID Provider  See:

The specification also says that additional claims may be present in the token.  Just hang on to that thought for the moment...we'll come back to it.

OpenAM OIDC configuration

For the purposes of investigating the 'jwt' parameter, let's configure OpenAM to generate OIDC id_tokens.  I'm not going to cover that here, but we'll assume you've followed the wizard to setup up an OIDC provider for the realm.  We'll also assume you've created/updated the OAuth2/OIDC Client Agent profile to allow the 'profile' and 'openid' scopes.  I'm also going to use an 'invoices' scope so the config must allow me to request that too.

Now I can issue:
curl --request POST --user "apiclient:password" --data "grant_type=password&username=bob&password=password&scope=invoices openid profile"

Note the request for the openid and profile scopes in order to ensure I get the OpenID Connect response.

And I should get something similar to the following:
  "scope":"invoices openid profile",

Note the lengthy id_token field.  This is the OIDC JWT made up according to the specification.  Also note that, by default, OpenAM will sign this JWT with the 1024-bit 'test' certificate using the RS256 algorithm.  I've updated my instance to use a new 2048-bit certificate called 'test1' so my response will be longer than the default.  I've used a 2048-bit certificate because I want to use this tool to inspect the JWT and its signature:  And, this tool only seems to support 2048-bit certificates which is probably due to the JWS specification   (I could have used to inspect the JWT, but this does not support verification of RSA based signatures).

So, in the JWT tool linked above you can paste the full value of the id_token field into 'Step 3', then click the 'Just Decode JWT' button.  You should see the decode JWT claims in the 'Payload' box:

You can also see that the header field shows how the signature was generated in order to allow clients to verify this signature.
In order to get this tool to verify the signature, you need to get the PEM formatted version of the public key of the signing certificate.  i.e. 'test1' in my case.
I've got this from the KeyStoreExplorer tool, and now I can paste it into the 'Step 4' box, using the 'X.509 certificate for RSA' option.  Now I can click 'Verify It':

The tool tells me the signature is valid, and also decodes the token as before.  If I was to change the content of the message, of the signature of the JWT then the tool would tell me that the signature is not valid. For example, changing one character of the message would return this:

Note that the message box says that the signature is *Invalid*, as well as the Payload now being incorrect.

The 'jwt' Parameter 

So now we've understood that the id_token field of the OIDC response is a JWT, we can use this as the 'jwt' parameter of the 'subject' field in the policy evaluation call.

For example, a call like this:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ.MS6jnMoeQ19y1DQky4UdD3Mqp28T0JYigNQ0d0tdm04HjicQb4ha818qdaErSxuKyXODaTmtqkGbBnELyrckkl7m2aJki9akbJ5vXVox44eaRMmQjdm4EcC9vmdNZSVORKi1gK6uNGscarBBmFOjvJWBBBPhdeOPKApV0lDIzX7xP8JoAtxCr8cnNAngmle6MyTnVQvhFGWIFjmEyumD6Bsh3TZz8Fjkw6xqOyYSwfCaOrG8BxsH4BQTCp9FgsEjI52dZd7J0otKLIk0EVmZIkI4-hgRIcrM1Rfiz9LMHvjAWY97JBMcGBciS8fLHjWWiLDqMHEE0Wn5haYkMSsHYg"}}'

might return:

This assumes the following policy definition:

Note that in this case I am using the 'iss' claim within the token in order to ensure I trust the issuer of the token when evaluating the policy condition.

As mentioned in previous articles, it is imperative that the id_token claims includes a 'sub' field.  Fortunately, the OIDC specification makes this mandatory so using an OIDC token here will work just fine.

It's also worth noting that OpenAM does *not* verify the signature of the id_token submitted in 'jwt' field.  This means that you could shorten the 'curl' call above to remove the signature component of the 'jwt'. For example, this works just the same as above:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ."}}'

Note that the 'jwt' string needs to have two dots '.' in it to conform to the JWT specification.  The content following the second dot is the signature, which has been removed entirely in this second curl example.  i.e. this is an unsigned-JWT which is completely valid.

But, just to prove that OpenAM does *not* validate signed JWTs, you could attempt a curl call that includes garbage for the signature.  For example:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ.garbage!!"}}'
...would still successfully be authorised.

It's also worth noting that the id_token claims of an OIDC token includes an 'exp' field signifying the 'expiry time' of the id_token.  OpenAM does not evaluate this field in this call.

Signature Verification

You might be wondering if it is possible to verify the signature and other aspects, such as the 'exp' field.  Yes, it is!  With a little bit clever scripting - of course!

The first thing is that we need to ensure that jwt token can be parsed by a script.  Unfortunately, simply passing it in the jwt parameter does not permit this.  But, we can *also* pass the jwt token in the 'environment' field of the policy decision request.  I'll shorten the jwt tokens in the following CURL command to make it easier to read, but you should supply the full signed jwt in the 'environment' field:
curl --request POST --header "iPlanetDirectoryPro: "AQIC....*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlw...MyNTYiIH0.eyAiYXRfa...MTMgfQ.MS6jn...sHYg"},"environment":{"jwt":["eyAidHlw...MyNTYiIH0.eyAiYXRfa...MTMgfQ.MS6jn...sHYg"]}}'

Note in this that the 'environment' field now includes a 'jwt' field whose data can be utilised in a script.  And what would such a policy condition script look like?
Well head over to and take a look at the 'ExternalJWTVerifier.groovy' script.  The associated blogpost from my colleague, Simon Moffatt, will set this script into context:  This will validate either an HMAC signed JWT - if you enter the appropriate shared secret - as well as an RSA 256 signed OIDC JWT - if you specify the jwk_uri for the OpenID Connect Provider.
And, now that you have claims accessible to the scripting engine you can pretty much apply any form of logic to them to validate the token - including validating the 'exp' field.

Tuesday, 19 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘ssoToken’ parameter of the ‘Subject’

I recently blogged about the using the 'claims' parameter of the subject item in a REST call for policy evaluation in OpenAM 13.  (See In that article I blithely stated that using the 'ssoToken' parameter was fairly obvious.  However, I thought I'd take the time to explore this in a little more detail to ensure my understanding is complete.  This is partly because I started thinking about OIDC JWT tokens, and the fact that OpenAM stateless sessions (nothing to do with OIDC) also use JWT tokens.

Let's first ensure we understand stateful and stateless sessions.
(It's documented here, in the Admin guide:!/docs/openam/13.5/admin-guide#chap-session-state)

Stateful sessions are your typical OpenAM session.  When a user successfully authenticates with OpenAM they will establish a session.  A Stateful session means that all the details about that session are held by the OpenAM server-side services.  By default, this is 'in-memory', but can be persisted to an OpenDJ instances in order to support high-availability and scalability across geographically disperse datacentres.  The client of the authentication request receives a session identifier, typically stored by a web application as a session cookie, that is passed back to the OpenAM servers so that the session details can be retrieved.  It's called 'stateful' because the server needs to maintain the state of the session.
A session identifier for a stateful session might look something like this:
Basically, it's just a unique key to the session state.

Stateless sessions are new in OpenAM 13.  These alleviate the need for servers to maintain and store state, which avoids the need to replicate persisted state across multiple datacentres.  Of course, there is still session 'state''s just no longer stored on the server.  Instead all state information is packaged up into a JWT and passed to the client to maintain.  Now, on each request, the client can send the complete session information back to an OpenAM server in order for it to be processed.  OpenAM does not need to perform a lookup of the session information from the stateful repository because all the information is right there in the JWT.  This means that for a realm configured to operate with stateless sessions, the client will receive a much bigger token on successful authentication
Therefore, a stateless session token might look something like:

Obviously, this is much larger and looks more complex.  This token is essentially made up of two parts:
1. a fake stateful session identifier
2. a JWT
OpenAM always prepends a fake stateful session identifier to this JWT for backwards compatibility. So, the actual JWT starts *after* the second asterisk (*).  i.e. from the bit that begins eyAidH... right through to the end.

You can use tools like and to unpack and read this JWT.
e.g, for the JWT above, you can see the payload data which is how OpenAM represents the session information:

Now, turning our attention to the policy evaluation REST calls we see that there is an option to use 'ssoToken' as a parameter to the 'subject' item.

In a realm that uses the default 'stateful' sessions then any policy evaluation REST call that uses the 'ssoToken' parameter should use a stateful session identifier.  The policy will then have full access to the session information as well the profile data of the user identified by the session.

A stateless realm works exactly the same way.  You now need to provide the *full* stateless token (including the 'fake' stateful identifier with the JWT component) and the policy will have access to the state information from the JWT as well as information about the user from the datastore (such as group membership)

For example:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4SfcxxJaG7LFOia1TVHZuJ4_OVm9lq5Ih5uXA.*AAJTSQACMDEAAlNLABQtMjU4MDgxNTIwMzk1NzA5NDg0MwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["orders"],"application":"api","subject":{"ssoToken":"AQIC5wM2LY4SfcyRBqm_r02CEJ5luC4k9A6HPqDitS9T5-0.*AAJTSQACMDEAAlNLABQtNTc4MzI5MTk2NjQzMjUxOTc2MAACUzEAAA..*eyAidHlwIjogIkpXVCIsICJhbGciOiAiSFMyNTYiIH0.eyAic2VyaWFsaXplZF9zZXNzaW9uIjogIntcInNlY3JldFwiOlwiN2RiODdhMjQtMjk5Ni00YzkxLTkyNTUtOGIwNzdmZDEyYmFkXCIsXCJleHBpcnlUaW1lXCI6MTQ2ODkzNTgyODUyNSxcImxhc3RBY3Rpdml0eVRpbWVcIjoxNDY4OTI4NjI4NTI1LFwic3RhdGVcIjpcInZhbGlkXCIsXCJwcm9wZXJ0aWVzXCI6e1wiQ2hhclNldFwiOlwiVVRGLThcIixcIlVzZXJJZFwiOlwiYm9iXCIsXCJGdWxsTG9naW5VUkxcIjpcIi9vcGVuYW0vVUkvTG9naW4_cmVhbG09U2NvcGVBelwiLFwic3VjY2Vzc1VSTFwiOlwiL29wZW5hbS9jb25zb2xlXCIsXCJjb29raWVTdXBwb3J0XCI6XCJ0cnVlXCIsXCJBdXRoTGV2ZWxcIjpcIjVcIixcIlNlc3Npb25IYW5kbGVcIjpcInNoYW5kbGU6QVFJQzV3TTJMWTRTZmN3Y3YzMFFJTGF0Z3E3d3NJMWM4RThqRmZkTDMzTlZVQjAuKkFBSlRTUUFDTURFQUFsTkxBQk15TVRNME9USTRPVFk0TmpBNE1qSTFNelF3QUFKVE1RQUEqXCIsXCJVc2VyVG9rZW5cIjpcImJvYlwiLFwibG9naW5VUkxcIjpcIi9vcGVuYW0vVUkvTG9naW5cIixcIlByaW5jaXBhbHNcIjpcImJvYlwiLFwiU2VydmljZVwiOlwibGRhcFNlcnZpY2VcIixcInN1bi5hbS5Vbml2ZXJzYWxJZGVudGlmaWVyXCI6XCJpZD1ib2Isb3U9dXNlcixvPXNjb3BlYXosb3U9c2VydmljZXMsZGM9b3BlbmFtLGRjPWZvcmdlcm9jayxkYz1vcmdcIixcImFtbGJjb29raWVcIjpcIjAxXCIsXCJPcmdhbml6YXRpb25cIjpcIm89c2NvcGVheixvdT1zZXJ2aWNlcyxkYz1vcGVuYW0sZGM9Zm9yZ2Vyb2NrLGRjPW9yZ1wiLFwiTG9jYWxlXCI6XCJlbl9VU1wiLFwiSG9zdE5hbWVcIjpcIjEyNy4wLjAuMVwiLFwiQXV0aFR5cGVcIjpcIkRhdGFTdG9yZVwiLFwiSG9zdFwiOlwiMTI3LjAuMC4xXCIsXCJVc2VyUHJvZmlsZVwiOlwiQ3JlYXRlXCIsXCJBTUN0eElkXCI6XCI2MzE2MDI4YjcyYWU5MWMyMDFcIixcImNsaWVudFR5cGVcIjpcImdlbmVyaWNIVE1MXCIsXCJhdXRoSW5zdGFudFwiOlwiMjAxNi0wNy0xOVQxMTo0Mzo0OFpcIixcIlByaW5jaXBhbFwiOlwiaWQ9Ym9iLG91PXVzZXIsbz1zY29wZWF6LG91PXNlcnZpY2VzLGRjPW9wZW5hbSxkYz1mb3JnZXJvY2ssZGM9b3JnXCJ9LFwiY2xpZW50SURcIjpcImlkPWJvYixvdT11c2VyLG89c2NvcGVheixvdT1zZXJ2aWNlcyxkYz1vcGVuYW0sZGM9Zm9yZ2Vyb2NrLGRjPW9yZ1wiLFwic2Vzc2lvbklEXCI6bnVsbCxcImNsaWVudERvbWFpblwiOlwibz1zY29wZWF6LG91PXNlcnZpY2VzLGRjPW9wZW5hbSxkYz1mb3JnZXJvY2ssZGM9b3JnXCIsXCJzZXNzaW9uVHlwZVwiOlwidXNlclwiLFwibWF4SWRsZVwiOjMwLFwibWF4Q2FjaGluZ1wiOjMsXCJuZXZlckV4cGlyaW5nXCI6ZmFsc2UsXCJtYXhUaW1lXCI6MTIwfSIgfQ.Dnjk-9MgANmhX4jOez12HcYAW9skck-HFuTPnzEmIq8"}}'

Might return:

Assuming the policy looks something like this:

...and, in this specific case, that the authentication level for the 'subject' of the ssoToken is set to two or greater, as well as the 'subject' being a member of the the 'api_order' group in the datastore.

Next up, we'll look at using OIDC tokens in the subject parameter of the REST call.

Wednesday, 13 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘Claims’ parameter of the ‘Subject’

Consider this API for a moment:!/docs/openam/13/dev-guide#rest-api-authz-policy-decision-concrete

In particular consider the ‘subject’ field to be passed to the endpoint.

Using the ssoToken as the ‘subject’ is fairly easy to understand. You have full access to the subject’s datastore properties in a policy condition, including things like group membership.
But what do the other ‘subject’ options mean?
The documentation isn’t particularly expansive on how to use the ‘jwt’ and ‘claims’ subject parameters that are available in the REST interface.

This article will focus on the ‘claims’ parameter.

Firstly, it’s worth noting that these options are *not* mutually exclusive. You can specify multiple subject parameters and they will be combined. We’ll come back to this point later in this article.

Let’s look at the ‘claims’ parameter in detail.
This needs to be specified as an object map. e.g.

The critical thing that the docs omit is that you must include a ‘sub’ key. Without this you will receive an ‘Invalid value subject’ error.
The value of the ‘sub’ key seems largely irrelevant in that it doesn’t automatically hydrate this value into a corresponding subject in the datastore. It is, however, a valid ‘claim’ that can be used in the policy. For example:

Of course, the ‘sub’ claim value could be empty.  All that’s important is that it is included as a key.  

So, with a policy defined like this:

You could specify a policy evaluation request like this:
curl --request POST --header "iPlanetDirectoryPro: AQIC5…*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"","iss":""}}}'

And you would receive a successful policy evaluation response.

Note that the iPlanetDirectoryPro header must be an ssoToken of a user with permissions to execute this REST API call.  In all these samples I’m using the ssoToken of amadmin here.

I might want to augment this policy so that only members of a given a datastore group would be considered.  I might think this is as easy as ensuring I include the correct ‘sub’ value of a user in the datastore that has a group membership, and updating the policy ‘subjects’ tab to include the group membership.  For example:

Then issue (note the ‘demo’ value for the ‘sub’ key):
curl --request POST --header "iPlanetDirectoryPro: AQIC5…*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"demo","iss":""}}}'

But, as stated before, the ‘sub’ value is not hydrated to a datastore subject so the policy cannot know about the group memberships.

To prove this, and also to show that multiple ‘subjects’ are combined by the policy evaluation we can get an ssoToken for the ‘demo’ user using the ‘authenticate’ api in OpenAM.  Then issue something like:
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"ssoToken":"AQIC5..*","claims":{"sub":"demo","iss":""}}}'

And you will get a successful response.

Note that I have two separate ssoTokens here.  One is the iPlanetDirectoryPro token which corresponds to a user with permissions to execute the REST API call (I’m using amadmin here).  The other, used in the ssoToken parameter, is that of the ‘demo’ user.

If I was to execute the same call, but remove the ‘claims’ parameter, e.g.
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"ssoToken":"AQIC5..*"}}'
then the policy would return false.

This proves that both the ‘ssoToken’ and ‘claims’ parameters are being considered together as part of the policy evaluation of the ‘subjects’ tab.  i.e. the group membership is being evaluated based on the ssoToken (which is a representation of the user in the datastore) as well as the ‘iss’ claim value passed by the REST API call.

So can I use the ‘claims’ parameter *only* (i.e. without an ssoToken) in order to evaluate a policy based on group membership where the ‘sub’ claim is assumed to be a user in the datastore?

Well, yes!
Let me return my ‘subjects’ tab to the original:

Now let’s consider the ‘environments’ tab.

Maybe we could try an LDAP Filter condition?  Well, not exactly.  The LDAP Filter condition implicitly adds the uid of the user in the datastore to the query by relying on the contents of the ‘subject’ ssoToken.  And, as we now know, the ‘sub’ claim is not rehydrated to a datastore subject/ssoToken so the implicit addition of the uid means the query fails top return a result.

But, what if we look at the ‘Scripted Authorisation Condition’?
One thing we do have available in a scripted condition is a ‘username’ property.  Now this *is* populated from the ‘sub’ claim.
(This also makes it very clear that the ‘identity’ object, which is a reference to the subject in the datastore, is *only* available if the ssoToken is used).

So, with the ‘username’ available perhaps we can write a script that connects to a directory (the datastore, for arguments sake) and reads properties, such as group membership for that subject?
It turns out that, yes, we can!

Head to the ‘Scripts’ page and create a ‘Policy Condition’ script like this:

The script would be something like this:

import org.forgerock.openam.ldap.LDAPUtils
import org.forgerock.util.Options
import org.forgerock.opendj.ldap.Connection
import org.forgerock.opendj.ldap.LDAPConnectionFactory
import org.forgerock.opendj.ldap.SearchScope

//default to false

//log the username, which will be the 'subject' as defined by the claims.sub param
logger.message('username' + username)

//set up the ldap server connection params
ldapBindUser='cn=Directory Manager'

//connect to ldap server
LDAPUtils ldapUtils = new LDAPUtils()
Options options = new Options()
LDAPConnectionFactory factory = ldapUtils.createFailoverConnectionFactory(ldapServer, ldapPort, ldapBindUser, ldapBindPassword, options.defaultOptions())
Connection connection = factory.getConnection();

//define the filter
groupFilter = '(isMemberOf=cn=api_customer,ou=groups,dc=openam,dc=forgerock,dc=org)'
userFilter = '(uid='+username+')'
filter = '(&'+userFilter+groupFilter+')'
baseDN = 'ou=people,dc=openam,dc=forgerock,dc=org'

//issue the search
try {
  result = connection.searchSingleEntry(baseDN, SearchScope.SUBORDINATES, filter, 'uid')
  logger.message('result:' + result.getAttribute('uid').toString())
} catch (any) {
    logger.message('Assume filter returned no results, so assume that user is not a member of the group')

You will want to ensure that the groupFilter variable matches your environment.
You may also need to change the connection details.
You may also want to improve the error handling, or use a different method for querying the ldap.

Note the ‘imports’ lines at the top of the script.  Some of these classes are not on the script module ‘whitelist’ for Policy Condition scripts.  In order for the script to run, they need to be added to the whitelist.
So, head to Configuration -> Global -> Scripting -> Policy condition -> Engine Configuration
and ensure the imported class names are permitted by the whitelist patterns.

Now, head back to the ‘Environments’ tab of the Policy and configure it like this:

Now let’s assume you have two users in your datastore: Alice and Bob.
Let’s also assume you have a group called ‘api_customer’.
Now Bob is made a member of the group, but Alice is not.

So, issuing:
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"alice","iss":""}}}'
will return failure, i.e.:

curl --request POST --header "iPlanetDirectoryPro: AQIC5..*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"bob","iss":""}}}'
will return success, i.e.:

Note that in my policy, I have an action of ‘permit’ defined, and also Static Response Response attributes of ‘hello:world’.

Don’t try to use ‘Subject Attributes’ because these depend on the SSOToken referencing the datastore identity.  You could, however, update the script to return attributes returned from the call to the directory if you wanted to emulate this functionality.  See:!/docs/openam/13/dev-guide#scripting-api-authz-response

Friday, 1 July 2016

Custom Stages to User Self-Service

Commons Project

One of the great features of the OpenAMv13 and OpenIDMv4 releases is actually not a feature of those products at all.  I'm talking about the Self-Service capability which is actually part of the 'COMMONS' project at ForgeRock.  As the name suggests, the functionality on 'Commons' may be found in more than one of the final products.  Commons includes capabilities such as audit logging, the CREST framework, the UI framework, as well as the User Self-Service functionality.

Self-Service Configuration

Now there is lots of good documentation about how to make use of the User Self-Service functionality as exposed through the OpenAM and OpenIDM products.
For example, for OpenAM:!/docs/openam/13/admin-guide#chap-usr-selfservices, and for OpenIDM:!/docs/openidm/4/integrators-guide#ui-configuring.
Whilst the end-user functionality is the same, the way you configure the functionality in OpenAM and OpenIDM is slightly different.  This is the OpenIDM configuration view:

One thing you might notice from the documentation is that the User Self-Service flows (Registration, Forgotten Password, and Forgotten Username) have the ability to use custom 'plugins' - sometimes called 'stages'.  However, another thing you might notice is a distinct lack of documentation on how to go about creating such a plugin/stage and adding it to your configuration.

Note that there is an outstanding JIRA logged for ForgeRock to provide documentation ( but, in the meantime, this post attempts to serve as an interim.  But, I'm only going to use OpenIDM as the target here, not OpenAM.

Fortunately, the JIRA linked above highlights that within the code base there already exists the code for a custom module, and a sample configuration.  So, in this post I'll explain the steps required to build, deploy, and configure that pre-existing sample.

The easiest way to do this is to get the code of the Standalone Self-Service application!

Get the Standalone Self-Service code

(*** EDIT : the code referenced below is now available in the forgerock-selfservice-public repo of the LEGACYCOMMONS project: ***)

You'll need to head over the 'self-service' repository in the 'COMMONS' project of the ForgeRock Stash server:
(You may need to register, but go ahead, it's free!)
If you're following the instructions in this post, and are targeting OpenIDMv4  (as opposed to any later releases) then you'll specifically want v1.0.3 of this SelfService repository.

 Now download the code to your local drive so we can build it.

Compile and run it

You can see that the 'readme.txt' provides the details of how to compile this project.  Note that this will compile the full 'Commons Self-Service' functionality, including the custom stage, in a standalone harness.

Once it's built you can browse to the web site and play with the functionality.  Any users registered are held in memory of this harness, and therefore flushed each time you stop the jetty container.

It's also worth noting that the readme.txt instructs you to enter email username and password.  These are used to connect to the configured email service of this test harness in order to actually send registration emails.  (The implementations in OpenAM and OpenIDM will use the configured email services for those products).  By default, the configured email service is gmail.  And, by default, gmail stops this type of activity unless you change your gmail account settings.  However, you may instead choose to run a dummy SMTP service to capture the sent emails.  One such utility that I'll use here is FakeSMTP:
So, once you have an accessible SMTP service, you might now need to change the email service config of the User Self-Service functionality.  You find this for the test harness - assuming the mvn build has worked successfully - here:


If you're running FakeSMTP on port 2525, then this might look like:
  "mailserver": {
    "host": "localhost",
    "port": "2525"

Now when you run the example webapp emails will be delivered to FakeSMTP (and will ignore whatever you use for username and password)

So, go ahead, register a user. The first thing you should see is a "Math Problem" stage.  Eh? What? Where did that come from? Well, that's the custom stage!!  Yes, this standalone version of Self-Service includes the custom stage!

Step1. Math Problem
Assuming you can add 12 and 4 together complete the 'puzzle'.  Then follow the remaining steps of the registration (noting that the email gets delivered to FakeSMTP, where you can open it and click the link to complete the registration).
Step 2. Email Verification
Email link
Step 3. Register details
Step 4. KBA

Inspect the configuration

Now, if we take a look at the 'Registration Stage' configuration for this example app, which we can find here:
we will see it begins like this:
  "stageConfigs": [
      "class" : "org.forgerock.selfservice.custom.MathProblemStageConfig",
      "leftValue" : 12,
      "rightValue" : 4
      "name" : "emailValidation",
      "emailServiceUrl": "/email",
      "from": "", 

Brilliant!  That first item in the stageConfigs array is the "Math Problem" with it's configuration (i.e. which two numbers to sum!) The remaining array items reference 'name' which are the native/in-built modules and their necessary configuration.
So, what we've achieved so far is:

  • Compiled the custom stage (Math Problem)
  • Compiled standalone version of Common User Self-Service
  • Tested a Registration flow that includes the custom stage, along with some native stages.

And what's left to do?

  • Deploy and configure the custom stage in OpenIDM

OpenIDM Deployment

Simply copy the custom stage JAR file to the 'bundle' directory of your OpenIDM deployment
cp forgerock-selfservice-custom-stage/target/forgerock-selfservice-custom-stage-1.0.3.jar <openidm>/bundle

And update the 'selfservice-registration.json' file in your 'conf' folder.
This is OpenIDM's name for the 'registration.json' file of the standalone SelfService app, so use the same configuration snippet you saw in the standalone app.

It seems that this method will not allow you to see the custom stage in the Admin UI for OpenIDM. Happily, changes made within the Admin UI do not remove the custom stage from the file. However, be warned that if you disable, then re-enable SelfService Registration there is no guarantee that custom stage will be added into the re-created 'selfservice-registration.json' file in the correct place.

So, with User Self Registration enabled, and the custom stage included in the config, when a user tries to register they will be prompted for the calculation at the appropriate point in the flow.
Custom Stage in OpenIDM flow!

Exercise for the Reader!

As you can see I have use the pre-supplied custom stage.  Showing one approach to building, deploying and configuring it.  If you need a custom stage to do something different, then you'll have to investigate the interfaces that need implementing in order to develop a custom stage.