Thursday, 17 November 2016

Calling external REST endpoints in OpenAMv13.5 scripted authorization policies


It is often useful to be able to call external services as part of an authorisation policy in OpenAM.  One such example is a policy that does a check to see if the IP address of the calling user is located in the same country as the registered address for the user.  Now, there's an out of the box scripted policy condition that does just this that relies on external services it calls using 'GET' requests.  I thought it might be nice to add some functionality to this policy that sent me a text message (SMS) when the policy determined that it was being evaluated from an IP address from a country other than my own.  This could act as a warning to me that my account has been compromised and is being used by someone else, somewhere else in the world.  A colleague had also been doing a little bit of integration work with Twilio who happen to provide RESTful endpoints for sending SMS so I decided to adopt Twilio for this purpose.  That Twilio is the endpoint here is of little consequence as this approach will work for any service provider, but it gave me a real service for SMS notifications.

The solution

Well, that's easy isn't it... there's a Developers Guide that explains the scripting API for OpenAM:!/docs/openam/13.5/dev-guide#scripting-api
We just use that, looking at how the existing external calls work in the policy, and we're done, right?
Err, no, wrong, as it turns out!
Tried that and it didn't work :(

The problem

The Twilio endpoint requires a POST of data including an Authorization and Content-Type header.  This should be fine.  But the method as described in the guide simply wouldn't send the required HTTP headers.
It turns out the 'post' and 'get' methods uses RESTlet under the covers which has very specific methods for including standardised HTTP headers.  Unfortunately these methods aren't exposed by the httpClient object available in the script.  The OpenAM developer documentation suggests that you should just be able to set these as headers in the 'requestdata' parameter but as the underlying code does not use the specific RESTlet methods for adding these then the RESTlet framework discards them.

As an example, from the guide, you might try to write code like this:
var response ="" + username, "", { cookies:[ { "domain": "", "field": "iPlanetDirectoryPro", "value": "E8cDkvlad83kd....KDodkIEIx*DLEDLK...JKD09d" } ], headers:[ { "field": "Content-Type", "value": "application/json" } ] });

If you do, then you'll see that the Content-Type header is not sent because it is considered standard.  However, if the header was a custom header then it would be passed to the destination.

The other thing you might notice in the logs is that the methods indicated by the developer guide are now marked as deprecated.  They still work - to a fashion - but an alternative method 'send' is recommended.  Unfortunately the guides don't describe this method...hence this blog post!

The real solution

So the real solution is to use the new 'send' method of the httpClient object.  This accepts one parameter 'request' which is defined as the following type:


So within our script we should define a Request object and set the appropriate parameters before passing it as a parameter to the httpClient.send method.

Great, easy right? Well, err, that depends...

As I was using a copy of the default policy authorization script this was defined as Javascript.  So I needed to import the necessary class using Javascript.  And, as I discovered, I was using Java8 which changed the engine from Rhino to Nashorn which recommends the 'JavaImporter' mechanism for importing packages

So, with this is at the top of my script:
    var fr = new JavaImporter(org.forgerock.http.protocol)

I can now instantiate a new Request object like this:
    with (fr) {
      var request = new fr.Request();
Note the use of the 'with' block.

Now I can set the necessary properties of the request object so that I can call the Twilio API.  This API requires the HTTP headers specified, to be POSTed, with url-encoded body contents that describe the various details of the message Twilio will issue.  All this needs to be done within the 'with' block highlighted above:
    request.method = 'POST';
    request.getHeaders().add('Content-Type', 'application/x-www-form-urlencoded');
    request.getHeaders().add('Authorization', 'Basic abcde12345');

Ok, so now I can send my request parameter?
Well, yes, but I also need to handle the response, which for the 'send' method is a Promise (defined as org.forgerock.util.promise.Promise) for a Response object (defined as org.forgerock.http.protocol.Response)

So this is another package I need to import into my Javascript file in order to access the Promise class.  JavaImporter takes multiple parameters to make them all available to the assigned variable so you can use them all within the same 'with' block.  Therefore my import line now looks like:
 var fr = new JavaImporter(org.forgerock.http.protocol, org.forgerock.util.promise)

And, within the 'with' block, I now include:
    promise = httpClient.send(request);
    var response = promise.get();

Which will execute the desired external call and return the response into the response variable.

So now I can use the Response properties and methods to check the call was successful e.g:

So my script for calling an external service looks something like this:
var fr = new JavaImporter(org.forgerock.http.protocol, org.forgerock.util.promise)
function postSMS() {
  with (fr) {
    var request = new Request();
    request.method = 'POST';
    request.getHeaders().add('Content-Type', 'application/x-www-form-urlencoded');
    request.getHeaders().add('Authorization', 'Basic abcde12345');
    promise = httpClient.send(request);
    var response = promise.get();
    logger.message("Twilio Call. Status: " + response.getStatus() + ", Cause: " + response.getCause());

Great, so we're done now?

Well, almost!

The various classes defined by the imported packages are not all 'whitelisted'. This is an OpenAM feature that controls which classes can be used within scripts. We therefore need to add the necessary classes to the 'whitelist' for Policy scripts which can be in the Global Services page of the admin console. As you run the script you'll see the log files will produce an error if the necessary class is not whitelisted. You can use this approach to see what is required then add the highlighted class to the list.

And, now we're done... happy RESTing!

Wednesday, 27 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘jwt’ parameter of the ‘Subject’


I've previously blogged about the 'claims' and 'ssoToken' parameters of the 'subject' item used in the REST call to evaluate a policy for a resource. These articles are:
Now we're going to look at the 'jwt' parameter.  

For reference, the REST call we'll be using is documented in the developer guide, here:

The 'JWT' Parameter

The documentation describes the 'jwt' paramter as:
The value is a JWT string
What does that mean?
Firstly, it's worth understanding the JWT specification: RFC7519
To summarise, a JWT is a URL-safe encoded, signed (and possibly encrypted) representation of a 'JWT Claims Set'. The JWT specification defines the 'JWT Claims Set' as:
A JSON object that contains the claims conveyed by the JWT.

Where 'claims' are name/value pairs about the 'subject' of the JWT.  Typically a 'subject' might be an identity representing a person, and the 'claims' might be attributes about that person such as their name, email address, and phone number etc

So a JWT is generic way of representing a subject's claims.

OpenID Connect (OIDC)

OIDC makes use of the JWT specification by stating that the id_token must be a JWT.  It also defines a set of claims that must be present within the JWT when generated by an OpenID Provider  See:

The specification also says that additional claims may be present in the token.  Just hang on to that thought for the moment...we'll come back to it.

OpenAM OIDC configuration

For the purposes of investigating the 'jwt' parameter, let's configure OpenAM to generate OIDC id_tokens.  I'm not going to cover that here, but we'll assume you've followed the wizard to setup up an OIDC provider for the realm.  We'll also assume you've created/updated the OAuth2/OIDC Client Agent profile to allow the 'profile' and 'openid' scopes.  I'm also going to use an 'invoices' scope so the config must allow me to request that too.

Now I can issue:
curl --request POST --user "apiclient:password" --data "grant_type=password&username=bob&password=password&scope=invoices openid profile"

Note the request for the openid and profile scopes in order to ensure I get the OpenID Connect response.

And I should get something similar to the following:
  "scope":"invoices openid profile",

Note the lengthy id_token field.  This is the OIDC JWT made up according to the specification.  Also note that, by default, OpenAM will sign this JWT with the 1024-bit 'test' certificate using the RS256 algorithm.  I've updated my instance to use a new 2048-bit certificate called 'test1' so my response will be longer than the default.  I've used a 2048-bit certificate because I want to use this tool to inspect the JWT and its signature:  And, this tool only seems to support 2048-bit certificates which is probably due to the JWS specification   (I could have used to inspect the JWT, but this does not support verification of RSA based signatures).

So, in the JWT tool linked above you can paste the full value of the id_token field into 'Step 3', then click the 'Just Decode JWT' button.  You should see the decode JWT claims in the 'Payload' box:

You can also see that the header field shows how the signature was generated in order to allow clients to verify this signature.
In order to get this tool to verify the signature, you need to get the PEM formatted version of the public key of the signing certificate.  i.e. 'test1' in my case.
I've got this from the KeyStoreExplorer tool, and now I can paste it into the 'Step 4' box, using the 'X.509 certificate for RSA' option.  Now I can click 'Verify It':

The tool tells me the signature is valid, and also decodes the token as before.  If I was to change the content of the message, of the signature of the JWT then the tool would tell me that the signature is not valid. For example, changing one character of the message would return this:

Note that the message box says that the signature is *Invalid*, as well as the Payload now being incorrect.

The 'jwt' Parameter 

So now we've understood that the id_token field of the OIDC response is a JWT, we can use this as the 'jwt' parameter of the 'subject' field in the policy evaluation call.

For example, a call like this:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ.MS6jnMoeQ19y1DQky4UdD3Mqp28T0JYigNQ0d0tdm04HjicQb4ha818qdaErSxuKyXODaTmtqkGbBnELyrckkl7m2aJki9akbJ5vXVox44eaRMmQjdm4EcC9vmdNZSVORKi1gK6uNGscarBBmFOjvJWBBBPhdeOPKApV0lDIzX7xP8JoAtxCr8cnNAngmle6MyTnVQvhFGWIFjmEyumD6Bsh3TZz8Fjkw6xqOyYSwfCaOrG8BxsH4BQTCp9FgsEjI52dZd7J0otKLIk0EVmZIkI4-hgRIcrM1Rfiz9LMHvjAWY97JBMcGBciS8fLHjWWiLDqMHEE0Wn5haYkMSsHYg"}}'

might return:

This assumes the following policy definition:

Note that in this case I am using the 'iss' claim within the token in order to ensure I trust the issuer of the token when evaluating the policy condition.

As mentioned in previous articles, it is imperative that the id_token claims includes a 'sub' field.  Fortunately, the OIDC specification makes this mandatory so using an OIDC token here will work just fine.

It's also worth noting that OpenAM does *not* verify the signature of the id_token submitted in 'jwt' field.  This means that you could shorten the 'curl' call above to remove the signature component of the 'jwt'. For example, this works just the same as above:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ."}}'

Note that the 'jwt' string needs to have two dots '.' in it to conform to the JWT specification.  The content following the second dot is the signature, which has been removed entirely in this second curl example.  i.e. this is an unsigned-JWT which is completely valid.

But, just to prove that OpenAM does *not* validate signed JWTs, you could attempt a curl call that includes garbage for the signature.  For example:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4Sfcx-sATGr4BojcF5viQOrP-1IeLDz2Un8VM.*AAJTSQACMDEAAlNLABQtMjUwMzE4OTQxMDA1NDk1MTAyNwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlwIjogIkpXVCIsICJraWQiOiAidWFtVkdtRktmejFaVGliVjU2dXlsT2dMOVEwPSIsICJhbGciOiAiUlMyNTYiIH0.eyAiYXRfaGFzaCI6ICJVbmFHMk0ydU5kS1JZMk5UOGlqcFRRIiwgInN1YiI6ICJib2IiLCAiaXNzIjogImh0dHA6Ly9hcy51bWEuY29tOjgwODAvb3BlbmFtL29hdXRoMi9TY29wZUF6IiwgInRva2VuTmFtZSI6ICJpZF90b2tlbiIsICJhdWQiOiBbICJhcGljbGllbnQiIF0sICJvcmcuZm9yZ2Vyb2NrLm9wZW5pZGNvbm5lY3Qub3BzIjogIjhjOWNhNTU3LTk0OTgtNGU2Yy04ZjZmLWY2ZjYwZjNlOWM4NyIsICJhenAiOiAiYXBpY2xpZW50IiwgImF1dGhfdGltZSI6IDE0NjkwMjc1MTMsICJyZWFsbSI6ICIvU2NvcGVBeiIsICJleHAiOiAxNDY5MDMxMTEzLCAidG9rZW5UeXBlIjogIkpXVFRva2VuIiwgImlhdCI6IDE0NjkwMjc1MTMgfQ.garbage!!"}}'
...would still successfully be authorised.

It's also worth noting that the id_token claims of an OIDC token includes an 'exp' field signifying the 'expiry time' of the id_token.  OpenAM does not evaluate this field in this call.

Signature Verification

You might be wondering if it is possible to verify the signature and other aspects, such as the 'exp' field.  Yes, it is!  With a little bit clever scripting - of course!

The first thing is that we need to ensure that jwt token can be parsed by a script.  Unfortunately, simply passing it in the jwt parameter does not permit this.  But, we can *also* pass the jwt token in the 'environment' field of the policy decision request.  I'll shorten the jwt tokens in the following CURL command to make it easier to read, but you should supply the full signed jwt in the 'environment' field:
curl --request POST --header "iPlanetDirectoryPro: "AQIC....*" --header "Content-Type: application/json" --data '{"resources":["invoices"],"application":"api","subject":{"jwt":"eyAidHlw...MyNTYiIH0.eyAiYXRfa...MTMgfQ.MS6jn...sHYg"},"environment":{"jwt":["eyAidHlw...MyNTYiIH0.eyAiYXRfa...MTMgfQ.MS6jn...sHYg"]}}'

Note in this that the 'environment' field now includes a 'jwt' field whose data can be utilised in a script.  And what would such a policy condition script look like?
Well head over to and take a look at the 'ExternalJWTVerifier.groovy' script.  The associated blogpost from my colleague, Simon Moffatt, will set this script into context:  This will validate either an HMAC signed JWT - if you enter the appropriate shared secret - as well as an RSA 256 signed OIDC JWT - if you specify the jwk_uri for the OpenID Connect Provider.
And, now that you have claims accessible to the scripting engine you can pretty much apply any form of logic to them to validate the token - including validating the 'exp' field.

Tuesday, 19 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘ssoToken’ parameter of the ‘Subject’

I recently blogged about the using the 'claims' parameter of the subject item in a REST call for policy evaluation in OpenAM 13.  (See In that article I blithely stated that using the 'ssoToken' parameter was fairly obvious.  However, I thought I'd take the time to explore this in a little more detail to ensure my understanding is complete.  This is partly because I started thinking about OIDC JWT tokens, and the fact that OpenAM stateless sessions (nothing to do with OIDC) also use JWT tokens.

Let's first ensure we understand stateful and stateless sessions.
(It's documented here, in the Admin guide:!/docs/openam/13.5/admin-guide#chap-session-state)

Stateful sessions are your typical OpenAM session.  When a user successfully authenticates with OpenAM they will establish a session.  A Stateful session means that all the details about that session are held by the OpenAM server-side services.  By default, this is 'in-memory', but can be persisted to an OpenDJ instances in order to support high-availability and scalability across geographically disperse datacentres.  The client of the authentication request receives a session identifier, typically stored by a web application as a session cookie, that is passed back to the OpenAM servers so that the session details can be retrieved.  It's called 'stateful' because the server needs to maintain the state of the session.
A session identifier for a stateful session might look something like this:
Basically, it's just a unique key to the session state.

Stateless sessions are new in OpenAM 13.  These alleviate the need for servers to maintain and store state, which avoids the need to replicate persisted state across multiple datacentres.  Of course, there is still session 'state''s just no longer stored on the server.  Instead all state information is packaged up into a JWT and passed to the client to maintain.  Now, on each request, the client can send the complete session information back to an OpenAM server in order for it to be processed.  OpenAM does not need to perform a lookup of the session information from the stateful repository because all the information is right there in the JWT.  This means that for a realm configured to operate with stateless sessions, the client will receive a much bigger token on successful authentication
Therefore, a stateless session token might look something like:

Obviously, this is much larger and looks more complex.  This token is essentially made up of two parts:
1. a fake stateful session identifier
2. a JWT
OpenAM always prepends a fake stateful session identifier to this JWT for backwards compatibility. So, the actual JWT starts *after* the second asterisk (*).  i.e. from the bit that begins eyAidH... right through to the end.

You can use tools like and to unpack and read this JWT.
e.g, for the JWT above, you can see the payload data which is how OpenAM represents the session information:

Now, turning our attention to the policy evaluation REST calls we see that there is an option to use 'ssoToken' as a parameter to the 'subject' item.

In a realm that uses the default 'stateful' sessions then any policy evaluation REST call that uses the 'ssoToken' parameter should use a stateful session identifier.  The policy will then have full access to the session information as well the profile data of the user identified by the session.

A stateless realm works exactly the same way.  You now need to provide the *full* stateless token (including the 'fake' stateful identifier with the JWT component) and the policy will have access to the state information from the JWT as well as information about the user from the datastore (such as group membership)

For example:
curl --request POST --header "iPlanetDirectoryPro: AQIC5wM2LY4SfcxxJaG7LFOia1TVHZuJ4_OVm9lq5Ih5uXA.*AAJTSQACMDEAAlNLABQtMjU4MDgxNTIwMzk1NzA5NDg0MwACUzEAAA..*" --header "Content-Type: application/json" --data '{"resources":["orders"],"application":"api","subject":{"ssoToken":"AQIC5wM2LY4SfcyRBqm_r02CEJ5luC4k9A6HPqDitS9T5-0.*AAJTSQACMDEAAlNLABQtNTc4MzI5MTk2NjQzMjUxOTc2MAACUzEAAA..*eyAidHlwIjogIkpXVCIsICJhbGciOiAiSFMyNTYiIH0.eyAic2VyaWFsaXplZF9zZXNzaW9uIjogIntcInNlY3JldFwiOlwiN2RiODdhMjQtMjk5Ni00YzkxLTkyNTUtOGIwNzdmZDEyYmFkXCIsXCJleHBpcnlUaW1lXCI6MTQ2ODkzNTgyODUyNSxcImxhc3RBY3Rpdml0eVRpbWVcIjoxNDY4OTI4NjI4NTI1LFwic3RhdGVcIjpcInZhbGlkXCIsXCJwcm9wZXJ0aWVzXCI6e1wiQ2hhclNldFwiOlwiVVRGLThcIixcIlVzZXJJZFwiOlwiYm9iXCIsXCJGdWxsTG9naW5VUkxcIjpcIi9vcGVuYW0vVUkvTG9naW4_cmVhbG09U2NvcGVBelwiLFwic3VjY2Vzc1VSTFwiOlwiL29wZW5hbS9jb25zb2xlXCIsXCJjb29raWVTdXBwb3J0XCI6XCJ0cnVlXCIsXCJBdXRoTGV2ZWxcIjpcIjVcIixcIlNlc3Npb25IYW5kbGVcIjpcInNoYW5kbGU6QVFJQzV3TTJMWTRTZmN3Y3YzMFFJTGF0Z3E3d3NJMWM4RThqRmZkTDMzTlZVQjAuKkFBSlRTUUFDTURFQUFsTkxBQk15TVRNME9USTRPVFk0TmpBNE1qSTFNelF3QUFKVE1RQUEqXCIsXCJVc2VyVG9rZW5cIjpcImJvYlwiLFwibG9naW5VUkxcIjpcIi9vcGVuYW0vVUkvTG9naW5cIixcIlByaW5jaXBhbHNcIjpcImJvYlwiLFwiU2VydmljZVwiOlwibGRhcFNlcnZpY2VcIixcInN1bi5hbS5Vbml2ZXJzYWxJZGVudGlmaWVyXCI6XCJpZD1ib2Isb3U9dXNlcixvPXNjb3BlYXosb3U9c2VydmljZXMsZGM9b3BlbmFtLGRjPWZvcmdlcm9jayxkYz1vcmdcIixcImFtbGJjb29raWVcIjpcIjAxXCIsXCJPcmdhbml6YXRpb25cIjpcIm89c2NvcGVheixvdT1zZXJ2aWNlcyxkYz1vcGVuYW0sZGM9Zm9yZ2Vyb2NrLGRjPW9yZ1wiLFwiTG9jYWxlXCI6XCJlbl9VU1wiLFwiSG9zdE5hbWVcIjpcIjEyNy4wLjAuMVwiLFwiQXV0aFR5cGVcIjpcIkRhdGFTdG9yZVwiLFwiSG9zdFwiOlwiMTI3LjAuMC4xXCIsXCJVc2VyUHJvZmlsZVwiOlwiQ3JlYXRlXCIsXCJBTUN0eElkXCI6XCI2MzE2MDI4YjcyYWU5MWMyMDFcIixcImNsaWVudFR5cGVcIjpcImdlbmVyaWNIVE1MXCIsXCJhdXRoSW5zdGFudFwiOlwiMjAxNi0wNy0xOVQxMTo0Mzo0OFpcIixcIlByaW5jaXBhbFwiOlwiaWQ9Ym9iLG91PXVzZXIsbz1zY29wZWF6LG91PXNlcnZpY2VzLGRjPW9wZW5hbSxkYz1mb3JnZXJvY2ssZGM9b3JnXCJ9LFwiY2xpZW50SURcIjpcImlkPWJvYixvdT11c2VyLG89c2NvcGVheixvdT1zZXJ2aWNlcyxkYz1vcGVuYW0sZGM9Zm9yZ2Vyb2NrLGRjPW9yZ1wiLFwic2Vzc2lvbklEXCI6bnVsbCxcImNsaWVudERvbWFpblwiOlwibz1zY29wZWF6LG91PXNlcnZpY2VzLGRjPW9wZW5hbSxkYz1mb3JnZXJvY2ssZGM9b3JnXCIsXCJzZXNzaW9uVHlwZVwiOlwidXNlclwiLFwibWF4SWRsZVwiOjMwLFwibWF4Q2FjaGluZ1wiOjMsXCJuZXZlckV4cGlyaW5nXCI6ZmFsc2UsXCJtYXhUaW1lXCI6MTIwfSIgfQ.Dnjk-9MgANmhX4jOez12HcYAW9skck-HFuTPnzEmIq8"}}'

Might return:

Assuming the policy looks something like this:

...and, in this specific case, that the authentication level for the 'subject' of the ssoToken is set to two or greater, as well as the 'subject' being a member of the the 'api_order' group in the datastore.

Next up, we'll look at using OIDC tokens in the subject parameter of the REST call.

Wednesday, 13 July 2016

Fun with OpenAM13 Authz Policies over REST - the ‘Claims’ parameter of the ‘Subject’

Consider this API for a moment:!/docs/openam/13/dev-guide#rest-api-authz-policy-decision-concrete

In particular consider the ‘subject’ field to be passed to the endpoint.

Using the ssoToken as the ‘subject’ is fairly easy to understand. You have full access to the subject’s datastore properties in a policy condition, including things like group membership.
But what do the other ‘subject’ options mean?
The documentation isn’t particularly expansive on how to use the ‘jwt’ and ‘claims’ subject parameters that are available in the REST interface.

This article will focus on the ‘claims’ parameter.

Firstly, it’s worth noting that these options are *not* mutually exclusive. You can specify multiple subject parameters and they will be combined. We’ll come back to this point later in this article.

Let’s look at the ‘claims’ parameter in detail.
This needs to be specified as an object map. e.g.

The critical thing that the docs omit is that you must include a ‘sub’ key. Without this you will receive an ‘Invalid value subject’ error.
The value of the ‘sub’ key seems largely irrelevant in that it doesn’t automatically hydrate this value into a corresponding subject in the datastore. It is, however, a valid ‘claim’ that can be used in the policy. For example:

Of course, the ‘sub’ claim value could be empty.  All that’s important is that it is included as a key.  

So, with a policy defined like this:

You could specify a policy evaluation request like this:
curl --request POST --header "iPlanetDirectoryPro: AQIC5…*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"","iss":""}}}'

And you would receive a successful policy evaluation response.

Note that the iPlanetDirectoryPro header must be an ssoToken of a user with permissions to execute this REST API call.  In all these samples I’m using the ssoToken of amadmin here.

I might want to augment this policy so that only members of a given a datastore group would be considered.  I might think this is as easy as ensuring I include the correct ‘sub’ value of a user in the datastore that has a group membership, and updating the policy ‘subjects’ tab to include the group membership.  For example:

Then issue (note the ‘demo’ value for the ‘sub’ key):
curl --request POST --header "iPlanetDirectoryPro: AQIC5…*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"demo","iss":""}}}'

But, as stated before, the ‘sub’ value is not hydrated to a datastore subject so the policy cannot know about the group memberships.

To prove this, and also to show that multiple ‘subjects’ are combined by the policy evaluation we can get an ssoToken for the ‘demo’ user using the ‘authenticate’ api in OpenAM.  Then issue something like:
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"ssoToken":"AQIC5..*","claims":{"sub":"demo","iss":""}}}'

And you will get a successful response.

Note that I have two separate ssoTokens here.  One is the iPlanetDirectoryPro token which corresponds to a user with permissions to execute the REST API call (I’m using amadmin here).  The other, used in the ssoToken parameter, is that of the ‘demo’ user.

If I was to execute the same call, but remove the ‘claims’ parameter, e.g.
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"ssoToken":"AQIC5..*"}}'
then the policy would return false.

This proves that both the ‘ssoToken’ and ‘claims’ parameters are being considered together as part of the policy evaluation of the ‘subjects’ tab.  i.e. the group membership is being evaluated based on the ssoToken (which is a representation of the user in the datastore) as well as the ‘iss’ claim value passed by the REST API call.

So can I use the ‘claims’ parameter *only* (i.e. without an ssoToken) in order to evaluate a policy based on group membership where the ‘sub’ claim is assumed to be a user in the datastore?

Well, yes!
Let me return my ‘subjects’ tab to the original:

Now let’s consider the ‘environments’ tab.

Maybe we could try an LDAP Filter condition?  Well, not exactly.  The LDAP Filter condition implicitly adds the uid of the user in the datastore to the query by relying on the contents of the ‘subject’ ssoToken.  And, as we now know, the ‘sub’ claim is not rehydrated to a datastore subject/ssoToken so the implicit addition of the uid means the query fails top return a result.

But, what if we look at the ‘Scripted Authorisation Condition’?
One thing we do have available in a scripted condition is a ‘username’ property.  Now this *is* populated from the ‘sub’ claim.
(This also makes it very clear that the ‘identity’ object, which is a reference to the subject in the datastore, is *only* available if the ssoToken is used).

So, with the ‘username’ available perhaps we can write a script that connects to a directory (the datastore, for arguments sake) and reads properties, such as group membership for that subject?
It turns out that, yes, we can!

Head to the ‘Scripts’ page and create a ‘Policy Condition’ script like this:

The script would be something like this:

import org.forgerock.openam.ldap.LDAPUtils
import org.forgerock.util.Options
import org.forgerock.opendj.ldap.Connection
import org.forgerock.opendj.ldap.LDAPConnectionFactory
import org.forgerock.opendj.ldap.SearchScope

//default to false

//log the username, which will be the 'subject' as defined by the claims.sub param
logger.message('username' + username)

//set up the ldap server connection params
ldapBindUser='cn=Directory Manager'

//connect to ldap server
LDAPUtils ldapUtils = new LDAPUtils()
Options options = new Options()
LDAPConnectionFactory factory = ldapUtils.createFailoverConnectionFactory(ldapServer, ldapPort, ldapBindUser, ldapBindPassword, options.defaultOptions())
Connection connection = factory.getConnection();

//define the filter
groupFilter = '(isMemberOf=cn=api_customer,ou=groups,dc=openam,dc=forgerock,dc=org)'
userFilter = '(uid='+username+')'
filter = '(&'+userFilter+groupFilter+')'
baseDN = 'ou=people,dc=openam,dc=forgerock,dc=org'

//issue the search
try {
  result = connection.searchSingleEntry(baseDN, SearchScope.SUBORDINATES, filter, 'uid')
  logger.message('result:' + result.getAttribute('uid').toString())
} catch (any) {
    logger.message('Assume filter returned no results, so assume that user is not a member of the group')

You will want to ensure that the groupFilter variable matches your environment.
You may also need to change the connection details.
You may also want to improve the error handling, or use a different method for querying the ldap.

Note the ‘imports’ lines at the top of the script.  Some of these classes are not on the script module ‘whitelist’ for Policy Condition scripts.  In order for the script to run, they need to be added to the whitelist.
So, head to Configuration -> Global -> Scripting -> Policy condition -> Engine Configuration
and ensure the imported class names are permitted by the whitelist patterns.

Now, head back to the ‘Environments’ tab of the Policy and configure it like this:

Now let’s assume you have two users in your datastore: Alice and Bob.
Let’s also assume you have a group called ‘api_customer’.
Now Bob is made a member of the group, but Alice is not.

So, issuing:
curl --request POST --header "iPlanetDirectoryPro: AQIC5..*" --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"alice","iss":""}}}'
will return failure, i.e.:

curl --request POST --header "iPlanetDirectoryPro: AQIC5..*” --header "Content-Type: application/json" --data '{"resources":["customers"],"application":"api","subject":{"claims":{"sub":"bob","iss":""}}}'
will return success, i.e.:

Note that in my policy, I have an action of ‘permit’ defined, and also Static Response Response attributes of ‘hello:world’.

Don’t try to use ‘Subject Attributes’ because these depend on the SSOToken referencing the datastore identity.  You could, however, update the script to return attributes returned from the call to the directory if you wanted to emulate this functionality.  See:!/docs/openam/13/dev-guide#scripting-api-authz-response

Friday, 1 July 2016

Custom Stages to User Self-Service

Commons Project

One of the great features of the OpenAMv13 and OpenIDMv4 releases is actually not a feature of those products at all.  I'm talking about the Self-Service capability which is actually part of the 'COMMONS' project at ForgeRock.  As the name suggests, the functionality on 'Commons' may be found in more than one of the final products.  Commons includes capabilities such as audit logging, the CREST framework, the UI framework, as well as the User Self-Service functionality.

Self-Service Configuration

Now there is lots of good documentation about how to make use of the User Self-Service functionality as exposed through the OpenAM and OpenIDM products.
For example, for OpenAM:!/docs/openam/13/admin-guide#chap-usr-selfservices, and for OpenIDM:!/docs/openidm/4/integrators-guide#ui-configuring.
Whilst the end-user functionality is the same, the way you configure the functionality in OpenAM and OpenIDM is slightly different.  This is the OpenIDM configuration view:

One thing you might notice from the documentation is that the User Self-Service flows (Registration, Forgotten Password, and Forgotten Username) have the ability to use custom 'plugins' - sometimes called 'stages'.  However, another thing you might notice is a distinct lack of documentation on how to go about creating such a plugin/stage and adding it to your configuration.

Note that there is an outstanding JIRA logged for ForgeRock to provide documentation ( but, in the meantime, this post attempts to serve as an interim.  But, I'm only going to use OpenIDM as the target here, not OpenAM.

Fortunately, the JIRA linked above highlights that within the code base there already exists the code for a custom module, and a sample configuration.  So, in this post I'll explain the steps required to build, deploy, and configure that pre-existing sample.

The easiest way to do this is to get the code of the Standalone Self-Service application!

Get the Standalone Self-Service code

(*** EDIT : the code referenced below is now available in the forgerock-selfservice-public repo of the LEGACYCOMMONS project: ***)

You'll need to head over the 'self-service' repository in the 'COMMONS' project of the ForgeRock Stash server:
(You may need to register, but go ahead, it's free!)
If you're following the instructions in this post, and are targeting OpenIDMv4  (as opposed to any later releases) then you'll specifically want v1.0.3 of this SelfService repository.

 Now download the code to your local drive so we can build it.

Compile and run it

You can see that the 'readme.txt' provides the details of how to compile this project.  Note that this will compile the full 'Commons Self-Service' functionality, including the custom stage, in a standalone harness.

Once it's built you can browse to the web site and play with the functionality.  Any users registered are held in memory of this harness, and therefore flushed each time you stop the jetty container.

It's also worth noting that the readme.txt instructs you to enter email username and password.  These are used to connect to the configured email service of this test harness in order to actually send registration emails.  (The implementations in OpenAM and OpenIDM will use the configured email services for those products).  By default, the configured email service is gmail.  And, by default, gmail stops this type of activity unless you change your gmail account settings.  However, you may instead choose to run a dummy SMTP service to capture the sent emails.  One such utility that I'll use here is FakeSMTP:
So, once you have an accessible SMTP service, you might now need to change the email service config of the User Self-Service functionality.  You find this for the test harness - assuming the mvn build has worked successfully - here:


If you're running FakeSMTP on port 2525, then this might look like:
  "mailserver": {
    "host": "localhost",
    "port": "2525"

Now when you run the example webapp emails will be delivered to FakeSMTP (and will ignore whatever you use for username and password)

So, go ahead, register a user. The first thing you should see is a "Math Problem" stage.  Eh? What? Where did that come from? Well, that's the custom stage!!  Yes, this standalone version of Self-Service includes the custom stage!

Step1. Math Problem
Assuming you can add 12 and 4 together complete the 'puzzle'.  Then follow the remaining steps of the registration (noting that the email gets delivered to FakeSMTP, where you can open it and click the link to complete the registration).
Step 2. Email Verification
Email link
Step 3. Register details
Step 4. KBA

Inspect the configuration

Now, if we take a look at the 'Registration Stage' configuration for this example app, which we can find here:
we will see it begins like this:
  "stageConfigs": [
      "class" : "org.forgerock.selfservice.custom.MathProblemStageConfig",
      "leftValue" : 12,
      "rightValue" : 4
      "name" : "emailValidation",
      "emailServiceUrl": "/email",
      "from": "", 

Brilliant!  That first item in the stageConfigs array is the "Math Problem" with it's configuration (i.e. which two numbers to sum!) The remaining array items reference 'name' which are the native/in-built modules and their necessary configuration.
So, what we've achieved so far is:

  • Compiled the custom stage (Math Problem)
  • Compiled standalone version of Common User Self-Service
  • Tested a Registration flow that includes the custom stage, along with some native stages.

And what's left to do?

  • Deploy and configure the custom stage in OpenIDM

OpenIDM Deployment

Simply copy the custom stage JAR file to the 'bundle' directory of your OpenIDM deployment
cp forgerock-selfservice-custom-stage/target/forgerock-selfservice-custom-stage-1.0.3.jar <openidm>/bundle

And update the 'selfservice-registration.json' file in your 'conf' folder.
This is OpenIDM's name for the 'registration.json' file of the standalone SelfService app, so use the same configuration snippet you saw in the standalone app.

It seems that this method will not allow you to see the custom stage in the Admin UI for OpenIDM. Happily, changes made within the Admin UI do not remove the custom stage from the file. However, be warned that if you disable, then re-enable SelfService Registration there is no guarantee that custom stage will be added into the re-created 'selfservice-registration.json' file in the correct place.

So, with User Self Registration enabled, and the custom stage included in the config, when a user tries to register they will be prompted for the calculation at the appropriate point in the flow.
Custom Stage in OpenIDM flow!

Exercise for the Reader!

As you can see I have use the pre-supplied custom stage.  Showing one approach to building, deploying and configuring it.  If you need a custom stage to do something different, then you'll have to investigate the interfaces that need implementing in order to develop a custom stage.

Thursday, 30 June 2016

Dynamic Profiles in OpenAM 13

I recently had cause to play with 'Dynamic Profiles' in OpenAMv13.

If you don't know then Dynamic Profile is a realm based configuration setting that can dynamically create a user profile in the configured DataStore.  This can be useful in many circumstances where the authentication of a user takes place in a service other than the DataStore.  Examples of this include using OpenAM as Social Media/Oauth2 client (allowing users to sign in with Facebook), and SAML Service Provider (e.g. allowing users to sign in with Federated credentials).  Having authenticated the credentials it now might be useful to maintain a profile of that user that is used throughout their interactions with the services protected by OpenAM.
The specific scenario that triggered this investigation (and therefore this blog post!) was one where the user is authenticated by credentials in an Active Directory, but the user profile (DataStore) was a separate OpenDJ instance.

Now before I go any further it is entirely possible to make the AD your DataStore making things simple.  However, there are many occasions where the schema changes needed in a directory to provide the full range of DataStore capabilities are simply not allowed to be applied to an Active Directory due to business or security policy.  Therefore it is necessary to configure a separate DataStore using, say, OpenDJ as well as allow users to authenticate against AD with their AD credentials.

By default OpenAM configures a realm to 'Require' a profile entry in the DataStore for an authenticated user.  Therefore a user profile has to exist in the DataStore in order for authentication to complete.

Now you could provision a set of user profiles into the DataStore using something like OpenIDM to ensure the required profile exists.  Or you can set 'Dynamic' profiles in OpenAM.  This causes OpenAM to dynamically create a profile in the configured DataStore if one does not exist.
So, if we're 'dynamically' creating profiles, what data does OpenAM use to populate the DataStore?   Good question...and one that this post is designed to answer!

First, the basics...

  • Authentication Chain
    As described we want the user to authenticate using their AD credentials.  Therefore we need to define the Authentication Chain to contain an LDAP Authentication Module (or the AD module if you prefer).  Let's assume you can get this working (you may choose to set profile as 'Ignored' whilst you set this up so that authentication can complete without the need for a profile at all).

  • DataStore
    We'll be using OpenDJ as the DataStore.  Let's assume you can setup OpenDJ and configure it as a DataStore.  The OpenAM documentation on Backstage provides guidance on this.

Ok, so now we need to tie these together.  The general idea is that a user logs in with their AD credentials using the Authentication Chain/Module.  OpenAM checks to see if there is an associated user profile record in the DataStore and creates it if not.

Let's first of all consider the Authentication Module.  There are two key properties here:

  1. Attribute Used to Retrieve User Profile
  2. User Creation Attributes

Attribute Used to Retrieve User Profile

In my experience the description and helptext for this is not entirely accurate.  In the scenario I'm describing, OpenAM will retrieve the value of this attribute from the AD/LDAP.  It will then be placed into memory for use later on.  Let's call this 'AttributeX'.  Typically the attribute you want to retrieve is the unique identifier for the user, so might be sAMAccountName or uid, for example.  We'll use the value in 'AttributeX' later when we search for and find the user profile in the DataStore.

User Creation Attributes

This is a list of attribute values to be retrieved from the AD/LDAP authentication source that we wish to pass through to the user profile creation step.
You can specify these attributes either as singletons such as 'sn' or 'mail'.  Or as pipe separated maps such as phoneNum|telephoneNumber.
Again the description of the pipe mapping syntax is confusing.
In actual fact it should be read as:
OpenAM property|AD/LDAP property
i.e. the setting of phoneNum|telephoneNumber would take the value of telephoneNumber from AD/LDAP and store it in an OpenAM property called phoneNum.  We can then use the OpenAM property later when we create the record in the DataStore.
Note that the singleton syntax such as 'sn' will essentially be interpreted as 'sn|sn'.
Also note that the list that appears here is explicit.  If a property is not in this list then the DataStore will not be able to access it during creation.

Ok, so now we know how to extract information from the AD/LDAP that we authenticate against and store it in OpenAM properties.  Now let's look at the DataStore configuration.

There are a couple of things to look at here, but they're mostly in the 'User Configuration' section:
1. LDAP Users Search Attribute
2. LDAP People Container Naming Attribute
3. LDAP People Container Value
4. Create User Attribute Mapping
and, in the Authentication Configuration section,
5. Authentication Naming Attribute

LDAP Users Search Attribute

This is the attribute that will contain the value of 'AttributeX' i.e. the unique key for the user.  Typically in OpenDJ this is often 'uid', but could be something else depending on your configuration.  For example I want a DN of a person record to be something like:
whereas the default might be:
Therefore I need the search attribute to be 'cn' not 'uid'.

LDAP People Container Naming Attribute

As you can see from my desired DN, the container for the 'people' records is 'users' named by the 'cn' property.  Hence the value I specify here is 'cn'.  The default (for OpenDJ) is 'ou' here.

LDAP People Container Value

Again, from the desired DN you can see that I needs to specify 'users' here, whereas the the default is 'people'.

These settings are used to both find a user as well as set the DN when the user is dynamically created.
So, with my settings a user will be created thus:

(The dc=example,dc=com section is defined as the 'LDAP Organization DN' elsewhere in the DataStore config but you should already have that setup correctly!)

Create User Attribute Mapping

Now this is the interesting bit!  This is where the values we retrieved from the AD/LDAP Authentication module and placed in to OpenAM properties can be mapped to attributes in the DataStore.
By default the list contains two singletons: cn and sn

But the helptext says you must specify the list as 'TargetAttributeName=SourceAttributeName' which the defaults don't follow.
Remember that 'AttributeX' we collected?  Well any singleton attribute in this list will take the value of AttributeX...if it was not explicitly defined in the Authentication Module 'User Creation Attributes' map.
i.e. if User Creation Attributes included 'sn' then the value of that would be used for the 'sn' value here.  If there was no mapping then 'sn' here would take the value of AttributeX.
This particular nuance allows you to configure a setting to ensure that attributes that are defined as 'mandatory' in the DataStore to always have an attribute value.  This avoids record creation errors due to missing data constraints defined in the DataStore.
Now, what happens if we use the 'TargetAttributeName=SourceAttributeName' format?  Well, in this case 'TargetAttributeName' refers to the name of the attribute in the DataStore, whereas 'SourceAttributeName' refers to the OpenAM property as specified in Create User Attribute Mapping of the Authentication Module.

Oh, and there's one extra consideration here...
If the OpenAM property name (as defined in Create User Attribute Mapping) exactly matches the name of a DataStore property then you don't need to specify it at all in this list!!

For example, say you want to take the value of 'givenName' from the AD/LDAP and place it in the attribute called givenName in the DataStore then all you need to do is specify 'givenName' in the Authentication Module Create User Attribute Mapping settings.  There is no need to explicitly define it in this list.

However, if you do define it in this list then you must define it as givenName=givenName.
If you were to just specify givenName then it's value will be that of the mysterious 'AttributeX'.

and finally...

Authentication Naming Attribute

For my configuration I set this to 'cn'.  This is a bit of conjecture, but I believe this configuration is used when the DataStore Authentication Module is used.  The DataStore Authn Module will take the username entered by the user on the login page and try to find a user record where the 'cn' equals it.
Now in my scenario this should never happen...I never intend the DataStore to be used as an authentication source...but, as they say, YMMV.

Friday, 3 June 2016

Webinar: Mid-year release of the ForgeRock Identity Platform

We're preparing the ForgeRock Identity platform for a new release imminently. To find out more details, attend this webinar hosted by John Barco, VP of Global Product Marketing:

Tuesday, 10 May 2016

OpenIDM: Allocate from a Pool

The problem

During a recent Proof of Concept we had a need to allocate 'serial numbers' of hardware tokens to users as the user was created.  The serial number would come from a 'pool' of available tokens and should be allocated automatically on user creation.  It should also be possible to remove serial numbers from users which would be returned to the pool.

The solution

We decided to create a custom managed object in OpenIDM to hold the 'pool' of token serial numbers.  We could then leverage the new (in v4) relationship model to allocate serial numbers to users.
So we needed a few bits to this jigsaw:

  1. Define the custom managed object
  2. Populate the new object with the available serial numbers
  3. Modify the user creation logic to retrieve an available serial number and associate it with the user

Define the custom managed object

We were using FrejaId tokens, so we defined a 'managed object' like this:
SerialNo is just defined as a string and uses the OOTB box policy validation capability to ensure uniqueness (take a look at the 'managed user' object and its 'userName' property to see how validation policies can be applied to properties).  Also, it is defined as 'Viewable' and 'Searchable'.

AssignedUser is defined as a 'reverse relationship' which matches up with a property on the managed user object called 'AssignedSerialNo'.
We therefore also have to modify the 'managed user' object to include the 'AssignedSerialNo' property which is defined as a reverse relationship to the 'AssignedUser' property on on the FrejaToken object.

Populate the token object with SerialNos

There are several ways you could achieve this depending on the source of the information.  I just had a list of serial numbers from the 3rd party hardware vendor.  The easiest way for me was to create a command line script that iterated through the list and called the OpenIDM REST API for the new managed object.  So I created a file called 'tokens.csv' with each SerialNo on a new line.  There was no need for a header to be included.  The script then used this file as the source of input to a REST call to populate the object.  This was a bash script saved in '' and required the CSV file as an input parameter when it runs e.g.:
#  import the specified token serial no to OpenIDM
if [ -z "$1" ]   # Exit if no argument given.
  echo "Usage: `basename $0` file"
  exit $E_NOARGS
cat "$1" | while read line;
           do {
             curl --header 'X-OpenIDM-Username: openidm-admin' --header 'X-OpenIDM-Password: openidm-admin'  --header 'Content-Type: application/json' --header 'If-None-Match: *' --data '{"SerialNo":"'"$line"'"}' --request PUT http://localhost:8080/openidm/managed/FrejaToken/"$line"
exit 0

You can run this multiple times.  The 'unique' policy validation constraint on the managed object 'SerialNo' property will stop duplicates being saved.  Also note I'm using a 'PUT' request and specifying the _id of the item being created.  (The _id being the same as the 'SerialNo').
Note also that I'm using the default OpenIDM-Admin username and password, and running this script on the same machine as OpenIDM.

Modify user creation logic

As you may be aware OpenIDM provides 'hooks' in many places to allow custom logic to be triggered at various points.  One such place is when a specific instance of a managed object is created.  In fact, the out-of-the-box definition for a managed user object includes an 'onCreate' script.  This script is called onCreate-user-set-default-fields.js  The script exists in openidm/bin/defaults/script/ui.  You should never ever modify this script...but you can make a copy of it and modify the copy!  So take a copy and place it in:
(you may need to create the 'ui' directory)
Then add this to the bottom of the file:
// allocate first available freja token
var availableTokens = openidm.query('managed/FrejaToken', {_queryFilter:'true'}, ['*', 'AssignedUser']).result.filter(function (token) { return !(token.AssignedUser)});
if ((availableTokens) && (availableTokens.length > 0)) {
  object.AssignedSerialNo = {"_ref":"managed/FrejaToken/" + availableTokens[0]._id};

The first line queries all tokens, and filters out the tokens that are already assigned.  The availableTokens array therefore contains all available tokens.
We then check that there are actually tokens in the array, and then get the first one, saving it as a 'relationship' to the 'AssignedSerialNo' property of the managed user object that we created earlier.

Note that as this is defined as a 'reverse property' you would also see that the 'AssignedUser' property of the FrejaToken is now automagically populated.  Which also means that the next time this query is run then this token will already be 'allocated' and therefore not exist in the availableTokens array.


Because of the 'reverse property' relationship, deallocation is straightforward.  You can remove the link between the user and SerialNo using REST calls or the User Interface.  This means the token becomes available again.  Also, if you delete a user then the relationship is removed, returning the assigned SerialNo to the pool.