Wednesday, 17 January 2018

Simplifying a planning process with help again from Groovy

I recently had the chance to look at improving an error prone and over complicated on-premise process to run FDMEE data load rules from Planning business rules.

The solution started with a business rule using the Calculation Manager custom MaxL function to call a batch file with the SHELL command.

The batch script then called another script using PSExec to allow remote execution as the script required running on the FDMEE server and not on the Essbase server, this was required because the FDMEE load rule was being run with the standard batch scripts that need to be executed from the FDMEE server.

The process included the calling of MaxL scripts to read and update substitution variables, the output of these scripts also needed parsing as part of the batch.

To add to all the complication there was a major issue where multiple users could run the same process while it is was already active.

As you can image there was plenty of room for improvement with the current solution, though instead of trying to fix areas of the process I took a step back and tried to look at the solution as a whole and this is where Groovy came in to help.

If you are not aware the ability to use Groovy scripting with planning business rules or Essbase calculation scripts came into action from Calculation Manager patch, it provides much more flexibility over Essbase custom defined functions and is easy to get up and running.

Anyway, straight away I knew Groovy could help with running of the FDMEE load rules as I have already tackled this problem before, instead of covering old ground again there are a couple of posts I have written in the past that go through the steps.

The first post looks at using Groovy to call a Java servlet which basically the same method as what the FDMEE batch scripts do behind the scenes, the advantage of using the Groovy method is that the script does not have to be run from the FDMEE server as it just requires http access to the web server.

Planning to FDMEE with a little help from Groovy

The above post was written before the FDMEE REST API appeared in, not to worry though I have covered using the REST API with Groovy in part of the following post:

Loading Planning metadata from a relational source through FDMEE and Planning

If you have not already I recommend going through the posts as in this post I am going to assume there is an understanding of what Groovy scripting with business rules is all about.

By creating a Groovy script to run FDMEE rules from a business rule I have already improved the original solution and removed the need to run batch scripts across servers with PSExec.

Now on to the next problem of stopping the rule from running if it was already active, the original solution involved a user selecting an entity with a runtime prompt so I am going to stick with it in my example.

Basically, you wouldn’t want the rule to run if it was already running for a selected entity, in planning one of the techniques could be to set a flag within in the database and if the flag is active then stop running the rule.

In the following example, a fix is made down to a single block in a restricted part of the database, the entity is defined by a RTP

A member named “FDMEE_Run_Flag” controls whether the process is already running, if the value is 1 then it is running, if it is 0 it is not running.

If the flag is set to 1 then the rule is stopped using the @RETURN function, if it is not set to 1 then the flag is set to 1, the Groovy script is then called using the @CalcMgrGroovyNumber function to run the FDMEE process and after it completes the flag is set back to 0.

To demonstrate I updated the value of the flag member to 1 or “Running” as the member has been assigned a Smart List.

The business rule was launched and the same entity selected.

An error is returned and the rule then exits.

This type of solution can be perfectly acceptable but there can be times where the flag is not reset for example if the rule did not complete, this would mean the end user would receive the same error message each time the rule is run until the flag was reset, using this type of method does also require some minor changes to the planning application and process.

I was looking for a solution that had minimal impact to the planning application and processes, as I was already executing a Groovy script then maybe it could help to restrict the FDMEE data load rule from being run.

Before I go through solution I want to point out that this is just one method out of a lot of possibilities and it doesn’t mean it is right for everyone.

Whenever a business rule is launched there is information captured in the Job Console.

This means you know when a rule is running or not, the information is stored and read from a database table in the planning applications repository called “HSP_JOB_STATUS”.

The table has a column called “RUN_STATUS” which defines whether the rule is running, the table also captures RTP details and the value selected.

Groovy SQL makes it easy to query a database table so the idea was to check the table to see if a rule is running for a specified entity, if it is running return an error, if it is not carry on with the processing and then run the FDMEE load rule.

Just before I get on to the script it is worth pointing out that I have been testing with the latest version of Calc Manager, which at the time of writing is, Groovy was made available from but there were a few changes in which I take advantage of so I recommend being either on the latest patch or at least 008.

If you are just starting out then download the latest 2.4 version of the Groovy jar because from version 2.5+ the requirement is for Java 7 and unfortunately still implements Java 6.

To get up and running check out my earlier blog but in summary you only need to download the Groovy jar, copy to the Essbase server and create a properties file, restart Essbase and off you go, easy.

From Calc Manager it is possible to create a properties file called in the same location as the Groovy jar which is


You probably want to set the “staticcompile” parameter to false unless you want to go down the route of having define every variable and have it assigned the correct type, which means you would lose some of the Groovy goodness, though saying that I think Groovy in EPM Cloud has it enabled by default.

Setting the “sandbox” parameter to false overrides the whitelist package and classes “” file which means you instantly have access to all the packages within the classpath, this is great for developing though you might want to consider restricting the packages when you move to a production state.

On to the script, I am not going to include every line of code as it there to provide an idea of how it was achieved, I have changed around the script a little so it hopefully makes more sense, so before the purists hound me that is the reason why :)

The script could be simplified as I have written it so that no database connection information including passwords are included, this adds more steps but at the same time makes it reusable and requires no changes when migrating across environments.

I will break the script down into bite-sized chunks, I am going to show the business rule script later but all you need to know is that three variables are passed into the Groovy script from the rule, one for the rule name and the others for the RTP name and value, so in this case it would be “Entity” and the entity name, this is required when querying the job console table.

The first section converts the entity name to include quotes, this is because the RTP values in the job console database table are surrounded by quotes.

The passed in variables are outputted to the log.

Next, a function is called which directly queries the EPM registry and returns the connection information to the planning system database.

Now that the planning system database connection details have been gathered, a query can be made to return the connection information to the planning application database.

The table “HSPSYS_DATASOURCE” holds the connection information for all the planning applications.

Basically, a query is made to return the following information:

This is done with simplicity using Groovy SQL and once the information has been retrieved it is then stored.

If for some reason no records are returned the script ends and returns an error number back to the business rule.

If you wanted to hardcode the connection information to the planning application database in the script then the above would not be required and you would only need the next section.

Now a new database connection can be made to the planning application and the job status table queried to see if the rule is already running for the same entity.

If the query returns a count greater than one it means the rule is already running against the same entity so return an error code back to the business rule.

The count value should always be at least one because there will be an entry in the table for the current execution of the rule.

If the query doesn’t return a count greater than one the processing continues and calls the FDMEE REST API to run the data rule.

Taking an example of the rule running, the job console would display a run status of “Processing”

The script would run the following SQL to check if the rule is running for the specified entity.

The query returns a count of 1 which means there is only one instance of the rule running.

On to the business rule which fixes down to a single block and one member so the Groovy script is only called once.

The Groovy script is then executed using the @CalcMgrGroovyNumber function and Calc Manager RTP variables for rule name, RTP name and value are passed into it.

A temporary Essbase variable holds stores the value returned from the Groovy script, if the value is 2 then it means the process is already running, if the value is 3 an error occurred in the script, both values generate an error back to the user.

The RTP name and value variables are set to be hidden so they are not shown at runtime.

Now on to running the rule.

If the rule is not currently running for the same entity the script should run through the end.

The log will contain:

If the rule is run again for the same entity while the process is still running the following error will be displayed and the process will stop.

This time the log will contain the following:

If an error occurs in the process the following error will be returned to the user.

Earlier I mentioned that the original solution called MaxL scripts to read and update substitution variables as well as having to parse the output log to achieve this.

With Groovy it makes life much simpler as sub vars can be passed into the script, for example say we are interested in these variables.

The sub vars could be passed in Groovy script in the following way.

The Groovy script would automatically be able to access them.

You can then process them however you like, in this example they are just written to the log.

You don’t have to pass them into the script as Groovy should be able to directly access the Essbase Java API packages.

With the following script, the variables can be retrieved using the API or just as easily be created/updated.

The above script calls a function which returns a SSO token so there is no need to hardcode passwords, alternatively the password can be included in the connection to Essbase.

The script reads the “curYear” and “curMonth” variables and writes  them to the log.

A new sub var has also been created.

When you put this all together, one Groovy script has replaced the original over complicated process, the new solution can restrict rules from running concurrently, execute FDMEE data load rules, manage substitution variables and removes the requirement to run scripts across servers.

If you want to find out more then please feel free to get in touch.


  1. Where did you find the function: reg_db_details?

    I wasn't aware we had access to this data without creating a properties file but this is very cool!

  2. I created the function, it calls a hidden Java method to extract the information.

  3. Hi John, wonderful post again!

    I've another question about groovy usage on-prem env.
    Can I get the updated cell information from a data form if it is an on-prem app?

    In EPBCS; It's possible to get edited cells and process them by using Groovyscript; but how i can enable the same feature for on-prem applications. Will it work if I simply put ".edited" ?

    In addition to putting the groovy_abc.jar file to Essbase server, i think we need to touch Planning server too, am i right? (as if we used to deploy ValidateData.js once upon a time)

    Your insights much appreciated.

  4. The functionality is not the same as in the cloud. The on-premise functionality basically runs against Essbase so you don't get any of the planning features you do with EPBCS.

  5. Thanks John!

    REST API functionality is also different in EPBCS and on-Prem.
    It's easy to pass multiple RTPs in EPBCS; however i'm not able to pass one-single RTP to on-Prem application yet. I found your posts regarding to the same...
    Will post here if i make it work.


Note: only a member of this blog may post a comment.