Wednesday 28 March 2018

EPM Cloud – Limiting the use of an application through automation

The question around how to automate the process to put an application into maintenance mode has been raised on a few occasions, so I thought I would put a post together to cover this topic.

If you are not aware, it is possible to restrict access to an application to only administrators; maintenance tasks or processes can then be carried out, once complete the application access can be returned to all users.

This is not new functionality and has been around in the on-premise planning world for a long time.

Let us first cover off the manual method and then move on to different ways to accomplish it using automation.

Selecting “System Settings and Defaults” from the navigator will provide the options to set the application maintenance mode.


There are two options available, either “All users” or “Administrators”

This is similar to on-premise planning with the exception of being able to limit the access to only the application owner.


Staying with on-premise for the moment, there is a command line utility which provides the ability to automate the process of setting the application maintenance mode.

An example to restrict an application to only administrators would be:


The output log provides a good insight to what happened when running the above command.


If a user tries to log into the application they are greeted with a message to inform them that application is in maintenance mode.


Going back to EPM Cloud, you would expect an EPM Automate command to be able to restrict application access. Well, not so fast, it is possible but not with a direct command, though this may well change in the future.

I will now cover a couple of methods to automate the process of limiting application access, the first being through a refresh database job.

From the navigator if you select overview there is an action available to refresh the database.


If create is then selected, the refresh database window is displayed, you will notice there are options to limit access to the application before and after the refresh.



I have selected to limit the application to only administrators, this can now be saved as a job.


If this job is run, a refresh will take place and the application will only be accessible to administrators even after the refresh completes.

To return the access back for all users the process is repeated but this time after the refresh, the option is set to enable the application for all users.


Once again this can be saved as a job.


Now we have two jobs, one will put the application into maintenance mode and the other will take it out of maintenance mode, though the downside is that two database refreshes will also be carried out.


In terms of automation, one way to handle this could be to create two schedules, one against the job to set the application into maintenance mode and the other against the job that takes the application out of maintenance mode.


This is fine if you know the exact times and frequencies of when you want to change the application maintenance mode, there is a higher probability that you would want to incorporate into an existing process or script and this can be achieved using EPM Automate or the REST API.

EPM Automate has a “refreshcube” command which you include the job name, so with a simple PowerShell script you could put the application into maintenance mode:


After running the script, the application has been limited to admin users.


If the application is accessed with a non-admin account the following page is displayed:


To take application out of maintenance mode then the script just needs to be updated to call the other refresh database job.


Once the script has completed, the application has been enabled for all users.


To take EPM Automate out of the picture, the same can be achieved using the REST API.

I have covered the REST API in detail in the past, there is a REST resource to run jobs and all that is required is the job type of “CUBE_REFRESH” and the name of the job are passed as JSON in the body of request.

Here is an example of how to achieve that with a script:


To take the application out of maintenance mode you just need to change the job name in the script.

To be honest I am not a fan of having to perform a database refresh to put the application in maintenance mode, it’s just not an efficient way and I feel there should be a direct command available, until this happens I looked for an alternative solution.

I will take you through the solution I came up with.

If you go to migration and export only the application settings


This will create a snapshot which can be downloaded.


If the snapshot is extracted there will be an xml file which contains all the application settings.


The xml file contains a node called “LoginLevel” which defines whether the application is limited to only administrators or is available to all users.


Now I know I could have created two snapshots of the application settings. I could have taken one with the login level set to all users and one for administrators, then I could automate importing the snapshot to limit the application access. The problem would be if any of the other application settings change they would be overwritten when importing the snapshot, I could keep taking new snapshots but that didn’t feel optimal.

I updated the xml to include just the login level setting, I created one for administrators.


Then a second xml for all users.


The directory structure of the snapshot was kept identical so there would be no issues when importing, these were then compressed into zip files.


The two snapshots were uploaded to the application.


To automate this the process of limiting access to the application with EPM Automate, the “importsnapshot” command can be used.

To limit the application to administrators the “maintenance_mode_admin” snapshot can be imported.


After the script has completed, the application maintenance mode will be set to administrators.



To confirm, logging in with a non-admin user produced the maintenance mode message.


To revert to all users, all that is required is the snapshot name is changed when calling the “importsnapshot” command.


The application access level has been correctly returned to all users.


It wouldn’t be complete if I didn’t provide an example script on how to go about this with the REST API.



To change the maintenance mode to all users in the above script, all would be required is to update the snapshot name.

So there we go, a few different options to automate the process of limiting access to an EPM Cloud application.

Monday 12 March 2018

Planning audit reports with help from Groovy - Part 2

In the previous post I went through a possible solution for producing audit reports through the planning UI with the help from business rules and Groovy. I mainly covered the background to planning audit reports and went through a demo of the solution, in this post I am going to break down how it was all put together.

Let us start with running the audit report business rule which can be accessed through the standard or simplified user interface. I am going to stick again with the simplified interface as it provides the ability to download the audit report once it has been generated.


Once launched this will present the user with runtime prompts to define how the audit report will be generated.


The audit type and date range are built from Smart Lists within the planning application.

As I explained in the previous post I have limited the list of options for demo purposes but the final solution has all the different audit types available.
 

The above list is built from the following Smart List.


In Calculation Manager a variable has been created as a runtime prompt with the type set to integer and the planning Smart List selected.


This means when the rule is launched the audit type drop down will be displayed as shown above, the selection that is passed into the business rule is the integer value that matches the Smart List ID.

It is the same concept for the date range, there is a planning Smart List.
 

The reason why the IDs have been created this way is due to the way the SQL will be generated in the Groovy script, beside the “All” option the IDs match the number of days.

Just like with the audit type, a variable has been created to display the Smart List when launching the rule.
 

The option to select a select a delimiter for the audit report is handled using a Calculation Manager runtime prompt string variable with the default value set.

There is also a hidden variable which passes the name of the planning application into the business rule.
 

On to the business rule which basically just calls a Groovy script and passes in the variable names and the values from the runtime prompts.


If you want more detail about getting up and running with Groovy then please refer to one of my previous posts which you can read about here or here.

Before I get on to the Groovy script there are a few points to mention.

I have disabled static compile and sandbox in groovycdf.properties in order to relax the static type checking and access to Java packages, no additional Java files are required other than the Groovy jar.

The SQL in the script is based on Oracle but wouldn’t require much updating for it to run against SQL Server.

The script does not contain any hardcoding of database connection information including passwords as they are generated at runtime.

I will break down the script into chunks, the variables would have all been defined at the beginning of the script but I have moved them about to make more sense.

The beginning section of the script generates the connection information to the planning application database, a method is called to return the planning system database connection details from the EPM registry.

A connection is made to the planning system database and a SQL query executed to return the planning application database connection information, the planning application name used in the SQL query was passed in from the business rule, the value had been defined in a Calculation Manager variable.

Once the planning application connection details have been returned a connection is then made to it.
 

If you are not interested in dynamically generating the database connection details then the above can be simply replaced with something like:

sql=Sql.newInstance("jdbc:oracle:thin:@dbserver:port/servicename", "username", "password","oracle.jdbc.OracleDriver")

The next section manages the variables, which like I said would usually be at the start of the script.

The values selected from the drop downs for the list of audit types and date ranges are passed into the script as strings so they are converted to integers.

A date is generated that is then used to form the names for the exported audit text file and zip.

A temporary directory is defined which will be the location where the audit file will be generated before it is compressed and then moved to the planning inbox/outbox explorer location.

The Groovy script is executed by the Essbase process so the temporary directory is located on the Essbase server.

Next a lookup is made to the EPM Registry to retrieve the planning/inbox explorer location, which if you are not aware, is the same location in the LCM import/export directory.
 

If I ran the audit business rule and with the following runtime prompt selections:


The variables in the Groovy script would be assigned with the following:

auditList=2
dateRange=30
delimiter=|
auditFile=audit110320181534.txt
zipFileName= audit110320181534.zip
planInbox=\\FILESERVER\epmshare\import_export\

The next section builds a SQL statement to retrieve the records from the planning audit table based on the values selected from the business rule runtime prompts.

There are two SQL statements built, they are practically the same, except one of them returns a count of the number of audit records based on the selected criteria, this is because we don’t want to generate an audit file if there are no records returned.
 

Based on the same selection from the previous example the SQL generated for the count would be:

SELECT count(*) as numrecords FROM hsp_audit_records WHERE 1=1 AND type = 'Data' AND TRUNC(time_posted) >= TRUNC(sysdate) -30

The query is returning a count of the number of audit records where the audit type is data for the last 30 days.

The SQL is then executed and if the number of records returned equals zero then the business rule will terminate with an error and the error message “No records returned” will be available in the job console.
  

The SQL is then generated to return the audit records which is based on the same criteria and would produce:

SELECT * FROM hsp_audit_records WHERE 1=1 AND type = 'Data' AND TRUNC(time_posted) >= TRUNC(sysdate) -30 order by time_posted desc

If the above SQL query is run using a tool like SQL developer it would produce the following results:


The Groovy script executes the SQL and the column names and rows returned are split by the delimiter and written to a text file
 

The file is temporarily created in the directory defined in the “tmpFileDir” variable.


The contents of the file will be similar to the results shown in the earlier query.


The audit text file is then compressed into a zip file.


The zip file is created in the same temporary directory.


Finally, the text file is deleted.


The zip file is moved to the planning inbox/outbox location.


The file is then accessible from the planning simplified interface where it can be downloaded or deleted.

As shown in the last post the solution does also have the ability to archive the records in the audit table and includes a planning form to show the number of records in the audit table and when it was last archived, if you would like to find out more information then feel free to get in touch.