Sunday, 24 June 2018

Essbase REST API - Part 1

I am no stranger when it comes to Web Services and in particular REST APIs which I have been covering for a number of years now, I am a strong believer that they play an important part in integration and automation. With the rise of the cloud, REST APIs are integral to process automation between different cloud services.

In the past I have covered on-premise Essbase Web Services which is based on the SOAP protocol, the preferred method these days is REST and prevalent in most cloud services. I definitely enjoy working with REST over SOAP, mainly for its simplicity and ease of use.

The 11.1.2.4 Essbase patch releases have contained the following in the readme:

"JAPI SOAP web services will be removed in a future release and they will be replaced with REST APIs."

Unfortunately for on-premise Essbase this has not yet happened and it is unclear when it will happen.

It is a different case for the Essbase side of Oracle Analytics Cloud (OAC) where a REST API is available. Though unless I am missing something it does not seem to be well documented yet but I am sure that will change over time.

I have to assume that one day the REST API will also be pushed down from cloud to on-premise, but for now I will be covering what I have found available in the Essbase cloud service. In this first part I am going to go through a really quick introduction in getting started with the REST API.

Just in case you are not aware, the OAC – Essbase Command-Line Interface (EssCLI) is built on top of the REST API so you may have already inadvertently been using the API.

Getting started with the REST API is simple, all you need is a REST client. There are lots of REST clients available as add-ons for most browsers, alternatively standalone apps like Postman or Insomnia.

If you want to take using REST a stage further, for say automation, then most scripting languages provide the functionality to achieve this.

I am going to stick with what I have been doing for the last few years, using the Boomerang app for Chrome and for scripting PowerShell, mainly for demo purposes and ease of use. The concept will be the same whichever client or scripting language you choose to use.

I won’t go into detail on what I have already covered in the past but here is a description of REST from my first post on the planning REST API:

REST describes any simple interface that transmits data over a standardized interface (such as HTTP) without an additional messaging layer, such as SOAP.

REST provides a set of design rules for creating stateless services that are viewed as resources, or sources of specific information, and can be identified by their unique URIs.

RESTful web services are services that are built according to REST principles and, as such, are designed to work well on the Web. Typically, RESTful web services are built on the HTTP protocol and implement operations that map to the common HTTP methods, such as GET, POST, PUT, and DELETE to retrieve, create, update, and delete resources, respectively.



Examples of each method in terms of Essbase could be:

GET – Retrieve a list of applications, databases or logged in users.
POST – Execute a calculation, MDX or MaxL script.
PUT – Update a calc script, substitution variable or filter.
DELETE – Delete an application, script, variable or filter.

The URL structure to access Essbase REST resources would follow the lines of:

https://<oac_essbase_instance>/rest/{api_version}/{path}

Currently the api_version is v1.

So let’s start out with one of the most basic resources to return the version information, this is the same as selecting ‘about’ in the web UI.


 The EssCli has a command to return the version.


Using a REST API client, you would just need to enter base REST API URL plus “about”, for example:


Before you run the request, you will also need to enter your credentials.


This will add a basic authorization header to the request with the username and password base64 encoded.


Now you are ready to execute the request and receive a response back in JSON format.


To convert this into a scripting equivalent does not take much effort.


Basically, the user credentials are encoded into base64 and added to the request header, this is automatically done for you when using a REST client.

The URL for the about REST resource is defined, a GET request is made to the URL and the response stored, this is then outputted.

To return session information about the user making the REST request, then all that is required is “session” suffixed to the base URL.


In terms of scripting, the existing script only requires the URL updating to include “session”


To retrieve a list of Essbase provisioned users then the “users” resource can be accessed.


This will return a list of user information in text format, there is an equivalent resource for returning group information, no surprise that this is accessed with “/groups”.

To retrieve information on which users are logged into the Essbase instance then the “sessions” resource can be requested.


The same script could be updated to return a list of logged in users and the time they have been logged in.


I am going to leave it there for this post and in the next part I will get into some of the more interesting REST API resources available.

Sunday, 3 June 2018

Data Management now supports executing business rules

In Data Management, an area of functionality that has not been available until now is the ability to execute business rules. If you wanted to run a business rule to say aggregate the loaded data you would have to run the data load in Data Management and then switch over to planning to run the rule.

This was not a problem if you were running processes outside of the Data Management UI, because with EPM Automate or the REST API you have the ability to run a data load rule and business rule separately.

Before I get on to the new functionality in Data Management I think it is worth briefly going through what is currently available in on-premise FDMEE.

With on-premise FDMEE there is no option to execute business rules but it is possible to execute calculation scripts, this is true for both Essbase and Planning target applications.

Within the target application details there is a tab for calculation scripts which allows scripts to run be run at Data Rule, Location, Category or Application level.


The scripts can be executed before or after a data load, or before or after check events.


The functionality works alongside Essbase runtime substitution variables in calculation scripts, this allows a large selection of values to passed from FDMEE into the Essbase calc script.


So on-premise FDMEE provides quite a bit of flexibility for running scripts, even if it doesn’t meet your requirements then custom scripting can help achieve whatever you need.

There is not really any direct access to Essbase in EPM Cloud so this means the new functionality introduced in the 18.06 release is directed towards executing business rules.

As I have quickly provided a summary of what is available for on-premise, it is time to compare this to the new feature in PBCS and EPBCS Data Management.

There are the following important notes from the documentation for this release:
  • Business rules are registered in the Target Application option.
  • Business rules are available only for Planning applications.
  • Business with run time prompts are not supported when executed from Data Management.
  • Business rules can be assigned an Application scope or to a specific data load rule.
If you are running EPBCS or PBCS + additional module then the rules can either be standard calculation script or a Groovy script.

Currently business rules are only executed after data load rules have completed, there are no options to change this behaviour like with on-premise.

There are some other points that are not yet mentioned in the documentation which will become apparent as I test out the functionality.

From the above notes, the stand out one for me is there no ability to have pass values to runtime prompts which does limit the rule. This is disappointing as it is possible to pass runtime parameters when running rules using EPM Automate or the REST API.

Anyway, let’s get on with it and test out executing business rules in Data Management.

If you select a target planning application in data management there will be a new “Business Rules” tab.


As described in the notes, the scope of the rules can only be at application or data load rule level.

There is no option to retrieve the available business rules in planning so the name must be entered manually and it is not validated.


Personally, I think it would have been nice to be able to select from a list of available rules, it is possible to return a list of rules using the REST API so I don’t understand why this could have not been included.

I have entered a valid business rule name and set it to application scope, this means any data load rules that are run will execute the business rule after the data load completes.


The load method is set to the default of “Numeric Data Only” which loads data directly to Essbase using an Essbase load rule.


The data management load rule is then executed and the process logs indicate it was successful.


Looking at the planning job console, no rule was executed.


In the data management process logs there is no mention of the business rule.

I updated the load method in the data load rule to “All data types with security” which means data is loaded through the planning layer using the Outline Load Utility (OLU).


The data load rule is run again and this time there is an entry in the job console to show the business rule has been run.


In the data management process log there is the following entry:

Property file arguments: /DF:MM-DD-YYYY /DL:tab /PDR:MTL_AGG_1 /I:***Vision_882.dat /TR

So it looks like there is a new property being used in the OLU which defines to run a business rule.

There is no further information in the process log about the business rule or whether it ran, the assumption is if there is not an error the rule ran successfully.

It is a shame if the business rule functionality only works when the load type is set to all data types, at the time of writing, the documentation does not specify this information.

On to the next test, the documentation specifies:

“The Application scope rules does not run if a Data Rule scope exists for the data load rule.”

I added a new valid rule at data rule level and applied a data load rule to it.


This should mean only the rule defined at data rule level should run and the one at application level will be ignored.

After running the data load rule, the process log proves this theory to be correct.

Property file arguments: /DF:MM-DD-YYYY /DL:tab /PDR:MTL_AGG_2 /I:***Vision_883.dat /TR

The planning job console shows that the rule was run.


The next test I wanted to check whether the following information in the documentation was correct:

“If the scope is Data Rule, only rules for the running data rule run in sequential order.”

The statement should also apply to  business rules added at application level.

I added a new rule at data load rule level, I set the sequence so “MTL_AGG_3” should run first.


The data load rule was successful.


This time the process log had the two rules in the OLU arguments and in same the order defined in data management.

Property file arguments: /DF:MM-DD-YYYY /DL:tab
/PDR:MTL_AGG_3,MTL_AGG_2 /I:***Vision_884.dat /TR

The job console showed that both the rules ran, and after checking the times, they ran in the correct order.


For the next test I wanted to see what happens if an invalid rule name was entered, would the data management rule fail?


The data load rule ran successfully.


The process log contained the rule name as an argument.

/DF:MM-DD-YYYY /DL:tab /PDR:InvalidRule /I:***Vision_885.dat /TR

So if an invalid rule name is entered there is no failure.

Next to test what happens if a rule errors, I added a rule which requires a runtime prompt value.


The good news is the data load rule failed.


There was the following entry in the process log that provided the reason for the failure.

com.hyperion.planning.HspCallbackInvocationException: Business rule failed to execute. See the job console page for error details.

As expected the planning job console shows an error against the rule.


Finally, I wanted to test what happens with a non-admin user and the load method in the rule set to “Add data types with security”. This means the data is loaded from data management to planning using the REST API and the import data slice resource, a grid of data is generated in JSON format and posted to planning.

A business rule was defined in the target application details.


The user does not have access to run the rule in planning.


The non-admin user ran the data management rule and it completed with no errors.


This time in the process log it actually mentions the business rule name and you can see that the REST API using the import data slice resource is in operation.

DEBUG [AIF]: businessRuleName: MTL_AGG_1
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice


The job console shows the business rule was run, even though the user does not have access to the rule in planning, it can still be run using the REST API resource.


What is interesting is in the payload of the REST API request there is a new parameter that is not currently in the REST API documentation.


   "aggregateEssbaseData":false,
   "dateFormat":"MM-DD-YYYY",
   "customParams":{ 
     
"PostDataImportRuleNames":"MTL_AGG_1"

The parameter “PostDataImportRuleNames” allows business rules to be executed after submitting a grid of data.

Just to prove that multiple business rules can be executed in a set sequence, I added a new rule and defined the sequence.


The data load rule ran successfully and the process log had entries for both rules in the correct order.

DEBUG [AIF]: businessRuleName: MTL_AGG_2,MTL_AGG_1

The JSON posted with the REST API had both the rules and in the correct order.


   "aggregateEssbaseData":false,
   "dateFormat":"MM-DD-YYYY",
   "customParams":{ 
     
"PostDataImportRuleNames":"MTL_AGG_2,MTL_AGG_1"

I checked the job console and the rules did run in the correct sequence.

That completes my initial look at the data management functionality for executing business rules, it is certainly not without limitations and there is room for improvement.