Sunday 3 June 2018

Data Management now supports executing business rules

In Data Management, an area of functionality that has not been available until now is the ability to execute business rules. If you wanted to run a business rule to say aggregate the loaded data you would have to run the data load in Data Management and then switch over to planning to run the rule.

This was not a problem if you were running processes outside of the Data Management UI, because with EPM Automate or the REST API you have the ability to run a data load rule and business rule separately.

Before I get on to the new functionality in Data Management I think it is worth briefly going through what is currently available in on-premise FDMEE.

With on-premise FDMEE there is no option to execute business rules but it is possible to execute calculation scripts, this is true for both Essbase and Planning target applications.

Within the target application details there is a tab for calculation scripts which allows scripts to run be run at Data Rule, Location, Category or Application level.


The scripts can be executed before or after a data load, or before or after check events.


The functionality works alongside Essbase runtime substitution variables in calculation scripts, this allows a large selection of values to passed from FDMEE into the Essbase calc script.


So on-premise FDMEE provides quite a bit of flexibility for running scripts, even if it doesn’t meet your requirements then custom scripting can help achieve whatever you need.

There is not really any direct access to Essbase in EPM Cloud so this means the new functionality introduced in the 18.06 release is directed towards executing business rules.

As I have quickly provided a summary of what is available for on-premise, it is time to compare this to the new feature in PBCS and EPBCS Data Management.

There are the following important notes from the documentation for this release:
  • Business rules are registered in the Target Application option.
  • Business rules are available only for Planning applications.
  • Business with run time prompts are not supported when executed from Data Management.
  • Business rules can be assigned an Application scope or to a specific data load rule.
If you are running EPBCS or PBCS + additional module then the rules can either be standard calculation script or a Groovy script.

Currently business rules are only executed after data load rules have completed, there are no options to change this behaviour like with on-premise.

There are some other points that are not yet mentioned in the documentation which will become apparent as I test out the functionality.

From the above notes, the stand out one for me is there no ability to have pass values to runtime prompts which does limit the rule. This is disappointing as it is possible to pass runtime parameters when running rules using EPM Automate or the REST API.

Anyway, let’s get on with it and test out executing business rules in Data Management.

If you select a target planning application in data management there will be a new “Business Rules” tab.


As described in the notes, the scope of the rules can only be at application or data load rule level.

There is no option to retrieve the available business rules in planning so the name must be entered manually and it is not validated.


Personally, I think it would have been nice to be able to select from a list of available rules, it is possible to return a list of rules using the REST API so I don’t understand why this could have not been included.

I have entered a valid business rule name and set it to application scope, this means any data load rules that are run will execute the business rule after the data load completes.


The load method is set to the default of “Numeric Data Only” which loads data directly to Essbase using an Essbase load rule.


The data management load rule is then executed and the process logs indicate it was successful.


Looking at the planning job console, no rule was executed.


In the data management process logs there is no mention of the business rule.

I updated the load method in the data load rule to “All data types with security” which means data is loaded through the planning layer using the Outline Load Utility (OLU).


The data load rule is run again and this time there is an entry in the job console to show the business rule has been run.


In the data management process log there is the following entry:

Property file arguments: /DF:MM-DD-YYYY /DL:tab /PDR:MTL_AGG_1 /I:***Vision_882.dat /TR

So it looks like there is a new property being used in the OLU which defines to run a business rule.

There is no further information in the process log about the business rule or whether it ran, the assumption is if there is not an error the rule ran successfully.

It is a shame if the business rule functionality only works when the load type is set to all data types, at the time of writing, the documentation does not specify this information.

On to the next test, the documentation specifies:

“The Application scope rules does not run if a Data Rule scope exists for the data load rule.”

I added a new valid rule at data rule level and applied a data load rule to it.


This should mean only the rule defined at data rule level should run and the one at application level will be ignored.

After running the data load rule, the process log proves this theory to be correct.

Property file arguments: /DF:MM-DD-YYYY /DL:tab /PDR:MTL_AGG_2 /I:***Vision_883.dat /TR

The planning job console shows that the rule was run.


The next test I wanted to check whether the following information in the documentation was correct:

“If the scope is Data Rule, only rules for the running data rule run in sequential order.”

The statement should also apply to  business rules added at application level.

I added a new rule at data load rule level, I set the sequence so “MTL_AGG_3” should run first.


The data load rule was successful.


This time the process log had the two rules in the OLU arguments and in same the order defined in data management.

Property file arguments: /DF:MM-DD-YYYY /DL:tab
/PDR:MTL_AGG_3,MTL_AGG_2 /I:***Vision_884.dat /TR

The job console showed that both the rules ran, and after checking the times, they ran in the correct order.


For the next test I wanted to see what happens if an invalid rule name was entered, would the data management rule fail?


The data load rule ran successfully.


The process log contained the rule name as an argument.

/DF:MM-DD-YYYY /DL:tab /PDR:InvalidRule /I:***Vision_885.dat /TR

So if an invalid rule name is entered there is no failure.

Next to test what happens if a rule errors, I added a rule which requires a runtime prompt value.


The good news is the data load rule failed.


There was the following entry in the process log that provided the reason for the failure.

com.hyperion.planning.HspCallbackInvocationException: Business rule failed to execute. See the job console page for error details.

As expected the planning job console shows an error against the rule.


Finally, I wanted to test what happens with a non-admin user and the load method in the rule set to “Add data types with security”. This means the data is loaded from data management to planning using the REST API and the import data slice resource, a grid of data is generated in JSON format and posted to planning.

A business rule was defined in the target application details.


The user does not have access to run the rule in planning.


The non-admin user ran the data management rule and it completed with no errors.


This time in the process log it actually mentions the business rule name and you can see that the REST API using the import data slice resource is in operation.

DEBUG [AIF]: businessRuleName: MTL_AGG_1
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice


The job console shows the business rule was run, even though the user does not have access to the rule in planning, it can still be run using the REST API resource.


What is interesting is in the payload of the REST API request there is a new parameter that is not currently in the REST API documentation.


   "aggregateEssbaseData":false,
   "dateFormat":"MM-DD-YYYY",
   "customParams":{ 
     
"PostDataImportRuleNames":"MTL_AGG_1"

The parameter “PostDataImportRuleNames” allows business rules to be executed after submitting a grid of data.

Just to prove that multiple business rules can be executed in a set sequence, I added a new rule and defined the sequence.


The data load rule ran successfully and the process log had entries for both rules in the correct order.

DEBUG [AIF]: businessRuleName: MTL_AGG_2,MTL_AGG_1

The JSON posted with the REST API had both the rules and in the correct order.


   "aggregateEssbaseData":false,
   "dateFormat":"MM-DD-YYYY",
   "customParams":{ 
     
"PostDataImportRuleNames":"MTL_AGG_2,MTL_AGG_1"

I checked the job console and the rules did run in the correct sequence.

That completes my initial look at the data management functionality for executing business rules, it is certainly not without limitations and there is room for improvement.

6 comments:

  1. Thanks for the detailed explanation and steps

    Thanks

    ReplyDelete
  2. Great post, as usual :-)

    ReplyDelete
  3. Thanks It Works!!!

    I have a VERY important question:

    John Goodwin are you a Real person? Or a group of persons?!!!

    Thanks in advance!!

    ReplyDelete
  4. I am definitely real and it is only me :)

    ReplyDelete
  5. Thank you, John, for the detailed explanation.

    Can we launch a business rule before the data has been loaded to the application? Like clear data rule.

    ReplyDelete
  6. Currently they can only be run after a data load

    ReplyDelete

Note: only a member of this blog may post a comment.