Wednesday, 15 March 2017

FDMEE – diving into the Essbase and Planning security mystery – Part 1

It is not an unusual FDMEE requirement for a user to be able to load data to a target application and have their access permissions honoured.

FDMEE is a data management product after all so it should not be a problem, well you would think so, if your target application is Financial Management then this is not an issue as security classes are checked when loading data so data can only be loaded to a member combination that the user has access to.

When the target is Essbase or Planning then things get a little more interesting, if you compare on-premise, hybrid and cloud then it can all get a little confusing.

I thought it would be a good idea to try and clear up any confusion and go through some examples of what happens when loading data at user level compared to admin level and what options are available to overcome these differences.

I am going to break this into two posts and in this first post look at the current situation with on-premise and hybrid and then in the next part concentrate on EPM cloud.

The examples I will be providing in terms of on-premise will be based on the latest 11.1.2.4 patches for FDMEE, Essbase and Planning. I am fully aware that the functionality in the cloud will be pushed down to on-premise at some point but you never know when that will happen and it is good to get an understanding of where we are currently at.

I will try to update this post to highlight any changes when they do occur.

So, let us start with Essbase and before even going near FDMEE I want to set the scene with simple examples of a user loading data and the security setup behind this.

First, we start with the Shared Services provisioning and the user has been granted the Filter role for the Essbase Sample application


A filter has been created that will only allow write access to the member combination 'Sales,100-10, Florida, Actual', there will no access to any other part of the database.


The filter is then applied to the provisioned user.


To show the filter is in operation I created a simple retrieve in Smart View.


The user should be able to submit data to the Sales member but not COGS.


Once submitted the data for Sales has been loaded and as expected no data has been loaded to COGS.


The user can also load data using a load rule and the same member combination is in the data load file.


The information window confirms data has been loaded to 1 cell and there were errors.


The data load error file has rejected the row containing the COGS member as the user has no access to it.


A quick retrieve in Smart View confirms this.


Now this is what you would expect to be possible in FDMEE but let us see.

A data load rule has been created to load data to the same member combination as the previous example.


The load method has been set to file.


The options available for ‘Load Method’ are either ‘File’ or ‘SQL’, if file is selected then a flat file is generated from the FDMEE database repository and then the file is loaded to the target Essbase database using an Essbase load rule, if SQL is chosen then data is loaded to Essbase directly from the FDMEE database repository using an Essbase SQL data load rule.

Let us first test the data load process using an admin user.


The full process run through without any problems and in the process log you can what is happening.

INFO  [AIF]: Creating data file: \\fileshare\EPMSHARE\FDMEE\outbox\Sample_2682.dat
INFO  [AIF]: Data file creation complete
DEBUG [AIF]: Created rule file: AIF0062
DEBUG [AIF]: Locked rule file: AIF0062
INFO  [AIF]: Saved rule file to essbase server
DEBUG [AIF]: Unlocked rule file: AIF0062

The data load file is created in the FDMEE outbox location and then an Essbase data load rule is created so the file can be loaded to the Essbase database.

Further down the log you can see the process to load the data using the load rule.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: admin
DEBUG [AIF]: Obtained connection to essbase cube: Basic
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0062
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0062
DEBUG [AIF]: Locked rule file: AIF0062
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0062

Another retrieve shows the data has been successfully loaded.


Now on to using the same user as earlier which we know can load data to the Essbase database.


There is no problem loading and mapping the file and in theory if FDMEE worked liked we should expect it to work then the Sales record should load and the COGS record should fail.

Running the export indicates there was a failure.


Investigating the process logs provides the reason for the failure.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
ERROR [AIF]: com.essbase.api.base.EssException: Cannot open cube outline. Essbase Error(1022002): User [lhowlett@FUSIONAD] Does Not Have Correct Access for Command [OpenOutlineEdit]

At the point where the data load rule is created the error is generated due to incorrect access permissions, it is correct the user does not have access to create rules and open the outline in edit mode, the user only needs to load the data.

The errors can be replicated in EAS when a user tries to save a rules file or open the outline in edit mode.


To me the answer would be that an admin user would create the rule behind the scenes in FDMEE and then the standard user would then load the data using the rule, I have shown it is possible to do this earlier on in this post, I don’t quite know why it has been developed this way but I am sure there must be a reason.

I thought maybe a possible workaround would be to use a custom Essbase load rule so the load rule would not be created.


The Essbase load rule was added to target options in the FDMEE load rule

The export failed again.


Checking the logs this time shows that the process got further and the data file was created.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
INFO  [AIF]: The default rule file will not be used as a custom rule file has been specified: FDMEE
INFO  [AIF]: Creating data file: \\fileshare\EPMSHARE\FDMEE\outbox\Sample_2687.dat
INFO  [AIF]: Data file creation complete

Later in the log the reason for the failure is clear.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: Obtained connection to essbase cube: Basic
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0062
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0062
ERROR [AIF]: com.essbase.api.base.EssException: Cannot lock olap file object. Essbase Error(1051041): Insufficient privilege for this operation

This time it fails due the user not have access to lock and unlock Essbase load rules, this error can be replicated in EAS by trying to lock a rule.


In the message panel the following error is generated.

Error: 1051041 Insufficient privilege for this operation

The user should not have to lock and unlock the rule, the admin user could lock the rule then the user could load the data using the rule, the rule could then be unlocked by the admin user.

Once again I am not sure why it has been developed in this way in FDMEE but to me it should be possible to load data at user level and honour the security filter.

What about a workaround, well up to 11.1.2.3.520 the user would need to be provisioned with at least the ‘Database Manager ‘role for the Essbase application.

You wouldn’t want to be giving out that role to users that just want to load data but that is the way it goes before that release, luckily most should be running a newer release by now.

From 11.1.2.3.520 there is the property called 'Global User for Application Access which can be set at target application level in FDMEE.

If I provision a user with the ‘Database Manager’ role for the Sample application in Shared Services.


Now the user can be added to target application options in FDMEE.


Let us run the export again in the workbench with the same user as before.


This time the outcome is more positive and in the process logs:

DEBUG [AIF]: GlobalUserForAppAccess from Profile: SampleDBManager
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: SampleDBManager
DEBUG [AIF]: Obtained connection to essbase cube: Basic
INFO  [AIF]: The default rule file will not be used as a custom rule file has been specified: FDMEE
DEBUG [AIF]: Resolved essbase rule file name for loading: FDMEE
DEBUG [AIF]: Fetching rule file from essbase server for data loading: FDMEE
DEBUG [AIF]: Locked rule file: FDMEE
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: FDMEE

We can see that the loading of data has now been overridden by the global user, this user has the permissions to create rules and lock and unlock so the data load runs through successfully.

The issue we have here is that the global user can load data to any member combination so we have lost that filter restriction we set earlier.

Unfortunately, that is the best that can be offered presently which I know is not ideal.

Moving on to planning and let us take a similar example with the same user.

The user is provisioned with the ‘Planner’ and ‘Essbase Write Access’ role for the sample Vision application.


Access permissions have been applied within the planning application and for direct Essbase access a filter has been automatically created for the user.

This time write access has been defined for the member combination ‘1110,110,Actual,Working’ and read access for ‘1150’.


The access permissions for the planning layer are confirmed to be working with a form.


A retrieve using an Essbase connections confirms the filters are working as expected.


On to FDMEE, a load rule was created which will load two rows of data from a flat file to the above member combination.

The load method was set to ‘Numeric Data only – File'


The property values available for a target planning application are:


I will go through the ‘All Data Types’ option shortly.

The load process was first tested with a user that is an administrator of the planning application.


The process log shows that the method of loading data is the same as when the target is an Essbase application.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: admin
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
DEBUG [AIF]: Locked rule file: AIF0069
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0069

I think you know what is coming when we try to run the FDMEE export with a user provisioned as a planner.



Yes, it failed and no surprises with the error message.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
ERROR [AIF]: com.essbase.api.base.EssException: Cannot lock olap file object. Essbase Error(1051041): Insufficient privilege for this operation

We are in the same position as when loading directly to Essbase target applications.

Just like with Essbase there is the option to set a global user.


A global user was added that has the administrator role assigned for the planning application.

The export was run again.


This time the export was successful and the process log confirms what is happening.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: planadmin
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: planadmin
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
DEBUG [AIF]: Locked rule file: AIF0069
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0069

The global user overrides and loads the data, as it is an administrator loading the data the access permissions are ignored and all member combinations can be loaded.

There is another data load method available at data load rule level and that is ‘All Data Types’ which uses the outline load utility to load data through the planning layer.


The global user was removed and the export run again with the standard user.


Failed again, the process logs provide the reason behind the failure.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: loadMethod: OLU
Unable to obtain dimension information and/or perform a data load: java.lang.RuntimeException: You must be an Administrator to use the Hyperion Planning Adapter.

Back to square one again as you need to be an administrator to operate the outline load utility.

If the planning application administrator is added back in as the global user, then the export is successful.


DEBUG [AIF]: GlobalUserForAppAccess from Profile: planadmin
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: planadmin
DEBUG [AIF]: loadMethod: OLU
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 0

The global user overrides and loads the data but once again you lose out on the member level security so not great, in the next post you will see how this load method differs quite considerably in the cloud.

Finally, I want to quickly cover off the user requirements for hybrid which means any integrations between on-premise FDMEE and EPM Cloud.

When you add a cloud application as a target the cloud user credentials are entered and whenever authentication is required with the cloud these credentials are called upon.


As an example, I added a user which has been provisioned with the power user role for the cloud application.

I ran a simple integration which will which extracts data from an on-premise application and loads to a cloud application, the process failed and the logs contained the following information.

INFO  [AIF]: Uploading data file to PBCS: \\fileshare\EPMSHARE\FDMEE\outbox\Vision_1991.dat
ERROR [AIF]: java.io.FileNotFoundException: Response: 401: Unauthorized for url: https://cloudinstance/interop/rest/11.1.2.3.600/applicationsnapshots/Vision_1991.dat/contents?q={chunkSize:63,isFirst:true,isLast:true,extDirPath:"inbox"}

The process failed at the point where the on-premise extracted file is uploaded to the cloud instance using the REST API, the 401 error gives an indication it might be user related.

Uploading a file using EPM Automate with the same user returns an error that the user has insufficient privileges.


The documentation states the following.

“EPM Automate Utility enables Service Administrators to automate many repeatable tasks”

So, to be able to upload the file from on-premise to cloud requires the service administrator role.

After updating the cloud user to one that has the service administrator role the hybrid integration was successful.

The process logs show that once the file has been loaded to the cloud instance the remaining steps are performed by the default cloud admin account so overrides the user configured in FDMEE.

INFO  [AIF]: Cloud Mode: CLOUD, Resolved user name for application access: epm_default_cloud_admin

Well that about covers all I wanted to in this post, in summary if you are looking to implement user defined security for data loading to Essbase and Planning then you’re going to be disappointed, this is no doubt going to change in the future and in the next post I will cover how EPM cloud currently differs from on-premise in this respect.

Monday, 27 February 2017

Loading Planning metadata from a relational source through FDMEE and Planning

If you are using FDMEE to load data to a planning application then great but what about metadata, not so great at the moment, yes you can load metadata from a supported ERP source but still that is pretty limited.

Wouldn’t it be nice to be able to initiate a metadata load to a planning application from FDMEE before loading data, with a custom solution this can be made possible.

In this post I am going to concentrate on loading metadata from a relational source but the concept is the same and can be achieved if the source is file based.

Loading metadata to planning from a relational source has been available in the outline load utility (OLU) since 11.1.2.2.300, I wrote a detailed blog about the functionality which you can read about here.

I think the functionality is great as there are many situations where you want to load from a source system or a staging area instead of relying on flat files.

The problem is that it has never been incorporated into the planning UI like with flat files and has to be run from command line using the outline load utility, I didn’t want the FDMEE custom solution to have to resort to calling batch scripts on the file system as that would just over complicate matters.

The main requirements for the FDMEE solution were:

Ability to load metadata from a relational source to planning application.
Allow selection of dimension to load metadata to.
Option to perform a database refresh if required.
Non planning administrators should be able to run the metadata load.
Be able to run from client machine without the need to log into FDMEE
No calling of batch scripts on the file system
No changes to the default deployment of FDMEE like adding jars.
No hard coding of clear text passwords
Log records loaded and rejected.

So let me go through the solution I came up with in FDMEE and then move on Planning.

You are probably aware that FDMEE allows custom Jython scripts to be registered and executed so this seemed like the logical route for the solution, after lots of research I found it is possible to use Jython to directly integrate with the OLU so no need for additional configuration or access to the file system to run the utility, it is pretty much the same concept that already exists within FDMEE.

Before getting on to the Jython script I registered a new custom script in FDMEE.


The custom script definition was set up with two query parameters, one SQL query to return the target planning application dimensions from the FDMEE repository.


The other query was to determine whether a planning refresh should be run by the simple selection of ‘Yes’ or ‘No’.

Moving on to the Jython script, now I am not going to provide every line of the script because what would be the point of trying to copy what I have written, if you are interested in this type of solution please feel free to get in touch.


The next section of the script stores the target application name, retrieve the values for the dimension to load to and whether to run a refresh from the execution parameters.

Part of the SQL is dynamically generated from the dimension name; in this example the SQL is extremely simple for demo purposes.

The process log full path and file name is generated from a combination of the FDMEE outbox, target application, dimension name and the current process ID.


What I have not shown above is the SSO token for a planning application administrator is generated at run time which means the password is not hard coded in the script, the SSO token is required when calling the outline load utility, this also allows one of my requirements for non-planning admins to be able to run the metadata load.

The next section sets the properties for the outline load utility, such as database connection, authentication, SQL and dimension information, if you want to understand in more detail about these properties then take a look a previous blog on the outline load utility.


The last section of the script calls a Jython function to update the FDMEE process log to a status of running.

A function is called to run the OLU with passed in parameters for the SSO token, application name, OLU properties and log file information.

The function returns the status of metadata load with number of records loaded and rejected.

If no records were loaded or there were rejected records, then the FDMEE process log is updated to warning status.

Finally, a link to the OLU log is added to the FDMEE process ID so it can be downloaded from process details.


I know you are not interested in the code and much prefer to see the solution in action :)

To start with I have as basic as it gets database table containing account member information for parent, member name, alias and data storage, obviously it would have additional properties in a real life scenario.


The aim is to load the members in the above table as children of the following member in the account dimension.


As the script has been registered it is now available from the script execution area in FDMEE.


Executing the script opens up a window with the option to select the target planning application dimension and whether to refresh the database.


Selecting the magnifying glass for target dimension will run the predefined SQL query to return the available dimensions in the target application.


Selecting the refresh database will also run a SQL query to populate the Yes/No values.


In my example I have selected “Account” as the target dimension and not to refresh the database.


Once executed the process will be started and the Jython script called.



One of the process steps includes details that account dimension members have successfully been loaded to the planning application.

You will notice that there is an option to download a file, this is the OLU log which I will show a little later.

The main process log contains details on the target application, dimension, whether a refresh was performed and more importantly the number of rows loaded and rejected.


If I run the custom script again but this time set to refresh the database.


The OLU will perform a database refresh from Planning to Essbase and this can be viewed in OLU log which can be downloaded from process details.


If I go back into the planning application I can confirm the account members have definitely been loaded.


All very nice but what if you want to run the metadata load outside of FDMEE, well from 11.1.2.4.210 you have the option to use the REST API to run FDMEE custom scripts.

If you want to understand all about REST and FDMEE then have a look at the two-part series I wrote, part 1 and part 2.

With a REST client it is easy to run a custom script with parameters, the URL format for the REST resource is:

http(s)://<webserver>:<port>/aif/rest/V1/jobs


The response will include the process ID and current status.


it is possible to keep checking the status of the job using the format:

http(s)://<webserver>:<port>/aif/rest/V1/jobs/<jobid>

One of the requirements for the solution was the ability to run the load from a client machine without the need to log into the FDMEE UI, well with REST this is now possible but you wouldn’t really want a power user to start messing with REST clients so with a little help from PowerShell this can be made possible.


I have displayed some the code for the script but the end user would not need to get involved with that and they would only need to run the script, they would be prompted to enter the dimension to load and whether to run a refresh, the script would then keep checking the status until the load was complete.

Looking back my original set of requirements I believe that all of them have been covered with this solution.

Moving on to the next piece and that is the ability to call the FDMEE custom script from within planning, in a previous post I wrote about running an FDMEE data load rule from a planning form using Groovy.

I am going to use the same concept here but things have moved on a little, at the time of writing the blog the REST API was not available in FDMEE so I had to call a servlet, I have now updated the Groovy script to call the REST API instead of the servlet.

I am not going to go into detail as much about how Groovy can be implemented in planning so please refer to my other post.

What I did was create two business rules which will attached to a planning form, the first rule was to run the FDMEE custom script to load metadata and the second to check the status of the process.

The business to run the custom script basically calls a Groovy script and passes in variables to define the dimension to load data to, perform a refresh or not and to let the script know this was metadata load, this is so I could use the same Groovy script when checking the status of the job by just changing the value of the variable.


The main section of the rule has a FIX to narrow down the intersection of data on the planning form.

Member ‘FDMEE_Run_Date’ will store the current date.

FDMEE_Load_Status’ will hold the value returned from the Groovy script, the value determines the status of running the FDMEE rule, in planning the member has a Smart List assigned to provide a friendly status name instead of a number.

FDMEE_Process_ID’ will contain the FDMEE process ID returned from the Groovy script.

In my example I am using the @CalcMgrGroovyNumber function which will return a number, the number is then processed to store the correct value for both the load status and process ID members.

The dimension variable will be picked up from the form and passed into the Groovy script, the ‘runRefresh’ variable is a runtime prompt and the value is entered when the rule is run.


The rule to check to the status of the FDMEE process is very similar except it passes in “C” for check and the FDMEE process ID into the Groovy script.

As for the Groovy script, once again there is no need for me to provide every line as only an understanding of what it is doing is important.

User credentials are encrypted into base64 to be sent in the header of the REST request, the URL to the FDMEE REST resource is defined, the jobID is included if the script is being to run to check the status of the FDMEE process.


If the script is being run to load metadata, then the payload of the REST request it built using JSON including the parameters for the target planning dimension and whether to refresh database, the method for the request is set to POST, if the script is being to check the status then the method is set as GET.


Next a Groovy method is called to execute the REST request, the REST URL, method, payload and content type as passed in.

The response of the REST request is stored and then the JSON parsed.


Finally, the status of the FDMEE process is stored and converted into a format that can be returned to the business rule.

I convert to a format so I can pass back both job the ID and status of the job as one value, the format is jobid.jobstatus so say the job ID was 2520 and has a status of -1 meaning in process then 2520.01 is passed back into the business rule.

The business rule would then take 2520 and store it against member 'FDMEE_Process_ID' and the '01' value would be converted in the business rule and stored against member 'FDMEE_Load_Status', this member has a Smart List attached so the number is displayed as meaningful text in the form.


You may be wondering what the hell is that all about so it is much nicer to see a simple demonstration.

I created the following read only form, security has been applied so the user will only have access to dimension loads that they should be allowed to run.


Two business rules have been attached to the form, one to run the metadata load and one to check the status.


If I select to run the metadata load the option to run a refresh will be displayed in the runtime prompts window.


The rule ran successfully so hopefully the account dimension build will have been started.


The form is now populated with the load status, run date and the process ID


As the status was shown as running the second rule can be run to check the status.


This time the form has been updated with a successful status.


To prove it is working as expected I can cross check the process ID in the form to the one in FDMEE.


All good, so what about when there rejected records in the load, to show this I have updated the database table to set an invalid parent member in the account dimension.


If I run the load metadata rule again and then check the status, this time it is shown as a warning.


In FDMEE the process step for loading members to the member is shown as warning, this is because in the Jython script I set the process status to warning if any records are rejected.



Checking the main process log confirms that one record has been rejected.


As I mentioned earlier, in the Jython script I attach the OLU log so it can be downloaded from the output file column in process details.


The OLU log contains the reason to why the record was rejected.


So there we go, I know there was a lot to take in but if it is the sort of solution you would like to know more about then please feel free to contact me.