Sunday, 16 April 2017

EPM Cloud - masking data

One of the new features in release 17.04 is the ability to mask data using the EPM Automate utility which also means it should be available through the REST API, the release documentation provides the following information:

“A new version of the EPM Automate Utility is available with this update. This version includes the maskData command, which masks all application data to ensure data privacy. You use this command only on test instances to hide sensitive data from users; for example, application developers, who should not access your business data.”

While the documentation says only to use to only on test instances it should be possible to run the command on a production instance but the reality is you probably wouldn’t want to do that anyway.

The command is available for FCCS, PBCS, ePBCS and TRCS

It is worth pointing out before using the command is that it will update all data and make it meaningless so make sure you have a backup like a maintenance snapshot and run it against the correct instance as there is no going back.

The EPM Automate command syntax is:

epmautomate maskdata –f

-f is optional and suppresses the user input to confirm whether to run the command, you would only really use the parameter if you are going to be automating the process of masking data.

So let’s give it a test drive and run the command.


The output from the command gives us a little more of an insight to what is going on behind the scenes, as this is Software as a Service we rarely get to understand the full story of what is happening.

The process flow is for each database in the application:
  • Export all data in masked format
  • Clear all data from cube
  • Load masked data back into the cube
This applies to all cubes within the application including ASO, for BSO it is a full export so depending on the size of the databases the command could take a while to complete, I will go into more detail on how I believe the masking of data is being achieved shortly.

The following form shows an example of the data before running the mask data command.


After running the command the data has been masked and is now totally meaningless.


At the start of this post I said that the functionality should be available through REST, if you are not aware the EPM Automate utility is basically built on top of REST so if the functionality is in the utility then it should be possible to access it through REST.

The URL format for the REST resource is:

https://<cloudinstance>/interop/rest/v1/services/maskdata

I can replicate this with a REST client using a POST method.


The above response includes a URL which can then be accessed to check the status of the data masking.


A status of -1 means the process is running and a status of 0 indicates the process has completed, any other value would mean the process has failed.

For automation purposes this can be easily scripted in your preferred language, I have put together an example PowerShell script that calls the mask data REST resource and then keeps checking the status until the process has completed.


So how is the masking of data achieved, well I know some people believe that if you put the word cloud into any sentence then some kind of magic occurs but unfortunately I can’t always accept that, the clue was there when running the EPM Automate “maskdata” command:

“This command will export the data in masked format”

In the on-premise world a new feature was introduced into 11.1.2.4 MaxL and the following is taken from the Essbase new features readme.

“The MaxL export data statement includes grammar you can use to make exported data anonymous, wherein real data is replaced with generated values. This removes the risk of sensitive data disclosure, and can be used in case a model needs to be provided to technical support for reproduction of certain issues.”

The Essbase tech ref provides additional information on how the data is masked, for BSO

“Export data in anonymized format. Anonymization removes the risk of sensitive data disclosure, and can be used in case sample data needs to be provided for technical support. Essbase replaces real data values with incremental values beginning with 0, increasing by 1 for each value in the block.”

If I take the Sample Basic BSO database for simplicity I can demonstrate what is happening in the cloud using MaxL.


The above example is a full data export using the anonymous syntax and the output shows how the cells in each block have been replaced with incremental values.


I know the basic database usually has scenario as a dense dimension but I updated it to sparse for this example.

Each block in the database will have the same incremental values including upper level blocks which means the aggregated values for stored upper level members will be incorrect.

For on-premise if you wanted the data to be aggregated correctly you could run the anonymous export for level 0, clear, load and then aggregate. For the cloud, you don’t have that control so you could run the mask data command, clear upper level data and then aggregate with a business rule.

A spreadsheet retrieve highlights the difference before and after a masking the data.


Moving on to ASO which uses a slightly different approach to masking the data, the documentation provides further details on this:

“Export data in anonymized format. Anonymization removes the risk of sensitive data disclosure, and can be used in case sample data needs to be provided for technical support. Essbase replaces real data values with 1, for each value in the block.”

I am not sure I agree with the statement about block and would prefer for it to say input level cell, also the way the data is anonymized is quite primitive but at least it shows a true reflective of where the data did exist.

Anyway, let us take another example of masking the data by using the anonymous syntax against the ASO sample application.


The exported data contains all the level 0 data values replaced with a 1.


Another spreadsheet retrieve shows the difference of before and after anonymizing the data.



I am going to leave it there but hopefully you now understand the concept of how the data mask functionality works, even though it is a cloud command the process can be replicated on-premise and with a greater level of flexibility.

Thursday, 30 March 2017

FDMEE – diving into the Essbase and Planning security mystery – Part 2

In the last part I took an in-depth look at the current situation with on-premise FDMEE and user based security when the target is either Essbase or Planning, I pretty much concluded that a user’s security is not honoured and a global admin user must be defined which basically means no security restrictions on the data that is being loaded.

In this part I am going to go through the same process but for EPM Cloud to see how it differs, the cloud products I will be concentrating on are PBCS including enterprise and FCCS.

Let us dive right in and look at loading data to a target planning application, at present there are two load methods available and we will start with “Numeric Data Only”


This method is the equivalent to on-premise load method of “Numeric Data only – File” but in the cloud the difference is that it can only be defined at application level, for on-premise it can be set at load rule level which makes more sense, I am not sure why it has not been implemented like that in the cloud but I am sure it will change and probably without any warning.

As always I am keeping things nice and simple and will be loading two rows of data.


I will start with a user that has been provisioned with the role of “Service Administrator” which is the same as a planning application administrator.


As expected the process ran through smoothly and the process log provides details on how the data is being loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0029
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0029
DEBUG [AIF]: Locked rule file: AIF0029
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0029

The process is the same as with on-premise where a data file and load rule are created then the data is loaded using the load rule.

The difference between cloud and on-premise is the user that has loaded the data is the default cloud admin and not the user running the export, the ‘epm_default_cloud_admin’ is a system generated account to carry out admin type duties and there is no way to change the behaviour of this account.

To be sure I provisioned a user with the power user role and ran the export again.


The process log confirms that the user running the export is being overridden with the default cloud admin account

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0029
INFO  [AIF]: The data has been loaded by the rule file.

In the cloud, there is no option to set a global user so when loading data using the "Numeric Data Only" method the user will be ignored and the admin account will take control of the process, this also means that it is not possible to honour the user’s data level security using this method.

So, in some respects there is a similarity between cloud and on-premise when using this method as for it to function correctly it requires an admin type role and there is no control on restricting the security at data level.

Let us switch back to a service administrator user and set the load method to “All data types with security”


You would usually set this method if you are planning on loading non-numeric data which I previously wrote a post about, when this method was first introduced it was called “HPL” and then was subsequently changed.

The method includes reference to security so maybe it is inclination to it operating differently.

The export was successfully run again.



The process log contains important details to what is happening behind the scenes.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: John.Goodwin@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the admin user: OLU
Outline data store load process finished. 2 data records were read, 3 data records were processed, 3 were accepted for loading (verify actual load with Essbase log files), 0 were rejected.

This time the default cloud admin has not overridden the user running the export and the method to load the data is the Outline Load Utility (OLU), as this user has the service administrator role and security restrictions on the data are not important the OLU can be used, in the last part I showed what happens if the user is not an admin and tries to load data using the OLU.

Now let’s look at what happens with a user that has a power user role assigned which I believe is the minimum role level requirement for Data Management access.  please note I have not assigned any access permissions for the user in the planning application yet as I just want to show what happens.

The export is run and it fails.


This time in the process log there are noticeable differences to the way the data is being loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
ERROR [AIF]: The rest service request has failed: 400 Bad Request - {"status":400,"message":"java.lang.NullPointerException","localizedMessage":"java.lang.NullPointerException"}

There is no overriding of the user performing the export and as the user is not an admin the method for loading data is with the REST API.

If you are not aware there are REST resources available for importing, exporting and deleting data by building up a data grid which I will go into more detail shortly.

The REST call failed and the reason behind this even though the error message is not clear is because no access permissions have been defined.

Now I am going to add access permissions but not for all dimensions and run the export again.



The export failed again which was to be expected and now the error message that is produced is something you should be familiar with if you have worked with planning.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com 
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
ERROR [AIF]: The rest service request has failed: 400 Bad Request - {"status":400,"detail":"You are trying to open the form, but cannot because all of the required dimensions are not present. Possible causes may be that you do not have access to at least one member of a required dimension, or the member selection resulted in no members present. Contact your administrator."}

The error message is the same one you receive when a user opens a form and does not have all the required access permissions to the members in the form, so it looks like the REST resource sits on top of existing functionality built into forms.

I updated the access permissions so the user has write access to all the members contained in the data load.


This time the export was successful and the process log confirms the number of rows loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 0

Opening a form with the same user shows that the data was correctly loaded.


So what is happening behind the scenes to load the data using the REST API, well when an export is initiated a file is created which contains JSON and a grid based on the data which is being loaded.


The JSON in the file is then posted to the "importdataslice" REST resource and the user details running the export are passed in so that the security is honoured.

The URL format for the REST resource is:
https://<cloudinstance>/HyperionPlanning/rest/v3/applications/<app>/plantypes/<cube>/importdataslice

An example of the JSON that is being generated for the two rows of data is:


The response that is returned is in JSON format and contains information about the number of cells that were accepted and rejected.


The JSON grid that is being generated in my example would look like the following in terms of a planning form.


To demonstrate what happens with rejected cells I updated the JSON to include an account member “4110” which the user does not have access to.


The response shows that one cell was rejected and contains the row of the grid that was rejected, it does not provide the actual member that was rejected though.


I will take another example in Data Management and load four rows of data, the user does not have access to entity member “111” so in theory two rows of data should be rejected.


I was expecting the export process to be successful but contain warnings but it looks like if any invalid data is encountered the process status is shown as a failure.



Investigating the process logs shows that two rows were loaded and two were rejected which is what I would expect, the list of the rejected rows are written to the log and are generated from the REST response like in my previous example.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 2
INFO  [AIF]: List of rejected cells: ["[Actual, 1520, 111, No Version, BaseData, P_000, FY17, Mar]","[Actual, 2210, 111, No Version, BaseData, P_000, FY17, Mar]"]

I was interested to know whether valid intersections would restrict the data being loaded, as the REST functionality looks to be built on top of form data grids then the logic should apply.

I updated the users access permissions so they could write to entity members “110”, “111” and then restricted permissions to entity “111” using a valid combination.


The export failed.



The rows containing entity member “111” were rejected so valid intersections are honoured when loading data using the REST method.

INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 2
INFO  [AIF]: List of rejected cells: ["[Actual, 2210, 111, No Version, BaseData, P_000, FY17, Mar]","[Actual, 1520, 111, No Version, BaseData, P_000, FY17, Mar]"]

My previous examples have all been using PBCS which also applies to E-PBCS so how about FCCS, with FCCS the load type can be set a rule level.


There are only two options available which are Data and Journal.


I am not going to bother covering the journal option as setting this option will generate a Journal in FCCS and this does not relate to what this post is about.

I loaded the following data set with a user that has the Service Administrator role applied.


The process log confirms that when using the “Data” load type in FCCS it acts in the same way as “All data types with security” in PBCS, if the user has the Service Administrator role it will load the data using the Outline load utility.

INFO  [AIF]: cloudServiceType: FCCS, Resolved user name for application access: John.Goodwin@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the admin user: OLU
INFO  [AIF]: Number of rows loaded: 1, Number of rows rejected: 0

If I switch to a power user with the correct access permissions for the data that is being loaded, then the export is successful.


INFO  [AIF]: cloudServiceType: FCCS, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/FCCS/plantypes/Consol/importdataslice
INFO  [AIF]: Number of rows loaded: 1, Number of rows rejected: 0

The method to load the data is with the REST API, this is basically the same as PBCS which means the user’s access permissions will be honoured.

The only concern I have about using the REST method would be the performance implications of loading a large amount of data as it will be the equivalent of creating a huge form, I have not had the chance yet to test whether it does impact performance.

That covers when the target is Planning so how about when an ASO Essbase cube is the target.

The load methods are the same for Essbase as they are for Planning and currently can only be set in the target application options, I going to start with “Numeric Data Only


The following data set is loaded by a Service Administrator.


When using the numeric data method then concept is the same as with Planning, the default cloud admin overrides, a data file and load rule are created, the data is loaded with the rule.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0031
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0031
DEBUG [AIF]: Locked rule file: AIF0031
INFO  [AIF]: Getting load buffer for ASO data load...
INFO  [AIF]: Initializing load buffer [1]
INFO  [AIF]: Successfully initialized the load buffer
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The load buffer [1] has been closed.
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0031

Now to switch over to a power user and repeat the process.


Once again the process is the same as when Planning is the target and the default admin overrides and loads the data.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0031
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0031
DEBUG [AIF]: Locked rule file: AIF0031
INFO  [AIF]: Getting load buffer for ASO data load...
INFO  [AIF]: Initializing load buffer [1]
INFO  [AIF]: Successfully initialized the load buffer
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The load buffer [1] has been closed.
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0031

How about setting the load method to “All data types with security


I will try with the Service Administrator user as there shouldn’t be any problems.


Spoke to soon, the process failed so time to look at the logs.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
ERROR [AIF]: Essbase ASO application is not created from any Planning application, please use other data load methods.

Hold on that error does not make sense, this is PBCS so it must be created from the Planning application there is no other way.

I tried with the power user and received the same error message so I decided to test whether I could load data to the ASO cube using the REST API, it should be possible because I can create a form and enter data against the ASO cube.


The response confirms that it is possible to use the REST resource to load data to an ASO cube.


At the moment I am not sure why it is possible to select the “All data types with security” method if it doesn’t work, maybe I am missing something or it is a bug or a feature that will be implemented at a later stage. If I find out any further information I will update this post.

That is all I am going to cover on the different security type behaviour between admin and non-admin users for on-premise, hybrid and cloud. It certainly can be confusing with all the possible scenarios but hopefully I have cleared it up over the two posts.

Wednesday, 15 March 2017

FDMEE – diving into the Essbase and Planning security mystery – Part 1

It is not an unusual FDMEE requirement for a user to be able to load data to a target application and have their access permissions honoured.

FDMEE is a data management product after all so it should not be a problem, well you would think so, if your target application is Financial Management then this is not an issue as security classes are checked when loading data so data can only be loaded to a member combination that the user has access to.

When the target is Essbase or Planning then things get a little more interesting, if you compare on-premise, hybrid and cloud then it can all get a little confusing.

I thought it would be a good idea to try and clear up any confusion and go through some examples of what happens when loading data at user level compared to admin level and what options are available to overcome these differences.

I am going to break this into two posts and in this first post look at the current situation with on-premise and hybrid and then in the next part concentrate on EPM cloud.

The examples I will be providing in terms of on-premise will be based on the latest 11.1.2.4 patches for FDMEE, Essbase and Planning. I am fully aware that the functionality in the cloud will be pushed down to on-premise at some point but you never know when that will happen and it is good to get an understanding of where we are currently at.

I will try to update this post to highlight any changes when they do occur.

So, let us start with Essbase and before even going near FDMEE I want to set the scene with simple examples of a user loading data and the security setup behind this.

First, we start with the Shared Services provisioning and the user has been granted the Filter role for the Essbase Sample application


A filter has been created that will only allow write access to the member combination 'Sales,100-10, Florida, Actual', there will no access to any other part of the database.


The filter is then applied to the provisioned user.


To show the filter is in operation I created a simple retrieve in Smart View.


The user should be able to submit data to the Sales member but not COGS.


Once submitted the data for Sales has been loaded and as expected no data has been loaded to COGS.


The user can also load data using a load rule and the same member combination is in the data load file.


The information window confirms data has been loaded to 1 cell and there were errors.


The data load error file has rejected the row containing the COGS member as the user has no access to it.


A quick retrieve in Smart View confirms this.


Now this is what you would expect to be possible in FDMEE but let us see.

A data load rule has been created to load data to the same member combination as the previous example.


The load method has been set to file.


The options available for ‘Load Method’ are either ‘File’ or ‘SQL’, if file is selected then a flat file is generated from the FDMEE database repository and then the file is loaded to the target Essbase database using an Essbase load rule, if SQL is chosen then data is loaded to Essbase directly from the FDMEE database repository using an Essbase SQL data load rule.

Let us first test the data load process using an admin user.


The full process run through without any problems and in the process log you can what is happening.

INFO  [AIF]: Creating data file: \\fileshare\EPMSHARE\FDMEE\outbox\Sample_2682.dat
INFO  [AIF]: Data file creation complete
DEBUG [AIF]: Created rule file: AIF0062
DEBUG [AIF]: Locked rule file: AIF0062
INFO  [AIF]: Saved rule file to essbase server
DEBUG [AIF]: Unlocked rule file: AIF0062

The data load file is created in the FDMEE outbox location and then an Essbase data load rule is created so the file can be loaded to the Essbase database.

Further down the log you can see the process to load the data using the load rule.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: admin
DEBUG [AIF]: Obtained connection to essbase cube: Basic
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0062
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0062
DEBUG [AIF]: Locked rule file: AIF0062
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0062

Another retrieve shows the data has been successfully loaded.


Now on to using the same user as earlier which we know can load data to the Essbase database.


There is no problem loading and mapping the file and in theory if FDMEE worked liked we should expect it to work then the Sales record should load and the COGS record should fail.

Running the export indicates there was a failure.


Investigating the process logs provides the reason for the failure.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
ERROR [AIF]: com.essbase.api.base.EssException: Cannot open cube outline. Essbase Error(1022002): User [lhowlett@FUSIONAD] Does Not Have Correct Access for Command [OpenOutlineEdit]

At the point where the data load rule is created the error is generated due to incorrect access permissions, it is correct the user does not have access to create rules and open the outline in edit mode, the user only needs to load the data.

The errors can be replicated in EAS when a user tries to save a rules file or open the outline in edit mode.


To me the answer would be that an admin user would create the rule behind the scenes in FDMEE and then the standard user would then load the data using the rule, I have shown it is possible to do this earlier on in this post, I don’t quite know why it has been developed this way but I am sure there must be a reason.

I thought maybe a possible workaround would be to use a custom Essbase load rule so the load rule would not be created.


The Essbase load rule was added to target options in the FDMEE load rule

The export failed again.


Checking the logs this time shows that the process got further and the data file was created.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
INFO  [AIF]: The default rule file will not be used as a custom rule file has been specified: FDMEE
INFO  [AIF]: Creating data file: \\fileshare\EPMSHARE\FDMEE\outbox\Sample_2687.dat
INFO  [AIF]: Data file creation complete

Later in the log the reason for the failure is clear.

INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: Obtained connection to essbase cube: Basic
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0062
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0062
ERROR [AIF]: com.essbase.api.base.EssException: Cannot lock olap file object. Essbase Error(1051041): Insufficient privilege for this operation

This time it fails due the user not have access to lock and unlock Essbase load rules, this error can be replicated in EAS by trying to lock a rule.


In the message panel the following error is generated.

Error: 1051041 Insufficient privilege for this operation

The user should not have to lock and unlock the rule, the admin user could lock the rule then the user could load the data using the rule, the rule could then be unlocked by the admin user.

Once again I am not sure why it has been developed in this way in FDMEE but to me it should be possible to load data at user level and honour the security filter.

What about a workaround, well up to 11.1.2.3.520 the user would need to be provisioned with at least the ‘Database Manager ‘role for the Essbase application.

You wouldn’t want to be giving out that role to users that just want to load data but that is the way it goes before that release, luckily most should be running a newer release by now.

From 11.1.2.3.520 there is the property called 'Global User for Application Access which can be set at target application level in FDMEE.

If I provision a user with the ‘Database Manager’ role for the Sample application in Shared Services.


Now the user can be added to target application options in FDMEE.


Let us run the export again in the workbench with the same user as before.


This time the outcome is more positive and in the process logs:

DEBUG [AIF]: GlobalUserForAppAccess from Profile: SampleDBManager
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: SampleDBManager
DEBUG [AIF]: Obtained connection to essbase cube: Basic
INFO  [AIF]: The default rule file will not be used as a custom rule file has been specified: FDMEE
DEBUG [AIF]: Resolved essbase rule file name for loading: FDMEE
DEBUG [AIF]: Fetching rule file from essbase server for data loading: FDMEE
DEBUG [AIF]: Locked rule file: FDMEE
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: FDMEE

We can see that the loading of data has now been overridden by the global user, this user has the permissions to create rules and lock and unlock so the data load runs through successfully.

The issue we have here is that the global user can load data to any member combination so we have lost that filter restriction we set earlier.

Unfortunately, that is the best that can be offered presently which I know is not ideal.

Moving on to planning and let us take a similar example with the same user.

The user is provisioned with the ‘Planner’ and ‘Essbase Write Access’ role for the sample Vision application.


Access permissions have been applied within the planning application and for direct Essbase access a filter has been automatically created for the user.

This time write access has been defined for the member combination ‘1110,110,Actual,Working’ and read access for ‘1150’.


The access permissions for the planning layer are confirmed to be working with a form.


A retrieve using an Essbase connections confirms the filters are working as expected.


On to FDMEE, a load rule was created which will load two rows of data from a flat file to the above member combination.

The load method was set to ‘Numeric Data only – File'


The property values available for a target planning application are:


I will go through the ‘All Data Types’ option shortly.

The load process was first tested with a user that is an administrator of the planning application.


The process log shows that the method of loading data is the same as when the target is an Essbase application.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: admin
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
DEBUG [AIF]: Locked rule file: AIF0069
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0069

I think you know what is coming when we try to run the FDMEE export with a user provisioned as a planner.



Yes, it failed and no surprises with the error message.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
ERROR [AIF]: com.essbase.api.base.EssException: Cannot lock olap file object. Essbase Error(1051041): Insufficient privilege for this operation

We are in the same position as when loading directly to Essbase target applications.

Just like with Essbase there is the option to set a global user.


A global user was added that has the administrator role assigned for the planning application.

The export was run again.


This time the export was successful and the process log confirms what is happening.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: planadmin
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: planadmin
DEBUG [AIF]: Obtained connection to essbase cube: Plan1
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0069
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0069
DEBUG [AIF]: Locked rule file: AIF0069
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0069

The global user overrides and loads the data, as it is an administrator loading the data the access permissions are ignored and all member combinations can be loaded.

There is another data load method available at data load rule level and that is ‘All Data Types’ which uses the outline load utility to load data through the planning layer.


The global user was removed and the export run again with the standard user.


Failed again, the process logs provide the reason behind the failure.

DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: lhowlett
DEBUG [AIF]: loadMethod: OLU
Unable to obtain dimension information and/or perform a data load: java.lang.RuntimeException: You must be an Administrator to use the Hyperion Planning Adapter.

Back to square one again as you need to be an administrator to operate the outline load utility.

If the planning application administrator is added back in as the global user, then the export is successful.


DEBUG [AIF]: GlobalUserForAppAccess from Profile: planadmin
INFO  [AIF]: Cloud Mode: NONE, Resolved user name for application access: planadmin
DEBUG [AIF]: loadMethod: OLU
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 0

The global user overrides and loads the data but once again you lose out on the member level security so not great, in the next post you will see how this load method differs quite considerably in the cloud.

Finally, I want to quickly cover off the user requirements for hybrid which means any integrations between on-premise FDMEE and EPM Cloud.

When you add a cloud application as a target the cloud user credentials are entered and whenever authentication is required with the cloud these credentials are called upon.


As an example, I added a user which has been provisioned with the power user role for the cloud application.

I ran a simple integration which will which extracts data from an on-premise application and loads to a cloud application, the process failed and the logs contained the following information.

INFO  [AIF]: Uploading data file to PBCS: \\fileshare\EPMSHARE\FDMEE\outbox\Vision_1991.dat
ERROR [AIF]: java.io.FileNotFoundException: Response: 401: Unauthorized for url: https://cloudinstance/interop/rest/11.1.2.3.600/applicationsnapshots/Vision_1991.dat/contents?q={chunkSize:63,isFirst:true,isLast:true,extDirPath:"inbox"}

The process failed at the point where the on-premise extracted file is uploaded to the cloud instance using the REST API, the 401 error gives an indication it might be user related.

Uploading a file using EPM Automate with the same user returns an error that the user has insufficient privileges.


The documentation states the following.

“EPM Automate Utility enables Service Administrators to automate many repeatable tasks”

So, to be able to upload the file from on-premise to cloud requires the service administrator role.

After updating the cloud user to one that has the service administrator role the hybrid integration was successful.

The process logs show that once the file has been loaded to the cloud instance the remaining steps are performed by the default cloud admin account so overrides the user configured in FDMEE.

INFO  [AIF]: Cloud Mode: CLOUD, Resolved user name for application access: epm_default_cloud_admin

Well that about covers all I wanted to in this post, in summary if you are looking to implement user defined security for data loading to Essbase and Planning then you’re going to be disappointed, this is no doubt going to change in the future and in the next post I will cover how EPM cloud currently differs from on-premise in this respect.