Tuesday 26 September 2017

FDMEE – Load rules driven by substitution variables

A question has come up a few times around using Essbase substitution variables to define the period for a FDMEE data load rule  so I thought of putting a post together on the subject.

Unfortunately at present whether it is on-premise or the cloud it is not possible to use sub vars in period mappings.

The following type of period mapping would be nice for a Essbase/Planning target application but will generate errors if you try to use it.


Even though it is not currently possible to put the sub vars directly into period mappings there are alternatives and I am going to go through one of the possible methods.

Let us start with EPM Cloud, data management in the cloud does not allow any kind of custom jython scripting yet so the options are limited from the UI perspective but it does have the benefit of the EPM Automate utility and REST API.

Both EPM Automate and REST have commands to run data load rules and retrieve sub vars so it should be possible to run a rule based on sub var values. I am not going to spend time going into these commands as I have covered both of them in previous blogs and you can read in greater detail about them here and here

Right, I am going to assume you have read the previous posts or already know all about the commands.

Within the planning application I have the following sub vars that define the current month and year.


The period name defined in data management is a combination of the short name for the month and the last two digits of the year.


With EPM Automate I can use the getsubstvar command to retrieve the sub var values.

The format for getsubvar is:

epmautomate getsubstvar CUBE_NAME|ALL [name=VARIABLE_NAME]


Then with the rundatarule command, run the load rule with the current period name.

The format for rundatarule:

epmautomate rundatarule RULE_NAME START_PERIOD END_PERIOD IMPORT_MODE EXPORT_MODE [FILE_NAME]

In my example I am going to load for a single period but could easily handle multi period loads.


To achieve the above there is some custom scripting required as the sub vars need to be retrieved, the period name constructed and the load rule called.

I know I say this every time but it is up to you which scripting language you decide to opt for, it all depends which one you feel most comfortable with.

I am going to stick with PowerShell as it is easily accessible, not difficult to pick up and it is definitely time to put the old windows command line scripting to bed.

The first part of the script sets all the variables and should be self-explanatory.


The main section of the script does the following:

Logs into EPM cloud using EPM Automate based on the variables that have been defined.

Checks if the login was successful, if not exit.

Retrieves and stores the value for the current month variable using the EPM Automate getsubstvar command

The month is then extracted as the getsubstvar command returns an array of three lines.


The variable value is in the second line which I then split out based on the equals sign and store the month name.

The same method is then repeated to extract the year from the current year variable.

The period name is then constructed based on the month and year.

Finally the data load rule is executed using the rundatarule command and passing in the parameters from the stored variables.


So with a simple script, data load rules can be executed based on sub vars.

Moving on to REST, it is possible to achieve the same results without the requirement of having the EPM Automate utility installed and removes the need to keep updating and installing newer versions of the utility.

I have already provided the links to previous posts where I have gone into detail on using the REST API but in summary there is a resource available to return substitution variables, the format is:

https://<cloud_instance/HyperionPlanning/rest/v3/applications/<appname>/substitutionvariables/<subvarname>

An example using a REST client with a GET method request:


The sub var details are returned in JSON format.

The format for running a data load rule using REST is:

https://<cloud_instance>/aif/rest/V1/jobs

A POST method is required with the data rule parameters in the body of the request in JSON format.


Running the request will return details about the job including the status and ID.


Converting this to a script does not require much more effort than the last example, first of all variables are defined.


The main section of the script follows the same process as the previous script and with the comments I have provided I don’t feel I have to go into any more detail.


If I wanted I could expand the script to keep checking the status of the data load rule until it completes which I have done in other posts.

So that is one possible method covered for EPM cloud using either EPM Automate or REST, what about on-premise?

From FDMEE 11.1.2.4.210 the REST API is available to run data load rules but what is not yet available is the ability to return sub var information with the REST API, this requires an update to planning and your guess is good as mine as when that will happen.

There are different ways to obtain sub var details such as MaxL or one of the available APIs and this is a solution I have come up with using custom jython scripting in FDMEE.

I will get on to the script shortly but the idea is to have the same parameters available as when running a rule but also include ones for the current month and year sub vars.

I registered a custom script which has parameters for data load rule name, the rule import and import modes which have SQL queries defined to return the possible values.


There are two parameters for sub vars which will define which variables to hold the current month and year.

In my example the target is going to be a planning application but the concept would be exactly the same for a Essbase application.

Once registered the script is available for execution.


When the script is run, the parameters are displayed and values can be entered.


Selecting either the Import or Export mode will run a SQL query behind the scenes and generate a list of possible values.


The jython script is then called and the parameters are passed into the script.

The script itself uses a combination of the Essbase Java API to retrieve the sub var values and then the REST API to run the data load rule.

There is no additional configuration required to run the script, such as adding Jar files as the Essbase API is directly accessible from FDMEE, the same goes for calling the REST API and managing JSON, also there are no hardcoded passwords as authentication is handled by a SSO token.

I am not going to show every line of the script but in summary the required Java classes are imported, then the variables are defined using a combination of static and ones passed in from running the script.


With the use of Essbase Java API the current month and year sub var values are retrieved and from these the period name is constructed.

JSON is generated for the body of the REST request and then the FDMEE REST resource is called to run the data load rule.


In my example I have kept the code as simple as possible, the full version handles exceptions such as if there were problems calling the REST resource.

Once the custom script has been run there are two process log entries, one for the custom script execution and one for the running of the data load rule.


The process log provides further details on running the rule using REST.


The custom script can be run from the FDMEE UI or it can be run remotely as it is possible to run custom scripts using the REST API.

Here is an example using a REST client to run a FDMEE custom script:


I also put together a PowerShell script that takes user input to define parameters and then calls the REST resource to run the custom jython script.


So now we have the ability to run an FDMEE data load rule where the period name is based on Essbase substitution variables without having to log into FDMEE through workspace.

Well that wraps another post, I hope you found it useful.

Saturday 2 September 2017

EPM Cloud - Connections are here

In the latest EPM Cloud release there is some new functionality that I thought would be worth covering, it was originally titled EPM connect in the presentations I have previously seen and allows seamless connectivity between different EPM cloud services.

Before the latest release if you had multiple EPM Cloud services such as PBCS, EPBCS, FCCS, each would act independently, to access the UI you would have to use a separate URL for each application and there was no link between them, in the on-premise world you can access all your applications through a single point which is workspace.

With this EPM Cloud release it is now possible to connect to multiple services from a source application and with the use of navigation flows make it all feel unified by embedding clusters or cards from different services into the source application.

To be able to create a connection the following sources are currently supported.
  • Oracle Planning and Budgeting Cloud
  • Oracle Enterprise Planning and Budgeting Cloud
  • Oracle Financial Consolidation and Close Cloud
  • Oracle Tax Reporting Cloud
From this source, you are then able to make a connection to any of the following EPM Cloud services.
  • Oracle Planning and Budgeting Cloud
  • Oracle Enterprise Planning and Budgeting Cloud
  • Oracle Financial Consolidation and Close Cloud
  • Oracle Tax Reporting Cloud
  • Oracle Account Reconciliation Cloud
  • Oracle Profitability and Cost Management Cloud
  • Oracle Enterprise Performance Reporting Cloud
The matrix from the documentation provides a clearer picture:


If the services are in the same domain, then the connection can be configured using a service administrator account.

From information in the announcements and new features update document It also looks possible to connect to services that are in different domains and in different data centers, this is achieved by configuring the domains to use single sign-on through the same identity provider.

With this release it is not possible to configure connections in different domains within the same data center.

I am not sure yet why it is currently only supported across different data centers and not in the same data center but no doubt there is a technical reason behind it (I have now been informed it is due to current restrictions in shared identity management), I was hoping to have a test out setting up connections between different domains but unfortunately the domains I have access to are all in the same data center so I will have to wait until it is supported.

This slightly conflicts to what is described in the administrator documentation:

“If the target and source subscription instances are not on the same identity domain, then you might not be able to establish a connection between them. You must relocate one of the instances so that the source and target instances share an identity domain.”

I believe the above statement about relocating instances is going to be removed from the documentation as it should only be done in exceptional circumstances, once it has been removed I will update this post.

Anyway, I will go through setting up connections as hopefully it will become clearer what this new functionality is all about.

To create a new connection to a cloud service there will be a new card called “Connections” which is part of the tools cluster.


Alternatively connections can be accessed from the navigator.


If you don’t see “Connections” it may be because you are using a custom navigation flow, it should be available in the default flow and can be added to any custom flows.

Once you go to the Connections card you can manage connections so let us add a new connection.


You will then be presented with the list of EPM cloud services that you can make a connection to.


In this example I am going to make a connection from a source PBCS application to a target EPBCS application but it could be simply any of the EPM cloud services, the functionality will operate in the same way.


Once the connection has been validated the target application name will be populated, you may be wondering what happens when the service administrator password is changed, well I will get on to that later in the post.

Please note the instance you are connecting to will have to be upgraded to the latest release in order for the connection to be successful.

I did test what would happen if I tried to create a connection to a EPM cloud service which is a different domain but in the same data center and received an error about the connection having to belong to the same domain.


Once the connection has been saved it be added under manage connections where it can be edited or deleted.


If you click the home icon and then go to the navigator you will see the new connection.


To navigate to the target application, you just have to click the connection name and the application will open in the same browser tab, alternatively click the open window icon and the application will open in a new browser window. So a user can now access multiple cloud services through a single access point with one login.

You can build up connections to all the EPM cloud services you have access to.


The above is a true reflection of what the connections will look like, unlike what is currently in the documentation where connections have been created to a PBCS instance but named like they are to different cloud services.

I am going to stick with a single connection to EPBCS.

To add a card/cluster from the target application to your source application, go to navigation flows, select the navigation flow you want to update.


Select “Add Existing Card/Cluster”, this will open the usual library window but now on the left side you will have the option to select any of your connections.

From the library I selected the EPBCS connection and then the Revenue card.


It is worth mentioning that:

“For this release, cards cannot be directly selected from Oracle Account Reconciliation Cloud, Oracle Enterprise Performance Reporting Cloud, and Oracle Profitability and Cost Management Cloud using the Add Existing Card/Cluster option.”

The Revenue card is added to navigation flow with the connection name “EPBCS” and navigation flow name “Default”


Once the navigation flow has been saved and is active, if you select the home icon and then “Reload Navigation Flow”


The Revenue card from the target EPBCS application is now part included in the source application’s main screen or as Oracle likes to call it the springboard.


Selecting the Revenue card will seamlessly connect to the target application and display as if it was part of the source application.


I could have easily picked to add in an existing cluster from the target EPBCS application.


Once the flow has been saved and reloaded the cluster is displayed from the target application.


It is possible to mix it up and create a cluster which has cards from the local application or any of the EPM cloud services, Oracle like to term this as a mash-up but that saying means something totally different to me.

So let’s add a new cluster.


Now create a new card against the local application and newly created cluster, assign a form to the card.


Next to create another card against the same cluster but this time assign a dashboard from the EPBCS application, the same type of artifacts can be selected whether it is a local or remote connection.


The navigation flow now has a new cluster with a card pointing to a form in the local PBCS application and a card pointing to a dashboard in the remote EPBCS application.

Save the flow, activate and reload and the cluster and cards are displayed in the springboard.


Once again it is possible to seamlessly flip between viewing a form in the local application


and a dashboard in the EPBCS application.


As you would expect the clusters/cards also show up in the navigator


We can take it step further and create a card with tabs and mix it up between local and target cloud services, the following example I have added vertical tabs to a new card with one tab from the local PBCS application and one from the remote EPBCS application, the card is assigned to the existing cluster.


You know the drill, save the flow, activate, reload and the new card is available in the springboard and from the navigator.


Opening the card will now allow the viewing of vertical tabs which are looking at forms, dashboards and reports across the local and remote application.


It is not limited to vertical tabs and we can mix it up in horizontal tabs, for example in the above “Mash it up” card I have added a new tab, the tab has been set to tabular which automatically sets the orientation to horizontal.


Next I added to two new sub tabs, one looking at a form in the local PBCS application and another form in the EPBCS application.


This was then saved and you can see it has been added to the card I created earlier.


Now I have a card that consists of vertical and horizontal tabs which work across cloud services, this is great for the end user who does not need to be concerned about logging into multiple applications and this is where you can appreciate the power of navigation flows across multiple services.


Remember in these examples I am just looking at PBCS and EPBCS, you can build flows across any of the EPM Cloud services which you have access to.

For example I can add a connection to EPRCS and select from Reports, Snapshots or Packages.


Just like with all the previous examples the card/clusters/tabs will act in the same way.


You may be asking what about security, do the navigation flows honour access permissions across the cloud services, well the answer is yes and really it has to for the functionality to be of any use.

From my initial testing a user will be able to see all the defined connections in the navigator even if they don't have access to some of the services, if the user does not have access and tries to access one of the connections they will not be able to and will receive an error.

Up to now I have been using the service administrator account to the demo the flows, let us take a quick look at the difference when a standard user accessing them.

Back to the Revenue card which is part of the EPBCS connection, the admin user can access multiple horizontal and vertical tabs.


Now for a standard user who has only been given access to Revenue dashboard which is part of the first vertical tab, the user does not have access to any of the forms or dashboards that are part of the tabs.


When the user accesses the Revenue card they can only see the revenue dashboard, so the access permissions are being honoured.

One final example with the “Connect Demo” cluster, the admin has access to all the cards that are part of the cluster.


The user does not have access to the form that is part of the “From EPBCS” card so when they view the cluster they don’t see the card.


The same goes for the “Mash it up” card with the mix of vertical and horizontal tab across cloud services, the admin has full access.


The user will only see what they have been given access to in the local and target applications.


This applies to all the access permissions for the user, so say they access a form across a service they will only see the members they have access granted to in the target application.

Right, going back to what I mentioned earlier when setting up a connection, what happens when the service administrator password is changed.

Well, initially everything looks ok but that must be because of caching, if the service is restarted or after the maintenance window, the connections will be greyed out in the navigator and inaccessible, clusters/cards/tabs from the target applications will not be visible any more in the source application until the connection has been updated with valid credentials.


I can see that as being a bit of pain because the administrator will have to remember to update all the connections each time their password is changed, it is easy to forget and yet another thing to add to the list like having to updating the password in EPM Automate or API scripts.

It would nice if there was some kind of sync option, it would be great if there was an command in EPM Automate or the API to manage the service administrator password, also the ability to find out how many days left until the password expires which would help with scripting. I am not sure how feasible it is though due to security concerns. Oracle if you are watching let me know your views on this :)

Finally on to migrations, if you take a look at the artifact list you will see there is new addition under Global Artifacts called “Service Connections”, this will contain each of the connections that have been configured.


If you export the snapshot there will be an XML file for each connection.


The XML is quite simple but what you need to be aware of is that is does not contain a password, so if you are migrating or restoring then after the import you will need to edit each connection and enter the password.


Well I definitely think that is enough for this post, I am pretty impressed with the new functionality and it is definitely a step in the right direction, until next time..