Saturday, 4 May 2019

EPM Cloud 19.05 – REST API and EPM Automate new functionality

It has been a quiet year up to now for new functionality in EPM Cloud, but that has all changed with the 19.05 release. In this post I am going to concentrate on the new features that have made their way to the REST API and EPM Automate.

There are new REST API jobs to import and export security, export audit records and the job console, and finally the ability to sort dimension members. Not all of these are available in EPM Automate yet but no doubt they will be included in future releases.

In the post I am going to go through each of the new REST resources, providing an example using a REST client and a simple script which can easily be updated for the different job types. I have covered the REST API many times in previous posts, so I am going to assume that you understand the basics.

Let us begin with the resources that I feel will be most welcomed – the ability to import and export access permissions. If you would like detailed information on access permissions, please refer to the documentation.

All the new REST API functionality is available through the jobs resource, which uses a post method and has the URL format of:

https://<epm_cloud_instance>/HyperionPlanning/rest/v3/applications/<application_name>/jobs

The body or payload of the REST request is where the job information is defined using parameters.

So, let us take an example of exporting access permissions, this can simply be achieved with the following:


The “jobType” parameter for exporting access permissions should be “Export Security”

The “jobName” parameter can be anything relevant, it is just the name that is going to be shown in the job console.

The “filename” parameter is optional, if it is not supplied, a CSV file will be created with a name containing the administrator who ran the request, description and current date/time.

Once you have the basic authentication details you are ready to execute the request.


After the request has been executed, a response will be returned containing all the job information.


A status of -1 indicates the job is in progress, to keep checking the job status, a get request can be made which includes the job ID.


This time the job has completed, and the export was successful. You can see that the filename I provided has been automatically generated as a CSV file.

The file is available within the EPM Cloud instance and can be seen through the UI in the inbox/outbox explorer.


If I had executed the REST request without the filename parameter:


The file would have been generated with the following naming convention:


The file can be downloaded from the UI or by using EPM Automate or using the REST API, an example being:


As the file is a CSV type, it can be opened and viewed in a much more readable format through Excel, or a similar application.


Most of the headings should be self-explanatory, the name column is the name of the user or group, the “Is User” column defines whether it is a user or a group.

The object type is an artefact within the application and can be any of the following:
  • SL_FORM – Form
  • SL_COMPOSITE - Composite Form
  • SL_TASKLIST – Tasklist
  • SL_CALCRULE – Rule
  • SL_FORMFOLDER - Form Folder
  • SL_CALCFOLDER - Rule Folder
  • SL_DIMENSION – Dimension
  • SL_REPORT – Financial Report
  • SL_REPORTSSHOT – Financial Report Snapshot
The access type is the type of access permission, this can be: NONE, READ, WRITE, READWRITE or LAUNCH

The access mode is the inheriting access permission, this can be: MEMBER, CHILDREN, @ICHILDREN, @DESCENDANTS or @IDESCENDANTS. The documentation provides further details on the these.


I will provide details on the “Remove” column later in this post.

In the previous example the access permissions were exported for all users and groups. It is possible to restrict the export to specified users or groups.

To export access by group the “exportGroups” parameter can be used and assigned with a list of comma separated groups.


The above example will export access permissions for only the “Analyst” and “Reviewer” groups.


To export access permissions by user, the “exportUsers” parameter can be used and defined by a list of comma separated users.


The above example exports security for a single user.


To automate the process of exporting the access permissions, you can easily convert this into script. Most programming languages support REST in some form so pick the one you feel most comfortable with.

In the following PowerShell script, a REST request is made to export access permissions for a group named “Planner”.


Once the REST request has been made the status can be repeatably checked until it changes from progressing.


Finally, a separate REST resource can be executed to download the file from the EPM Cloud instance.


The file that has been downloaded contains the access permissions for the “Planner” group.



The script can be modified to work with any of the REST API job resources contained in this post.

Moving on to importing access permissions and I will start off with a CSV file that defines permissions for a user and group.


The “Planner” group will be defined with read access to the form folder “Long Range Plan”.

The user will be granted read/write permissions to the “Total Entity” member and all its descendants.

To demonstrate the “Total Entity” member has no user access defined.


The “Long Range Plan” form folder has no group access defined.


The CSV file must first be uploaded to the Cloud instance and this can be achieved using a REST resource, an example being:


Once the request has been successfully executed the file should be available in the Cloud instance.


The access permissions can then be imported using the REST resource with the same jobs URL and post method.


The “jobType” is “Import Security. The “jobName” can be anything relevant.

Just like with the export security resource the “filename” is optional.

Once the request has been sent, a response containing the job information is returned.


This can then be checked by including the job ID in the URL.


The import failed as the filename was incorrect. The filename is case sensitive, and I incorrectly set the first character as upper case.

After rectifying the filename, the import was successful.


The response includes the details of how many records were imported and if any were ignored.

To confirm the import, the form folder was correctly assigned with the group.


The member permissions were assigned to the user.


It is possible to remove access permissions, and this can be achieved by setting the value in the remove column to “Y”.

In the following example I am going to remove access to the “Planner” group for the member “BS”.


Currently access has been assigned to the group.


The CSV file was uploaded, and the access permissions imported using the REST resource.


The response confirms 1 record was successfully imported.


Checking in the UI confirms the access permissions have been removed for the group.


On to the next REST resource and that is the ability to export audit records. Please note that you will have to enable auditing in the UI if you have not already done so.

Once again, the URL and method are the same.


The “jobType” parameter should be “Export Audit”. “JobName” can be anything relevant you decide on.

The “filename” parameter is optional. If it is not included in the body of the request it will be generated in the same format as previously explained, the only difference is the filename will include “AuditRecords”.

The “ndays” parameter defines the number of days which audit records should be exported. The values can be 1,2,7,30,60,180 or All. If not included the default is 7 days.

Once the request has been sent the audit records will be exported and a zip file produced which can be downloaded.


The zip file contains a csv file with the same name.


The CSV file is in the same format as that viewed in the UI.


It is possible to export audit data for specified users.


The “userNames” parameter can contain a comma separated list of users.

The generated CSV file will contain only audit records for the specified user(s).


Next REST resource allows the export of the job console information.


The “jobType” should be “JOBCONSOLE_EXPORT”. Define “jobType” to anything you want.

Again the “filename” parameter is optional. If it is not included in the body of the request it will be generated in the same format as previously explained, the only difference is the filename will include “PlanningJobConsoleRecords”.

Just like with the other resources the job can be repeatedly checked until the status changes.


A zip file will be created which can be downloaded using either the UI, EPM Automate or REST API.


Unless specified with parameters in the body of the REST request the only job console information exported will be for rules and with a completed status.


By default, 7 days of job console information will be exported. This can be changed using the “ndays” parameter. The possible values can be 1,2,7,30,60,180 or All.


You can define the job types included in the export with the “jobTypes” parameter. This can be a comma separated list, please refer to the documentation for the full list.

In the above example, job console information will be exported for all records and job types.


To export the job console information for importing and exporting security the following parameter can be defined in the body of the request.


This will produce a CSV file with job console information for only import and export security jobs.


It is possible to restrict the export to the job name.


In this example the job console information is only exported for jobs that have been named “MTL Import Security” and “MTL Export Security”.


By default, only jobs with a completed status are exported. This can be changed by using the “jobStatusCodes” parameter and a list of comma separated codes.

The possible values are:
  • 1 Processing: Current jobs under execution
  • 2 Completed: Successfully completed jobs
  • 3 Error: Jobs failed with errors
  • 4 Unknown: Jobs completed with unknown status
  • 5 Completed with Threshold Violation: Jobs completed with threshold violation status
  • 6 Pending Cancel: Jobs with Pending cancel status
  • 7 Cancelled: Cancelled jobs
  • 8 Completed with Errors: Jobs completed with errors
  • 9 Completed with Warnings: Jobs completed with warnings
  • All         Export the jobs with any status

In this example, job console information has been exported only for jobs that have errored.


In the zip file that is produced there is a folder called “errorlogs”.


This contains additional error files that match by job ID.


So for the job ID 18 the error file includes the following information.


If the error files are not required, the “exportErrorDetails” parameter can be used with a value of “false”.


This means the folder in the zip file will not include the error files.

On to the final new REST API resource and this allows dimension members to be sorted.

The same jobs URL and method are used.


The “jobType” should be “Sort Members” and the “jobName” can be set to anything relevant.

The “member” parameter is the parent member whose children or descendants are being sorted.

By default, the sort will be applied to the children of the specified member and in ascending order.

The above example will sort all the children of member “SortMe” in ascending order.


Just like with all the other resources the job status can be checked.


The children of “SortMe” have been sorted in ascending order.


Now let’s add some children to the “Z_Product” member.


The body of the REST request has been updated to include the “order” and “type” parameters.


The descendants of member “SortMe” have been sorted in descending order.


This sort functionality has also been added to EPM Automate with the “sortmember” command.

As you would expect it is very simple to use and the command has the following syntax.


The following will sort the descendants of the member “SortMe” in ascending order.


I didn’t need to include the “order” parameter as the default is to sort in ascending order.


Another new command to be introduced into EPM Automate allows the application to be put into administration mode or returned to all users.

This functionality has been available in the REST API for a while, I covered this in a previous blog which you can read about here.

The new EPM Automate command is named “ApplicationAdminMode” and only requires either “true” or “false” to be included.


Before running the command, the application is available for all users.


Running the command with the “true” parameter puts the application into administrator mode.


This is confirmed by checking the settings section in the UI.


The application can be returned to all users by issuing the command with the “false” parameter.


The application is now all enabled for all users.


That completes my look at all the new REST API functionality and some of the more useful new EPM Automate commands.

I hope you have found it useful!

Sunday, 24 March 2019

Hybrid FDMEE and upcoming Oracle Cloud changes

If you are not already aware, there are security protocol changes being made in the 19.05 Cloud release that will affect hybrid FDMEE implementations.

An excerpt from the Cloud Readiness document:

“Starting with the May 2019 update (Release 19.05) to EPM Services, Oracle will support only Transport Layer Security protocol version 1.2 (TLS 1.2) to communicate with EPM Cloud. To ensure the highest level of authentication and data encryption security, TLS 1.0 and TLS 1.1 will no longer be supported after your environments are updated on May 3, 2019.”

Now I have already read an article on this topic and Oracle support documents which outline how to be compliant with the TLS changes, but in my opinion the guidance has been made overcomplicated and confusing.

In this post I am going to provide my take on what is required to be ready for when the security changes are implemented. The objective is not to overly complicate matters, nor to advise to apply every critical patch update across your whole EPM estate. The idea is that you already have a functioning hybrid FDMEE implementation and you need to know what requires updating which will have minimal impact to your environment.

Please note this is just my opinion, so feel free to disagree and challenge my view.

Basically, FDMEE integrations with EPM Cloud are achieved using the REST API, communication between FDMEE and the Cloud is only from the FDMEE Java web application. This gives a clue to where to focus first and that is with Java. If there have been no updates to the version of Java deployed with FDMEE 11.1.2.4 then it will be running Java 6 update 35. This version of Java is so old it does not support TLS 1.2 so when the upcoming changes are made, FDMEE integrations with Oracle Cloud will fail.

The Oracle Java roadmap highlights just how far behind 11.1.2.4 is. If you put it into perspective Java 12 has recently been released and 11.1.2.4 is running on Java 6 which extended support ended in December 2018 so there will no further patch updates.

If you want to read about this in more detail, then take a look at the Oracle support document – “Oracle Java SE 6 and JRockit End of Support (Doc ID 2244851.1)


So, what are the options when it comes to updating the version of Java to support TLS 1.2? Well, 11.1.2.4 only supports Java 6 and 7 so you can forget about anything higher.

If you have already updated you EPM environment to Java 6 update 121+ (don’t hold me on that version though) or to Java 7 then you should be ready for TLS 1.2, though to be sure you should really test to confirm.

Before making any changes, I double checked the supported protocols against the EPM Cloud instance with the help of an SSL labs test.


Only TLS 1.2 is enabled which is what I wanted to see. If you have time and would like to test out on an Oracle Cloud instance, then raise an SR and they will provide temporary access.

In FDMEE I added a new target Cloud application.


Instead of the application selection window being displayed, an SSL error was generated.


With TLS debugging enabled, the logs confirmed that the issue was related to communication being initiated with the TLS 1.0 protocol.

[ACTIVE] ExecuteThread: '6' for queue: 'weblogic.kernel.Default (self-tuning)', READ: TLSv1 Alert, length = 2
[ACTIVE] ExecuteThread: '6' for queue: 'weblogic.kernel.Default (self-tuning)', RECV TLSv1 ALERT:  fatal, handshake_failure

I will go through what I believe is a simple method for updating Java to a version that should support TLS 1.2, it is not the one which Oracle sometimes advises and involves finding and replacing Java versions in files.

I am going to start out with Java 7 as that makes more sense to upgrade to. My example will be on Windows, but the concept should be similar for other operating systems.

If you look in the middleware home directory, you will likely see the following Java versions.


It is only the JDK which will require updating to Java 7, this is because JRockit does not exist after 6 so we will configure to only use the JDK.

If you are looking to download Java versions from Oracle support, then I recommend the document – “All Java SE Downloads on MOS (Doc ID 1439822.1)”.

I am going to be updating Java to the following version.


The patch is simple to install as it is just an executable.


I set the install path to be the middleware home directory, so it is in the same location as the existing Java versions.


There are now three Java versions.


The Java 6 directories were renamed.


Now the next step is to use “mklink” to create a symbolic link to the new Java location. The old Java 6 directories will be set as the link name. This means anything referring to jrockit_160_37 and jdk160_35 will be directed to the new Java installation, so no need to mess about trying to find and replace Java versions in files.  Also, if you update the version of Java in the future, it is as simple as recreating the symbolic links.

The syntax for “mklink” is:

mklink /d <link> <target>

/d - Creates a directory symbolic link.
<Link> -Specifies the name of the symbolic link that is being created.
<Target> - Specifies the path that the new symbolic link refers to.

It is as simple as running the following for the JDK.

mklink /d E:\Oracle\Middleware\jdk160_35 E:\Oracle\Middleware\jdk170_211


Then for JRockit:

mklink /d E:\Oracle\Middleware\jrockit_160_37 E:\Oracle\Middleware\jdk170_211


In the middleware home directory there are now two symbolic links created.


The properties show that the target is the Java 7 directory.


For example, if I go to the Java 6 JDK directory it might look like I am in that directory, but it is pointing to the Java 7 directory.


This makes life much easier as there is no need to go searching through files to repoint to the new Java installation.

Please note if your environment is across multiple servers you will need to update Java in each of them.

There is one more change that needs to be made before I can test. Each of the Java web applications will be accessing a Java file in a JRockit directory which will need pointing to the equivalent JDK directory.

To make this change, the Windows registry editor is required as well as the key for each of the web applications, for example:


There will be a “JavaDll” string pointing to a JRockit directory.


This can be updated to the equivalent JDK directory, notice that this is kept as the symbolic link location so no modifications are required if Java is upgraded again.


This change will need to be made in the registry for each of the Java web applications.

All done and ready to test once the services have started.

In FDMEE a new Cloud application is added.


This time the result is much better, and no error is displayed.


With TLS debugging enabled, the log this time confirms the communication is using the TLS 1.2 protocol.

[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)', READ: TLSv1.2 Handshake, length = 87

To be sure everything is working as expected I tested out refreshing metadata.


This was successful.


Next, refreshing members.


No problems with this.


Now for a data load from on-premise FDMEE to EPM Cloud.


I am testing out with a numeric load first which will create a file that is in the format for a direct load to Essbase.


The file will be uploaded using the REST API and then again with the REST API a job will be run to load the data file.



I tested out the same data load but this time using the all data types method. This will create a file in the Outline Load Utility (OLU) format and the utility will be called to load the file through the planning layer.


Once again, no problems.


Next, on to extracting data from EPM Cloud and loading to on-premise FDMEE.


All good. I did also test a custom Jython script which makes REST API calls to EPM Cloud and there were no issues.

So all I have done is update the version of Java and everything seems to be working well.

If you are adamant in staying on Java 6 then it is a similar process to update. The only difference is that JRockit can be also be updated.

The final Java 6 patch versions are:

Patch 28414647: Oracle JDK 6 Update 211
Patch 28414796: JDK160_211 Oracle JRockit R28.3.20

If you follow the same process I have previously been through, then you should end up with something like:


Create the symbolic links using “mklink”


You should end up with the following:


There is no need to update the Windows registry as JRockit is still being used.

I carried out the same tests in FDMEE and did not come across any issues.


Oracle do suggest patching WebLogic but from my testing it didn’t look like it was required for TLS 1.2. I can only assume it is for enhanced security reasons.

If you look at the web application logs, you will notice how old the version of WebLogic is that is deployed by default with 11.1.2.4

<Version: WebLogic Server 10.3.6.0  Tue Nov 15 08:52:36 PST 2011 1441050 >

If you want to patch WebLogic, the process is pretty simple and there are lots of examples on the internet. Patches are applied using the BEA Smart Update (bsu) command line utility.

The patch I am going to apply is:

Patch 28710912: SU Patch 7HKN: WLS PATCH SET UPDATE 10.3.6.0.190115

This was downloaded and extracted.

Then the patch can be applied with the BSU utility through command line. I will leave it to you to find out what each of the parameters are :)


If you have Java web applications deployed across multiple servers then the patch should be applied to each of these.

Once you start up a web application the log should contain the patch information.

<Version: WebLogic Server 10.3.6.0.190115 PSU Patch for BUG28710912 Tue Nov 13 06:00:38 PST 2018
WebLogic Server 10.3.6.0  Tue Nov 15 08:52:36 PST 2011 1441050 >

So there you go, updating Java might be enough to get you ready for the changes. Let me know if you see different results.