Sunday 30 September 2018

Automating data flows between EPM Cloud and OAC – Part 1

In past blogs I have covered in detail, automation in EPM Cloud using the REST API. Recently I have blogged comprehensively on the Essbase REST API in OAC, so I thought I would combine these and go through an example of automating the process of moving data between EPM Cloud and OAC Essbase.

The example will be based on extracting forecast data from PBCS using Data Management, downloading the data and then loading this to an OAC Essbase database. I will provide an option of downloading data directly to OAC from PBCS for those who have a customer managed OAC instance, alternatively for autonomous OAC the data can be downloaded from PBCS to a client/server before loading to Essbase.

I am going to break this into two parts, with the first part covering the setup and manual steps to the process, then the second part gets into the detail of automating the full process with the REST API and scripting.

Before I start I would like to point out this is not the only way to achieve the objective and I am not stating that this is the way it should be done, it is just an example to provide an idea of what is possible.

To start out with I am going to want to extract forecast data from PBCS and here is a sample of the data that will be extracted:


To extract the data, I am going to use Data Management, once the integration has been defined I can add automation to extract the data using the REST API.

As it is EPM Cloud, I will need to extract the data to a file and this can be achieved by creating a custom target application in Data Management.


The dimensions have been created to match those of the OAC Essbase database, I could have included scenario but that is always going to be static so can be handled on the Essbase side.


There are slight differences between the format of the Year in PBCS


to that in the Essbase database.


Aliases could be used but I want to provide an example of how the difference can be handled with period mappings in Data Management.


This will mean any data against, say FY19, in PBCS will be mapped to 2019 in the target output file.

If there are any differences between other members these can be handled in data load mappings in DM.

In the DM data load rule, source filters are created to define the data that will be extracted


In the target options of the file a fixed filename has been added, this is just to make the process of downloading the file easier. If this is not done, you would need to either capture the process ID from the REST response to generate the filename or read the filename from the jobs REST response, both methods produce the same outcome but, in this example, I am going for the simpler option.


Before running the integration, I will need to know which start and end period to select.

For the automated process I am going to pick this up from a substitution variable in Essbase, it would be the same concept if the variable is held in PBCS as both have a REST resource available to extract the information.


The data will be extracted for a full year, so based on the above sub var, the start period would be Jan-19 and the end period Dec-19


Running the rule will extract the data from PBCS, map and then produce an output file.


The rule ran successfully so the exported file will be available in the inbox/outbox explorer.


If I download the file you can see the format of the exported data.


When I cover the automation in the next part I will provide two options, the first one will download the data file directly to OAC from PBCS and then load the data, the second will download the file from PBCS to a machine running the automation script and then stream load it to Essbase.

As this post is about manually going through the process, I have downloaded and the file from PBCS and uploaded to OAC Essbase.


The file has been uploaded to the Essbase database directory.


Now an Essbase data load rule is required to load the above file.

A new rule was created, and the uploaded data file selected.


The columns in the data file were mapped to the corresponding dimensions.


The data is always being loaded to the forecast scenario member and not contained in the file, so this was added to the datasource information.


As I mentioned earlier I could have easily included scenario in the data export file by adding the dimension to the target application in Data Management, it is up to you to decide which method you prefer.

Once created it will be available from the scripts tab and under rules.


To run the rule, head over to jobs in the user interface and select “Load Data”


The application, database, rule and data file can then be selected.


The status of the data load can then be checked.


This is a hybrid database there is no need to run a calculation script to aggregate the data, if aggregations or calcs were required to be run then you could simply add this into the process.


A retrieve on the data confirms the process from extracting data from PBCS to OAC Essbase has been successful.


You could apply this process to extracting data from OAC and loading to EPM Cloud, one way to do this could be to run an Essbase data export script, the export file could then be uploaded to EPM Cloud, and a Data Management rule run to map and load to the target application.

We have a process in place, but nobody wants to live in a manual world, so it is time to streamline with automation which I will cover in detail in the next part. Stay tuned!

Sunday 9 September 2018

EPM Cloud – Managing users with EPM Automate and REST API

New functionality has been added in the EPM Cloud 18.09 release to provide the ability to manage users and roles at an identity domain level with either EPM Automate or the REST API. In this post I am going to cover this functionality starting off with EPM Automate.

Four new commands have been added to EPM Automate and these are:
  • addusers – Creates new users in the identity domain based on the contents of a comma separated file.
  • removeusers – Deletes identity domain accounts based on the contents of a comma separated file.
  • assignrole – Assigns an identity domain role to all users that are contained in a comma separated file.
  • unassignrole – Unassigns an identity domain role to all users that are contained in a comma separated file.
Please note, to be able to use these commands you will need to be logged in with an account that has the “Identity Domain Administrator” role.

The comma separated files have to be uploaded to the cloud instance first using the “uploadfile” command before you can use the new commands, I would have preferred it if it could have been done in a single command without having to upload files but unfortunately that is not the way it has been developed.

I am quickly going to go through each command and provide an example.

Let’s start off with adding new users to the identity domain.

Before you can use the “addusers” command, you will need a file containing the new user information in the correct format.

The file can contain as many users you would to like to add, for demo purposes I am just going to be adding one user.

The file is required to be in the following format:


The format is the same as if you were importing a batch of users through Oracle Cloud My Services.

Now that I have the file ready I can upload it using EPM Automate, I have assumed the user has been logged in and a file with the same name does not already exist in the cloud instance, you can easily use the “deletefile” command first to remove it if needed.


The file will then be available in the “Inbox/Outbox Explorer”


The format to add users with EPM Automate is:

epmautomate addusers <FILE_NAME> <userPassword=PASSWORD > <resetPassword=true|false>

FILE_NAME is the name of the comma separated file containing the new user information which I just uploaded.

userPassword is a default password that is assigned to the new users. It will need to meet the minimal password requirements for identity domain passwords.

resetPassword defines whether the new users must change the password the first time they log in. I recommend this is always set to true.

An example to create the users contained in the file that was just uploaded is:


The response from issuing the command will include how many new users were processed, including the number of successful and failed user creations.

If successful, the user should then be available in the identity domain. Oracle cloud should also send out an email to the new user with their account information.


As I set the reset password parameter to true, the first time the user logs in they will be redirected to the “Identity Self Service” and require changing their password.


Now that the user has been created we can assign a new identity domain role.

The format for the EPM Automate command is:

epmautomate assignrole <FILE_NAME> <ROLE>

FILE_NAME is the name of a comma separated file containing the user login for the users that you want to assign an identity domain role to.

ROLE is one of the predefined identity domain roles which are:
  • Service Administrator
  • Power User
  • User
  • Viewer
The file requires to be in the following format:


I hoped I could use the same file used for creating the users as it also contains the “User Login” information, but when I tried that I received the following EPM Automate error:

EPMAT-1:Failed to assign role for users. File does not have valid header. Please provide a valid header in file.

Before being able to use the “assignrole” command the above file was uploaded using EPM Automate.


After the file has been uploaded, the “assignrole” command can be executed, in my example I am assigning the “Power User” role to the user in the “assignUsers.csv” file.


The response is the same as when adding users, as the command was successful the user has been assigned the role.


Unassigning a role is no different to assigning a role, the format for the command is:

epmautomate unassignrole <FILE_NAME> <ROLE>

I don’t feel I need to explain what the parameters are this time.

In the following example I am using the same file I uploaded for assigning roles to users, this time I am going to unassign the “Power User” role.


The EPM Automate command completed successfully and a quick check in “My Services” confirms the role has been removed.


On to the final command to delete users from the identity domain.

The format for the EPM Automate command is:

epmautomate removeusers FILE_NAME

The filename should contain all the users you want to remove, and the file format is the same as when assigning/unassigning roles.


In the following example using EPM Automate, I upload the file containing the users to remove and then remove them with the “removeusers” command.


Back in “My Services” the user has been deleted.


As the EPM Automate utility is built on top of the REST API then all the above commands will be available using REST.

So let us repeat the available functionality using a REST client. There are lots of free clients out there so pick the one you prefer. As usual I will be using the Boomerang REST client for Chrome.

First, I am going to delete the CSV file in the cloud instance containing the users and then upload a new one.

The REST URL format to delete files is:

https://<cloud_instance>/interop/rest/11.1.2.3.600/applicationsnapshots/<filename>

A DELETE method is used, so to delete the “newusers.csv” file a request would be made to the following:


If the deletion was successful a status of 0 will be returned.


If the file does not exist, a status of 8 will be returned and an error message.


Now to upload a new file containing the users to add in the identity domain. I have covered uploading files using the REST API in the past which you can read about here, so there is no need for me to go into much detail again.

The REST URL to upload files is:

https://<cloud_instance>/interop/rest/11.1.2.3.600/applicationsnapshots/<filename>/contents?q={"isLast":<true/false>,"chunkSize":<sizeinbytes,"isFirst":<true/false>}

A POST method is required, for example:



In the body of the post I added the user information for the file.


You can include as many users as you want to add.

The request header content type equals “application/octet-stream”

A response status of 0 will be returned if the file was uploaded.


If the file already exists, you would receive something like:


The file is now available in the “Inbox/Output Explorer”.


On to adding the user contained in the file using the REST API.

The URL format for managing users is:

https://<cloud_instance>/interop/rest/security/v1/users

A POST method is required, and the body of the request should contain the filename and the default and reset password values. These are the same parameters which are used with EPM Automate commands.


It is a shame that the user information could not have been included the body of the request instead of having to upload a file.

The response will contain job information for adding new users. It includes a URL which can be accessed to check the job status.


A GET request can be made to keep checking the job status until it completes.


A status of 0 means the operation was successful, just like with EPM Automate details are included to inform how many new users creations were processed and how many succeeded or failed.

As the job was successful the new user has been added and is available in “My Services”


We can now move on to assigning a role for the new user.

I uploaded the following file:


In my example the file only contains a single user but it can contain as many as you want to assign a role to.

To assign a role to the users contained in a file the same URL format is required as when adding users.

A PUT method is required, and the body of the request should include the filename, the role name and a job type of “ASSIGN_ROLE”


This time I am going to assign the “User” identity domain role.

The current job status for assigning the role is returned, once again it includes a URL to check the job status.


The job status can then be checked until it completes.


As the operation was successful, the “User” role has been assigned to the user contained in the file.


To unassign roles is very similar to assigning. A PUT method is required, and the body of the request should contain the filename containing the users that the role should be unassigned from, the role name and a job type of “UNASSIGN_ROLE”


The current job status is returned in the response.


The job status can be checked until it completes.


As the job was successful the “User” role has been removed for the user contained in the specified file.


To remove users, a file should be uploaded containing the user login details of the users to remove.

I uploaded the following file.


A DELETE method is required, and the URL should include the filename containing the users to remove.


No surprise that job status information is returned in the response.


The job status can be checked until it completes.


The response confirms the user was removed. A look in “My Services” confirms the user has been removed.


I was going to include an example using scripting, but I think you should get the idea. I have covered scripting with the REST API many times in the past so just have a look at my previous posts if you are unclear.

Update - New functionality to manage users was added in the 18.11 release, you can read about in here.