Tuesday, 10 January 2017

FDMEE and the REST is up to you – Part 2

In the last post I went through two new REST API resources for running data load rules and batches which are available from FDMEE 11.1.2.4.210, these resources have been available for some time in the cloud which I wrote about here.

In this post I am going to continue where I left off and look at another three REST resources which were available in Data Management in the cloud from the 10.16 release and for on-premise from FDMEE 11.1.2.4.210.

I will also cover the equivalent commands using the EPM Automate utility which is currently only cloud based.

The first piece of functionality I am going to look is the ability to import data member mappings and let us start with how this can be achieved using the EPM Automate utility.

If you are using the web UI to import mappings you would go to “Data Load Mapping”, select import for either a single dimension or all dimensions and then select or upload a file with the mappings in.


In reality you want to keep manual tasks down to a minimum so to do this using the EPM Automate utility the command you will need “importmapping” and the syntax is:

epmautomate importmapping DIMENSION_NAME|ALL FILE_NAME IMPORT_MODE VALIDATION_MODE LOCATION

DIMENSION_NAME|ALL - The dimension name for a specific dimension to import, such as ACCOUNT, or ALL to import all dimensions

FILE_NAME - The file and path from which to import mappings. The file format can be .CSV, .TXT, .XLS, or .XLSX. The file must be uploaded prior to importing, either to the inbox or to a sub-directory of the inbox.

IMPORT_MODE - The import mode: MERGE to add new rules or replace existing rules, or REPLACE to clear prior mapping rules before import

VALIDATION_MODE - Whether to use validation mode, true or false, an entry of true validates the target members against the target application; false loads the mapping file without any validations.

LOCATION – The location where the mapping rules should be loaded to.

For an example I am going to load the following account member mappings in csv format using the utility.


First the mapping file needs to be uploaded to the Data Management inbox using the utility.


Next the mappings are imported using the “importMapping” command.


A quick check in the UI confirms the mappings are definitely available for the specified location.


So with two commands the process of loading mappings can be automated.

To do this using a REST resource is an easy task as well, the resource is available for both EPM Cloud and FDMEE from 11.1.2.4.210.

The URL for the resource is the same as when running rules or batches and follows the format:

On-premise : http(s)://<webserver>:<port>/aif/rest/V1

Cloud: https://<cloud_instance/aif/rest/V1

The resource uses a POST method and has a number of input parameters which are required to be in JSON format in the body of the request.

For an example I am going to import the mappings in following csv file for all dimensions.


The mapping file has already been uploaded to the inbox directory.


In this example I am importing mappings to an on-premise FDMEE instance but it would be exactly the same for cloud.


The “jobType” is always “MAPPINGIMPORT” and the “jobName” will be either the name of the dimension or ALL, I think the rest of the parameters in the body of the POST should be self-explanatory and follow the same logic as the EPM Automate utility.

Once the request has been sent a response should be returned in JSON format which will contain information about the import mapping process.


The job status has been returned as running, to keep checking the status a GET request can be made to the URL held in the href parameter.


If you take a look at process details in the FDMEE UI you can see that the import mapping process was indeed successful



The imported mappings are now available in the data load mapping area.


As usual I also like to convert this into the scripting world and with a little bit of PowerShell member mappings can be imported.


To keep checking the status, the URL in the returned href parameter can be stored and then a request made using the URL.


Hopefully you agree that it is not too difficult to script the process but feel free to use a scripting language you feel most comfortable with.

So let us now reverse the process and export mappings from FDMEE/Data Management, first with help from the EPM Automate utility.

The command to use this time is “exportmapping” and the syntax is similar to using the import functionality.

epmautomate exportmapping DIMENSION_NAME|ALL FILE_NAME LOCATION

DIMENSION_NAME|ALL - The dimension name for a specific dimension to export or ALL to export all dimensions

FILE_NAME - The file and path from which to export mappings. The file format can be .CSV, .TXT, .XLS, or .XLSX.

LOCATION – The location where the mapping rules should be exported from.

So for this example let use export all mappings for the location “DelimitedLoc


The exported mapping file can then be downloaded.


Now we have local access to the exported mapping file.


To export mappings using REST then the concept is pretty much the same as when importing.

The resource has the same URL and POST method.

This time I am going to export account dimension mappings to a text file for the “ALLDATA_LOC” location.


The “jobType” will always be “MAPPINGEXPORT” and the remaining parameters shouldn’t need any explanation.

The returned response once again contains the job information.


Process details in the web UI will show that an exporting mapping process has been run.


If no path was included in the REST request, the exported mapping file will be available in the root directory.


Another new feature in FDMEE 11.1.2.4.210 is the ability to select the delimiter for the mapping export file.

The delimiter can be set at system, application or user setting level.


The options beside the default of comma are exclamation mark, semi-colon or pipe.


So if I select pipe and send the same REST call again the output mapping file should now honour that setting.


The only difference to import/export from/to excel format is to change the file extension.


The mappings will then be in the excel template format.


In scripting terms there is not much change to the import example I gave, a quick update to the payload parameters and you are ready to run an export.


On to the final command for today and that is ability to run predefined reports in FDMEE/Data Management, once the report has run it can be downloaded from the “outbox/reports” directory.

The EPM Automate utility command is “runDMReport” and has the following syntax:

epmautomate runDMReport REPORT_NAME PARAMETER=Value "Report Output Format=[PDF|HTML|XLS|XLSX]"

REPORT_NAME is the name of the report to executed.

PARAMETER=Value are the report parameters if any and their values.

Report Output Format is the report output format and valid options are PDF, HTML,XLS, and XLSX.

The syntax for the EPM Automate utility can be generated by going to “Report Execution” in Data Management.

Select the report and then “Create Report Script” and a window will be displayed with the full command to run.


This can be copied and then executed using the utility, if a report has parameters then you can select them and they will be included in the generated command.


Using the first example after copying and pasting the command the report can be generated.


Unfortunately, the report is named after the process number which you are not informed at generation time.

If you take a look in process details you can see the process id and that will be of the report and the extension depends on the output format.

The report can then be downloaded using the “downloadfile” command.


The “Import Formats By Location” report is now available locally in PDF format.

Let us move on to the REST resource which has an advantage of providing the name for the generated report.

For an example I am going to be running “Dimension Map (Dimension)” report with the following parameters.

The URL and method for the resource is the same as the previously examples, the above report translates into the following request.


The “jobType” is set as “REPORT” and “jobName” is the name of the report.

Just like with the other resources the response contains information about the job.


As the process is still running the output filename is not available, the status can be repeatedly checked until the status has been updated.


Once the process is complete the output filename is generated in the response, the report can then be accessed from “outbox/reports” directory.


Converting this into a script only requires a minimal amount of change to the previous example.


The status can then be checked by posting a request to the URL held in the href parameter.


The advantage of using the REST resource is the output filename is returned in the response which means the report can be accessed and then processed further if need be.

Well I think that covers as much as I need to on this new functionality, hopefully you found it useful, until next time…

Saturday, 31 December 2016

FDMEE and the REST is up to you – Part 1

I have previously written a couple of posts around FDMEE Web Services and at the time of writing them the options for on-premise did not include any RESTful services.

The first part covered Web Services available in FDMEE using the SOAP protocol which requires the arduous task of configuring Oracle Web Services Manager (OWSM), the amount of effort involved has never made this a viable option and personally I prefer REST over SOAP for ease of use.

The post also covered an alternative way of executing processes in FDMEE by calling a Java Servlet which behind the scenes is basically the same method the command batch scripts use.

In the second part I went through the REST resources available at the time for Data Management in PBCS.

With the release of FDMEE 11.1.2.4.210 many of the EPM cloud features have finally been pushed down to on-premise, the readme for the FDMEE 210 PSU does not include any reference to the REST API and the full documentation is yet to be released.

The good news is that the REST API has made its way down to on-premise and since I wrote the original piece on the cloud there have been some additional resources added to the API.

I feel this is a good opportunity to go through what is available in terms of the REST API for both on-premise and cloud.

In this first part I will show how simple it is to execute data load rules and batches using the REST API, now I did cover this previously for the cloud and it is pretty much the same principal for on-premise but I thought I would go over it again to clear any confusion.

In the examples I will use a REST Client (boomerang for Chrome) and for scripting PowerShell, I have said this before but I have only picked them because I find them easy to use and good for demonstration purposes, the beauty is there are many different clients and scripting languages that can take advantage of REST so just use the one you are most comfortable with.

The URL structure for accessing the REST resources is:

http(s)://<webserver>:<port>/aif/rest/<api_version>/<path>

The API version is currently V1 for both cloud and on-premise

The path identifies the resource and for running rules and batches the path is jobs

To test the resources are accessible using a REST client a GET request can be made to:

http(s)://<webserver>:<port>/aif/rest/V1


Before sending, the header of the request will require authentication information containing the username and password of a provisioned FDMEE user.


The REST client automatically encodes the credentials into base64 and adds them as a basic authorization header.


After sending the request, a valid response should contain the REST API version information in JSON format.


So let us go through an example of running an FDMEE data load rule.

I have the following data load rule which has a source file and target planning application.


Another of the new features in the 210 PSU is the ability to load non numeric data, once again I have covered this is in the cloud but this is a good time to test out the functionality for on-premise FDMEE.

In the target options of the load rule there is the option to set the load method to “All Data Types” and the date format for date data.


For this example, I have created a Text, Date and Smart List member in the target planning application.


The source file has the three members and data with the equivalent data type


There is also a form to verify that the non-numeric data has been loaded correctly.


Now that everything is in place I can look at using the REST resource to execute the load rule.

The resource uses a POST method and has a number of input parameters which are required to be in JSON format in the body of the request.

The parameters are:

jobType which for load rules will always be “DATARULE”

jobName will be the name of the data load rule.

startPeriod will be the first period for the data.

endPeriod will be the last period the data.

importMode defines how the data is imported into FDMEE. the possible values are: APPEND, REPLACE, RECALCULATE, NONE

exportMode defines how the data is loaded into the target application, the possible values depend on the target application.

For planning these are STORE_DATA, ADD_DATA, SUBSTRACT_DATA, REPLACE_DATA, NONE

If your target application is HFM then is where you hit an issue, as the REST resources have been pushed down from the cloud and there is no HFM in the cloud then all export options are not available, well I think they are not there, I will update this post if I find them or if anything changes.

The available values are REPLACE, MERGE, NONE.

This means that accumulate and replace by security are not there, if you need to use those export values then as a workaround you can create a batch in FDMEE and use the REST resource to run the batch.

fileName (optional) if not supplied the file which is defined in the data load rule will be used.

For the data load rule I am going to execute the parameters translate into the following:


The request can be sent and a response will be returned with parameters and values in JSON format which provide information about the job.


The jobID parameter is the equivalent to the process ID in FDMEE.

jobStatus will contain either “RUNNING”, “SUCCESS”, “FAILED

status will be one of the following values: -1 = in progress; 0 = success; 1 = error; 2 = cancel pending; 3 = cancelled; 4 = invalid parameter

The job status has been returned as running, to keep checking the status a GET request can be made to the URL held in the href parameter.


The job status has now been returned as “SUCCESS” so the full process is complete.

If the job ID is removed from the URL then all job statuses will be returned.

If you take a look in process details in the FDMEE UI then you can see that the process ID and job ID do match and the status is the same.


Back to my form in planning and the non-numeric data has been loaded correctly, so that is two pieces of new functionality tested in one go.


Converting this into a script is a simple task, with not much effort and a small amount of code an FDMEE data load rule can be executed.


To keep checking the status the URL in the returned href parameter can be stored and then a request made using the URL.


So now rules can be easily executed using the REST API.

I know I am going to get asked about uploading files to the FDMEE directory because it is possible using REST in the cloud, currently this is not possible as the upload/download functionality in the cloud uses the migration REST API and this has not made it down to on-premise, I am not sure if it will or not because of the difference between the directory structure in the cloud compared to on-premise, we will see.

Really it shouldn’t be too much of a problem though with on-premise as the FDMEE directories are on a file system so there are many ways to transfer files using scripting languages.

Moving on to the next REST resource and that is the ability to run batches.

For my example I have the following batch created which just executes the same load rule as the previous example.


The REST resource for running batches is the same URL as for running data loads and once again with a POST method.

Only a couple of input parameters are required in the body of the request to the resource.

jobType which for load rules will always be “BATCH”

jobName will be the name of the batch.

For my example this translates to the following request in the REST client:


Just like with the data rule a response is returning containing all the job information.


The response will only contain the details of the main batch process and not the ID of the jobs that are part of the batch, this is not really a problem as you would only really be interested in the status of the overall batch process.


As the job ID of the batch is returned and you will already know the order of the jobs then you could easily check the details of each of the jobs in the batch.


In terms of scripting then the code would be pretty much similar to running a data load and the only difference being less parameters are required in the body of the request.


Using a script is also nice and simple way of monitoring the status of FDMEE processes without having to log into the UI.


One thing that would be beneficial that is not included in the REST responses is the start and end times of the processes, maybe Oracle will include them at some point in the future.

Right, I am going to leave it there for today and in the next part I will cover importing/exporting mappings and running reports using REST.