Monday, 23 January 2017

FDMEE Hybrid update - Part 1

The 11.1.2.4.210 patch set update for FDMEE adds new Hybrid functionality to allow integration with either Oracle FCCS or Fusion Cloud, the readme for the PSU has the following information:

“FDMEE can be used as a primary gateway to integrate data between on-premise and the cloud based Financial Consolidation and Close Cloud (FCCS) and the Fusion GL Cloud Balances applications. This feature allows EPM customers to adapt FCCS and Fusion GL Cloud Balances deployments into their existing EPM portfolio. You might use the integration to import data from existing on-premise ERP applications, or synchronize data between FCCS applications or Fusion GL Cloud Balances and on-premise applications.”

FDMEE 11.1.2.4.210 now supports these type of hybrid integrations:
  • On-premise ERP applications from supported source with Oracle Cloud
  • On-premise EPM applications with Oracle Cloud
  • Oracle Cloud with on-premise EPM applications
  • Oracle Cloud with EBS/Peoplesoft GL
  • Oracle Cloud with on-premise external sources using custom application
  • Oracle Fusion Financials Cloud (GL) with on-premise EPM application
  • Oracle Financial Consolidation and Close Cloud applications with on-premise applications
In this post I am going to concentrate on the area of integration between on-premise and FCCS, in the past I wrote a three-part series looking at FDMEE hybrid functionality which was focused on PBCS.

If you have not already read my hybrid posts then I recommend taking a look, the first part is a must in order to get going with hybrid integrations.

Part 1 - Necessary configuration to get up and running with the hybrid functionality.

Part 2 - Loading data from an on-premise EPM application to a PBCS application and looking at what happens behind the scenes with the REST API.

Part 3 - Exporting data from a PBCS application and loading it into an on-premise EPM application with focus on what is happening with the REST API.

The aim for today is to export data from an on-premise planning application and load that into a FCCS application.

As usual I am going to try and simple as simple as possible and I have created a planning form to show the data which will be exported.


To be able to get anywhere the FCSS application has to be added as a target.


After selecting the cloud option credentials are entered.


From 11.1.2.4.210 there is a new option of “Consolidation” when selecting the target application.


As long as the FCCS application is available as a target in the cloud Data Management then it should be displayed.


The application name is returned by making a call to a REST call, a GET request is made to the following URL:

https://<cloud_instance>/aif/rest/V1/applications

The response contains the application name.


Once the application has been selected then REST comes into play again to return the dimension information

A GET request is made to:

https://<cloud_instance>/aif/rest/V1/applications/<app_name>

The response contains information about the dimensions in the application.


Now that the dimensions are known the dimension details can be populated in FDMEE.


You will notice that in the target application summary there is an option to “Refresh Metadata” and “Refresh Members”.


I will get to refreshing members later but say you add a new dimension in the target FCCS application then you would want this to be synchronized back to on-premise FDMEE, this option refreshes data management in the cloud and then pushes this down to FDMEE.

Behind the scenes it is all done again with REST, if you click the refresh metadata button then an FDMEE process will be executed.

In the on-premise process details you can the steps that have been carried out.



In data management in the cloud you will see a process has been run to extract the FCCS metadata and update the target application.


To update the metadata in data management a post request is made to a REST resource.


Once this process has been successfully completed then the same REST resource when adding the target application is called again.

https://<cloud_instance>/aif/rest/V1/applications/<app_name>

This will extract the dimension information from DM in the cloud and update the target on-premise FDMEE application.

In the target FCCS application options they contain the cloud credentials and default FCCS system members which are used when loading to the FCCS application.


The “Load Type” property has two possible values.


For today I am only going to be covering the standard “Data” load type, possibly in the future I will go over loading journals.

These load type values can be overridden at data load rule level.

From this point it is the same process you would follow in FDMEE to load to a target on-premise EPM application.

An import format is created setting the source as the on-premise planning application and the target the FCCS application.


The source planning dimensions are then mapped to the target FCCS dimensions.


A location is created and the newly created import format is selected.


By default, the FCCS currency member “Entity Currency” is automatically added for the functional currency.


The same goes for the Consolidation system member “FCCS_Entity Input” which is generated for “Data Value


On to the data load rule, the FCCS scenario to load to is selected, the scenarios already match to what I had configured in the category mapping.

The source and target essbase cubes were selected, I hope that soon they are just going to be called cubes instead of causing confusion with cube and plan type.


Next filters were added to define the source intersection to extract data from, this is defined to match the data in the form that I showed earlier.


In the target options of the rule there is the option to override the application level load type.


Just like at the target application level the value can be set to either “Data” or “Journal”.

Moving on to the mappings, for the source planning application members there is an option to refresh members.


For the target FCCS application members there is no button available to refresh members and initially the member list view will be empty.


The “Refresh Values” is not selectable in the mapping UI.


This is because currently it is not possible to refresh members at a dimension level for cloud application and it has to be done at target application level.


Selecting “Refresh Members” will launch a new process.

In process details there will be a “Refresh Members From Cloud” process.



The steps first involve refreshing the members from FCCS to data management in the cloud

There is an equivalent process in data management.



Once again the refreshing member in the FCCS data management cloud instance is achieved using a REST resource.


Once the process has completed successfully the member information is pushed down from data management to on-premise FDMEE using REST.

A GET request is made for each dimension in the target application using the following URL format.

https://<cloud_instance>/aif/rest/V1/applications/<app_name>/dimension/<dimension>


After all dimensions have been processed the members will be available in the FDMEE mappings member selector.


There will be automatically created mappings for the FCCS system members in the “Data Source”, “Movement” and “Multi-GAAP” ” (if enabled) dimensions, these member definitions originate from the target application options which were shown earlier.


Now the rule can be run and I will start off by only running the import in the workbench.


The steps taken are exactly the same as running an on-premise EPM to EPM integration.

The data will be extracted from the source Essbase application based on the filters in the load rule to a text file, an Essbase DATAEXPORT calculation script is generated to export the data which means currently it is only possible to extract numerical data from a planning application, maybe a future release will provide the ability to extract all data types.

The export file is then loaded into the FDMEE relational database and source members mapped to target members.

So now on to the export.


Depending on the amount of data it is probably best to run in offline mode.


The export completed successfully and process details give a very high level summary to the steps undertaken.



So what is happening behind the scenes, first of all data is extracted from the FDMEE relational database to a text file, this can be seen in the process log.

INFO  [AIF]: Creating data file: \\fileshare\EPMSHARE\FDMEE\outbox\FCCS_1869.dat
INFO  [AIF]: Data file creation complete


What is interesting in the log is details on the method that will be used to load data to FCCS.

DEBUG [AIF]: Overrode info.loadMethod for the available mode: OLU

So by default the way data is loaded with FCCS is by using the Outline Load Utility, this differs from loading to PBCS where it is possible to select “Numeric Data only – File” in the FDMEE data load rule which means data will be loaded directly to the Essbase database using a load rule

Alternatively, for PBCS the “All Data Types” load method can be selected which means the Outline Load Utility will be used to load data through the planning layer

If you open the exported file before it is loaded to FCCS you can see that it is the format required for the OLU.


The next steps in the log show the exported file is then uploaded.

INFO  [AIF]: Uploading data file to PBCS: \\fileshare\EPMSHARE\FDMEE\outbox\FCCS_1869.dat
INFO  [AIF]: Completed data file upload to PBCS.


It looks like the same code is used for PBCS as even though it is being uploaded to FCCS it is logged as PBCS.

The file is uploaded using a REST resource which I covered in a previous post.

Another REST resource is called upon to load data from the uploaded file, an example being:


The file is then loaded to the FCCS application using the outline load utility.

Outline data store load process finished. 3 data records were read, 4 data records were processed, 4 were accepted for loading (verify actual load with Essbase log files), 0 were rejected.
INFO  [AIF]: Number of rows loaded: 3, Number of rows rejected: 0


It is also worth understanding what happens if invalid rows of data are loaded, as a test I added another account to the source filter in the data load rule and mapped it to mapped it to a member that does not exist in the FCCS application.


The account member “Securities” does not exist in the target application and running the export causing the FDMEE process to fail.



Looking at the process log shows that the correctly mapped data did actually load and the new incorrectly mapped record of data was rejected.

Fetch of Driver Member "Securities" failed.
Outline data store load process finished. 4 data records were read, 5 data records were processed, 4 were accepted for loading (verify actual load with Essbase log files), 1 was rejected.
INFO  [AIF]: Number of rows loaded: 3, Number of rows rejected: 1


The full details of the record that was rejected is not displayed in the log but it does contain the invalid member.

Anyway I created a form in FCCS to confirm the data had loaded successfully and test out drill through.


If create drill regions have been enabled in the data load rule then you should be able to select “Drill Through to source”.


This opens a new browser window which redirects to the on-premise instance of workspace passing in the drill through details.

Once authenticated the FDMEE Drill Through Summary page will be displayed.


I am going to leave it there for today and in the next part I will reverse the process and extract data from an FCCS application and load into an on-premise EPM application.

Tuesday, 10 January 2017

FDMEE and the REST is up to you – Part 2

In the last post I went through two new REST API resources for running data load rules and batches which are available from FDMEE 11.1.2.4.210, these resources have been available for some time in the cloud which I wrote about here.

In this post I am going to continue where I left off and look at another three REST resources which were available in Data Management in the cloud from the 10.16 release and for on-premise from FDMEE 11.1.2.4.210.

I will also cover the equivalent commands using the EPM Automate utility which is currently only cloud based.

The first piece of functionality I am going to look is the ability to import data member mappings and let us start with how this can be achieved using the EPM Automate utility.

If you are using the web UI to import mappings you would go to “Data Load Mapping”, select import for either a single dimension or all dimensions and then select or upload a file with the mappings in.


In reality you want to keep manual tasks down to a minimum so to do this using the EPM Automate utility the command you will need “importmapping” and the syntax is:

epmautomate importmapping DIMENSION_NAME|ALL FILE_NAME IMPORT_MODE VALIDATION_MODE LOCATION

DIMENSION_NAME|ALL - The dimension name for a specific dimension to import, such as ACCOUNT, or ALL to import all dimensions

FILE_NAME - The file and path from which to import mappings. The file format can be .CSV, .TXT, .XLS, or .XLSX. The file must be uploaded prior to importing, either to the inbox or to a sub-directory of the inbox.

IMPORT_MODE - The import mode: MERGE to add new rules or replace existing rules, or REPLACE to clear prior mapping rules before import

VALIDATION_MODE - Whether to use validation mode, true or false, an entry of true validates the target members against the target application; false loads the mapping file without any validations.

LOCATION – The location where the mapping rules should be loaded to.

For an example I am going to load the following account member mappings in csv format using the utility.


First the mapping file needs to be uploaded to the Data Management inbox using the utility.


Next the mappings are imported using the “importMapping” command.


A quick check in the UI confirms the mappings are definitely available for the specified location.


So with two commands the process of loading mappings can be automated.

To do this using a REST resource is an easy task as well, the resource is available for both EPM Cloud and FDMEE from 11.1.2.4.210.

The URL for the resource is the same as when running rules or batches and follows the format:

On-premise : http(s)://<webserver>:<port>/aif/rest/V1

Cloud: https://<cloud_instance/aif/rest/V1

The resource uses a POST method and has a number of input parameters which are required to be in JSON format in the body of the request.

For an example I am going to import the mappings in following csv file for all dimensions.


The mapping file has already been uploaded to the inbox directory.


In this example I am importing mappings to an on-premise FDMEE instance but it would be exactly the same for cloud.


The “jobType” is always “MAPPINGIMPORT” and the “jobName” will be either the name of the dimension or ALL, I think the rest of the parameters in the body of the POST should be self-explanatory and follow the same logic as the EPM Automate utility.

Once the request has been sent a response should be returned in JSON format which will contain information about the import mapping process.


The job status has been returned as running, to keep checking the status a GET request can be made to the URL held in the href parameter.


If you take a look at process details in the FDMEE UI you can see that the import mapping process was indeed successful



The imported mappings are now available in the data load mapping area.


As usual I also like to convert this into the scripting world and with a little bit of PowerShell member mappings can be imported.


To keep checking the status, the URL in the returned href parameter can be stored and then a request made using the URL.


Hopefully you agree that it is not too difficult to script the process but feel free to use a scripting language you feel most comfortable with.

So let us now reverse the process and export mappings from FDMEE/Data Management, first with help from the EPM Automate utility.

The command to use this time is “exportmapping” and the syntax is similar to using the import functionality.

epmautomate exportmapping DIMENSION_NAME|ALL FILE_NAME LOCATION

DIMENSION_NAME|ALL - The dimension name for a specific dimension to export or ALL to export all dimensions

FILE_NAME - The file and path from which to export mappings. The file format can be .CSV, .TXT, .XLS, or .XLSX.

LOCATION – The location where the mapping rules should be exported from.

So for this example let use export all mappings for the location “DelimitedLoc


The exported mapping file can then be downloaded.


Now we have local access to the exported mapping file.


To export mappings using REST then the concept is pretty much the same as when importing.

The resource has the same URL and POST method.

This time I am going to export account dimension mappings to a text file for the “ALLDATA_LOC” location.


The “jobType” will always be “MAPPINGEXPORT” and the remaining parameters shouldn’t need any explanation.

The returned response once again contains the job information.


Process details in the web UI will show that an exporting mapping process has been run.


If no path was included in the REST request, the exported mapping file will be available in the root directory.


Another new feature in FDMEE 11.1.2.4.210 is the ability to select the delimiter for the mapping export file.

The delimiter can be set at system, application or user setting level.


The options beside the default of comma are exclamation mark, semi-colon or pipe.


So if I select pipe and send the same REST call again the output mapping file should now honour that setting.


The only difference to import/export from/to excel format is to change the file extension.


The mappings will then be in the excel template format.


In scripting terms there is not much change to the import example I gave, a quick update to the payload parameters and you are ready to run an export.


On to the final command for today and that is ability to run predefined reports in FDMEE/Data Management, once the report has run it can be downloaded from the “outbox/reports” directory.

The EPM Automate utility command is “runDMReport” and has the following syntax:

epmautomate runDMReport REPORT_NAME PARAMETER=Value "Report Output Format=[PDF|HTML|XLS|XLSX]"

REPORT_NAME is the name of the report to executed.

PARAMETER=Value are the report parameters if any and their values.

Report Output Format is the report output format and valid options are PDF, HTML,XLS, and XLSX.

The syntax for the EPM Automate utility can be generated by going to “Report Execution” in Data Management.

Select the report and then “Create Report Script” and a window will be displayed with the full command to run.


This can be copied and then executed using the utility, if a report has parameters then you can select them and they will be included in the generated command.


Using the first example after copying and pasting the command the report can be generated.


Unfortunately, the report is named after the process number which you are not informed at generation time.

If you take a look in process details you can see the process id and that will be of the report and the extension depends on the output format.

The report can then be downloaded using the “downloadfile” command.


The “Import Formats By Location” report is now available locally in PDF format.

Let us move on to the REST resource which has an advantage of providing the name for the generated report.

For an example I am going to be running “Dimension Map (Dimension)” report with the following parameters.

The URL and method for the resource is the same as the previously examples, the above report translates into the following request.


The “jobType” is set as “REPORT” and “jobName” is the name of the report.

Just like with the other resources the response contains information about the job.


As the process is still running the output filename is not available, the status can be repeatedly checked until the status has been updated.


Once the process is complete the output filename is generated in the response, the report can then be accessed from “outbox/reports” directory.


Converting this into a script only requires a minimal amount of change to the previous example.


The status can then be checked by posting a request to the URL held in the href parameter.


The advantage of using the REST resource is the output filename is returned in the response which means the report can be accessed and then processed further if need be.

Well I think that covers as much as I need to on this new functionality, hopefully you found it useful, until next time…