Saturday, 7 July 2018

EPM Cloud - Data Integration comes to the simplified interface

A new piece of functionality was added in the EPM Cloud 18.07 update that I thought I should cover.

The cloud readiness monthly update document introduced the new feature in the following way:

"In addition to the standard Data Management interface, the 18.07 update provides a new, simplified interface to work with integrations in the Oracle’s Planning and Budgeting Cloud Service. The new simplified interface, called Data Integration is the front-end for all integration-related activities. Using this interface, the workflow to create mapping rules to translate and transform source data into required target formats, and to execute and manage the periodic data loading process is now streamlined and simplified."

So basically, Data Management is using the legacy standard interface and the end game for Oracle is to move everything into the simplified interface, so this is the first step in that process for Data Management.

What you will also notice is the name is changing to Data Integration, which means for now we have on-premise FDMEE, Data Management and Data Integration. Just to add to the confusion there is the unrelated Enterprise Data Management cloud service which is probably why Oracle want to change the name to Data Integration.

Before you get too excited about this release, it is early days and it is definitely nowhere near close to replacing Data Management.

Here is an excerpt from the documentation:

“Currently, Data Integrations is available as a preview version only for Planning and Oracle Enterprise Planning and Budgeting Cloud system administrators. The Data Integration component must be used in conjunction with Data Management to complete setup tasks such as registering source systems or target applications. Data Management is still fully supported and remains available as a menu selection on the Navigator menu.”


“Data Integration does not replace the legacy Data Management, it is an additional feature that supports the same workflow with a subset of legacy features. Data Integration will continue to be enhanced until it has one hundred per cent parity with Data Management.”

What we can extract from the above statements is that it is a preview version and it is only available for standard and enterprise PBCS. It also does not contain all the functionality, so the missing pieces still need to be undertaken in Data Management.

The areas that still need to be setup in Data Management are:
  • Register Source System
  • Register Target Application
  • Period Mapping
  • Category Mapping
There are some important terminology changes with Data Integration compared to Data Management.

I am sure this is going to cause some added confusion while both Data Management and Data Integration exist, plus the fact there is on-premise FDMEE.

It is worth going through the list of functionality that is not supported or available in this release, for now it is quite a big list but obviously this will change over time.
  • Only supported in standard and enterprise PBCS
  • Only available for Service Administrators
  • Location attributes like currency, check entities and groups cannot be defined.
  • Logic groups are not available.
  • Fixed length files are not supported
  • Import expressions must be entered manually.
  • In the workbench the following is unavailable:
    • Import/Validate/Export/Check workflow processes
    • Validation errors
    • Displays only dimensions in the target application and columns cannot be added
    • Drill to source
    • View mappings
    • Source and Target view is only available.
    • Import/Export to Excel
  • In map members (data load mappings):
    • Rule Name is replaced with Processing order
    • Mappings cannot be assigned to a specific integration (data load rule)
    • Exporting is not available
    • Mapping scripts are unavailable
    • Multi-dimensional mappings are available but cannot be defined.
  • Column headers for multi-period loads are unavailable
  • Scheduling is unavailable.
With all of the current limitations, maybe you are starting to understand that it is definitely a preview version.

I think it is about time we take the new functionality for a test drive; my objective is to create a simple file-based data load.

Before I start out I think I should point out that the issues I hit might only be because it is extremely new, you may not have the same experience and if you do, then I am sure any bugs will be ironed out over time.

You can access Data Integration though the navigator by selecting Data Exchange.

Alternatively, you can select the Application cluster and then Data Exchange.

This takes you to the Data Integration homepage where all the available integrations are displayed, you can also access Data Maps.

What I did experience is that the sort option did not work correctly, especially by last executed. As well all the executions had a time of 12:00:00 AM.

It is nice though that all the integrations are available and can be executed or edited from this homepage.

Let us start out by creating a new integration by selecting the plus icon.

This opens an integration workflow where you are taken through the steps in defining an integration in a simplified way.

The general section allows you to define a name for integration, create a new location by typing in a name, or selecting an existing location.

After providing a name for the integration and location, you can select the source icon, this then allows you to select from all the available sources.

As I am creating a file-based load I selected File which then opens a file browser window.

This allows you to select, create, delete a folder and upload a file. It operates in the same way as the file browser in Data Management but I actually prefer this one in the simplified interface.

I opened the inbox and the browser displays the directory you are in.

Selecting inbox provides the option to create or delete folder.

I selected to create a folder and provided a name.

Now the source file can be uploaded.

The file will then be displayed in the browser where you have the option to download or delete it.

After clicking OK, the source details are populated in the integration configuration.

Below the filename there is a file options button, selecting this will open up a file import window, this is like the first part of creating an import format in Data Management.

This provides the option to select a different file and select the file type and delimiter. These were correctly automatically selected and I didn’t need to select so they must be determined from Data Integration reading the file.

The options for type and delimiter match to that in Data Management.

You also get to see a preview of the file which is a nice feature and can select the header to use as the column names.

The next screen is the file column mapping, if selected in the previous screen, the header in the file will populate the column names, it is possible to override the naming of columns by just entering new names.

Moving on to the target,

Selecting Target allows you to select the target for the integration which provides the same look and feel as when selecting a source.

I selected the planning application, this completes the general section of the workflow.

Select “Save and Continue”, this moves on to the map dimensions section which is equivalent to the source and target mappings when creating an import format.

Just like with an import format you can select a source dimension and map it to a target.

If you select the cog icon you have the following options.

This pretty much mirrors the ‘add’ functionality in the standard Data Management interface.

In the simplified interface you have to manually type any required expression. In the standard interface you have the option to select an expression type from a dropdown.

Now that the map dimensions have been configured, save and continue can be selected.

For me, it didn’t continue in the workflow and after saving I had to select “Map Members”.

The map members section of the workflow is the equivalent of creating data load mappings.

There is the option to import mappings but not export.

So let’s add some simple mappings.

In the source you have the option to define the type of mapping, this differs from the standard interface where there are tabs available for the different mapping types.

The concept of the mapping types and order of precedence is exactly the same as in Data Management, it wouldn’t make any sense if the logic had changed.

You will see “Multi Dimensional” is in the list but I don’t believe you can define the mappings in this release.

There is a column for processing order which is the replacement for rule name in Data Management, it operates in the same way and defines the order of precedence within a mapping type based on an alphanumerical value.

Now this is the point where I started to hit issues, even though I saved the mappings for each dimension when I returned to them they were blank.

Update the following issues were identified as a bug and have been fixed in the 18.12 release

When I got to the next part of the workflow to define the options I could not select a category or plan type.

The options section of the workflow is the equivalent to the options available in a data load rule in Data Management.

When I went back and edited the integration I received the following error message when I tried to save.

I went into Data Management and could see the import format had been created but there was no location.

I tried to create a location with the same name as the one in the simplified interface and was hit with another error.

I now have a location that does exist but I can’t see it in the standard interface and it doesn’t seem to save properly in the simplified interface.

I did have a couple of attempts at it and hit the same problem, maybe it was because I was trying to create a location from simplified interface instead of using an existing one. Once I get the opportunity I will look at it in more detail and update this post if anything changes.

The integrations I created do appear in the Data Integration homepage.

Though unless I am missing something, I don’t seem to be able to delete them in the simplified interface and they don’t appear in Data Management so I am stuck with them.

Instead of dwelling on this problem as I might have just been unlucky, I decided to create an integration in Data Management and then edit it in Data Integration.

The integration is available from the Data Integration homepage.

Selecting Actions provides the following options for the integration:

I selected “Map Members” and this time the mappings were available.

You will notice that a multi-dimensional mapping is displayed with a value of #SCRIPT, in this release even though it is an option it is not possible to fully define a multi-dimensional mapping in the simplified interface.

In the options section of the workflow, the category and plan type were now available and I could update them to other available ones if needed.

The filename and directory are also displayed which didn’t happen when I configured the integration through the simplified interface.

From the Data Integration homepage, the integration can be run.

This provides similar options to running a load rule in Data Management in the standard interface.

A confirmation window is displayed and provides the process ID.

As I mentioned earlier the time of the last execution is not correct and for some reason is displayed at 12:00:00 AM

It is possible to view the process details from the actions menu.

Process details in the simplified interface is basically a cut-down version of the one in Data Management, at least the execution times are correct and the process log can be downloaded.

The workbench is an extremely simplified version of the one in Data Management.

I only seemed to be able to change the period and selecting the icons did not do anything, so at the moment it looks like they are just there to indicate whether an import/validation/export/check has been run.

An annoyance for me was that if you filtered on an integration in the homepage – this is something you will do if you have lots of integrations – if I went to any section in Data Integration and returned to homepage the filter had been removed.

For instance, if I filtered down on the in integration I was interested in.

I then opened the workbench and closed it, the filter was gone even though I was still technically in Data Integration.

Another thing I noticed, even though I was active in Data Integration I would still get expiration warnings.

As stated earlier it is a long way off parity with Data Management and there are lots of improvements and functionality additions that need to happen. At the moment I could only see using it to run integrations or quickly access process details without having to open Data Management.

I am sure this will change over time and no doubt I will be back with further updates.


Suresh said...

Thanks for the detailed summary of the test you have performed and it's outcome. I just want to add two points.

1. You will not be able to delete the newly or existing Data Load Rule in Data Integration option. As per the development team, delete option is not available yet.

2. If incase, you are unable to see the newly created data load rule in Data Management UI, then please perform the below action:

Goto to the option page(last page in the setup wizard), make sure to assign category & plan type and it will show in the old UI.

John Goodwin said...

Hi Suresh,

On point 1, I know it is preview but it should have been implemented with an option to delete.

On point 2, unfortunately this is not working for me, I think because it has not saved and created a location in Data Management, if I edit the integrations and go to options I get the following error
"Import Format not available in current rule! Please save import format to the rule to work on Options."

If I go to map dimensions and try to save the import format, I get the following error:
"Status Message : Invalid Input Saving Integration."

So I still am unable to delete the integrations.

I really don't feel that in it's current state it is ready to move to production but that is just my opinion.