Tuesday, 31 October 2017

FDMEE - building Essbase dimensions - Part 2

In the last part I covered a possible solution for building Essbase dimensions through FDMEE, it involved creating a custom application to allow metadata from a file to be loaded to FDMEE, this was then mapped and exported to a file, finally a jython event script was called and the dimension was built using a Essbase load rule with help from the JAVA API.

In this post, I am going to carry on where I left off and go through an example using the same concept but this time the source will be a relational database and the building of the Essbase dimension will be through a SQL load rule, this means there will be no file created in the export stage as the dimension will be built directly from the FDMEE repository.

If you have not read the first part then I recommend doing so as I am going to go straight into the solution without repeating what was previously covered.

The idea is load metadata from a relational source using the universal data adaptor which is available in FDMEE from, I am not going to go into any detail on setting up the adaptor as all the steps have been covered before by Francisco in a great three-part blog.

The source comprises of a view which queries two tables, the tables are in parent/child format hold member properties for a measures and market dimension, yes the Essbase target is going to be the only and only Sample Basic.

I have included two dimensions (it could be more) just to demonstrate the possibility of loading multiple dimensions at once using an Essbase incremental dimension build.

The view does not contain all member properties as it only there to provide an example and not an exhaustive list, the properties are already in the correct format for Essbase but it doesn’t have to be that way as the mapping stage in FDMEE could easily handle it if transformations are required.

The view is included as part of the setup of the source adaptor in FDMEE.

Once the table definition has been imported the columns from the view are populated in the adaptor configuration.

The classification has been left blank for all columns as it is not required for this solution.

Now the source has been defined the target needs creating which is done once again using a custom application.

Just like the example in the previous part, one of the properties needs to be assigned a target class of Account for the solution to work, it is not important which one and the remaining can be set a generic.

On to the import format where the source is defined as the UDA and the target the custom application.

The source columns from the view are then mapped to the target application.

You will notice I have formula mapped to “Description 1”, this is because the UDx column sizes in the FDMEE database tables are set as 80 characters and the description column is larger in size so I have used that. The amount column has not been mapped as this is not required and is going to be ignored.

There is nothing to really to note about the location as it is just the default values with the import format selected.

There are a few things to point out with the data load rule, first “Zero Balances” has been set to “Include”, this is because the source does not have any data values so if this was set to “Exclude” then none of the metadata will be loaded to FDMEE.

In the target options, the property that defines whether to export to file has been set to no, this is because there is no requirement for a file to be created as the dimensions are going to be built by directly querying an existing FDMEE view using Essbase SQL load rules.

In the custom options tab, there are a couple of values that have been defined, one for the target Essbase application/database name and the name of a substitution variable which will hold the FDMEE load id, this will become clearer later.

The majority of the data load mapping are set as like for like mapping as in this example there is no need for much transformation, though because some of the member properties are null I added a SQL mapping to convert them to blank space, this is basically to make the validation stage successful and for visual aesthetics in the workbench, the spaces will be removed in the Essbase SQL load rules.

On to running the rule from the workbench.

With the help of the UDA and database view, the members and properties for both the Market and Measures dimensions have successfully been loaded to FDMEE.

Now that the members and properties have been mapped the next step is to build the Essbase dimensions using SQL load rules.

Before creating the load rule you need the SQL statement to retrieve the members and properties, luckily there are default views in the FDMEE database which make this easy.

There is a view called “AIF_HS_BALANCES” which returns mapped data for the latest FDMEE process ID, I would have used this view but it does not include the description column which I need because I am storing the member formula in it. There is a more detailed view available named “AIF_HS_BALANCES_DETAIL_V” which does include the column.

I created two load rules, one for each dimension using the parent/child build method.

The SQL in the rules is a simple select from the “AIF_HS_BALANCES_DETAIL_V” view, an example in the measures rule is:

The trim function is to remove the blank spaces that were created as part of the mapping in FDMEE, the load ID (process ID) is read from an Essbase substitution variable.

The value of this variable will be set in a jython event script that will also run the Essbase load rules.

The finished version of the load rule has all the columns returned from the SQL query mapped to the relevant dimension build properties.

The rule is created with the naming convention of the first four characters of the dimension name and suffixed with SQL, this is so the rule name can be identified when building the dimension.

On to the final piece of the solution and that is the jython event script which will handle building the dimensions.

In the process FDMEE log there is reference to jython event scripts that are called at different stages throughout the export process.

For example:

INFO  [AIF]: Executing the following script: <application_root_folder>/data/scripts/event/AftCheck.py

As stated in the previous blog post the scripts are not there by default so you may get a warning saying the script does not exist, if they don’t exist it is just a matter of creating the script in the event script directory and it will be executed next time the process is run.

I am going to use the above event script to carry out the dimension builds using the Essbase Java API.

As usual I am not going to include and go through every line of code as this is just an example of what can be achieved and it is really up to you to write the code to fit your requirement.

The basic idea of the script is to update the Essbase substitution variable with the current load ID, this is so the SQL query in the load rule will retrieve the correct dataset.

An incremental dimension build will be run for each dimension that has been loaded to FDMEE, an incremental build is being used for efficiency so that all the dimensions can be built before restructuring the database.

The script does not contain any hardcoded passwords as the connection to Essbase is made using a generated SSO token and the credentials for the SQL load rule to connect to the FDMEE database are retrieved using the FDM API.

So let us break down the script into manageable chunks.

The above section imports the required Java classes, the majority of these are for the Essbase JAVA API and the classes are accessible by default from FDMEE.

There is a SQL query defined against the FDMEE view which I mentioned earlier, this query will return each dimension to run a build for based on the current load ID.

The rest of the above section should be self-explanatory as comments are included.

The next section executes the query to return each dimension and stores the results in a list.

The custom options in the FDMEE load rule are retrieved and stored, the first option holds the Essbase target application/database and the second the name of the substitution variable which stores the current load ID.

A login to the Essbase server is made and the substitution variable is updated with the current load ID.

The final section starts the incremental dimension build and cycles through each dimension that requires a dim build.

A custom function is called which adds some additionally logging to the process steps log.

The Essbase load rule name is generated from the first four characters of the dimension name and suffixed with “SQL”

An incremental dimension build is then carried out by calling the Essbase data load for the current dimension.

If any errors exist these are written to the current process log and the dimension build error log is included as a download option in process details.

A custom function is called to update the process step with a warning or success status.

Once all the dimensions have been built the incremental build is ended meaning the database is restructured.

Before running the export step let us look at the database outline, the Measures and Market dimension are currently empty.

Run the export step.

The custom message indicates the dimension build was successful, process details also confirms each dimension build step was successful.

A check back at the database outline and all looks good.

To demonstrate what happens with build errors I added an invalid record to the source database table and ran the import again.

The record is trying to load a parent to the wrong dimension which should generate an error.

After running the export again, process details shows a warning and the process steps show a warning for the market dimension build.

The dimension build error file can be download from process details under “Output File”.

The errors are also contained in the main process log.

If you combine this dimension build method with a standard FDMEE data load functionality you have a full solution for loading metadata and data to a target Essbase database.

I am not saying you must go about it in this way and just having a custom jython script to build Essbase dimensions could be perfectly acceptable, I wanted to include as much as I could in this solution to demonstrate what is achievable.

Until next time..